CN116185948A - Migration method and device of code virtual environment, storage medium and electronic equipment - Google Patents

Migration method and device of code virtual environment, storage medium and electronic equipment Download PDF

Info

Publication number
CN116185948A
CN116185948A CN202211688442.0A CN202211688442A CN116185948A CN 116185948 A CN116185948 A CN 116185948A CN 202211688442 A CN202211688442 A CN 202211688442A CN 116185948 A CN116185948 A CN 116185948A
Authority
CN
China
Prior art keywords
file
library
path
filling
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211688442.0A
Other languages
Chinese (zh)
Inventor
许先才
李伟
刘顺文
熊磊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Yunintegral Technology Co ltd
Original Assignee
Shenzhen Yunintegral Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Yunintegral Technology Co ltd filed Critical Shenzhen Yunintegral Technology Co ltd
Priority to CN202211688442.0A priority Critical patent/CN116185948A/en
Publication of CN116185948A publication Critical patent/CN116185948A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/10File systems; File servers
    • G06F16/11File system administration, e.g. details of archiving or snapshots
    • G06F16/119Details of migration of file systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/10File systems; File servers
    • G06F16/13File access structures, e.g. distributed indices
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/30Creation or generation of source code
    • G06F8/31Programming languages or programming paradigms
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Abstract

The invention discloses a migration method and device of a code virtual environment, a storage medium and electronic equipment, wherein the method comprises the following steps: obtaining virtual environment data of a target programming language created on a source host, wherein the virtual environment data comprises a plurality of machine learning libraries; reading a library list of the plurality of machine learning libraries, and reading a library file and a library directory path of each machine learning library; creating a package file by adopting the library file and the library directory path; and migrating the package file to a target host so as to run a code script of the target programming language on the target host by adopting the machine learning library. The invention solves the technical problem of low efficiency of transferring virtual environment data in the related technology, improves the transfer efficiency of virtual environment data of a target programming language, and reduces the error rate of a machine learning library on a target host.

Description

Migration method and device of code virtual environment, storage medium and electronic equipment
Technical Field
The present invention relates to the field of computer technologies, and in particular, to a migration method and apparatus for a code virtual environment, a storage medium, and an electronic device.
Background
In the related art, for a migration scenario of a virtual environment (abbreviated as an environment) of a programming language (such as python), when a built environment is migrated greatly, an existing environment cannot be reused, the same work still needs to be repeated, and the efficiency is quite low. Even if the method of archiving, compressing and copying the installed environment is adopted, actions such as releasing, deploying and the like are needed, errors are easy to occur, links such as teaching and verification are needed after the deployment is finished, the process is tedious, and the problems of low efficiency, high error rate and incapability of completely restoring the existing environment exist.
In view of the above problems in the related art, no effective solution has been found yet.
Disclosure of Invention
The embodiment of the invention provides a migration method and device of a code virtual environment, a storage medium and electronic equipment.
According to an aspect of an embodiment of the present application, there is provided a migration method of a code virtual environment, including: obtaining virtual environment data of a target programming language created on a source host, wherein the virtual environment data comprises a plurality of machine learning libraries; reading a library list of the plurality of machine learning libraries, and reading a library file and a library directory path of each machine learning library; creating a package file by adopting the library file and the library directory path; and migrating the package file to a target host so as to run a code script of the target programming language on the target host by adopting the machine learning library.
Further, creating a package file using the library file and the library directory path comprises: acquiring library table entry data of the library file, and acquiring path entry data of the library directory path and a file container; determining the library table item data and path item data as index data, and determining the file container as content data; and filling the index data in the head area, and filling the content data in the load area to obtain the package file.
Further, filling the index data in a header area and filling the content data in a payload area, and obtaining the package file includes: filling file type characters in the first field of the initial file to obtain a first file; filling the library table entry data in a first intermediate field of the first file to obtain a second file; filling the path item data in a second intermediate field of the second file to obtain a third file; filling the file container in the tail field of the third file to obtain the package file, wherein the package file comprises: the first intermediate field, the second intermediate field, the tail field.
Further, filling the library table entry data in the first intermediate field of the first file to obtain a second file includes: the following is performed for each machine learning library until the last library entry: calculating a library table item value of a current library table item, wherein the library table item value is used for representing a path item starting address of the current library table item; calculating the field occupation length of the library table entry value; converting the occupied length of the field into a length character; and filling the length characters in fixed fields of the first file, and filling the library table entry values after the fixed fields.
Further, filling path item data in a second intermediate field of the second file, and obtaining a third file includes: the following is performed for each path item of the library table item until the last path item: judging the path type of the current path item; if the current path item is of a file type, configuring a first identifier, acquiring a start address of a file container of the current path item, determining the start address as a first path item value, filling the first identifier in a fixed field of the second file, and filling the first path item value after the fixed field; if the current path item is of a directory type, configuring a second identifier, acquiring a directory path of the current path item, determining the directory path as a second path item value, filling the second identifier in a fixed field of the second file, and filling the second path item value after the fixed field.
Further, filling a file container in a tail field of the third file, and obtaining the package file includes: the following is performed for each file container of the file path item until the last file container: acquiring a file path and file content of a current file container; calculating the file length of the file content; and filling the file path in a first fixed field of the third file, filling the file length in a second fixed field of the third file, and filling the file content after the second fixed field.
Further, obtaining the library entry data of the library file, and obtaining the path entry data and file container of the library directory path comprises: sequentially reading path data of each library table item according to the sequence of the library list; inquiring file content of a corresponding path based on the path data, wherein the file content is stored by adopting a file container; and reading the binary stream of the file content.
According to another aspect of the embodiments of the present application, there is also provided a migration apparatus of a code virtual environment, including: the system comprises an acquisition module, a storage module and a storage module, wherein the acquisition module is used for acquiring virtual environment data of a target programming language created on a source host, and the virtual environment data comprises a plurality of machine learning libraries; the reading module is used for reading library lists of the plurality of machine learning libraries and reading library files and library directory paths of each machine learning library; the creation module is used for creating a package file by adopting the library file and the library directory path; and the migration module is used for migrating the package file to a target host so as to run the code script of the target programming language on the target host by adopting the machine learning library.
Further, the creation module includes: the acquisition unit is used for acquiring the library table entry data of the library file, and acquiring the path entry data of the library directory path and the file container; a determining unit configured to determine the library entry data and the path entry data as index data and the file container as content data; and the filling unit is used for filling the index data in the head area and filling the content data in the load area to obtain the package file.
Further, the filling unit includes: the first filling unit is used for filling file type characters in the first field of the initial file to obtain a first file; a second filling subunit, configured to fill the library table entry data in a first intermediate field of the first file to obtain a second file; a third filling subunit, configured to fill the path item data in a second intermediate field of the second file, to obtain a third file; a fourth filling subunit, configured to fill the file container in a tail field of the third file to obtain the packet file, where the packet file includes: the first intermediate field, the second intermediate field, the tail field.
Further, the second filling subunit includes: a filling subunit for performing, for each machine learning library, the following operations until a last library entry: calculating a library table item value of a current library table item, wherein the library table item value is used for representing a path item starting address of the current library table item; calculating the field occupation length of the library table entry value; converting the occupied length of the field into a length character; and filling the length characters in fixed fields of the first file, and filling the library table entry values after the fixed fields.
Further, the third filler subunit includes: a filling subunit for performing, for each path item of the library table item, the following operations until the last path item: judging the path type of the current path item; if the current path item is of a file type, configuring a first identifier, acquiring a start address of a file container of the current path item, determining the start address as a first path item value, filling the first identifier in a fixed field of the second file, and filling the first path item value after the fixed field; if the current path item is of a directory type, configuring a second identifier, acquiring a directory path of the current path item, determining the directory path as a second path item value, filling the second identifier in a fixed field of the second file, and filling the second path item value after the fixed field.
Further, the fourth filler subunit includes: a filling subunit for performing, for each file container of the file path item, the following operations until the last file container: acquiring a file path and file content of a current file container; calculating the file length of the file content; and filling the file path in a first fixed field of the third file, filling the file length in a second fixed field of the third file, and filling the file content after the second fixed field.
Further, the acquisition unit includes: a first reading subunit, configured to sequentially read path data of each library table entry according to the sequence of the library list; a query subunit, configured to query file content of a corresponding path based on the path data, where the file content is stored by using a file container; and the second reading subunit is used for reading the binary stream of the file content.
According to another aspect of the embodiments of the present application, there is also provided a storage medium including a stored program that performs the steps described above when running.
According to another aspect of the embodiments of the present application, there is also provided an electronic device, including a processor, a communication interface, a memory, and a communication bus, where the processor, the communication interface, and the memory complete communication with each other through the communication bus; wherein: a memory for storing a computer program; and a processor for executing the steps of the method by running a program stored on the memory.
Embodiments of the present application also provide a computer program product comprising instructions which, when run on a computer, cause the computer to perform the steps of the above method.
The virtual environment data of the target programming language created on the source host is obtained, wherein the virtual environment data comprises a plurality of machine learning libraries, library lists of the plurality of machine learning libraries are read, library files and library directory paths of each machine learning library are read, package files are created by adopting the library files and the library directory paths, the package files are migrated to the target host, code scripts of the target programming language are run on the target host by adopting the machine learning libraries, and error rate can be reduced when the package files and the library directory paths of the machine learning libraries are packaged, the package formats are self-contained in file paths, the virtual environment data on the source host can be completely restored, the technical problem of low virtual environment data migration efficiency in related technologies is solved, the migration efficiency of the virtual environment data of the target programming language is improved, and the error rate of the machine learning libraries on the target host is reduced.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiments of the invention and together with the description serve to explain the invention and do not constitute a limitation on the invention. In the drawings:
FIG. 1 is a block diagram of the hardware architecture of a computer according to an embodiment of the present invention;
FIG. 2 is a flow chart of a method of migrating a code virtual environment according to an embodiment of the present invention;
FIG. 3 is a diagram illustrating a format of a packet file according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of a library table entry according to an embodiment of the present invention;
FIG. 5 is a schematic diagram of a path item in an embodiment of the present invention;
FIG. 6 is a schematic view of a file container according to an embodiment of the present invention;
FIG. 7 is a block diagram of a migration apparatus of a code virtual environment according to an embodiment of the present invention;
fig. 8 is a block diagram of an electronic device embodying an embodiment of the present invention.
Detailed Description
In order to make the present application solution better understood by those skilled in the art, the following description will be made in detail and with reference to the accompanying drawings in the embodiments of the present application, it is apparent that the described embodiments are only some embodiments of the present application, not all embodiments. All other embodiments, which can be made by one of ordinary skill in the art based on the embodiments herein without making any inventive effort, shall fall within the scope of the present application. It should be noted that, in the case of no conflict, the embodiments and features in the embodiments may be combined with each other.
It should be noted that the terms "first," "second," and the like in the description and claims of the present application and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that embodiments of the present application described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Example 1
The method embodiment provided in the first embodiment of the present application may be executed in a server, a computer, a mobile phone, or a similar computing device. Taking a computer as an example, fig. 1 is a block diagram of a hardware structure of a computer according to an embodiment of the present invention. As shown in fig. 1, the computer may include one or more processors 102 (only one is shown in fig. 1) (the processor 102 may include, but is not limited to, a microprocessor MCU or a processing device such as a programmable logic device FPGA) and a memory 104 for storing data, and optionally, a transmission device 106 for communication functions and an input-output device 108. It will be appreciated by those of ordinary skill in the art that the configuration shown in FIG. 1 is merely illustrative and is not intended to limit the architecture of the computer described above. For example, the computer may also include more or fewer components than shown in FIG. 1, or have a different configuration than shown in FIG. 1.
The memory 104 may be used to store a computer program, for example, a software program of application software and a module, such as a computer program corresponding to a migration method of a code virtual environment in an embodiment of the present invention, and the processor 102 executes the computer program stored in the memory 104 to perform various functional applications and data processing, that is, implement the above-mentioned method. Memory 104 may include high-speed random access memory, and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory. In some examples, memory 104 may further include memory located remotely from processor 102, which may be connected to the computer via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The transmission device 106 is used to receive or transmit data via a network. Specific examples of the network described above may include a wireless network provided by a communications provider of a computer. In one example, the transmission device 106 includes a network adapter (Network Interface Controller, simply referred to as NIC) that can connect to other network devices through a base station to communicate with the internet. In one example, the transmission device 106 may be a Radio Frequency (RF) module, which is configured to communicate with the internet wirelessly.
In this embodiment, a migration method of a code virtual environment is provided, and fig. 2 is a flowchart of a migration method of a code virtual environment according to an embodiment of the present invention, as shown in fig. 2, where the flowchart includes the following steps:
step S202, virtual environment data of a target programming language created on a source host is obtained, wherein the virtual environment data comprises a plurality of machine learning libraries;
the target programming language in this embodiment may be any programming language that requires creation of a virtual environment, such as the Python language.
Taking Python as an example, a virtual environment is a Python tool for dependency management and project isolation, allowing Python site packages (third party libraries) to be installed in the isolated catalog of a local specific project, rather than globally (i.e., as part of a system-wide Python).
Step S204, reading a library list of a plurality of machine learning libraries, and reading a library file and a library directory path of each machine learning library;
the library list of the present embodiment includes a plurality of machine learning libraries, each of the plurality of machine learning libraries including library files and directory paths for the files.
Step S206, creating a package file by adopting a library file and a library directory path;
Step S208, the package file is migrated to the target host, so that the code script of the target programming language is run on the target host by using the machine learning library.
The source host and the target host in this embodiment are host devices running the same or different operating systems, and may support performing environment migration between different operating systems, such as linux, windows, and other operating systems.
In this embodiment, python is created on a source host, the source host has a virtual environment creation capability, at least one virtual environment is created on the source host, a plurality of machine learning libraries are installed in the virtual environment, all machine learning library lists and files and directory paths to which the machine learning library lists belong are acquired, packet files are created by adopting data of the files and the directory paths, the packet files are migrated to a target host, the packet files are run (loaded) on the target host, and the packet files are verified on the target host, so that migration is completed.
When the loader loads the packet file, the file content stream is read byte by byte according to the format of the packet file, and after the corresponding byte number is reached, the content is analyzed according to meaning (stored in a memory variable without considering the restarting problem). After all the contents are read and analyzed, the corresponding contents are written into a target path (on a disk) according to the file meaning. When verifying the package file, a message digest algorithm (md 5) may be used, and for the message digest (attached to the last of the library table entries) of each library table entry (a plurality of files), the comparison of the secondary digests of the loaded (written to disk) files is performed, and if the two digests are the same, the verification is passed.
Through the steps, virtual environment data of a target programming language created on a source host is obtained, wherein the virtual environment data comprises a plurality of machine learning libraries, library lists of the plurality of machine learning libraries are read, library files and library directory paths of each machine learning library are read, package files are created by adopting the library files and the library directory paths, the package files are migrated to the target host so as to run code scripts of the target programming language on the target host by adopting the machine learning libraries, and error rate can be reduced when the package files and the library directory paths of the machine learning libraries are released on the target host by adopting the package formats with the file paths, the virtual environment data on the source host is completely restored, the technical problem of low efficiency of transferring the virtual environment data in related technologies is solved, the migration efficiency of the virtual environment data of the target programming language is improved, and the error rate of the machine learning libraries on the target host is reduced.
In one implementation of the present embodiment, creating the package file using the library file and the library directory path includes: acquiring library table entry data of the library file, and acquiring path entry data of the library directory path and a file container; determining the library table item data and path item data as index data, and determining the file container as content data; and filling the index data in the head area, and filling the content data in the load area to obtain the package file.
An index is set in the header of the packet (library table entry and path entry), and file contents stored in a file block (file container) of the packet are recorded.
Optionally, obtaining the library table entry data of the library file, and obtaining the path entry data of the library directory path and the file container includes: sequentially reading path data of each library table item according to the sequence of the library list; inquiring file content of a corresponding path based on the path data, wherein the file content is stored by adopting a file container; and reading the binary stream of the file content.
And acquiring a library list, reading the machine learning libraries one by one, and searching for corresponding files according to the paths of the list. Then the file content (binary stream mode) is read, and then the binary stream is written into the disk one by one according to the address layout of the packet file format. Both the read and write operations of this embodiment may be implemented in serial mode or parallel mode.
In one implementation of this embodiment, filling the index data in the header area and filling the content data in the payload area, the obtaining the packet file includes:
s11, filling file type characters in a first field of an initial file to obtain a first file;
In one example, the package file includes a plurality of bytes, and the header field includes 1 to 2 bytes, representing a file type, a fixed value, and a package file for indicating that the current file is virtual environment data of the target programming language, and the target host, after acquiring any file, identifies the file type by reading the header field, and if it is not the package file, cannot be loaded. The method can prevent the files transmitted during loading from being of other types (such as mp4, docx and the like) and ensure the purity of the inclusion.
FIG. 3 is a schematic diagram of the format of a packet file according to an embodiment of the present invention, which fills in the data of file types, library entries, path entries, and file containers sequentially from beginning to end.
S12, filling library table entry data in a first intermediate field of a first file to obtain a second file;
in one example, the package file further includes a first intermediate field, the first intermediate field being 3 to n bytes, representing the specific contents of a virtual environment software library table (simply referred to as a library table). Each library table item corresponds to one machine learning library in the virtual environment.
In one embodiment, populating the library entry data in the first intermediate field of the first file to obtain the second file includes: the following is performed for each machine learning library until the last library entry: calculating a library table item value of a current library table item, wherein the library table item value is used for representing a path item starting address of the current library table item; calculating the field occupation length of the table entry value of the library; converting the occupied length of the field into length characters; the fixed field of the first file is filled with length characters and the library entry value is filled after the fixed field.
The library table item data of this embodiment is used to record the start addresses of all path items in one machine learning library, and one library table item corresponds to one machine learning library. The structure comprises: the length of the library table item is fixed to 8 bits, the length occupied by the library table item is recorded, the value of the library table item is defined as the initial address of the path item of the library, the length is not fixed, and the length of the library table item is given. When the length value of the library table entry is: at 0xFF, the library table entry value reaches a maximum value: 2+.255bit. The maximum value of the library table entry can reach 8bit+255 bit=263 bit, and the path entry address which indicates that the library table entry can point to furthest is the maximum value of the library table entry, namely: 2≡255 is approximately equal to 5.7896-E76 bit, which is the theoretical upper limit.
FIG. 4 is a schematic diagram of a library table in an embodiment of the present invention, including a library table length and a library table value.
S13, filling path item data in a second intermediate field of the second file to obtain a third file;
in one example, the envelope file further comprises a second intermediate field comprising n+1 to m bytes, representing all path item tables, consisting of several path items, a certain path item representing a certain file or directory under a certain library. The length of each path item is variable.
In one embodiment, populating the path item data in the second intermediate field of the second file, the obtaining the third file includes: the following is performed for each path item of the library table item until the last path item: judging the path type of the current path item; if the current path item is of a file type, configuring a first identifier, acquiring a start address of a file container of the current path item, determining the start address as a first path item value, filling the first identifier in a fixed field of a second file, and filling the first path item value after the fixed field; if the current path item is of a directory type, configuring a second identifier, acquiring a directory path of the current path item, determining the directory path as a second path item value, filling the second identifier in a fixed field of a second file, and filling the second path item value after the fixed field.
A certain path item in the path item data represents a specific file or directory under a library. The structure comprises: the path item type is fixed to be 1bit,0 is a file, and 1 is a directory; the number of bytes occupied by the path item value is variable. When the type bit is 0, the length is 256 bits, and the value is the starting address of the file container; when the type bit is 1, the length is 2048 bits, the value is directory path (256 characters), and the directory has no file container (file block). The contents of the path item value are used to record the storage address of the actual contents of the file/directory in the file container.
FIG. 5 is a schematic diagram of a path item structure in an embodiment of the present invention, including a path item type and a path item value.
S14, filling a file container in the tail field of the third file to obtain a package file, wherein the package file comprises: a first field, a first intermediate field, a second intermediate field, and a tail field.
In one example, the envelope file also includes a tail field that includes m+1 to the last byte (the disk location of the last byte depends on the actual data size of the virtual environment data), representing the file container.
In one embodiment, filling the file container in the tail field of the third file to obtain the package file includes: the following is performed for each file container of the file path item until the last file container: acquiring a file path and file content of a current file container; calculating the file length of the file content; the file path is filled in a first fixed field of the third file, the file length is filled in a second fixed field of the third file, and the file content is filled after the second fixed field.
Storing the actual content of a file (binary byte stream), also called file block, the structure comprises: a file path of a fixed length 2048 bits (256 characters); the file length, fixed length is 32 bits, represents the file size (e.g. 512 MB), can satisfy the size of the python three-party library subfile.
FIG. 6 is a schematic diagram of a file container in accordance with an embodiment of the present invention, including a file path and a file length.
The scheme of the embodiment provides a package format which is completely packaged aiming at the virtual environment of the programming language, can completely package the package formats of all software libraries of the virtual environment, and has very important significance for improving the efficiency and quality of project deployment. The problems of low efficiency, high error rate and incapability of completely restoring the existing environment are solved.
The embodiment adopts a self-grinding binary packet format, which is different from the traditional filing compression technology, and the packet format has a file path, so that the error rate is reduced when the packet format is released; the packet format adopts sectional address management, can support random access extraction, supports cross-network direct loading, and improves distribution efficiency.
From the description of the above embodiments, it will be clear to a person skilled in the art that the method according to the above embodiments may be implemented by means of software plus the necessary general hardware platform, but of course also by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present invention may be embodied essentially or in a part contributing to the prior art in the form of a software product stored in a storage medium (e.g. ROM/RAM, magnetic disk, optical disk) comprising instructions for causing a terminal device (which may be a mobile phone, a computer, a server, or a network device, etc.) to perform the method according to the embodiments of the present invention.
Example 2
The embodiment also provides a migration device for a code virtual environment, which is used for implementing the foregoing embodiments and preferred embodiments, and is not described in detail. As used below, the term "module" may be a combination of software and/or hardware that implements a predetermined function. While the means described in the following embodiments are preferably implemented in software, implementation in hardware, or a combination of software and hardware, is also possible and contemplated.
FIG. 7 is a block diagram of a migration apparatus of a code virtual environment according to an embodiment of the present invention, as shown in FIG. 7, the apparatus comprising: an acquisition module 70, a reading module 72, a creation module 74, a migration module 76, wherein,
an obtaining module 70, configured to obtain virtual environment data of a target programming language created on a source host, where the virtual environment data includes a plurality of machine learning libraries;
a reading module 72 for reading a library list of the plurality of machine learning libraries, and reading a library file and a library directory path of each machine learning library;
a creation module 74 for creating a package file using the library file and the library directory path;
a migration module 76 for migrating the package file to a target host to run code scripts of the target programming language on the target host using the machine learning library.
Optionally, the creating module includes: the acquisition unit is used for acquiring the library table entry data of the library file, and acquiring the path entry data of the library directory path and the file container; a determining unit configured to determine the library entry data and the path entry data as index data and the file container as content data; and the filling unit is used for filling the index data in the head area and filling the content data in the load area to obtain the package file.
Optionally, the filling unit includes: the first filling unit is used for filling file type characters in the first field of the initial file to obtain a first file; a second filling subunit, configured to fill the library table entry data in a first intermediate field of the first file to obtain a second file; a third filling subunit, configured to fill the path item data in a second intermediate field of the second file, to obtain a third file; a fourth filling subunit, configured to fill the file container in a tail field of the third file to obtain the packet file, where the packet file includes: the first intermediate field, the second intermediate field, the tail field.
Optionally, the second filling subunit includes: a filling subunit for performing, for each machine learning library, the following operations until a last library entry: calculating a library table item value of a current library table item, wherein the library table item value is used for representing a path item starting address of the current library table item; calculating the field occupation length of the library table entry value; converting the occupied length of the field into a length character; and filling the length characters in fixed fields of the first file, and filling the library table entry values after the fixed fields.
Optionally, the third filling subunit includes: a filling subunit for performing, for each path item of the library table item, the following operations until the last path item: judging the path type of the current path item; if the current path item is of a file type, configuring a first identifier, acquiring a start address of a file container of the current path item, determining the start address as a first path item value, filling the first identifier in a fixed field of the second file, and filling the first path item value after the fixed field; if the current path item is of a directory type, configuring a second identifier, acquiring a directory path of the current path item, determining the directory path as a second path item value, filling the second identifier in a fixed field of the second file, and filling the second path item value after the fixed field.
Optionally, the fourth filling subunit includes: a filling subunit for performing, for each file container of the file path item, the following operations until the last file container: acquiring a file path and file content of a current file container; calculating the file length of the file content; and filling the file path in a first fixed field of the third file, filling the file length in a second fixed field of the third file, and filling the file content after the second fixed field.
Optionally, the acquiring unit includes: a first reading subunit, configured to sequentially read path data of each library table entry according to the sequence of the library list; a query subunit, configured to query file content of a corresponding path based on the path data, where the file content is stored by using a file container; and the second reading subunit is used for reading the binary stream of the file content.
It should be noted that each of the above modules may be implemented by software or hardware, and for the latter, it may be implemented by, but not limited to: the modules are all located in the same processor; alternatively, the above modules may be located in different processors in any combination.
Example 3
An embodiment of the invention also provides a storage medium having a computer program stored therein, wherein the computer program is arranged to perform the steps of any of the method embodiments described above when run.
Alternatively, in the present embodiment, the above-described storage medium may be configured to store a computer program for performing the steps of:
s1, virtual environment data of a target programming language created on a source host is obtained, wherein the virtual environment data comprises a plurality of machine learning libraries;
s2, reading library lists of the plurality of machine learning libraries, and reading library files and library directory paths of each machine learning library;
s3, creating a package file by adopting the library file and the library directory path;
and S4, migrating the package file to a target host computer so as to run the code script of the target programming language on the target host computer by adopting the machine learning library.
Alternatively, in the present embodiment, the storage medium may include, but is not limited to: a usb disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a removable hard disk, a magnetic disk, or an optical disk, or other various media capable of storing a computer program.
An embodiment of the invention also provides an electronic device comprising a memory having stored therein a computer program and a processor arranged to run the computer program to perform the steps of any of the method embodiments described above.
Optionally, the electronic device may further include a transmission device and an input/output device, where the transmission device is connected to the processor, and the input/output device is connected to the processor.
Alternatively, in the present embodiment, the above-described processor may be configured to execute the following steps by a computer program:
s1, virtual environment data of a target programming language created on a source host is obtained, wherein the virtual environment data comprises a plurality of machine learning libraries;
s2, reading library lists of the plurality of machine learning libraries, and reading library files and library directory paths of each machine learning library;
s3, creating a package file by adopting the library file and the library directory path;
and S4, migrating the package file to a target host computer so as to run the code script of the target programming language on the target host computer by adopting the machine learning library.
Alternatively, specific examples in this embodiment may refer to examples described in the foregoing embodiments and optional implementations, and this embodiment is not described herein.
Fig. 8 is a block diagram of an electronic device according to an embodiment of the present invention, as shown in fig. 8, including a processor 81, a communication interface 82, a memory 83, and a communication bus 84, where the processor 81, the communication interface 82, and the memory 83 perform communication with each other through the communication bus 84, and the memory 83 is used to store a computer program; a processor 81 for executing a program stored on a memory 83.
The foregoing embodiment numbers of the present application are merely for describing, and do not represent advantages or disadvantages of the embodiments.
In the foregoing embodiments of the present application, the descriptions of the embodiments are emphasized, and for a portion of this disclosure that is not described in detail in this embodiment, reference is made to the related descriptions of other embodiments.
In the several embodiments provided in the present application, it should be understood that the disclosed technology content may be implemented in other manners. The above-described embodiments of the apparatus are merely exemplary, and the division of the units, such as the division of the units, is merely a logical function division, and may be implemented in another manner, for example, multiple units or components may be combined or may be integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some interfaces, units or modules, or may be in electrical or other forms.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be embodied in essence or a part contributing to the prior art or all or part of the technical solution in the form of a software product stored in a storage medium, including several instructions to cause a computer device (which may be a personal computer, a server or a network device, etc.) to perform all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a removable hard disk, a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely a preferred embodiment of the present application and it should be noted that modifications and adaptations to those skilled in the art may be made without departing from the principles of the present application and are intended to be comprehended within the scope of the present application.

Claims (10)

1. A method for migrating a code virtual environment, comprising:
obtaining virtual environment data of a target programming language created on a source host, wherein the virtual environment data comprises a plurality of machine learning libraries;
reading a library list of the plurality of machine learning libraries, and reading a library file and a library directory path of each machine learning library;
creating a package file by adopting the library file and the library directory path;
and migrating the package file to a target host so as to run a code script of the target programming language on the target host by adopting the machine learning library.
2. The method of claim 1, wherein creating a package file using the library file and the library directory path comprises:
acquiring library table entry data of the library file, and acquiring path entry data of the library directory path and a file container;
Determining the library table item data and path item data as index data, and determining the file container as content data;
and filling the index data in the head area, and filling the content data in the load area to obtain the package file.
3. The method of claim 2, wherein populating the index data in a header area and populating the content data in a payload area to obtain the envelope file comprises:
filling file type characters in the first field of the initial file to obtain a first file;
filling the library table entry data in a first intermediate field of the first file to obtain a second file;
filling the path item data in a second intermediate field of the second file to obtain a third file;
filling the file container in the tail field of the third file to obtain the package file, wherein the package file comprises: the first intermediate field, the second intermediate field, the tail field.
4. The method of claim 3, wherein populating library entry data in a first intermediate field of the first file to obtain a second file comprises:
The following is performed for each machine learning library until the last library entry: calculating a library table item value of a current library 5 table item, wherein the library table item value is used for representing a path item starting address of the current library table item; calculating the field occupation length of the library table entry value; converting the occupied length of the field into a length character; and filling the length characters in fixed fields of the first file, and filling the library table entry values after the fixed fields.
5. The method of claim 3, wherein populating path item data in a second intermediate 0 field of the second file to obtain a third file comprises:
the following is performed for each path item of the library table item until the last path item: judging the path type of the current path item; if the current path item is of a file type, configuring a first identifier, acquiring a start address of a file container of the current path item, determining the start address as a first path item value, filling the first identifier in a fixed field of the second file, and filling the first path item value after the fixed field 5; if the current path item is of a directory type, configuring a second identifier, acquiring a directory path of the current path item, determining the directory path as a second path item value, filling the second identifier in a fixed field of the second file, and filling the second path item value after the fixed field.
6. The method of claim 3, wherein populating a file container in a tail field 0 of the third file, the obtaining the envelope file comprising:
the following is performed for each file container of the file path item until the last file container:
acquiring a file path and file content of a current file container; calculating the file length of the file content; filling the file path in a first fixed field of the third file, filling the file length in a second fixed field of the third file, and filling the file 5 content after the second fixed field.
7. The method of claim 2, wherein obtaining library entry data for the library file and obtaining path entry data and file containers for the library directory path comprises:
sequentially reading path data of each library table item according to the sequence of the library list;
inquiring file content of a corresponding path based on the path data, wherein the file content is stored by adopting a file container;
and reading the binary stream of the file content.
8. A migration apparatus for a code virtual environment, comprising:
The system comprises an acquisition module, a storage module and a storage module, wherein the acquisition module is used for acquiring virtual environment data of a target programming language created on a source host, and the virtual environment data comprises a plurality of machine learning libraries;
the reading module is used for reading library lists of the plurality of machine learning libraries and reading library files and library directory paths of each machine learning library;
the creation module is used for creating a package file by adopting the library file and the library directory path;
and the migration module is used for migrating the package file to a target host so as to run the code script of the target programming language on the target host by adopting the machine learning library.
9. A storage medium comprising a stored program, wherein the program when run performs the steps of the method of any of the preceding claims 1 to 7.
10. An electronic device comprises a processor, a communication interface, a memory and a communication bus, wherein the processor, the communication interface and the memory are communicated with each other through the communication bus; wherein:
a memory for storing a computer program;
a processor for performing the steps of the method of any one of claims 1 to 7 by running a program stored on a memory.
CN202211688442.0A 2022-12-27 2022-12-27 Migration method and device of code virtual environment, storage medium and electronic equipment Pending CN116185948A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211688442.0A CN116185948A (en) 2022-12-27 2022-12-27 Migration method and device of code virtual environment, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211688442.0A CN116185948A (en) 2022-12-27 2022-12-27 Migration method and device of code virtual environment, storage medium and electronic equipment

Publications (1)

Publication Number Publication Date
CN116185948A true CN116185948A (en) 2023-05-30

Family

ID=86435567

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211688442.0A Pending CN116185948A (en) 2022-12-27 2022-12-27 Migration method and device of code virtual environment, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN116185948A (en)

Similar Documents

Publication Publication Date Title
US20070294685A1 (en) Program upgrade system and method for ota-capable portable device
CN110928483B (en) Data storage method, data acquisition method and equipment
CN111249744B (en) Resource loading method and device, storage medium and electronic device
CN109908585B (en) File processing method and system, computing device and storage medium
CN104038450A (en) Message transmission method and apparatus based on PCIE bus
CN108509215A (en) A kind of replacing options of system software, device, terminal device and storage medium
CN105446975A (en) File packing method and device
CN102687472A (en) Processing devices and methods for transmitting and receiving data
CN110333876A (en) A kind of data clearing method and control equipment
CN110990356B (en) Real-time automatic capacity expansion method and system for logical mirror image
CN116185948A (en) Migration method and device of code virtual environment, storage medium and electronic equipment
JP6929946B2 (en) Data transmission method, device, transmitting end, receiving end and system
CN111147597B (en) File transmission method, terminal, electronic device and storage medium
CN112286565A (en) Embedded system differential upgrading method based on storage container
CN110022287A (en) A kind of asynchronous distributed document transmission method, terminal device and storage medium
CN102820982B (en) Data transmission method and device
CN113010195B (en) System upgrading method, storage medium and terminal equipment
CN110007937B (en) System updating method and system
CN106293993A (en) Avoid the method and device repeating to process file
CN112256282B (en) Application mirror image construction method and device, electronic equipment and storage medium
CN113923209B (en) Processing method for downloading batch data based on LevelDB
CN106257538B (en) Thumbnail downloading method and device
CN114172897B (en) Method and system for transmitting files of PC (personal computer) end and Android end
CN111050218A (en) Set top box upgrading and checking method, server, set top box and storage medium
CN112579003B (en) Key value pair adjustment method, key value pair adjustment device, storage medium and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination