CN113391909A - Process creation method and device, electronic equipment and storage medium - Google Patents

Process creation method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN113391909A
CN113391909A CN202110720406.7A CN202110720406A CN113391909A CN 113391909 A CN113391909 A CN 113391909A CN 202110720406 A CN202110720406 A CN 202110720406A CN 113391909 A CN113391909 A CN 113391909A
Authority
CN
China
Prior art keywords
sub
data
creation method
instruction
created
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110720406.7A
Other languages
Chinese (zh)
Inventor
肖波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Sensetime Technology Development Co Ltd
Original Assignee
Shanghai Sensetime Technology Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Sensetime Technology Development Co Ltd filed Critical Shanghai Sensetime Technology Development Co Ltd
Priority to CN202110720406.7A priority Critical patent/CN113391909A/en
Publication of CN113391909A publication Critical patent/CN113391909A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/48Program initiating; Program switching, e.g. by interrupt
    • G06F9/4806Task transfer initiation or dispatching
    • G06F9/4843Task transfer initiation or dispatching by program, e.g. task dispatcher, supervisor, operating system
    • G06F9/485Task life-cycle, e.g. stopping, restarting, resuming execution
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/52Program synchronisation; Mutual exclusion, e.g. by means of semaphores
    • G06F9/524Deadlock detection or avoidance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/54Interprogram communication
    • G06F9/544Buffers; Shared memory; Pipes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning

Landscapes

  • Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Medical Informatics (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Machine Translation (AREA)

Abstract

The disclosure provides a process creation method, a process creation device, an electronic device and a storage medium, wherein the method comprises the following steps: in response to a child process creation event being triggered, the host process detecting whether an intermediate process has been created currently; in response to the intermediate process having been created, sending a process creation instruction to the intermediate process; and the intermediate process responds to the received process creating instruction and creates the sub-process in a first mode. According to the embodiment of the invention, the intermediate process is created, and the sub-process is created by using the intermediate process, so that the problem of deadlock of the sub-process can be reduced on the premise of ensuring the creation speed as long as the intermediate process is a single-thread process.

Description

Process creation method and device, electronic equipment and storage medium
Technical Field
The present disclosure relates to the field of computer application technologies, and in particular, to a process creation method and apparatus, an electronic device, and a storage medium.
Background
The deep learning framework is a framework or a platform for training a deep learning model; when a deep learning model is trained by using a deep learning frame, a main process of the deep learning frame creates a sub-process for data processing in each training period; after the sub-process is created, the data reading and preprocessing functions provided by the user can be called to preprocess the sample data required by the training process and provide the required input data for the model training process.
The main process creates the sub-process in two ways: a split fork approach and a spawn approach; the method comprises the following steps that a sub-process is created in a spawn mode, wherein the creation speed is low; the fork method is fast in creating the sub-process, but may cause the problem of deadlock of the sub-process.
Disclosure of Invention
The embodiment of the disclosure at least provides a process creation method, a process creation device, an electronic device and a storage medium.
In a first aspect, an embodiment of the present disclosure provides a process creation method, including:
in response to a child process creation event being triggered, the host process detecting whether an intermediate process has been created currently; in response to the intermediate process having been created, sending a process creation instruction to the intermediate process;
and the intermediate process responds to the received process creating instruction and creates a sub-process in a first mode.
Therefore, the intermediate process is used for creating the sub-process, and the problem of deadlock of the sub-process can be reduced on the premise of ensuring the creating speed as long as the intermediate process is a single-thread process.
In one possible embodiment, the creating, by the intermediate process, the sub-process in response to receiving the process creation instruction in the first manner includes: and the intermediate process executes a split fork function based on the process parameters carried in the process creating instruction to create the sub-process.
In a possible implementation manner, the process creation method further includes:
and the main process responds to the situation that the intermediate process is not created, creates the intermediate process in a second mode, and sends the process creation instruction to the intermediate process after the intermediate process is successfully created.
In one possible embodiment, in response to the host process not creating the intermediate process, creating the intermediate process in the second manner includes: and the main process executes and generates a spawn function and creates the intermediate process.
In this way, the middle process is created in a spawn mode, and because the middle process is created in the spawn mode, the middle process cannot have the problem of process deadlock caused by a fork mode; the child process is created in a fork mode, and the intermediate process is a single-thread process, so that the creation of the child process can ensure the creation speed and cannot cause the deadlock of the child process.
In one possible embodiment, the sub-process creation event is triggered, and includes at least one of:
starting a new training period for training the deep learning model;
the batch number of the preprocessed sample data required by training the deep learning model is smaller than a preset number threshold.
Therefore, by setting different process creation events, the creation time of the sub-process can be flexibly controlled, and the flexibility of the deep learning framework is improved.
In a possible implementation manner, the process creation method further includes:
the intermediate process creates a data communication conduit between the main process and the sub-process.
In a possible implementation manner, the process creation method further includes:
in response to the data communication pipeline being successfully created, the main process sends a data processing instruction to the sub-process through the data communication pipeline; the data processing instruction carries a data identifier;
and in response to receiving a data processing instruction sent by the main process, the sub-process loads target data corresponding to the data identifier based on the data identifier carried in the data processing instruction, and performs preset processing on the target data.
In a possible implementation manner, the process creation method further includes:
and the sub-process transmits the result data to the main process through the data communication pipeline after the target data is subjected to preset processing to obtain the result data.
In a possible implementation manner, the process creation method further includes:
in response to receiving the result data sent by the sub-process, the main process sends a process closing instruction to the intermediate process;
and the intermediate process responds to the received process closing instruction and closes the sub-process.
In a possible implementation manner, the process creation method further includes:
in response to receiving the result data sent by the sub-process, the main process utilizes the result data to train a deep learning model in the current training period;
and/or, training the deep learning model in a future training cycle by using the result data.
Therefore, sample data can be preprocessed, samples required by training can be obtained in time in each training period, and the model training efficiency is improved.
In a second aspect, an embodiment of the present disclosure further provides a process creation apparatus, including: a main process module and an intermediate process module; wherein,
the main process module is used for responding to the triggering of the sub-process creating event and detecting whether an intermediate process module is created currently; in response to the intermediate process module being created, sending a process creation instruction to the intermediate process module;
the intermediate process module is used for responding to the received process creating instruction and creating a sub-process module in a first mode.
In an optional embodiment, the intermediate process module, when creating the sub-process module in the first manner in response to receiving the process creation instruction, is configured to: and executing a split fork function based on the process parameters carried in the process creating instruction, and creating the sub-process module.
In an optional implementation manner, the host process module is further configured to: and in response to the fact that the intermediate process module is not created, creating the intermediate process module in a second mode, and sending the process creation instruction to the intermediate process module after the intermediate process module is successfully created.
In an alternative embodiment, the intermediate process module, when creating the intermediate process module in the second manner in response to not creating the intermediate process module, is configured to: and executing the spawn function to create the intermediate process module.
In an alternative embodiment, the sub-process creation event is triggered, and includes at least one of:
starting a new training period for training the deep learning model;
the batch number of the preprocessed sample data required by training the deep learning model is smaller than a preset number threshold.
In an optional implementation manner, the intermediate process module is further configured to create a data communication pipe between the main process module and the sub-process module.
In an optional implementation manner, the main process module is further configured to send a data processing instruction to the sub-process module through the data communication pipeline in response to the data communication pipeline being successfully created; the data processing instruction carries a data identifier;
further comprising: and the sub-process module is used for responding to a received data processing instruction sent by the main process, loading target data corresponding to the data identification based on the data identification carried in the data processing instruction, and performing preset processing on the target data.
In an optional implementation manner, the subprocess module is further configured to transmit the result data to the main process module through the data communication pipeline after performing preset processing on the target data to obtain the result data.
In an optional implementation manner, the main process module is further configured to send a process closing instruction to the intermediate process module in response to receiving the result data sent by the sub process module;
the intermediate process module is further configured to close the sub-process module in response to receiving the process closing instruction.
In an optional implementation manner, the main process module is further configured to perform, in response to receiving the result data sent by the sub-process module, training of a deep learning model in a current training period by using the result data;
and/or, training the deep learning model in a future training cycle by using the result data.
In a third aspect, this disclosure also provides an electronic device, a processor, and a memory, where the memory stores machine-readable instructions executable by the processor, and the processor is configured to execute the machine-readable instructions stored in the memory, and when the machine-readable instructions are executed by the processor, the machine-readable instructions are executed by the processor to perform the steps in the first aspect or any one of the possible implementations of the first aspect.
In a fourth aspect, this disclosure also provides a computer-readable storage medium having a computer program stored thereon, where the computer program is executed to perform the steps in the first aspect or any one of the possible implementation manners of the first aspect.
For the description of the effects of the process creation apparatus, the electronic device, and the computer-readable storage medium, reference is made to the description of the process creation method, which is not repeated herein.
In order to make the aforementioned objects, features and advantages of the present disclosure more comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present disclosure, the drawings required for use in the embodiments will be briefly described below, and the drawings herein incorporated in and forming a part of the specification illustrate embodiments consistent with the present disclosure and, together with the description, serve to explain the technical solutions of the present disclosure. It is appreciated that the following drawings depict only certain embodiments of the disclosure and are therefore not to be considered limiting of its scope, for those skilled in the art will be able to derive additional related drawings therefrom without the benefit of the inventive faculty.
FIG. 1 is a flow chart illustrating a process creation method provided by an embodiment of the present disclosure;
FIG. 2 is a flowchart illustrating a specific example method for deep learning model training using a process creation method according to an embodiment of the present disclosure;
FIG. 3 is a diagram illustrating an apparatus for process creation provided by an embodiment of the present disclosure;
fig. 4 shows a schematic diagram of an electronic device provided by an embodiment of the present disclosure.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present disclosure more clear, the technical solutions of the embodiments of the present disclosure will be described clearly and completely with reference to the drawings in the embodiments of the present disclosure, and it is obvious that the described embodiments are only a part of the embodiments of the present disclosure, not all of the embodiments. The components of embodiments of the present disclosure, as generally described and illustrated herein, may be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present disclosure is not intended to limit the scope of the disclosure, as claimed, but is merely representative of selected embodiments of the disclosure. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the disclosure without making creative efforts, shall fall within the protection scope of the disclosure.
Currently, in the Linux system, a fork method and a spawn method are generally adopted when a main process creates a sub-process. The spawn mode is that a main process operates a series of spawn functions to create sub-processes; however, creating the sub-process in the spawn mode is a process from nothing to having a process created, and the problem of low creation speed exists; the fork mode is that a fork function is executed for a main process, a sub-process is created in a mode of copying the current state of the main process and the memory of the main process, and the creating speed is high; in the creating mode of the sub-process, a Copy-on-Write (COW) technology is adopted to Copy the memory mirror image of the main process into the sub-process, and the sub-process inherits the state and the memory data of the main process; however, in practice, if the main process includes multiple threads, after one of the threads executes the fork function to create the sub-process, the sub-process can only inherit the state of the thread and the memory data of the main process; if the lock object is acquired by other threads of the main process before the child process is created, the child process cannot inherit the states of the other threads, and the lock object can only be released by the owner, so that the lock object cannot be released, and the child process is deadlocked.
For example, there are three threads, number 0, number 1, and number 2, for the master process; the method includes the steps that a 0 thread executes fork to create a subprocess, the subprocess inherits the state of the 0 thread and memory images corresponding to the 0 thread, the 1 thread and the 2 thread, if the 1 thread acquires a certain lock object before the subprocess is created, and after the subprocess is created, the state of the 0 thread is only inherited, so that the 1 thread does not exist for the subprocess, but the lock object can only be released by the 1 thread, and therefore the subprocess cannot release the lock object acquired by the 1 thread, and deadlock occurs in the subprocess.
Based on the research, the present disclosure provides a process creation method, in which an intermediate process is created, and a sub-process is created by using the intermediate process, so that the problem of deadlock of the sub-process can be reduced on the premise of ensuring the creation speed as long as the intermediate process is a single-threaded process.
The above-mentioned drawbacks are the results of the inventor after practical and careful study, and therefore, the discovery process of the above-mentioned problems and the solutions proposed by the present disclosure to the above-mentioned problems should be the contribution of the inventor in the process of the present disclosure.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures.
To facilitate understanding of the present embodiment, first, a process creation method disclosed in the embodiments of the present disclosure is described in detail, where an execution subject of the process creation method provided in the embodiments of the present disclosure is generally an electronic device with certain computing capability, and the electronic device includes, for example: a terminal device, which may be a User Equipment (UE), a mobile device, a User terminal, a cellular phone, a cordless phone, a Personal Digital Assistant (PDA), a handheld device, a computing device, a vehicle mounted device, a wearable device, or a server or other processing device. In some possible implementations, the process creation method may be implemented by a processor calling computer readable instructions stored in a memory.
The process creation method provided by the embodiment of the disclosure can be used for a deep learning framework and can also be used for other application programs. The embodiment of the present disclosure takes a deep learning framework as an example, and explains a process creation method provided by the embodiment of the present disclosure.
Referring to fig. 1, a flowchart of a process creation method provided in the embodiment of the present disclosure is shown, where the method includes steps S101 to S102, where:
s101: in response to a child process creation event being triggered, the host process detecting whether an intermediate process has been created currently; in response to the intermediate process having been created, sending a process creation instruction to the intermediate process;
s102: and the intermediate process responds to the received process creating instruction and creates the sub-process in a first mode.
In the embodiment of the disclosure, a main process, in response to a sub-process creation event being triggered, detects whether an intermediate process has been created at present, and sends a process creation instruction to the intermediate process under the condition that the intermediate process has been created; the intermediate process responds to the received process creation instruction, the sub-process is created in the first mode, so that the intermediate process is created, the sub-process is created through the intermediate process, and the problem of deadlock of the sub-process can be reduced on the premise of ensuring the creating speed of the sub-process as long as the intermediate process is a single-thread process.
In the disclosed embodiment, the host process is the main part of the deep learning framework; all modules and processes during training of the deep learning model are controlled and promoted by a main process, including establishment and termination of a training process, data reading and preprocessing, execution processes of operators in the deep learning model, execution of the operators on different devices, synchronization among the devices during distributed training and the like; an Engine thread is a thread used for decomposing and distributing operator computing tasks in a main process, runs in the background and is invisible to a user, and exists all the time in the whole training process. The main process also comprises other threads for realizing other functions in the model training process.
The intermediate process is specially designed for creating the child process and is responsible for creating and destroying the child process. For example, when a new round of training cycle begins, and a main process needs to create a sub-process, a process creation instruction is sent to an intermediate process; and after receiving a process creation instruction sent by the main process, the intermediate process creates a sub-process. When the sub-process needs to be closed, the main process can send a process closing instruction to the intermediate process; and after the intermediate process receives the process closing sent by the main process, closing the sub-process.
The sub-processes include, for example: processes for data reading and preprocessing, or processes with other functions in a deep learning framework; taking the subprocess as an example of a process for reading and preprocessing data, after the subprocess is created, the main process sends a batch of sample data to be processed to the subprocess through a data transmission pipeline between the main process and the subprocess; after receiving a batch of sample data, the sub-process reads the sample data corresponding to the batch from the external memory by calling a data reading and preprocessing function provided by a user, preprocesses the read sample data, and then transmits the preprocessed sample data to the main process through a data transmission pipeline between the sub-process and the main process. And after receiving the preprocessed sample data transmitted by the subprocess, the main process executes the training of the current training period by using the preprocessed sample data.
The child process creates events, including, for example: and starting a new training period for training the deep learning model, or enabling the batch number of the preprocessed sample data required by training the deep learning model to be less than a preset number threshold.
At the beginning of each round (epoch) of training, the main process needs to create a sub-process for process data pre-processing; when a new training cycle for training the deep learning model begins, the host process detects whether an intermediate process has been currently created. And if detecting that the intermediate process is established currently, sending a process establishing instruction to the intermediate process.
And after receiving the process creating instruction, the intermediate process creates the sub-process in a first mode according to the process parameters carried in the process creating instruction.
In an embodiment of the present disclosure, the first method includes executing a split fork function based on a process parameter carried in a process creation instruction. The creating of the sub-process in the first way comprises: and the intermediate process executes a split fork function based on the process parameters carried in the process creating instruction to create the sub-process.
When the intermediate process executes the fork function, the created child process can inherit the state and data of the intermediate process. In order to ensure that when a child process is created by an intermediate process in a fork manner, deadlock of the child process caused by the fact that the intermediate process includes a plurality of threads is avoided, the intermediate process normally includes only one thread, and the functions of the thread include: and creating the sub-process in response to receiving a process creation instruction sent by the main process, and creating a data transmission pipeline between the main process and the sub-process after the sub-process is successfully created. And closing the sub-process in response to receiving a process closing instruction sent by the main process.
Therefore, the intermediate process always exists in the deep learning model training process, so that the intermediate process can create the sub-process in a fork mode in each training cycle, the sub-process creation efficiency can be improved, and the problem that the sub-process is not deadlocked due to the creation of the multi-thread process is solved.
In another embodiment of the present disclosure, the method further includes: and the main process responds to the situation that the intermediate process is not created, creates the intermediate process in a second mode, and sends the process creation instruction to the intermediate process after the intermediate process is successfully created.
In an embodiment of the present disclosure, the second manner includes performing a generate spawn function. The host process, in response to not creating the intermediate process, creating the intermediate process in a second manner includes: and the main process executes the spawn generating function and creates the intermediate process. In this way, the intermediate process is created by adopting the spawn mode, so that the intermediate process does not have the problem of process deadlock caused by the fork mode.
In another embodiment of the present disclosure, the host process may also create an intermediate process by executing a fork function. Before executing the fork function, the main process detects the number of threads in the main process, and closes other threads except the driving thread and executes the fork function to create an intermediate process when the number of threads is greater than 1. After the intermediate process is created, other closed threads are started, so that the main process only comprises one thread when a fork function is executed, and the condition that other processes are deadlocked possibly caused when the main process comprising multiple threads creates other processes in a fork mode is avoided.
In another embodiment of the present disclosure, the process creation method provided by the embodiment of the present disclosure further includes: after the intermediate process creates the sub-process, the intermediate process creates a data communication pipeline between the main process and the sub-process.
For example, after creating the sub-process, the intermediate process may feed back the creation result to the main process; carrying the process identification of the sub-process in the creating result; after obtaining the process identifier, the main process may communicate with the sub-process based on the process identifier, for example; in addition, after the intermediate process creates the sub-process, the process identifier of the main process can be transmitted to the sub-process, so that the sub-process can communicate with the main process based on the process identifier of the main process, and further, the data communication pipeline between the main process and the sub-process is created.
Responding to the successful creation of the data communication pipeline, and sending a data processing instruction to the sub-process by the main process through the data communication pipeline; the data processing instruction carries a data identifier, such as a batch of sample data to be preprocessed.
And in response to receiving a data processing instruction sent by the main process, the sub-process loads target data corresponding to the data identifier based on the data identifier carried in the data processing instruction, and performs preset processing on the target data.
In a specific implementation, when the sub-process loads the target data based on the data identifier carried in the data processing instruction, for example, a data page table corresponding to the sample data may be loaded first; the data page table comprises index information of sample data; the sub-process can inquire the physical address of the storage target data corresponding to the target data and the length of the target data from the page table based on the data identification, and then read the target data from the external memory based on the read physical address and the length of the target data.
After the target data are read, executing a preprocessing function defined by a user to realize preprocessing of the target data and obtain result data after preprocessing of the target data.
In addition, the subprocess can also transmit the result data to the main process through the data communication pipeline after preprocessing the target data to obtain the result data.
And after receiving the result data transmitted by the subprocess, the main process executes a model training process by using the result data.
In the disclosed example, after receiving the result data transmitted by the sub-process, the main process may train the deep learning model in a current training period by using the result data, and/or train the deep learning model in a future training period by using the result data.
In another embodiment of the present disclosure, in response to sending the result data to the main process, the sub-process actively exits; and after the next sub-process creation event is triggered, the main process sends a process creation instruction to the intermediate process again, and the intermediate process receives the process creation instruction and then re-creates a new sub-process.
In addition, the main process can also send a process closing instruction to the intermediate process after receiving the result data sent by the sub-process, and the intermediate process receives the process closing instruction sent by the main process and controls the sub-process to close.
Referring to fig. 2, an embodiment of the present disclosure further provides a specific example of deep learning model training by using the process creation method provided by the embodiment of the present disclosure, including:
s201: the deep learning framework is started and the main process of the deep learning framework starts to run.
S202: the main process responds to the triggering of the sub-process creation event and detects whether an intermediate process is created currently; if not, jumping to S203; if yes, go to S204.
S203: and creating the intermediate process by adopting a spawn mode. Jump to S204.
S204: and sending a process creation instruction to the intermediate process.
S205: and the intermediate process receives a process creation instruction sent by the main process, creates a sub-process in a fork mode, and creates a data communication pipeline between the main process and the sub-process.
S206: and after the child process is successfully created, loading a data page table corresponding to the sample data.
S207: and the intermediate process feeds back information of successful creation of the sub-process and related information of the data communication pipeline to the main process.
S208: and the main process sends identification information of the target data, such as batch indexes, to the sub-process according to the relevant information of the data communication pipeline.
S209: and the sub-process receives the identification information of the target data sent by the main process, and acquires the storage address and the data length of the target data from the data page table according to the identification information and the loaded data page table.
S210: and the subprocess reads the target data from the sample data stored in the external memory according to the storage address and the data length of the target data, and loads the target data from the external memory into the internal memory.
S211: and the subprocess preprocesses the target data by calling a preprocessing function configured by a user to obtain result data.
S212: and the sub-process sends result data to the main process through a data communication pipeline between the sub-process and the main process.
S213: and after receiving the result data sent by the subprocess, the main process utilizes the result data to train the deep learning model in the current training period, and/or utilizes the result data to train the deep learning model in the future training period.
S214: and after receiving the result data sent by the subprocess, the main process also sends a process closing instruction to the intermediate process.
S215: and after the intermediate process receives the process closing instruction sent by the main process, closing the sub-process.
Through the process, the training of the deep learning model is realized by one wheel.
Through the process, the problem of sub-process deadlock caused by the fact that the main process directly adopts a fork mode to create the sub-process can be avoided under the condition that the creating speed of the sub-process is guaranteed.
It will be understood by those skilled in the art that in the method of the present invention, the order of writing the steps does not imply a strict order of execution and any limitations on the implementation, and the specific order of execution of the steps should be determined by their function and possible inherent logic.
Based on the same inventive concept, a process creation apparatus corresponding to the process creation method is also provided in the embodiments of the present disclosure, and since the principle of the apparatus in the embodiments of the present disclosure for solving the problem is similar to the process creation method described above in the embodiments of the present disclosure, the implementation of the apparatus may refer to the implementation of the method, and repeated details are not described again.
Referring to fig. 3, a schematic diagram of a process creation apparatus provided in an embodiment of the present disclosure is shown, where the apparatus includes: the method comprises the following steps: a main process module 31, and an intermediate process module 32; wherein,
the main process module 31 is configured to detect whether an intermediate process module is created currently in response to a child process creation event being triggered; in response to the intermediate process module being created, sending a process creation instruction to the intermediate process module;
the intermediate process module 32 is configured to create a sub-process module in a first manner in response to receiving the process creation instruction.
In an alternative embodiment, the intermediate process module 32, when creating the sub-process module in the first manner in response to receiving the process creation instruction, is configured to: and executing a split fork function based on the process parameters carried in the process creating instruction, and creating the sub-process module.
In an optional implementation manner, the host process module 31 is further configured to: and in response to the fact that the intermediate process module is not created, creating the intermediate process module in a second mode, and sending the process creation instruction to the intermediate process module after the intermediate process module is successfully created.
In an alternative embodiment, the middle process module 32, when creating the middle process module in the second manner in response to not creating the middle process module, is configured to: and executing the spawn function to create the intermediate process module.
In an alternative embodiment, the sub-process creation event is triggered, and includes at least one of:
starting a new training period for training the deep learning model;
the batch number of the preprocessed sample data required by training the deep learning model is smaller than a preset number threshold.
In an optional embodiment, the intermediate process module 32 is further configured to create a data communication pipe between the main process module and the sub-process module.
In an optional embodiment, the main process module 31 is further configured to send a data processing instruction to the sub-process module through the data communication pipeline in response to the data communication pipeline being successfully created; the data processing instruction carries a data identifier;
further comprising: and the subprocess module 33 is configured to, in response to receiving a data processing instruction sent by the main process, load target data corresponding to the data identifier based on the data identifier carried in the data processing instruction, and perform preset processing on the target data.
In an optional implementation manner, the subprocess module 33 is further configured to transmit the result data to the main process module through the data communication pipeline after performing preset processing on the target data to obtain the result data.
In an optional implementation manner, the main process module 31 is further configured to send a process closing instruction to the intermediate process module in response to receiving the result data sent by the sub-process module;
the intermediate process module 32 is further configured to close the sub-process module in response to receiving the process closing instruction.
In an optional implementation manner, the main process module 31 is further configured to, in response to receiving the result data sent by the sub-process module, perform training on a deep learning model in a current training period by using the result data;
and/or, training the deep learning model in a future training cycle by using the result data.
The description of the processing flow of each module in the device and the interaction flow between the modules may refer to the related description in the above method embodiments, and will not be described in detail here.
An embodiment of the present disclosure further provides an electronic device, as shown in fig. 4, which is a schematic structural diagram of the electronic device provided in the embodiment of the present disclosure, and the electronic device includes:
a processor 41 and a memory 42; the memory 42 stores machine-readable instructions executable by the processor 41, the processor 41 being configured to execute the machine-readable instructions stored in the memory 42, the processor 41 performing the following steps when the machine-readable instructions are executed by the processor 41:
in response to a child process creation event being triggered, the host process detecting whether an intermediate process has been created currently; in response to the intermediate process having been created, sending a process creation instruction to the intermediate process;
and the intermediate process responds to the received process creating instruction and creates the sub-process in a first mode.
The storage 42 includes a memory 421 and an external storage 422; the memory 421 is also referred to as an internal memory, and temporarily stores operation data in the processor 41 and data exchanged with the external memory 422 such as a hard disk, and the processor 41 exchanges data with the external memory 422 via the memory 421.
For the specific execution process of the instruction, reference may be made to the steps of the process creation method described in the embodiments of the present disclosure, and details are not described here.
The embodiments of the present disclosure also provide a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program performs the steps of the process creation method described in the above method embodiments. The storage medium may be a volatile or non-volatile computer-readable storage medium.
The embodiments of the present disclosure also provide a computer program product, where the computer program product carries a program code, and instructions included in the program code may be used to execute the steps of the process creation method in the foregoing method embodiments, which may be referred to specifically in the foregoing method embodiments, and are not described herein again.
The computer program product may be implemented by hardware, software or a combination thereof. In an alternative embodiment, the computer program product is embodied in a computer storage medium, and in another alternative embodiment, the computer program product is embodied in a Software product, such as a Software Development Kit (SDK), or the like.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the system and the apparatus described above may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again. In the several embodiments provided in the present disclosure, it should be understood that the disclosed system, apparatus, and method may be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one logical division, and there may be other divisions when actually implemented, and for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of devices or units through some communication interfaces, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present disclosure may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a non-volatile computer-readable storage medium executable by a processor. Based on such understanding, the technical solution of the present disclosure may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for causing an electronic device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present disclosure. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
Finally, it should be noted that: the above-mentioned embodiments are merely specific embodiments of the present disclosure, which are used for illustrating the technical solutions of the present disclosure and not for limiting the same, and the scope of the present disclosure is not limited thereto, and although the present disclosure is described in detail with reference to the foregoing embodiments, those skilled in the art should understand that: any person skilled in the art can modify or easily conceive of the technical solutions described in the foregoing embodiments or equivalent technical features thereof within the technical scope of the present disclosure; such modifications, changes or substitutions do not depart from the spirit and scope of the embodiments of the present disclosure, and should be construed as being included therein. Therefore, the protection scope of the present disclosure shall be subject to the protection scope of the claims.

Claims (13)

1. A process creation method, comprising:
in response to a child process creation event being triggered, the host process detecting whether an intermediate process has been created currently; in response to the intermediate process having been created, sending a process creation instruction to the intermediate process;
and the intermediate process responds to the received process creating instruction and creates a sub-process in a first mode.
2. The process creation method of claim 1, wherein the intermediate process, in response to receiving the process creation instruction, creates a child process in a first manner comprising:
and the intermediate process executes a split fork function based on the process parameters carried in the process creating instruction to create the sub-process.
3. The process creation method according to claim 1 or 2, characterized in that the process creation method further comprises:
and the main process responds to the situation that the intermediate process is not created, creates the intermediate process in a second mode, and sends the process creation instruction to the intermediate process after the intermediate process is successfully created.
4. The process creation method of claim 3, wherein the host process, in response to not creating the intermediate process, creating the intermediate process in a second manner, comprises:
and the main process executes and generates a spawn function and creates the intermediate process.
5. The process creation method of any of claims 1-4, wherein the sub-process creation event is triggered and comprises at least one of:
starting a new training period for training the deep learning model;
the batch number of the preprocessed sample data required by training the deep learning model is smaller than a preset number threshold.
6. The process creation method of any one of claims 1 to 5, wherein the process creation method further comprises:
the intermediate process creates a data communication conduit between the main process and the sub-process.
7. The process creation method of claim 6, wherein the process creation method further comprises:
in response to the data communication pipeline being successfully created, the main process sends a data processing instruction to the sub-process through the data communication pipeline; the data processing instruction carries a data identifier;
and in response to receiving a data processing instruction sent by the main process, the sub-process loads target data corresponding to the data identifier based on the data identifier carried in the data processing instruction, and performs preset processing on the target data.
8. The process creation method according to claim 7, characterized in that the process creation method further comprises:
and the sub-process transmits the result data to the main process through the data communication pipeline after the target data is subjected to preset processing to obtain the result data.
9. The process creation method according to claim 8, characterized in that the process creation method further comprises:
in response to receiving the result data sent by the sub-process, the main process sends a process closing instruction to the intermediate process;
and the intermediate process responds to the received process closing instruction and closes the sub-process.
10. The process creation method according to claim 8, characterized in that the process creation method further comprises:
in response to receiving the result data sent by the sub-process, the main process utilizes the result data to train a deep learning model in the current training period;
and/or, training the deep learning model in a future training cycle by using the result data.
11. A process creation apparatus, comprising: a main process module and an intermediate process module; wherein,
the main process module is used for responding to the triggering of the sub-process creating event and detecting whether an intermediate process module is created currently; in response to the intermediate process module being created, sending a process creation instruction to the intermediate process module;
and the intermediate process module is used for responding to the received process creating instruction and creating the sub-process in a first mode.
12. An electronic device, comprising: a processor, a memory storing machine readable instructions executable by the processor, the processor to execute machine readable instructions stored in the memory, the processor to perform the steps of the process creation method of any one of claims 1 to 10 when the machine readable instructions are executed by the processor.
13. A computer-readable storage medium, having stored thereon a computer program which, when executed by an electronic device, causes the electronic device to perform the steps of the process creation method according to any one of claims 1 to 10.
CN202110720406.7A 2021-06-28 2021-06-28 Process creation method and device, electronic equipment and storage medium Pending CN113391909A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110720406.7A CN113391909A (en) 2021-06-28 2021-06-28 Process creation method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110720406.7A CN113391909A (en) 2021-06-28 2021-06-28 Process creation method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN113391909A true CN113391909A (en) 2021-09-14

Family

ID=77624215

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110720406.7A Pending CN113391909A (en) 2021-06-28 2021-06-28 Process creation method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113391909A (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102083099A (en) * 2010-12-30 2011-06-01 中兴通讯股份有限公司 Base station and method and a device for electrifying same in single station mode
US20140129880A1 (en) * 2012-11-08 2014-05-08 Dell Products L.P. Generation of memory dump of a computer process without terminating the computer process
US20140137183A1 (en) * 2012-11-13 2014-05-15 Auckland Uniservices Ltd. Security system and method for the android operating system
US20150193392A1 (en) * 2013-04-17 2015-07-09 Google Inc. User Interface for Quickly Checking Agenda and Creating New Events
CN109144741A (en) * 2017-06-13 2019-01-04 广东神马搜索科技有限公司 The method, apparatus and electronic equipment of interprocess communication
CN111090562A (en) * 2019-11-25 2020-05-01 广东科徕尼智能科技有限公司 Business process monitoring method, equipment and storage medium of edge intelligent gateway
CN111124685A (en) * 2019-12-26 2020-05-08 神州数码医疗科技股份有限公司 Big data processing method and device, electronic equipment and storage medium
CN111414256A (en) * 2020-03-27 2020-07-14 中国人民解放军国防科技大学 Application program process derivation method, system and medium based on kylin mobile operating system

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102083099A (en) * 2010-12-30 2011-06-01 中兴通讯股份有限公司 Base station and method and a device for electrifying same in single station mode
US20140129880A1 (en) * 2012-11-08 2014-05-08 Dell Products L.P. Generation of memory dump of a computer process without terminating the computer process
US20140137183A1 (en) * 2012-11-13 2014-05-15 Auckland Uniservices Ltd. Security system and method for the android operating system
US20150193392A1 (en) * 2013-04-17 2015-07-09 Google Inc. User Interface for Quickly Checking Agenda and Creating New Events
CN109144741A (en) * 2017-06-13 2019-01-04 广东神马搜索科技有限公司 The method, apparatus and electronic equipment of interprocess communication
CN111090562A (en) * 2019-11-25 2020-05-01 广东科徕尼智能科技有限公司 Business process monitoring method, equipment and storage medium of edge intelligent gateway
CN111124685A (en) * 2019-12-26 2020-05-08 神州数码医疗科技股份有限公司 Big data processing method and device, electronic equipment and storage medium
CN111414256A (en) * 2020-03-27 2020-07-14 中国人民解放军国防科技大学 Application program process derivation method, system and medium based on kylin mobile operating system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
五月君: "Node.js 进阶之进程与线程", pages 1 - 4, Retrieved from the Internet <URL:《https://zhuanlan.zhihu.com/p/94613627》> *

Similar Documents

Publication Publication Date Title
CN112035238B (en) Task scheduling processing method and device, cluster system and readable storage medium
CN104391716B (en) Application program implementation method and device based on plug-in unit
CN109635523B (en) Application program detection method and device and computer readable storage medium
CN111813869B (en) Distributed data-based multi-task model training method and system
CN110765288B (en) Image information synchronization method, device and system and storage medium
CN107526623B (en) Data processing method and device
EP3151518A1 (en) Information processing device, information processing method, and information processing program
CN111240482A (en) Special effect display method and device
CN110007936B (en) Data processing method and device
CN113805962B (en) Application page display method and device and electronic equipment
CN113157345A (en) Automatic starting method and device for front-end engineering
CN114428722A (en) Hardware simulation method, device, equipment and storage medium
CN112882732A (en) Method and device for updating function codes in Software Development Kit (SDK)
CN114706633A (en) Preloading method, electronic device and storage medium
CN113190427B (en) Method and device for monitoring blocking, electronic equipment and storage medium
CN108762983B (en) Multimedia data recovery method and device
CN108829391B (en) Method and system for identifying control in Fragment
CN107633080B (en) User task processing method and device
CN111580883B (en) Application program starting method, device, computer system and medium
CN113391909A (en) Process creation method and device, electronic equipment and storage medium
CN114090183B (en) Application starting method and device, computer equipment and storage medium
CN115827457A (en) Browser compatibility testing method and related equipment
EP3848800B1 (en) Method and apparatus for displaying message box, terminal and storage medium
CN112214287A (en) Service control method and device of application software and electronic equipment
CN108959955A (en) Document handling method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination