CN113157437A - Data processing method and device, electronic equipment and storage medium - Google Patents

Data processing method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN113157437A
CN113157437A CN202110237243.7A CN202110237243A CN113157437A CN 113157437 A CN113157437 A CN 113157437A CN 202110237243 A CN202110237243 A CN 202110237243A CN 113157437 A CN113157437 A CN 113157437A
Authority
CN
China
Prior art keywords
thread
data
data processing
service
processing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110237243.7A
Other languages
Chinese (zh)
Inventor
许剑飞
马原
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Pengsi Technology Co ltd
Original Assignee
Beijing Pengsi Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Pengsi Technology Co ltd filed Critical Beijing Pengsi Technology Co ltd
Priority to CN202110237243.7A priority Critical patent/CN113157437A/en
Publication of CN113157437A publication Critical patent/CN113157437A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/50Allocation of resources, e.g. of the central processing unit [CPU]
    • G06F9/5005Allocation of resources, e.g. of the central processing unit [CPU] to service a request
    • G06F9/5027Allocation of resources, e.g. of the central processing unit [CPU] to service a request the resource being a machine, e.g. CPUs, Servers, Terminals
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2209/00Indexing scheme relating to G06F9/00
    • G06F2209/50Indexing scheme relating to G06F9/50
    • G06F2209/5018Thread allocation

Landscapes

  • Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Stored Programmes (AREA)

Abstract

The technical scheme of the application provides a data processing method, a data processing device, an electronic device and a storage medium, wherein the data processing method comprises the following steps: receiving a data processing request through a calling interface; calling a service module corresponding to the data processing request in a service layer to obtain task parameters of the data processing request; wherein the service layer comprises: a plurality of service modules which can provide different business services and are called in parallel; calling a processing module corresponding to the task parameter in the capability integration layer to obtain a thread parameter responding to the data processing request; wherein the capability integration layer comprises: a plurality of processing modules capable of providing different data processing methods and being called in parallel; calling a thread corresponding to the thread parameter in a service thread layer, and responding to the data processing request; wherein the service line layer comprises: multiple threads that can be called in parallel.

Description

Data processing method and device, electronic equipment and storage medium
Technical Field
The present invention relates to the field of data processing, and in particular, to a data processing method and apparatus, an electronic device, and a storage medium.
Background
With the development of computer technology and big data technology, more and more application fields use computer technology and big data technology, and the use requirements of corresponding application fields can be better met through the technologies. A large amount of data is generated in each application field, and the data has a great amount of diversity.
There may be some valuable information in the data, which may be obtained by processing the data, which may be by data processing techniques, etc. According to the result obtained by processing the data, various subsequent applications and the like can be better performed.
Disclosure of Invention
The embodiment of the invention provides a data processing method, a data processing device, electronic equipment and a storage medium.
A first aspect of the embodiments of the present disclosure provides a data processing method, including: receiving a data processing request through a calling interface; calling a service module corresponding to the data processing request in a service layer to obtain task parameters of the data processing request; wherein the service layer comprises: a plurality of service modules which can provide different business services and are called in parallel; calling a processing module corresponding to the task parameter in the capability integration layer to obtain a thread parameter responding to the data processing request; wherein the capability integration layer comprises: a plurality of processing modules capable of providing different data processing methods and being called in parallel; calling a thread corresponding to the thread parameter in a service thread layer, and responding to the data processing request; wherein the business thread layer comprises: multiple threads that can be called in parallel.
In one embodiment, the capability integration layer includes: a parent class processing module and a plurality of child class processing modules which can be called in parallel.
In one embodiment, the invoking a processing module corresponding to the task parameter in the capability aggregation layer to obtain a thread parameter responding to the data processing request includes: transmitting the task parameters to the parent processing module corresponding to the task parameters; the parent processing module distributes data corresponding to the subclass processing module in the task parameters received by the parent processing module to the subclass processing module according to a data processing method of the subclass processing module; and the subclass processing module obtains a thread parameter responding to the data processing request according to the task parameter.
In one embodiment, the task parameter corresponds to data to be processed of at least one parent class; the capability integration layer includes: and different parent processing modules are used for processing the data to be processed of different parent classes.
In one embodiment, the business thread layer comprises: the system comprises a thread parent class and a plurality of thread subclasses which are created according to the thread parent class, inherit the property of the thread parent class and can be called in parallel.
In one embodiment, the invoking a thread in the business thread layer corresponding to the thread parameter in response to the data processing request includes: transmitting the thread parameter to the thread parent class corresponding to the thread parameter; the thread parent class distributes the data corresponding to the thread subclass in the thread parameters received by the thread parent class to the thread subclass according to a data processing method of the thread subclass; and the thread subclass responds to the data processing request according to the thread parameter.
In one embodiment, the task parameter corresponds to at least one parent class of data to be processed, the data to be processed of one parent class includes at least two different subclasses of data to be processed, and one of the thread parameters corresponds to one of the subclasses of data to be processed; the business thread layer comprises: and different thread parent classes are used for processing the data to be processed of different subclasses of the same parent class.
A second aspect of the embodiments of the present disclosure provides a data processing apparatus, including: the receiving module is used for receiving a data processing request through a calling interface; the first calling module is used for calling a service module corresponding to the data processing request in a service layer to obtain a task parameter of the data processing request; wherein the service layer comprises: a plurality of service modules which can provide different business services and are called in parallel; the second calling module is used for calling the processing module corresponding to the task parameter in the capability integration layer to obtain a thread parameter responding to the data processing request; wherein the capability integration layer comprises: a plurality of processing modules capable of providing different data processing methods and being called in parallel; the third calling module is used for calling the thread corresponding to the thread parameter in the service thread layer and responding to the data processing request; wherein the service line layer comprises: multiple threads that can be called in parallel.
A third aspect of the embodiments of the present disclosure provides an electronic device, including:
a processor;
a memory storing program instructions that, when executed by the processor, cause the electronic device to perform the processing method described above.
A fourth aspect of the embodiments of the present disclosure provides a storage medium storing a program that, when executed by a processor, executes the processing method described above.
The technical scheme of the embodiment of the disclosure includes a service layer, a capability convergence layer and a service thread layer, wherein the service layer includes a plurality of service modules which can provide different service services and are called in parallel, the capability convergence layer includes a plurality of processing modules which can provide different data processing methods and are called in parallel, and the service thread layer includes a plurality of threads which can be called in parallel. Because the service modules can provide different business services and are called in parallel, and the called service modules correspond to the data processing requests, after the service modules corresponding to the data processing requests are called, the service modules and other service modules are not affected with each other, and the coupling degree between the service modules is reduced.
Because the processing modules can provide different data processing methods and are called in parallel, and the called processing modules correspond to the task parameters, after the processing modules corresponding to the task parameters in the capability integration layer are called, the processing modules do not influence each other with other processing modules, and the coupling degree among the processing modules is reduced.
Because the service thread layer comprises a plurality of threads which can be called in parallel and the threads corresponding to the thread parameters in the service thread layer are called, the called threads and other threads are not influenced mutually after the threads corresponding to the thread parameters are called, and the coupling degree among the threads is also reduced.
The coupling degree between the processing modules, the coupling degree between the processing modules and the coupling degree between the threads are reduced, so that the coupling degree of each component in the process of responding to the data processing request is reduced on the whole.
The service module in the service layer, the processing module in the capability integration layer and the thread in the service thread layer are all multiple and can be adjusted according to the service, so that the expandability is realized, and the flexibility of data processing, the speed of processing data and the like are improved.
Drawings
Fig. 1 is a schematic flow chart of a data processing method according to an embodiment of the present disclosure;
fig. 2 is a schematic flow chart illustrating a process of obtaining thread parameters according to an embodiment of the disclosure;
fig. 3 is a schematic flow chart of responding to the data processing request according to an embodiment of the present disclosure;
fig. 4 is a schematic structural diagram of a data processing apparatus according to an embodiment of the present disclosure;
FIG. 5 is a schematic diagram of another data processing method provided by the embodiments of the present disclosure;
fig. 6 is a schematic diagram of another data processing method according to an embodiment of the present disclosure.
Detailed Description
The technical solution of the present invention is further described in detail with reference to the drawings and the specific embodiments of the specification.
Common data processing methods include a synchronous processing mode and an asynchronous processing mode. When data is processed in an application scene including massive data by a synchronous processing method, the utilization rate of a processor is not high. For a multiprocessor, when a large amount of data is processed by a synchronous processing method, since only a single-task sequential processing method can be used, when one task occupies a processor for a long time, other tasks wait for a long time, and thus the other tasks cannot be processed, which results in poor concurrency. When processing such a large amount of data, a large amount of time is required, resulting in low execution efficiency.
The data is processed in an asynchronous processing mode, and the problems that the dependence on a processing frame is large, a program is too complicated, the bottom layer is invisible, the program splitting difficulty is large and the like exist, so that the programming ideas of low coupling and high cohesion of a programming language are violated to a certain extent. If the frames are too dependent, conflicts between the frames may exist, and problems may exist when the frames are expanded at a later stage, for example, if one module is replaced, a series of subsequent problems may be caused.
Referring to fig. 1, a schematic flow chart of a data processing method provided in the technical solution of the present application is shown, where the data processing method mainly includes the following steps:
step S100, receiving a data processing request through a call interface.
Step S200, calling a service module corresponding to the data processing request in a service layer to obtain a task parameter of the data processing request; wherein the service layer comprises: and a plurality of service modules which can provide different business services and are called in parallel.
Step S300, calling a processing module corresponding to the task parameter in a capability integration layer to obtain a thread parameter responding to the data processing request; wherein the capability integration layer comprises: a plurality of processing modules that can provide different data processing methods and that are called in parallel.
Step S400, calling a thread corresponding to the thread parameter in a service thread layer, and responding to the data processing request; wherein the business thread layer comprises: multiple threads that can be called in parallel.
The relationship between the service layer, the capability aggregation layer, and the business thread layer can be referred to in FIG. 5.
In step S100, before data processing, a data processing request needs to be received, and data is processed according to the received data processing request. Specifically, the data processing request may be received through a call interface, where the call interface is used as an interface that can be called, and the call interface may further provide a corresponding service and perform corresponding processing. The calling interface may be an interface provided by a calling framework, through which the reception of data processing requests and the like may be achieved.
The calling interface can provide calling service for the outside, and the outside capable of calling the calling interface can call the calling interface.
In one embodiment, the calling interface may be a unified calling interface for all data processing requests, and at this time, only one calling interface provided externally by the computing system may be provided.
In another embodiment, the number of the calling interfaces may be multiple, and different calling interfaces may receive different processing requests, so that processing of different processing requests may be implemented. The same calling interface can also receive different processing requests and then call the corresponding service module according to the data processing request.
Step S200 may be considered as a first implementation phase of processing a data processing request.
After receiving the data processing request, since the processing request can be further realized by the provided service, and the processing request can correspond to the corresponding service, the service module corresponding to the data processing request can be called according to the data processing request. The service module can provide business services, and the processing of the data processing request can be further realized through the business services provided by the called service module.
And calling a service module corresponding to the data processing request, and processing the received data processing request through the called service module to obtain the task parameters of the data processing request. The task parameter may be a task parameter corresponding to the data processing request obtained after the data processing request is processed by the called service module corresponding to the data processing request.
The service module called in this step may be a service module included in the service layer. The service layer comprises a plurality of service modules which can provide different business services and can be called in parallel, different service modules can provide different business services, and different service modules can be called in parallel. The service modules do not influence each other, and after one service module is called, other service modules can be called. Because the service modules can be called in parallel, and the modules are relatively independent and do not influence each other, the coupling degree of the service modules is reduced, and the interdependence among the service modules is reduced.
Because different data processing requests have different required business services and correspond to different service modules, the business services provided by the service modules are different along with the difference of the data processing requests.
For example, the service layer includes a data sending service module and a data receiving service module, the data sending service module may provide a data sending service for the data sending request, and the data receiving service module may provide a data receiving service for the data receiving request. The service layer comprises a data preprocessing service module and a data verification service module, the data preprocessing service module can provide data preprocessing service for the data preprocessing request, and the data verification service module can provide data verification service for the data verification request.
For another example, the service layer includes a service module for image data processing and a service module for text data processing, the service module for image data processing may provide a service for image data processing, and the service module for text data processing may provide a service for text data processing. Image data processing and text data processing are different processing modes.
The number of the service modules included in the service layer is not fixed, the service modules can be transversely expanded according to the actual needs of the service, and other service modules are not affected when the service modules are expanded. Since different service modules can provide different service services in parallel, the same service module can provide the same service, and the same data processing request can be provided by the same service module. The calling interface can be regarded as a parent class, the service modules in the service layer can be regarded as subclasses, and through the design of the parent class and the subclasses between the calling interface and the service modules, more and richer services can be used for executing logic, so that the reusability is high.
For example, the service layer includes three service modules a, B and C which can provide different service services and can be called in parallel, a data processing request is sent externally through the calling interface, and after the data processing request is received through the calling interface, the service module a corresponding to the data processing request is called according to the corresponding relationship between the data processing request and the service module. The corresponding relation between the data processing request and the service module can be the corresponding relation between processing and processed and the corresponding relation between service and served. The service module a may provide a service corresponding to the data processing request, that is, the service provided by the service module a may process the data processing request to obtain the task parameter.
At this time, other data processing requests can be received through the calling interface, and then other service modules B or C in the service layer are called to obtain corresponding task parameters. The calling of a plurality of service modules is not influenced mutually, so that the coupling degree is reduced.
In an embodiment, after receiving the data processing request, the service layer obtains one or more tasks responding to the data processing request according to a service identifier carried by the data processing request and/or the data to be processed carried by the data processing request. Wherein one of said tasks corresponds to a set of said task parameters. The task parameters at least comprise: the task identifier or the identity of the processing module executing the corresponding task.
The task parameter may be a task parameter obtained by processing a data processing request through a service provided by a service module, and the task parameter obtained by providing different service for different data processing requests by different service modules is different.
The task parameters may include information of the corresponding plurality of processing modules in the capability aggregation layer, include identifier information identifying the unique processing module, may be an identification ID, and the like. Different processing modules are correspondingly provided with different IDs, so that the corresponding processing modules can be called through the IDs of the processing modules. The method name of the processing module can be used, the processing modules with different method names have different functions, different data processing methods can be provided, and the corresponding processing module can be called according to the method name. Of course, other information that uniquely identifies the processing module in the capability aggregation layer is also possible.
For example, when the data processing request is a data verification request, the data verification service module provides a data verification service for the data verification request to obtain ID information of a corresponding verification module providing the data verification method in the capability aggregation layer, and then the corresponding verification module in the capability aggregation layer can be called through the ID information of the verification module.
For another example, the service module in the service layer is a service module for image data processing, and obtains a task parameter, where the task parameter includes an identity identifier of the image recognition processing module, or a method name of the image recognition processing module. The corresponding image processing module can be called through the identity identifier or the method name of the image processing module. The task parameters may also include the identity identifier or method name of other processing modules in parallel with the image recognition processing module, such as the identity identifier or method name of a black/white list comparison processing module.
Step S300 may be considered as a second implementation phase of processing the data processing request.
The service module provides service for the data processing request, the processing module in the capability integration layer further processes the result obtained by the service module, determines the specific data processing method of the data processing request, and responds to the data processing request according to the determined data processing method so as to further process the data. After the task parameters are obtained through the called service module, the module corresponding to the task parameters is called according to the task parameters. And processing the task parameters through the called module corresponding to the task parameters to obtain thread parameters responding to the data processing request so that the thread responds to the data processing request.
The module called in this step may be a processing module included in the capability aggregation layer. The capability aggregation layer includes a plurality of processing modules that can provide different data processing methods and are called in parallel, different processing modules may provide different data processing methods, and different processing modules may be called in parallel. The processing modules are not affected mutually, and after one processing module is called, other processing modules can also be called. Because the processing modules can be called in parallel and are relatively independent and do not influence each other, the coupling degree of the processing modules is reduced, and the interdependence among the processing modules is reduced.
Because the task parameters include identifier information of multiple processing modules, multiple different processing modules need to be called to provide different processing methods for different processing, thereby completing processing of data to be processed. Therefore, the different data processing methods provided by the different processing modules are also different.
For example, the capability aggregation layer includes a processing module for data verification, and the processing module for data verification is called according to the ID information of the processing module for data verification included in the task parameter, and the processing module for data verification can provide a processing method for verifying data.
For another example, the capability aggregation layer includes an image recognition processing module and a black/white list comparison processing module, and the image recognition processing module is invoked according to the identity identifier or the method name of the image recognition processing module included in the task parameter. And calling the black/white list comparison processing module according to the identity identifier or the method name of the black/white list comparison processing module. The image identification processing module can provide an image identification processing method, and the black/white list comparison processing module can provide a black/white list comparison method.
The number of processing modules included in the capability integration layer is also not fixed, and the processing modules can be adjusted according to actual service requirements, expanded and the like. The expanded processing module and other processing modules can be called in parallel, and the expanded processing module and other processing modules have no dependency relationship and do not influence each other.
For example, the capability set layer includes three processing modules D, E, and F that provide different data processing methods and can be called in parallel. And calling a processing module D corresponding to the task parameter according to the corresponding relation between the task parameter and the processing module, wherein the processing module D can provide a processing method for processing the task parameter, namely the processing module D can process the task parameter according to the processing method of the task parameter to obtain the thread parameter.
At this time, other processing modules D and F may also be called, and the corresponding task parameters are processed according to the data processing methods provided by the processing module E and the gear module F, so as to obtain corresponding thread parameters. The calling of a plurality of processing modules is not influenced mutually, so that the coupling degree is also reduced.
The thread parameter may be a thread parameter obtained by processing the task parameter by a processing method provided by the processing module, and the thread parameters obtained by processing the task parameter by different processing modules in different methods are different.
The thread parameter may include identity identifier information of a corresponding thread in the service thread layer, and may be an identity ID. Different threads correspond to different IDs so that the corresponding threads can be called through the IDs of the threads. The method name of the thread can be used, different thread method names have different response functions, and different responses of the data processing request can be provided.
For example, the threads in the service thread layer packet include an image feature extraction thread and a data forwarding thread, and the thread parameters obtained by the image recognition processing module include an identity identifier or a method name of the feature extraction thread, an identity identifier or a method name of the data forwarding thread, and the like.
Step S400 may be considered as a third stage of data processing.
And after the thread parameters are obtained, calling the thread corresponding to the thread parameters according to the thread parameters, and responding to the data processing request through the thread so as to complete the processing of the corresponding data processing request. The phase is a final execution and response phase of the data processing request, the execution and response of the data processing request can be realized through the threads, and a response result of the data processing request can be obtained through the corresponding threads.
The thread in the step is a thread in a service thread layer, and the service thread layer comprises a plurality of threads which can be called in parallel. Different threads can respond to different data processing requests according to thread parameters, and the different threads can be called in parallel, so that the different threads are not influenced mutually, and after one thread is called, other threads can be called. This way. The coupling degree among the threads is reduced, and the interdependence among the threads is reduced.
It should be noted that, the processing module in the capability aggregation layer needs to be called according to the task parameter, and then the thread in the service thread layer is called according to the thread parameter, but the thread in the service thread layer cannot be called directly through the thread parameter. The service modules in the service layer and the threads in the business thread layer are connected in series through the capability integration layer, so that each service module corresponds to a corresponding processing module, each processing module corresponds to a corresponding thread, and the modules or threads in each layer are independent and expandable.
According to the technical scheme of the embodiment, data processing is divided into three stages, modules or threads corresponding to each stage are relatively independent and do not influence each other, expansion can be performed according to business needs, and each processing stage and the whole processing process have low coupling degree.
In another embodiment, the capability integration layer includes: a parent class processing module and a plurality of child class processing modules which can be called in parallel.
The capability integration layer adopts a parent class and subclass mode, task parameters are processed through an abstract parent class processing module and a specifically called subclass processing module, and the subclass processing module is used for providing a specific data processing method. The parent class processing module may be an abstract collection of child class processing modules.
The parent processing module comprises a plurality of subclass processing modules, the subclass processing modules can be created according to the parent processing module, and the data processing methods provided by the subclass processing modules can be obtained based on the parent processing module. The child class handling module inherits at least some of the attributes of the parent class handling module and thus has at least some of the capabilities provided by the parent class handling module.
As shown in fig. 5, the capability aggregation layer includes a parent processing module 1, the parent processing module 1 includes a child processing module 1, a child processing module 2, and the like, and the child processing module can perform horizontal expansion according to the service requirement. The subclass processing modules can be called in parallel, different subclass processing modules can provide different data processing methods, and different subclass processing modules can perform different processing on task parameters. Different subclass processing modules are not affected with each other, are relatively independent and have low coupling.
Since different subclasses are relatively independent and can be called in parallel, a plurality of subclass processing modules can process a plurality of tasks in parallel. When a plurality of subclass processing modules are called to process the task parameters, the data processing speed is increased, and therefore the data processing efficiency is improved.
Referring to fig. 2 and fig. 5, a schematic flow chart of obtaining thread parameters according to another embodiment is provided. Step S300, calling a processing module corresponding to the task parameter in the capability aggregation layer to obtain a thread parameter responding to the data processing request, including:
step S301, transmitting the task parameters to the parent processing module corresponding to the task parameters.
Because the capability integration layer includes a parent processing module and a child processing module, and the child processing module processes data in different ways on the basis of the parent processing module, the task parameters need to be transmitted to the parent processing module first. The parent processing module corresponds to the task parameters, and transmits the task parameters to the parent processing module according to the data which can be processed by the subclass processing module in the parent processing module.
Step S302, the parent processing module distributes data corresponding to the subclass processing module in the task parameters received by the parent processing module to the subclass processing module according to the data processing method of the subclass processing module.
And after the parent processing module receives the transmitted task parameters, distributing the data corresponding to the subclasses in the task parameters to the subclass processing modules according to the data processing method of each subclass processing module. In this way, different subclass processing modules can process data corresponding to the subclass processing modules.
Step S303, the subclass processing module obtains the thread parameter responding to the data processing request according to the task parameter.
After obtaining the data to be processed in the task parameters, each subclass module processes the data to obtain a thread parameter responding to the data processing request, so that the thread in the service thread layer responds to the data processing request according to the thread parameter.
In one embodiment, the capability aggregation layer sends the generated task parameter to a parent processing module of the capability aggregation layer, the parent processing module distributes the task parameter to a child processing module capable of executing a corresponding task according to the content of the task parameter, and the task parameter at least includes: the task identifier or the identity of the subclass processing module executing the corresponding task.
In one embodiment, the subclass processing module in the capability aggregation layer combines the functions executed by a single thread according to the received task parameters to obtain one or more sets of thread parameters. And the set of thread parameters are distributed to a thread subclass, so that the corresponding thread executes a subtask responding to the data processing request. The thread parameters at least include: an identification of a thread subclass or a subtask identification of a task executed by the thread.
Through the design of the parent class processing module and the subclass processing module, different data processing methods can be provided by different subclass processing modules on the basis of the parent class processing module, so that more and richer business logic execution modes are provided, and high reusability is achieved.
In another embodiment, each task parameter corresponds to at least one parent class of data to be processed, and the capability aggregation layer includes: and different parent processing modules are used for processing the data to be processed of different parent classes.
In this embodiment, the capability aggregation layer includes a plurality of parallel parent processing modules, each parent processing module is also independent from another parent processing module, different parent processing modules are not affected by each other, and the parent processing modules also have low coupling. Different parent class processing modules correspond to different data to be processed, and the data to be processed can be the data to be processed transmitted through task parameters. The data to be processed corresponding to different parent class processing modules in the capability integration layer are different, and the different data to be processed corresponding to the different parent class processing modules are represented as the data to be processed of different parent classes. In this embodiment, for example, the data to be processed corresponding to a plurality of parent classes is taken as the data to be processed corresponding to one parent class, the data to be processed corresponding to one parent class processing module corresponds to the data to be processed of one parent class, and the other parent class processing module corresponds to the data to be processed of another parent class.
Because the parent class processing module comprises a plurality of subclass processing modules which can be called in parallel, the to-be-processed data of the parent class corresponding to each parent class processing module comprises a plurality of to-be-processed data of different subclasses, and the to-be-processed data of each subclass corresponds to one subclass processing module. The subclass processing module can process the data to be processed of the corresponding subclass according to the data processing method of the subclass processing module.
The number of the parent processing modules is also not fixed, the number can be adjusted according to actual service requirements, and the parent processing modules can be expanded. Each parent processing module includes a plurality of child processing modules that can be called in parallel, please refer to the description of the parent processing module in the above embodiment, which is not described here.
In another embodiment, the business thread layer includes: the system comprises a thread parent class and a plurality of thread subclasses which inherit the property of the thread parent class and can be called in parallel, wherein the thread parent class is created according to the thread parent class.
The service thread layer also adopts a parent class and subclass mode, and the thread subclass is used as an execution unit for finally responding to the data processing request according to the technical scheme through an abstract thread parent class and a thread subclass for specifically responding to the data processing request. The thread parent class may be an abstract combination of thread subclasses.
The thread parent class includes a plurality of thread subclasses, and the thread subclasses can be created according to the thread parent class. The thread subclasses can be called in parallel, different thread subclasses do not influence each other, are relatively independent, and have low coupling. As shown in fig. 5, the thread parent class T1 includes a thread subclass 1, a thread subclass 2, a thread subclass 3, and the like, and the subclass thread can be extended horizontally according to the business requirement.
Because different thread subclasses are independent from each other and can be called in parallel, a plurality of thread subclasses can respond to the data processing request in parallel, the response speed of the data processing request is improved, and the data processing efficiency is improved.
Referring to fig. 3 and 5, a schematic flow chart of responding to the data processing request according to another embodiment is provided. Step S400, calling a thread corresponding to the thread parameter in a service thread layer, responding to the data processing request, and including:
step S401, transmits the thread parameter to the thread parent class corresponding to the thread parameter.
Since the service thread layer includes a thread subclass and a thread parent class, the thread subclass performs different responses to the data processing request on the basis of the thread parent class, and therefore the thread parameter needs to be transmitted to the thread parent class, which is the thread parent class corresponding to the thread parameter.
In step S402, the thread parent assigns data corresponding to the thread subclass among the thread parameters received by the thread parent according to the data processing method of the thread subclass to the thread subclass.
After receiving the thread parameters, the thread parent class allocates the data corresponding to each thread subclass according to different responses of each thread subclass to the data processing request, so that the thread subclass responds to the data processing request correspondingly.
In step S403, the thread subclass responds to the data processing request according to the thread parameter.
After each thread subclass obtains the thread parameters assigned to each thread subclass, different responses are made to the data processing request according to the obtained thread parameters, and thus the processing of the data processing request is completed.
In one embodiment, the service thread layer sends the generated thread parameter to a thread parent class of the service thread layer, and the thread parent class distributes the thread parameter to thread subclasses capable of performing different responses according to the content of the thread parameter, where the thread parameter at least includes: response identification or identification of thread subclass performing different responses.
The business thread layer also carries out different responses to the data processing requests on the basis of the thread parent class through the design of the parent class and the subclass class such as the thread parent class and the thread subclass, so that more and richer business logic execution modes are provided, and the reusability is high.
In another embodiment, one task parameter corresponds to at least one parent class of data to be processed, the data to be processed of one parent class includes at least two different subclasses of data to be processed, and one thread parameter corresponds to one subclass of data to be processed. The business thread layer comprises: and different thread parent classes are used for processing the data to be processed of different subclasses of the same parent class.
Similarly, taking an image recognition processing module and a black/white list comparison processing module in the capability integration layer as examples, the thread parameter obtained by the image recognition processing module is a parent class of one thread parameter, and the thread parameter obtained by the black/white list comparison processing module is a parent class of another thread parameter.
The thread parameters obtained by the image recognition processing module include an identity identifier or a method name of the feature extraction thread and an identity identifier or a method name of the data forwarding thread, so that data corresponding to the identity identifier or the method name of the feature extraction thread is to-be-processed data of one subclass, and data corresponding to the identity identifier or the method name of the data forwarding thread is to-be-processed data of another subclass. The data to be processed of the two different subclasses belong to the same parent class, namely, the data to be processed belong to different subclasses in the parent class of the thread parameters obtained by the image recognition processing module.
The identity identifier or the method name of the feature extraction thread corresponds to the thread parent class of the feature extraction, the identity identifier or the method name of the data forwarding thread corresponds to the thread parent class of the data forwarding.
The business thread layer comprises a plurality of parallel thread parent classes, each thread parent class is independent, different thread parent classes are not influenced mutually, and therefore the thread parent classes also have low coupling. The child class processing module corresponds to the to-be-processed data of the child class in the to-be-processed data of the parent class, so that the thread parent class corresponds to the to-be-processed data of different child classes of the parent class and is the to-be-processed data of different child classes in the same parent class.
For another example, the to-be-processed data n of a parent class corresponding to one parent class processing module 1 includes to-be-processed data n1 of a child class and to-be-processed data n2 of a child class in the to-be-processed data n of the parent class corresponding to one child class processing module. If the business thread layer includes a thread parent class T1 (refer to fig. 5) and a thread parent class T2 (not shown in fig. 5), the thread parent class T1 may correspond to the data to be processed n1 of the subclass, and the thread parent class T2 may correspond to the data to be processed n2 of the subclass.
The number of thread parent classes is not fixed, and the number of thread parent classes can be adjusted according to actual service requirements, expanded and the like.
In a preferred embodiment of the present disclosure, the data is internet of things data in the security field, and includes at least one of the following items: monitoring videos, snapshotting pictures, and structuring data.
Fig. 4 is a schematic structural diagram of a data processing apparatus according to another embodiment. The data processing apparatus includes:
and the receiving module is used for receiving the data processing request through the calling interface.
The first calling module is used for calling a service module corresponding to the data processing request in a service layer to obtain a task parameter of the data processing request; wherein the service layer comprises: and a plurality of service modules which can provide different business services and are called in parallel.
The second calling module is used for calling the processing module corresponding to the task parameter in the capability integration layer to obtain a thread parameter responding to the data processing request; wherein the capability integration layer comprises: a plurality of processing modules that can provide different data processing methods and that are called in parallel.
The third calling module is used for calling the thread corresponding to the thread parameter in the service thread layer and responding to the data processing request; wherein the service line layer comprises: multiple threads that can be called in parallel. Only a few of the above modules are shown in fig. 4, the remaining modules not being shown.
The data processing apparatus further includes: a first capability integration layer, the first capability integration layer comprising: a parent class processing module and a plurality of child class processing modules which can be called in parallel.
The second calling module comprises:
and the first transmission unit is used for transmitting the task parameters to the parent processing module corresponding to the task parameters.
And the parent processing module distributes data corresponding to the subclass processing module in the task parameters received by the parent processing module to the subclass processing module according to a data processing method of the subclass processing module.
And the subclass processing module obtains a thread parameter responding to the data processing request according to the task parameter.
The task parameters correspond to at least one parent class of data to be processed; the data processing apparatus further includes: a second capability integration layer, the second capability integration layer comprising: and different parent processing modules are used for processing the data to be processed of different parent classes.
The data processing apparatus further includes: a first business thread layer, the first business thread layer comprising: the system comprises a thread parent class and a plurality of thread subclasses which are created according to the thread parent class, inherit the property of the thread parent class and can be called in parallel.
The third calling module comprises:
and the second transmission unit is used for transmitting the thread parameters to the thread parent class corresponding to the thread parameters.
And the thread parent class distributes the data corresponding to the thread subclass in the thread parameters received by the thread parent class to the thread subclass according to a data processing method of the thread subclass.
And the thread subclass responds to the data processing request according to the thread parameter.
One task parameter corresponds to at least one parent class of data to be processed, the parent class of data to be processed comprises at least two different subclasses of data to be processed, and one thread parameter corresponds to one subclass of data to be processed. The data processing apparatus further includes: a second business thread layer, the second business thread layer comprising: and different thread parent classes are used for processing the data to be processed of different subclasses of the same parent class.
The technical scheme of this application still provides an electronic equipment, includes:
a processor;
a memory storing program instructions that, when executed by the processor, cause the electronic device to perform the method of any of the embodiments described above.
The technical solution of the present application further provides a storage medium storing a program, and when the program is executed by a processor, the method in any one of the embodiments described above is performed. The storage medium comprises a non-transitory storage medium.
Referring to fig. 5, in another embodiment, another data processing method is provided. The method comprises the following steps:
and receiving a data processing request through a calling interface. The service layer comprises a plurality of service modules which can provide different service services and can be called in parallel, and the service modules can be expanded according to actual service requirements. After receiving the data processing request, calling a service module corresponding to the data processing request in the service layer, processing the data processing request through a service provided by the called service module to obtain processing data including task parameters, then performing parameter transmission, and transmitting the task parameters to the capability integration layer.
The capability collection layer comprises a plurality of parent processing modules which can be called in parallel, the service module transmits the task parameters to the corresponding parent processing modules, the parent processing modules are independent from each other and do not influence each other, and different parent processing modules correspond to different parent to-be-processed data. In fig. 5 a parent processing module 1 is shown.
Each parent class processing module comprises a plurality of child class processing modules which can provide different data processing methods and are called in parallel. Different subclass processing modules are independent from each other and do not affect each other, a subclass processing module 1 and a subclass processing module 2 are shown in fig. 5, the two subclass processing modules can provide different processing methods and can be called in parallel to process different subclasses of data to be processed in the parent processing module 1, and each subclass processing module can obtain corresponding thread parameters and transmit the thread parameters to a service thread layer.
The business thread layer comprises a plurality of thread parent classes which can be independent from each other and do not influence each other, and the subclass processing module transmits the thread parameters to the corresponding thread parent classes. Each thread parent class comprises a pair of thread subclasses which can be called in parallel, different thread subclasses are independent from each other, a thread parent class 1 is shown in the figure, and the thread parent class 1 comprises a thread subclass 1, a thread subclass 2 and a thread subclass 3. The three thread subclasses may be invoked in parallel, responding differently to data processing requests.
After the thread subclasses respond differently to the data processing request, the thread subclasses corresponding to the subclass processing modules may be synchronized, for example, the thread synchronization tool CountDownLatch. After the thread subclasses are synchronized, the thread subclasses corresponding to the same subclass processing module can continue to execute other tasks after the thread subclasses corresponding to the same subclass processing module complete the response to the data processing request. The counting can be realized by a counter, and the total amount of the thread subclasses which have started to respond to the data processing request and the number of the thread subclasses which have ended to respond to the data processing request, which correspond to the same subclass processing module, are counted. And after one thread subclass completes the response of the data request, reducing the total amount of the thread subclass which starts to respond to the data processing request by one until the total amount of the thread subclass which starts to respond to the data processing request is zero, wherein the thread subclass corresponding to the subclass processing module completes the response of the data processing request. Therefore, the consistency of the data needing to be processed by the same subclass processing module can be ensured. The problem of data safety is solved by realizing data consistency, and the next piece of data can be processed only after all thread subclasses complete business operation through the thread synchronization tool CountDownLatch, so that data inaccuracy caused by data series connection is prevented. In addition to thread counters, a mechanism for thread lock is used in parameter passing, and after locking a synchronous data block, the problem that multiple threads modify the data simultaneously to cause thread deadlock and waiting can be prevented.
Through the plurality of embodiments, the processor can be more efficiently utilized, the data processing efficiency is improved, the coupling degree of each component is reduced, and the excessive dependence on a data processing framework is reduced.
Fig. 6 is a schematic diagram of a data processing method for a specific application scenario according to another embodiment.
Taking image data processing as an example, the image data is processed. An image data processing request is received through an interface. The service layer further includes a service module, which in this embodiment may be a service module for image data processing, and is configured to provide a business service for image data processing.
The capability integration layer comprises at least one parent processing module. And different parent processing modules are used for providing and executing different image processing tasks. The task parameters received by the capability integration layer from the service layer may be any parameter of the image processing task execution process.
The capability integration layer shown in fig. 6 includes: and the two parent processing modules can be a target identity recognition capability set and a preset image library comparison capability set respectively. At this time, the image processing task includes at least: and comparing the target identity recognition task with a preset image library. The target identity recognition task can be further split into: a face recognition subtask and/or a body recognition subtask, etc. The preset image library comparison task comprises the following steps: and comparing the received image with a first image library and/or a second image library, wherein the first image library is different from the second image library.
The target identity recognition capability set comprises a subclass processing module for face recognition and/or a subclass processing module for human body recognition. The preset image library comparison capability set comprises a subclass processing module for first image library comparison and/or a subclass processing module for second image library comparison.
The business thread layer comprises at least one thread parent class. Different thread parents are used to provide different responses to image data processing requests. The business thread layer shown in FIG. 6 includes: and the two thread parent classes are respectively a thread parent class for feature extraction and a thread parent class for data forwarding. At this time, the image data processing request includes at least: response of feature extraction and response of data forwarding. The response of the feature extraction can be further split into: eye feature extraction sub-responses, nose feature extraction sub-responses, and mouth feature extraction sub-responses. The response of the data forwarding can be further split into: the sub-response of the data forwarding system A, the sub-response of the data forwarding system B and the sub-response of the data forwarding system C.
The thread parent classes of the feature extraction also comprise a thread child class of the eye feature extraction, a thread child class of the nose feature extraction and a thread child class of the mouth feature extraction. The thread parent classes of data forwarding also comprise a thread subclass of a data forwarding A system, a thread subclass of a data forwarding B system and a thread subclass of a data forwarding C system.
Through the service module for processing the image data, task parameters for processing the image data can be obtained, wherein the task parameters comprise the ID or the method name of the target identity recognition capability set and the ID or the method name of the preset image library comparison capability set. The ID or the method name of the target identity recognition capability set also comprises the ID or the method name of a subclass processing module for face recognition and the ID or the method name of a subclass processing module for human body recognition. The ID or the method name of the preset image library comparison capability set also comprises the ID or the method name of the subclass processing module compared by the first image library and the ID or the method name of the subclass processing module compared by the second image library.
And determining the target identity recognition capability set according to the ID or the method name of the target identity recognition capability set, and transmitting the ID or the method name of the target identity recognition capability set to the target identity recognition capability set. And the target identity recognition capability set distributes the data of the subclass processing module for face recognition to the subclass processing module for face recognition according to the ID or the method name of the subclass processing module for face recognition. And the target identity recognition capability set distributes the data of the subclass processing module for human body recognition to the subclass processing module for human body recognition according to the ID or the method name of the subclass processing module for human body recognition.
And determining the comparison and identification capability set of the preset image library according to the ID or the method name of the comparison and identification capability set module of the preset image library, and transmitting the ID or the method name of the comparison and identification capability set module of the preset image library to the comparison and identification capability set of the preset image library. And the preset image library comparison and identification capability set distributes data corresponding to the subclass processing modules compared with the first image library according to the IDs or the method names of the subclass processing modules compared with the first image library. And the preset image library comparison and identification capability set distributes the data of the subclass processing module compared with the second image library to the subclass processing module compared with the second image library according to the ID or the method name of the subclass processing module compared with the second image library.
The subclass processing module for face recognition can obtain thread parameters for face recognition, the subclass processing module for human recognition can obtain thread parameters for human recognition, the subclass processing module for first image library comparison can obtain thread parameters for first image library comparison, and the subclass processing module for second image library comparison can obtain thread parameters for second image library comparison.
Taking the subclass processing module for face recognition as an example, the thread parameters for face recognition include the ID or method name of the parent class of the feature extraction thread and the ID or method name of the parent class of the data forwarding thread. The ID or method name of the feature extraction thread parent includes: the ID or method name of the thread subclass for eye feature extraction, the ID or method name of the thread subclass for nose feature extraction, and the ID or method name of the thread subclass for mouth feature extraction. The ID or method name of the data forwarding thread parent includes: the ID or method name of the thread subclass of the data transfer a system, the ID or method name of the thread subclass of the data transfer B system, and the ID or method name of the thread subclass of the data transfer C system.
And determining the feature extraction thread parent according to the ID or the method name of the feature extraction thread parent, and transmitting the ID or the method name of the feature extraction thread parent to the feature extraction thread parent. And the feature extraction thread parent class transmits data corresponding to the thread subclass extracted by the eye features according to the ID or the method name of the thread subclass extracted by the eye features. And transmitting data corresponding to the thread subclass extracted by the nose feature according to the ID or the method name of the thread subclass extracted by the nose feature. And transmitting data corresponding to the thread subclass of the mouth feature extraction according to the ID or the method name of the thread subclass of the mouth feature extraction.
The thread subclass is used as a response and execution unit for responding and executing the request, and different responses are carried out on the image data processing request through the thread subclass extracted by the eye characteristic, the thread subclass extracted by the nose characteristic and the thread subclass extracted by the mouth characteristic.
And determining the data forwarding thread parent according to the ID or the method name of the data forwarding thread parent, and transmitting the ID or the method name of the data forwarding thread parent to the data forwarding thread parent. And the data forwarding thread parent class transmits the data corresponding to the thread subclass of the data forwarding A system according to the ID or the method name of the thread subclass of the data forwarding A system. And transmitting the data corresponding to the thread subclass of the data forwarding B system according to the ID or the method name of the thread subclass of the data forwarding B system. And transmitting the data corresponding to the thread subclass of the data forwarding C system according to the ID or the method name of the thread subclass of the data forwarding C system.
The thread subclass is used as a response and execution unit for responding and executing the request, and different responses are carried out on the image data processing request through the thread subclass of the data transfer A system, the thread subclass of the data transfer B system and the thread subclass of the data transfer C system.
The thread synchronization tool CountDownLatch is used for synchronizing thread subclasses, can be a thread counter and the like, and ensures the consistency of data to be processed by the same subclass processing module. Refer to the description of the thread synchronization tool CountDownLatch in the above embodiment.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The above-described device embodiments are merely illustrative, for example, the division of the unit is only a logical functional division, and there may be other division ways in actual implementation, such as: multiple units or components may be combined, or may be integrated into another system, or some features may be omitted, or not implemented. In addition, the coupling, direct coupling or communication connection between the components shown or discussed may be through some interfaces, and the indirect coupling or communication connection between the devices or units may be electrical, mechanical or other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, that is, may be located in one place, or may be distributed on a plurality of network units; some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, all the functional units in the embodiments of the present invention may be integrated into one processing module, or each unit may be separately used as one unit, or two or more units may be integrated into one unit; the integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional unit.
In some cases, any two of the above technical features may be combined into a new method solution without conflict.
Those of ordinary skill in the art will understand that: all or part of the steps for implementing the method embodiments may be implemented by hardware related to program instructions, and the program may be stored in a computer readable storage medium, and when executed, the program performs the steps including the method embodiments; and the aforementioned storage medium includes: various media capable of storing program codes, such as a removable Memory device, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, and an optical disk.
The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and all the changes or substitutions should be covered within the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the appended claims.

Claims (10)

1. A data processing method, comprising:
receiving a data processing request through a calling interface;
calling a service module corresponding to the data processing request in a service layer to obtain task parameters of the data processing request; wherein the service layer comprises: a plurality of service modules which can provide different business services and are called in parallel;
calling a processing module corresponding to the task parameter in the capability integration layer to obtain a thread parameter responding to the data processing request; wherein the capability integration layer comprises: a plurality of processing modules capable of providing different data processing methods and being called in parallel;
calling a thread corresponding to the thread parameter in a service thread layer, and responding to the data processing request; wherein the business thread layer comprises: multiple threads that can be called in parallel.
2. The method of claim 1, wherein the capability aggregation layer comprises: a parent class processing module and a plurality of child class processing modules which can be called in parallel.
3. The method of claim 2, wherein invoking the processing module in the capability aggregation layer corresponding to the task parameter to obtain a thread parameter in response to the data processing request comprises:
transmitting the task parameters to the parent processing module corresponding to the task parameters;
the parent processing module distributes data corresponding to the subclass processing module in the task parameters received by the parent processing module to the subclass processing module according to a data processing method of the subclass processing module;
and the subclass processing module obtains a thread parameter responding to the data processing request according to the task parameter.
4. The method according to claim 2 or 3, wherein the task parameters correspond to data to be processed of at least one parent class;
the capability integration layer includes: and different parent processing modules are used for processing the data to be processed of different parent classes.
5. The method of claim 1, wherein the business thread layer comprises: the system comprises a thread parent class and a plurality of thread subclasses which are created according to the thread parent class, inherit the property of the thread parent class and can be called in parallel.
6. The method of claim 5, wherein responding to the data processing request by invoking a thread in the business thread layer corresponding to the thread parameter comprises:
transmitting the thread parameter to the thread parent class corresponding to the thread parameter;
the thread parent class distributes the data corresponding to the thread subclass in the thread parameters received by the thread parent class to the thread subclass according to a data processing method of the thread subclass;
and the thread subclass responds to the data processing request according to the thread parameter.
7. The method according to claim 5 or 6, wherein the task parameter corresponds to at least one parent class of data to be processed, the data to be processed of one parent class comprises at least two different subclasses of data to be processed, and one of the thread parameters corresponds to one of the subclasses of data to be processed;
the business thread layer comprises: and different thread parent classes are used for processing the data to be processed of different subclasses of the same parent class.
8. A data processing apparatus, comprising:
the receiving module is used for receiving a data processing request through a calling interface;
the first calling module is used for calling a service module corresponding to the data processing request in a service layer to obtain a task parameter of the data processing request; wherein the service layer comprises: a plurality of service modules which can provide different business services and are called in parallel;
the second calling module is used for calling the processing module corresponding to the task parameter in the capability integration layer to obtain a thread parameter responding to the data processing request; wherein the capability integration layer comprises: a plurality of processing modules capable of providing different data processing methods and being called in parallel;
the third calling module is used for calling the thread corresponding to the thread parameter in the service thread layer and responding to the data processing request; wherein the business thread layer comprises: multiple threads that can be called in parallel.
9. An electronic device, comprising:
a processor;
a memory storing program instructions that, when executed by the processor, cause the electronic device to perform the method of any of claims 1-7.
10. A storage medium storing a program which, when executed by a processor, performs the method of any one of claims 1 to 7.
CN202110237243.7A 2021-03-03 2021-03-03 Data processing method and device, electronic equipment and storage medium Pending CN113157437A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110237243.7A CN113157437A (en) 2021-03-03 2021-03-03 Data processing method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110237243.7A CN113157437A (en) 2021-03-03 2021-03-03 Data processing method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN113157437A true CN113157437A (en) 2021-07-23

Family

ID=76884079

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110237243.7A Pending CN113157437A (en) 2021-03-03 2021-03-03 Data processing method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113157437A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113885956A (en) * 2021-09-29 2022-01-04 北京百度网讯科技有限公司 Service deployment method and device, electronic equipment and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104598212A (en) * 2013-10-30 2015-05-06 上海联影医疗科技有限公司 Image processing method and device based on algorithm library
JP2015108903A (en) * 2013-12-03 2015-06-11 日本電信電話株式会社 Distributed information cooperation system and data operation method therefor and program
CN105975346A (en) * 2016-05-25 2016-09-28 大唐网络有限公司 Method, device and system for scheduling thread resources
US20190129747A1 (en) * 2017-10-30 2019-05-02 EMC IP Holding Company LLC Elastic Scaling Job Thread Pool in a Cloud Event Process Infrastructure
CN111796936A (en) * 2020-06-29 2020-10-20 平安普惠企业管理有限公司 Request processing method and device, electronic equipment and medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104598212A (en) * 2013-10-30 2015-05-06 上海联影医疗科技有限公司 Image processing method and device based on algorithm library
JP2015108903A (en) * 2013-12-03 2015-06-11 日本電信電話株式会社 Distributed information cooperation system and data operation method therefor and program
CN105975346A (en) * 2016-05-25 2016-09-28 大唐网络有限公司 Method, device and system for scheduling thread resources
US20190129747A1 (en) * 2017-10-30 2019-05-02 EMC IP Holding Company LLC Elastic Scaling Job Thread Pool in a Cloud Event Process Infrastructure
CN111796936A (en) * 2020-06-29 2020-10-20 平安普惠企业管理有限公司 Request processing method and device, electronic equipment and medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113885956A (en) * 2021-09-29 2022-01-04 北京百度网讯科技有限公司 Service deployment method and device, electronic equipment and storage medium
CN113885956B (en) * 2021-09-29 2023-08-29 北京百度网讯科技有限公司 Service deployment method and device, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
US11632441B2 (en) Methods, systems, and devices for electronic note identifier allocation and electronic note generation
CN106919445B (en) Method and device for scheduling containers in cluster in parallel
CN111813570A (en) Event-driven message interaction method for power Internet of things
CN111970198A (en) Service routing method, device, electronic equipment and medium
CN110413822B (en) Offline image structured analysis method, device and system and storage medium
CN112486695A (en) Distributed lock implementation method under high concurrency service
EP4030314A1 (en) Blockchain-based data processing method, apparatus and device, and readable storage medium
CN111507257B (en) Picture processing method, device, system, medium and program
CN113157437A (en) Data processing method and device, electronic equipment and storage medium
CN105373563B (en) Database switching method and device
CN110324262A (en) A kind of method and device that resource is seized
CN115361382B (en) Data processing method, device, equipment and storage medium based on data group
US9774640B2 (en) Method and system for sharing applications among a plurality of electronic devices
US9792162B2 (en) Network system, network node and communication method
CN112328598B (en) ID generation method, ID generation device, electronic equipment and storage medium
CN109586933A (en) Acquisition methods, system and the server of conferencing resource
CN112379952B (en) Method for implementing cross-process callback
CN113791876A (en) System, method and apparatus for processing tasks
CN111984424A (en) Task processing method, device, equipment and computer readable storage medium
CN109358970A (en) A kind of method of cloud data center task management and task center
CN117113942B (en) Model synchronization method and device, electronic equipment and storage medium
CN114666348B (en) Method for quickly constructing distributed system based on python language
CN109032655A (en) A kind of configuration method, server and computer readable storage medium monitoring example
CN113254168B (en) Operation method, device, equipment and storage medium of block chain system
WO2023185726A1 (en) Model acquisition method, information sending method, information receiving method, device and network element

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20210723