CN112735115A - Multithreading business processing method and device, server and storage medium - Google Patents

Multithreading business processing method and device, server and storage medium Download PDF

Info

Publication number
CN112735115A
CN112735115A CN202011598501.6A CN202011598501A CN112735115A CN 112735115 A CN112735115 A CN 112735115A CN 202011598501 A CN202011598501 A CN 202011598501A CN 112735115 A CN112735115 A CN 112735115A
Authority
CN
China
Prior art keywords
service
executed
threads
thread
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011598501.6A
Other languages
Chinese (zh)
Inventor
费战波
师文佼
王坤明
魏帅
王海豹
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SUNTRONT TECH CO LTD
Original Assignee
SUNTRONT TECH CO LTD
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SUNTRONT TECH CO LTD filed Critical SUNTRONT TECH CO LTD
Priority to CN202011598501.6A priority Critical patent/CN112735115A/en
Publication of CN112735115A publication Critical patent/CN112735115A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C19/00Electric signal transmission systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/466Transaction processing

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Telephonic Communication Services (AREA)

Abstract

The application provides a multithreading business processing method, a multithreading business processing device, a server and a storage medium, and relates to the technical field of communication. The method comprises the following steps: when the execution condition of the current service to be executed is met, determining a target execution thread for executing the service to be executed according to the service type of the current service to be executed, the number of threads currently running and the maximum number of threads allowed to run by a service platform, wherein the service type is used for representing the service function of the service to be executed; and calling the target execution thread to execute the service to be executed. According to the scheme, the service platform can simultaneously perform data processing on the intelligent terminals, for the execution of each service of each intelligent terminal, the relation among the threads is reasonably allocated, so that different services can be executed in a matched mode among the threads, the performance of the server is optimal, each service of each intelligent terminal is executed through the threads, and the efficiency of the service platform for executing the services of the intelligent terminals can be effectively improved.

Description

Multithreading business processing method and device, server and storage medium
Technical Field
The present application relates to the field of communications technologies, and in particular, to a method, an apparatus, a server, and a storage medium for processing a multi-thread service.
Background
With the continuous development of urban construction, smart meters such as smart water meters and electricity meters become indispensable data acquisition equipment. With the increase of the number of the intelligent meters, the meter reading requirement is gradually increased, and the realization of data acquisition service with higher efficiency becomes more important.
In the prior art, a meter reading service is realized by adopting a single-thread data meter reading service, that is, the execution, data transmission, data reception and the like of a meter reading task are executed in sequence through a single thread.
However, the above method may result in low traffic processing efficiency.
Disclosure of Invention
An object of the present application is to provide a method, an apparatus, a server, and a storage medium for processing a multi-thread service, so as to solve the problem of low efficiency of processing a service in the prior art.
In order to achieve the above purpose, the technical solutions adopted in the embodiments of the present application are as follows:
in a first aspect, an embodiment of the present application provides a multithreading service processing method, which is applied to a service platform, where the service platform is used to process data of multiple intelligent terminals, and the method includes:
when the execution condition of the current service to be executed is met, determining a target execution thread for executing the service to be executed according to the service type of the current service to be executed, the number of threads currently running and the maximum number of threads allowed to run by a service platform, wherein the service type is used for representing the service function of the service to be executed;
and calling the target execution thread to execute the service to be executed.
Optionally, the determining, according to the service type of the current service to be executed, the number of threads currently running, and the maximum number of threads allowed to run by the service platform, a target execution thread for executing the service to be executed includes:
judging whether the number of the currently running threads reaches the maximum number of the threads allowed to run;
and if so, determining the target execution thread according to the service type of the current service to be executed and the service type of the service operated by each thread which is operated.
Optionally, the determining the target execution thread according to the service type of the current service to be executed and the service types of the services operated by the running threads includes:
determining the priority of the current service to be executed according to the service type of the current service to be executed;
and if the priority of the current service to be executed is higher than or equal to a preset priority, determining a thread executing a first service in the running threads as the target execution thread according to the service type of the service corresponding to the running threads, wherein the priority of the first service is lower than the priority of the current service to be executed.
Optionally, the method further comprises:
and if the priority of the current service to be executed is lower than the preset priority, taking the first thread which finishes running in the running threads as the target execution thread.
Optionally, the method further comprises:
if the number of the currently running threads does not reach the maximum number of the threads allowed to run;
one thread is selected from a plurality of idle threads as the target execution thread.
Optionally, before determining a target execution thread for executing the service to be executed according to the service type of the service to be executed, the number of threads currently running, and the maximum number of threads allowed to run by the service platform, the method further includes:
and determining the maximum thread number allowed to run by the service platform according to the server configuration parameters of the service platform.
Optionally, the determining, according to the server configuration parameter of the service platform, the maximum number of threads allowed to be run by the service platform includes:
and determining the maximum thread number allowed to run by the service platform according to the central processing unit core number and the memory size of the server.
In a second aspect, an embodiment of the present application further provides a multithreading service processing apparatus, which is applied to a service platform, where the service platform is configured to process data of multiple intelligent terminals, and the apparatus includes: a determining module and an executing module;
the determining module is used for determining a target execution thread for executing the service to be executed according to the service type of the service to be executed, the number of threads currently running and the maximum number of threads allowed to run by a service platform when the executing condition of the current service to be executed is met, wherein the service type is used for representing the service function of the service to be executed;
the execution module is used for calling the target execution thread to execute the service to be executed.
Optionally, the determining module is specifically configured to determine whether the currently running thread number reaches the maximum thread number allowed to run; and if so, determining the target execution thread according to the service type of the current service to be executed and the service type of the service operated by each thread which is operated.
Optionally, the determining module is specifically configured to determine the priority of the current service to be executed according to the service type of the current service to be executed; and if the priority of the current service to be executed is higher than or equal to a preset priority, determining a thread executing a first service in the running threads as the target execution thread according to the service type of the service corresponding to the running threads, wherein the priority of the first service is lower than the priority of the current service to be executed.
Optionally, the determining module is further configured to, if the priority of the current service to be executed is lower than the preset priority, take a first thread that ends running in the running threads as the target execution thread.
Optionally, the determining module is specifically configured to determine whether the number of currently running threads does not reach the maximum number of threads allowed to run; one thread is selected from a plurality of idle threads as the target execution thread.
Optionally, the determining module is further configured to determine, according to a server configuration parameter of the service platform, a maximum number of threads allowed to be run by the service platform.
Optionally, the determining module is specifically configured to determine the maximum number of threads allowed to run by the service platform according to the number of central processing units and the size of the memory of the server.
In a third aspect, an embodiment of the present application provides a server, including: a processor, a storage medium storing machine-readable instructions executable by the processor, the processor and the storage medium communicating over a bus when the server is running, the processor executing the machine-readable instructions to perform the steps of the multi-threaded business processing method as provided in the first aspect when executed.
In a fourth aspect, an embodiment of the present application provides a storage medium, where a computer program is stored on the storage medium, and when the computer program is executed by a processor, the steps of the multithread service processing method according to the first aspect are performed.
The beneficial effect of this application is:
the application provides a multithreading business processing method, a multithreading business processing device, a server and a storage medium, wherein the method comprises the following steps: when the execution condition of the current service to be executed is met, determining a target execution thread for executing the service to be executed according to the service type of the current service to be executed, the number of threads currently running and the maximum number of threads allowed to run by a service platform, wherein the service type is used for representing the service function of the service to be executed; and calling the target execution thread to execute the service to be executed. In the scheme, the service platform can simultaneously perform data processing on a plurality of intelligent terminals, for the execution of each service of each intelligent terminal, a target execution thread for executing the service to be executed can be determined from a plurality of threads according to the maximum thread number allowed to operate by the service platform, so that the target execution thread is called to execute the service to be executed, different services can be cooperatively executed among the threads by reasonably allocating the relationship among the threads, the performance of the server is optimal, compared with the prior art, each service of each intelligent terminal is sequentially executed through a single thread, each service of each intelligent terminal is concurrently executed through a plurality of threads, and the efficiency of the service platform for executing the service of the intelligent terminal can be effectively improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained from the drawings without inventive effort.
Fig. 1 is an architecture diagram of a multithreading service processing system according to an embodiment of the present application;
fig. 2 is a schematic flowchart of a multithreading service processing method according to an embodiment of the present application;
fig. 3 is a schematic flowchart of another multithreading service processing method according to an embodiment of the present application;
fig. 4 is a schematic flowchart of another multithreading service processing method according to an embodiment of the present application;
fig. 5 is a schematic flowchart of another multithreading service processing method according to an embodiment of the present application;
fig. 6 is a schematic overall flow chart of a multithreading service processing method according to an embodiment of the present application;
fig. 7 is a schematic diagram of a multithreading service processing apparatus according to an embodiment of the present application;
fig. 8 is a schematic structural diagram of a server according to an embodiment of the present application.
Detailed Description
In order to make the purpose, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it should be understood that the drawings in the present application are for illustrative and descriptive purposes only and are not used to limit the scope of protection of the present application. Additionally, it should be understood that the schematic drawings are not necessarily drawn to scale. The flowcharts used in this application illustrate operations implemented according to some embodiments of the present application. It should be understood that the operations of the flow diagrams may be performed out of order, and steps without logical context may be performed in reverse order or simultaneously. One skilled in the art, under the guidance of this application, may add one or more other operations to, or remove one or more operations from, the flowchart.
In addition, the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. The components of the embodiments of the present application, generally described and illustrated in the figures herein, can be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present application, presented in the accompanying drawings, is not intended to limit the scope of the claimed application, but is merely representative of selected embodiments of the application. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the present application without making any creative effort, shall fall within the protection scope of the present application.
In order to enable those skilled in the art to use the present disclosure, the following embodiments are given in conjunction with a specific application scenario "meter reading service". It will be apparent to those skilled in the art that the general principles defined herein may be applied to other embodiments and applications without departing from the spirit and scope of the application. Although the present application is primarily described in the context of meter reading services, it should be understood that this is merely one exemplary embodiment. The present application may be applied to any other scenario. For example, the method and the device can be applied to intelligent terminal data processing and the like which can be realized in the scene of the Internet of things.
It should be noted that in the embodiments of the present application, the term "comprising" is used to indicate the presence of the features stated hereinafter, but does not exclude the addition of further features.
Fig. 1 is an architecture diagram of a multithreading service processing system according to an embodiment of the present application, where the service processing system may be used to implement the multithreading service processing method according to the present application. As shown in fig. 1, the service processing system may include: the service platform can be a server or a computer and other processing equipment, the plurality of intelligent terminals can be the same type of intelligent terminal, and also can be different types of intelligent terminals, and the intelligent terminal can be various terminal equipment for realizing network interaction such as water meters, electric meters, natural gas meters, televisions, projectors and the like, for example: based on the interaction realized by the networks such as the Internet of things, the cellular mobile network, the local area network and the like. The intelligent terminal comprises a plurality of intelligent terminals, a service platform and a plurality of intelligent terminals, wherein the plurality of intelligent terminals can be connected with the service platform through a network, and the service platform and each intelligent terminal can transmit instructions to realize functions of collecting and storing data of the intelligent terminals. Optionally, different types of services executed in the service platform may be executed in parallel by invoking different threads, so as to implement efficient concurrent execution of the services, and improve the efficiency of service execution.
Fig. 2 is a schematic flowchart of a multithreading service processing method according to an embodiment of the present application; the execution subject of the method may be a service platform in the service processing system shown in fig. 1, and the service platform may be a processing device such as a server or a computer. As shown in fig. 2, the method may include:
s201, when the execution condition of the current service to be executed is met, determining a target execution thread for executing the service to be executed according to the service type of the current service to be executed, the number of threads currently running and the maximum number of threads allowed to run by a service platform, wherein the service type is used for representing the service function of the service to be executed.
In the following embodiments of the application, the application of the method to the meter reading service is taken as an example scene, and the implementation of the steps of the method is described.
Optionally, the service to be executed in this embodiment may be each sub-service included in a large service, and taking the large service as a meter reading service as an example, the sub-services may include: the service to be executed may be any one of the plurality of sub-services.
When the service platform processes the service data of any intelligent terminal, each intelligent terminal can correspond to a plurality of sub-services, and when the service platform executes any sub-service of the intelligent terminal, a target execution thread corresponding to the sub-service can be determined from the plurality of threads, so that the target execution thread is called to execute the sub-service to be executed. At the same time, a plurality of running threads in the service platform can concurrently execute a plurality of sub-services in a plurality of sub-services corresponding to the intelligent terminal, so that high concurrency of service processing is realized.
In general, the service to be executed is not always executed continuously, and it may be started to be executed under the trigger of other services. Taking a meter reading service as an example, after a meter reading task is started, a timing task detection service starts to execute, when the detection reaches a timing time, the timing task detection service can trigger a timing task execution service to start executing, and when the timing task execution service judges that a data instruction needs to be sent, a data sending service is triggered to execute, so that the data sending service sends a data acquisition instruction to a target water meter, and the like. When the service to be executed meets the execution condition, the thread can be called to execute the service to be executed so as to ensure the normal execution of the service to be executed, and when the execution of the service to be executed is finished, the thread can enter a dormant state so as to wait for the execution of other services or wait for the triggering again, so that the load of a server can be effectively reduced, and the resource of the server can be saved.
The target execution thread for executing the service to be executed can be determined according to the service type of the service to be executed, the number of threads currently running and the maximum number of threads allowed to run by the service platform. The service type may refer to a function implemented by the service, for example: timing traffic, data transmission traffic, data reception traffic, etc. The maximum number of threads allowed to run by the service platform can be predetermined according to the configuration of the service platform.
S202, calling a target execution thread to execute the service to be executed.
Optionally, when the execution condition of the service to be executed is satisfied, the target execution thread may be invoked to execute the service to be executed according to the determined target execution thread.
After the execution of the service to be executed is finished, the target execution thread is in an idle state, when other services to be executed meet the execution condition, the target execution thread of other services to be executed is determined continuously through the judging method, and the target execution thread can be different from other threads out of the idle state or the idle thread, so that the reasonable calling among the threads is achieved, and the performance of the server is optimal.
In summary, the method for processing a multi-thread service provided in this embodiment includes: when the execution condition of the service to be executed is met, determining a target execution thread for executing the service to be executed according to the service type of the current service to be executed, the number of threads currently running and the maximum number of threads allowed to run by a service platform, wherein the service type is used for representing the service function of the service to be executed; and calling the target execution thread to execute the service to be executed. In the scheme, the service platform can simultaneously perform data processing on a plurality of intelligent terminals, for the execution of each service of each intelligent terminal, a target execution thread for executing the service to be executed can be determined from a plurality of threads according to the maximum thread number allowed to operate by the service platform, so that the target execution thread is called to execute the service to be executed, different services can be cooperatively executed among the threads by reasonably allocating the relationship among the threads, the performance of the server is optimal, compared with the prior art, each service of each intelligent terminal is sequentially executed through a single thread, each service of each intelligent terminal is concurrently executed through a plurality of threads, and the efficiency of the service platform for executing the service of the intelligent terminal can be effectively improved.
Fig. 3 is a schematic flowchart of another multithreading service processing method according to an embodiment of the present application; optionally, in step S201, determining a target execution thread for executing the service to be executed according to the service type of the service to be executed, the number of threads currently running, and the maximum number of threads allowed to run by the service platform, where the determining may include:
s301, judging whether the number of the currently running threads reaches the maximum number of the threads allowed to run.
In an implementation manner, it can be determined whether the number of currently running threads reaches the maximum number of threads allowed to run, that is, whether the number of currently running threads reaches the optimum.
Assuming that the maximum number of threads allowed to run is 3, and there are 3 threads currently executing different services, the number of threads currently running reaches the maximum number of threads allowed to run.
And S302, if so, determining a target execution thread according to the service type of the current service to be executed and the service type of the service operated by each thread in operation.
When the number of threads currently running reaches the maximum number of threads allowed to run, because the number of threads allowed to run concurrently by the service platform has reached the upper limit, the service to be executed cannot be run by starting a new thread, and then the target execution thread may be determined from the threads currently running according to the service type of the service to be executed and the service types of the services run by the threads currently running.
Fig. 4 is a schematic flowchart of another multithreading service processing method according to an embodiment of the present application; optionally, in step S302, determining a target execution thread according to the service type of the current service to be executed and the service types of the services run by the running threads, which may include:
s401, determining the priority of the current service to be executed according to the service type of the current service to be executed.
Generally, for a plurality of sub-services under a large service, there is a priority between the plurality of sub-services, and the priority may represent the importance degree of each sub-service, for example, for a core service, similar to the sub-services in the meter reading service, such as data sending, data receiving, and data storing, it is more important than the services such as timing service and detection timeout, and its priority is relatively higher.
Optionally, the priority of the service to be executed may be determined according to the service type of the service to be executed. Or, the priority of the service to be executed may also be determined according to the service identifier of the service to be executed and the pre-configured priority between the services. The manner of determining the specific priority may not be limiting.
S402, if the priority of the current service to be executed is higher than or equal to the preset priority, determining a thread executing a first service in the running threads as a target execution thread according to the service type of the service corresponding to the running threads, wherein the priority of the first service is lower than that of the current service to be executed.
Optionally, a preset priority may be set, and the to-be-executed service higher than or equal to the preset priority is used as the core service, and the to-be-executed service lower than the preset priority is used as the non-core service.
If the priority of the current service to be executed is higher than or equal to the preset priority, that is, when the service to be executed is determined to be the core service, at this time, because the number of threads in operation has reached the maximum number of threads allowed to be operated, the thread in execution of the first service can be determined as the target execution thread from the multiple threads in operation.
In one case, the first service may be a service with a priority lower than a preset priority, and in another case, the first service may also be another service with a priority lower than the priority of the service to be currently executed.
Optionally, when the service executed by each running thread includes a service with a priority lower than a preset priority, that is, includes a non-core service, the first service may be a service with a priority lower than a preset priority. The first is that the service run by each running thread only contains one non-core service, and then the thread running the non-core service can be determined as a target execution thread, and at this time, the non-core service enters a dormant state, and the target execution thread is invoked preferentially to execute the service to be executed with higher priority. Second, when only at least two non-core services are in the services run by each running thread, the thread running the service with the lower priority in the two non-core services may be determined as the target execution thread according to the priorities of the two non-core services, or one of the threads running the two non-core services may be selected as the target execution thread.
Alternatively, when the services run by the threads running do not include services with a priority lower than the preset priority, but only include services with a priority lower than the priority of the services to be executed, the thread running the service with the lowest priority may be determined as the target execution thread from the threads running the services with a priority lower than the priority of the services to be executed, or one thread may be arbitrarily selected as the target execution thread from the threads running the services with a priority lower than the priority of the services to be executed, similarly to the above.
It should be noted that, in this embodiment, only a few preferred target execution thread determination methods are listed to coordinate the coordination relationship among the threads, so that the server performance is optimal. Of course, in practical applications, there may be other target execution thread determining methods, and the methods are not limited to the above-mentioned ones.
Optionally, the method may further comprise:
and S403, if the priority of the current service to be executed is lower than the preset priority, taking the first thread which finishes running in all running threads as a target execution thread.
Optionally, the above description is provided for a method for determining a target execution thread of a current service to be executed when the priority of the current service to be executed is higher than or equal to a preset priority. In this embodiment, a case that the priority of the current service to be executed is lower than the preset priority will be described.
Optionally, if the priority of the current service to be executed is lower than the preset priority, that is, when the current service to be executed is a non-core service, it is not necessary to adopt the similar method to sleep other services and preferentially process the service to be executed, but to wait for each currently running thread and determine the thread whose first running is finished as the target execution thread.
That is, for the non-core service, the non-core service can be queued to run in each currently running thread, and the service to be executed is executed after the running of each currently running thread is finished.
Fig. 5 is a schematic flowchart of another multithreading service processing method according to an embodiment of the present application; optionally, the method of the present application may further include:
s501, if the number of the currently running threads does not reach the maximum number of the threads allowed to run.
S502, one thread is selected from the idle threads as a target execution thread.
In some embodiments, the maximum number of threads allowed to run may be greater, such that the number of threads being run does not reach the maximum number of threads allowed to run. At this time, there may be some idle threads, and then, one thread may be directly selected from the idle threads as the target execution thread.
The above describes in detail the determination method of the target execution thread for executing the to-be-executed service through a plurality of embodiments. After the target execution thread is determined, the target execution thread may be invoked to execute the service to be executed.
Optionally, in step S201, before determining a target execution thread for executing the service to be executed according to the service type of the service to be executed, the number of threads currently running, and the maximum number of threads allowed to run by the service platform, the method of the present application may further include: and determining the maximum thread number allowed to run by the service platform according to the server configuration parameters of the service platform.
Optionally, before determining the target execution thread of the service to be executed, the maximum number of threads allowed to be run by the service platform may be determined in advance according to the configuration parameters of the server used by the service platform.
Optionally, determining the maximum number of threads allowed to be run by the service platform according to the server configuration parameter of the service platform may include: and determining the maximum thread number allowed to run by the service platform according to the core number of the central processing unit of the server and the size of the memory.
In some embodiments, the maximum number of threads allowed to run by the service platform may be determined according to the number of CPU cores and the size of the memory of the server.
Typically, the maximum number of threads allowed to run may be twice the number of server CPU cores. Optionally, if the number of CPU cores of the server is a single core, it may be determined that the maximum number of threads allowed to run is 2, and then 2 threads may run in parallel at the same time, and the target execution thread of the service to be executed determined by the above method is used to invoke the target execution thread from among the 2 threads to execute the service to be executed.
If the number of the CPU cores of the server is 2 cores or 4 cores, and the memory of the server is less than 16G, the configuration of the server is considered to be general, and the maximum allowable number of threads to be run is 8 threads. And for the services with more sub-services, determining a target execution thread of the service to be executed by judging the number of the currently running threads and the maximum number of the threads.
And if the number of the CPU cores of the server is more than 4 cores and the memory of the server is more than 16G, the configuration of the server is considered to be higher, the maximum allowable number of threads for operation is more, the parallel execution of each sub-service in most services can be basically met, and for the service to be executed, the target execution thread can be directly determined from idle threads to execute the service to be executed.
Of course, the above-mentioned determination condition for the server configuration level may be adaptively adjusted according to the continuous update of the server, and this embodiment is described only as an example.
Fig. 6 is a schematic overall flow chart of a multithreading service processing method according to an embodiment of the present application. Optionally, a flow of the multithread service processing method according to the present application will be described below by taking a meter reading service as an example.
For the meter reading service, the corresponding sub-service may include: the method comprises a timing task detection service, a timing task execution service, a data sending service, a data timeout service, a data receiving service and a data storage service. And the service to be executed may be any one of the sub-services.
A. The timing task detection service is used for executing the following service processes:
step a1, when a meter reading service is started and the timing task detection service meets the execution condition, determining a target execution thread of the timing task detection service by the method and calling the target execution thread to execute the timing task detection service.
Step a2, detecting that there is a timing task reaching the execution time, jumping to step a3, otherwise returning to step a 1.
Step a3, triggering and starting the timing task execution service.
Step a4, judging whether the data collection service (meter reading service) stops, if not, returning to the step a1 cycle detection timing task, if so, jumping to the step a 5.
And a5, finishing the timing task detection service, and enabling the target execution thread to enter a dormant state.
B. The timing task execution service is used for executing the following service processes:
step b1, receiving the trigger condition of the timing task detection service, meeting the execution condition of the timing task execution service, determining the target execution thread of the timing task execution service, and starting to execute the timing task.
Step b2, detecting whether the timing task needs to send data command, if yes, starting data sending service.
And b3, starting a data transmission service.
And b4, processing different timed task services according to different timed task types.
And b5, after the timing task is executed, starting the data storage service and storing the service record.
And b6, the target execution thread of the timing task execution service starts to sleep and waits for the enabling signal.
And b7, jumping to b1 when the starting signal exists, and jumping to b8 when the starting signal does not exist.
And b8, ending the timing task execution service.
C. The data transmission service is used for executing the following service processes:
and step c1, receiving the trigger condition of the timing task execution service, meeting the execution condition of the data sending service, determining a target execution thread of the data sending service, and starting to send the data instruction.
And c2, assembling the data frame of the sending instruction according to the protocol analysis module.
And c3, obtaining the target sending object of the sending instruction, wherein the target sending object can be a target table (for example, a target water meter) determined according to the identification of each table.
And c4, sending the data instruction to the target sending object, so that the target sending object sends the data corresponding to the data parameter type to the data receiving service according to the data parameter type contained in the data instruction, and starts the check timeout service.
And c5, processing the transmission result, and recording the overtime result if the step d2 is overtime.
And c6, starting data storage service and saving the sending record.
And c7, ending the data transmission service.
D. The check timeout service is used to perform the following service flow:
and d1, receiving the trigger condition of the data sending service, meeting the execution condition of the overtime service, determining the target execution thread of the overtime service, and starting to detect overtime.
And d2, if the data transmission service transmission command is overtime, executing the step c 5.
If the data receiving thread e receives the instruction time out in step d3, step e5 is executed.
Step d4, the target execution thread checking the timeout service starts sleeping, waiting for the enable signal.
And d5, jumping to step d1 when the enable signal exists, and jumping to step d6 when the enable signal does not exist.
Step d6, the check timeout service ends.
E. The data receiving service is used for executing the following service processes:
and e1, when detecting that the data frame sent by the target sending object (target water meter) is received, satisfying the data receiving service execution condition, and determining the target execution thread of the data receiving service.
And e2, analyzing the data frame.
And e3, obtaining the analyzed data frame object (the data corresponding to the data parameter type sent by the target water meter to be acquired).
Step e4, start check timeout service.
And e5, processing the received result, and recording the overtime result if the step e4 is overtime.
And e6, starting the data storage service and storing the receiving record.
And e7, if the received data frame needs to be replied (when the data of the target water meter to be collected is sent for multiple times, the data needs to be replied for each time of the received data), starting a data sending service.
And e8, starting a data transmission service.
And e9, ending the data receiving service.
F. The data storage service is used for executing the following service processes:
and f1, when the triggering condition of the data sending service or the data receiving service is received, meeting the execution condition of the data storage service, determining a target execution thread of the data storage service, and starting to store data (wherein the stored data can be data sent by the data receiving service from a plurality of different target water meters, such as water meter flow and the like).
And f2, storing the service process (wherein the service process can be a service calculation executed according to the stored data, for example, calculating the water usage of the whole cell according to the data of each target water meter).
And f3, storing the service result (wherein the service result is the service result corresponding to the service flow).
And f4, saving the service log (all log information in the service execution process).
The corresponding service is called and executed through the mutual matching among the multiple threads, the threads are started when needed, and the sleep is carried out when not needed, so that the load of the server can be effectively reduced, the server resources are saved, meanwhile, the multiple threads run concurrently, the throughput of service processing can be increased, and the service processing efficiency is improved.
To sum up, the method for processing a multi-thread service provided in the embodiment of the present application includes: when the execution condition of the service to be executed is met, determining a target execution thread for executing the service to be executed according to the service type of the current service to be executed, the number of threads currently running and the maximum number of threads allowed to run by a service platform, wherein the service type is used for representing the service function of the service to be executed; and calling the target execution thread to execute the service to be executed. In the scheme, the service platform can simultaneously perform data processing on a plurality of intelligent terminals, for the execution of each service of each intelligent terminal, a target execution thread for executing the service to be executed can be determined from a plurality of threads according to the maximum thread number allowed to operate by the service platform, so that the target execution thread is called to execute the service to be executed, different services can be cooperatively executed among the threads by reasonably allocating the relationship among the threads, the performance of the server is optimal, compared with the prior art, each service of each intelligent terminal is sequentially executed through a single thread, each service of each intelligent terminal is concurrently executed through a plurality of threads, and the efficiency of the service platform for executing the service of the intelligent terminal can be effectively improved.
Specific implementation processes and technical effects of apparatuses, servers, storage media and the like for executing the multithreading service processing method provided by the present application are described below, and are not described in detail below.
Fig. 7 is a schematic diagram of a multithreading service processing apparatus according to an embodiment of the present application, where functions implemented by the multithreading service processing apparatus correspond to steps executed by the foregoing method. The device can be understood as the server or the processor of the server, and can also be understood as a component which is independent of the server or the processor and realizes the functions of the application under the control of the server. The multithreading service processing device is applied to a service platform, the service platform is used for processing data of a plurality of intelligent terminals, as shown in fig. 7, the multithreading service processing device may include: a determination module 710, an execution module 720;
a determining module 710, configured to determine a target execution thread for executing the service to be executed according to the service type of the service to be executed, the currently running thread number, and the maximum thread number allowed to run by the service platform when the execution condition of the current service to be executed is met, where the service type is used to represent a service function of the service to be executed;
and the execution module 720 is configured to invoke the target execution thread to execute the service to be executed.
Optionally, the determining module 710 is specifically configured to determine whether the number of currently running threads reaches the maximum number of threads allowed to run; if yes, determining a target execution thread according to the service type of the current service to be executed and the service type of the service operated by each thread in operation.
Optionally, the determining module 710 is specifically configured to determine a priority of the current service to be executed according to the service type of the current service to be executed; and if the priority of the current service to be executed is higher than or equal to the preset priority, determining a thread executing a first service in the running threads as a target execution thread according to the service type of the service corresponding to the running threads, wherein the priority of the first service is lower than the priority of the current service to be executed.
Optionally, the determining module 710 is further configured to, if the priority of the current service to be executed is lower than a preset priority, take a first thread that ends running in the running threads as a target execution thread.
Optionally, the determining module 710 is specifically configured to determine whether the number of currently running threads does not reach the maximum number of threads allowed to run; one thread is selected from the plurality of idle threads as a target execution thread.
Optionally, the determining module 710 is further configured to determine, according to the server configuration parameter of the service platform, a maximum number of threads allowed to be run by the service platform.
Optionally, the determining module 710 is specifically configured to determine the maximum number of threads allowed to be run by the service platform according to the number of central processing units and the size of the memory of the server.
The above-mentioned apparatus is used for executing the method provided by the foregoing embodiment, and the implementation principle and technical effect are similar, which are not described herein again.
These above modules may be one or more integrated circuits configured to implement the above methods, such as: one or more Application Specific Integrated Circuits (ASICs), or one or more microprocessors (DSPs), or one or more Field Programmable Gate Arrays (FPGAs), among others. For another example, when one of the above modules is implemented in the form of a Processing element scheduler code, the Processing element may be a general-purpose processor, such as a Central Processing Unit (CPU) or other processor capable of calling program code. For another example, these modules may be integrated together and implemented in the form of a system-on-a-chip (SOC).
The modules may be connected or in communication with each other via a wired or wireless connection. The wired connection may include a metal cable, an optical cable, a hybrid cable, etc., or any combination thereof. The wireless connection may comprise a connection over a LAN, WAN, bluetooth, ZigBee, NFC, or the like, or any combination thereof. Two or more modules may be combined into a single module, and any one module may be divided into two or more units. It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working processes of the system and the apparatus described above may refer to corresponding processes in the method embodiments, and are not described in detail in this application.
It should be noted that the above modules may be one or more integrated circuits configured to implement the above methods, for example: one or more Application Specific Integrated Circuits (ASICs), or one or more microprocessors (DSPs), or one or more Field Programmable Gate Arrays (FPGAs), among others. For another example, when one of the above modules is implemented in the form of a Processing element scheduler code, the Processing element may be a general-purpose processor, such as a Central Processing Unit (CPU) or other processor capable of calling program code. For another example, the modules may be integrated together and implemented in the form of a System-on-a-chip (SOC).
Fig. 8 is a schematic structural diagram of a server according to an embodiment of the present application, where the server may be a computing device with a data processing function.
The server may include: a processor 801 and a memory 802.
The memory 802 is used for storing programs, and the processor 801 calls the programs stored in the memory 802 to execute the above-mentioned method embodiments. The specific implementation and technical effects are similar, and are not described herein again.
The memory 802 stores therein program code that, when executed by the processor 801, causes the processor 801 to perform various steps in a multi-thread traffic processing method according to various exemplary embodiments of the present application described in the above-mentioned "exemplary methods" section of the present specification.
The Processor 801 may be a general-purpose Processor, such as a Central Processing Unit (CPU), a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware components, and may implement or execute the methods, steps, and logic blocks disclosed in the embodiments of the present Application. A general purpose processor may be a microprocessor or any conventional processor or the like. The steps of a method disclosed in connection with the embodiments of the present application may be directly implemented by a hardware processor, or may be implemented by a combination of hardware and software modules in a processor.
Memory 802, which is a non-volatile computer-readable storage medium, may be used to store non-volatile software programs, non-volatile computer-executable programs, and modules. The Memory may include at least one type of storage medium, and may include, for example, a flash Memory, a hard disk, a multimedia card, a card-type Memory, a Random Access Memory (RAM), a Static Random Access Memory (SRAM), a Programmable Read Only Memory (PROM), a Read Only Memory (ROM), a charged Erasable Programmable Read Only Memory (EEPROM), a magnetic Memory, a magnetic disk, an optical disk, and so on. The memory is any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer, but is not limited to such. The memory 802 in the embodiments of the present application may also be circuitry or any other device capable of performing a storage function for storing program instructions and/or data.
Optionally, the present application also provides a program product, such as a computer readable storage medium, comprising a program which, when being executed by a processor, is adapted to carry out the above-mentioned method embodiments.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional unit.
The integrated unit implemented in the form of a software functional unit may be stored in a computer readable storage medium. The software functional unit is stored in a storage medium and includes several instructions for enabling a computer device (which may be a personal computer, a server, or a network device) or a processor (processor) to perform some steps of the methods according to the embodiments of the present application. And the aforementioned storage medium includes: a U disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.

Claims (10)

1. A multithreading business processing method is applied to a business platform, wherein the business platform is used for processing data of a plurality of intelligent terminals, and the method comprises the following steps:
when the execution condition of the current service to be executed is met, determining a target execution thread for executing the service to be executed according to the service type of the current service to be executed, the number of threads currently running and the maximum number of threads allowed to run by a service platform, wherein the service type is used for representing the service function of the service to be executed;
and calling the target execution thread to execute the service to be executed.
2. The method according to claim 1, wherein the determining a target execution thread for executing the service to be executed according to the service type of the service to be executed currently, the number of threads currently running, and the maximum number of threads allowed to run by the service platform comprises:
judging whether the number of the currently running threads reaches the maximum number of the threads allowed to run;
and if so, determining the target execution thread according to the service type of the current service to be executed and the service type of the service operated by each thread which is operated.
3. The method according to claim 2, wherein the determining the target execution thread according to the service type of the current service to be executed and the service type of the service executed by each running thread comprises:
determining the priority of the current service to be executed according to the service type of the current service to be executed;
and if the priority of the current service to be executed is higher than or equal to a preset priority, determining a thread executing a first service in the running threads as the target execution thread according to the service type of the service corresponding to the running threads, wherein the priority of the first service is lower than the priority of the current service to be executed.
4. The method of claim 3, further comprising:
and if the priority of the current service to be executed is lower than the preset priority, taking the first thread which finishes running in the running threads as the target execution thread.
5. The method of claim 2, further comprising:
if the number of the currently running threads does not reach the maximum number of the threads allowed to run;
one thread is selected from a plurality of idle threads as the target execution thread.
6. The method according to any one of claims 1 to 5, wherein before determining the target execution thread for executing the service to be executed according to the service type of the service to be executed currently, the number of threads currently running, and the maximum number of threads allowed to run by the service platform, the method further comprises:
and determining the maximum thread number allowed to run by the service platform according to the server configuration parameters of the service platform.
7. The method according to claim 6, wherein the determining the maximum number of threads allowed to be run by the service platform according to the server configuration parameter of the service platform comprises:
and determining the maximum thread number allowed to run by the service platform according to the central processing unit core number and the memory size of the server.
8. A multithreading business processing device, which is applied to a business platform, wherein the business platform is used for processing data of a plurality of intelligent terminals, and the device comprises: a determining module and an executing module;
the determining module is used for determining a target execution thread for executing the service to be executed according to the service type of the service to be executed, the number of threads currently running and the maximum number of threads allowed to run by a service platform when the execution condition of the service to be executed is met, wherein the service type is used for representing the service function of the service to be executed;
the execution module is used for calling the target execution thread to execute the service to be executed.
9. A server, comprising: a processor, a storage medium and a bus, the storage medium storing program instructions executable by the processor, the processor and the storage medium communicating via the bus when the server is running, the processor executing the program instructions to perform the steps of the multi-threaded business processing method according to any one of claims 1 to 7 when executed.
10. A computer-readable storage medium, having stored thereon a computer program for performing the steps of the method of multi-threaded transaction processing according to any one of claims 1 to 7 when executed by a processor.
CN202011598501.6A 2020-12-28 2020-12-28 Multithreading business processing method and device, server and storage medium Pending CN112735115A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011598501.6A CN112735115A (en) 2020-12-28 2020-12-28 Multithreading business processing method and device, server and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011598501.6A CN112735115A (en) 2020-12-28 2020-12-28 Multithreading business processing method and device, server and storage medium

Publications (1)

Publication Number Publication Date
CN112735115A true CN112735115A (en) 2021-04-30

Family

ID=75611462

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011598501.6A Pending CN112735115A (en) 2020-12-28 2020-12-28 Multithreading business processing method and device, server and storage medium

Country Status (1)

Country Link
CN (1) CN112735115A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113821174A (en) * 2021-09-26 2021-12-21 迈普通信技术股份有限公司 Storage processing method, device, network card equipment and storage medium
CN115474109A (en) * 2022-11-01 2022-12-13 安徽博诺思信息科技有限公司 Electric power system multithread communication method and system based on CAN bus

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20010066933A (en) * 1999-07-26 2001-07-11 포만 제프리 엘 A method for determining whether to issue a command from a disk controller to a disk drive, a disk controller and a memory media that stores a program
CN101742662A (en) * 2008-11-04 2010-06-16 鼎桥通信技术有限公司 Method for space division of HSDPA in multi-channel system
CN101996099A (en) * 2010-11-17 2011-03-30 山东中创软件工程股份有限公司 Method and system for processing information
CN104216765A (en) * 2014-08-15 2014-12-17 东软集团股份有限公司 Multithreading concurrent service processing method and system
CN104794077A (en) * 2015-04-07 2015-07-22 无锡天脉聚源传媒科技有限公司 Link list storage method and system
CN106603215A (en) * 2017-03-02 2017-04-26 重庆邮电大学 Unfair network channel resource sharing method based on ZigBee
CN108762897A (en) * 2018-04-08 2018-11-06 天芯智能(深圳)股份有限公司 Multitask management process and smartwatch
CN109634724A (en) * 2018-10-16 2019-04-16 深圳壹账通智能科技有限公司 Collecting method, device, equipment and computer storage medium

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20010066933A (en) * 1999-07-26 2001-07-11 포만 제프리 엘 A method for determining whether to issue a command from a disk controller to a disk drive, a disk controller and a memory media that stores a program
CN101742662A (en) * 2008-11-04 2010-06-16 鼎桥通信技术有限公司 Method for space division of HSDPA in multi-channel system
CN101996099A (en) * 2010-11-17 2011-03-30 山东中创软件工程股份有限公司 Method and system for processing information
CN104216765A (en) * 2014-08-15 2014-12-17 东软集团股份有限公司 Multithreading concurrent service processing method and system
CN104794077A (en) * 2015-04-07 2015-07-22 无锡天脉聚源传媒科技有限公司 Link list storage method and system
CN106603215A (en) * 2017-03-02 2017-04-26 重庆邮电大学 Unfair network channel resource sharing method based on ZigBee
CN108762897A (en) * 2018-04-08 2018-11-06 天芯智能(深圳)股份有限公司 Multitask management process and smartwatch
CN109634724A (en) * 2018-10-16 2019-04-16 深圳壹账通智能科技有限公司 Collecting method, device, equipment and computer storage medium

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113821174A (en) * 2021-09-26 2021-12-21 迈普通信技术股份有限公司 Storage processing method, device, network card equipment and storage medium
CN113821174B (en) * 2021-09-26 2024-03-22 迈普通信技术股份有限公司 Storage processing method, storage processing device, network card equipment and storage medium
CN115474109A (en) * 2022-11-01 2022-12-13 安徽博诺思信息科技有限公司 Electric power system multithread communication method and system based on CAN bus
CN115474109B (en) * 2022-11-01 2023-02-03 安徽博诺思信息科技有限公司 Electric power system multithreading communication method and system based on CAN bus

Similar Documents

Publication Publication Date Title
Nishio et al. Service-oriented heterogeneous resource sharing for optimizing service latency in mobile cloud
US8805322B2 (en) Method, apparatus, and mobile phone for measuring and displaying internet traffic of mobile phone
EP2701074A1 (en) Method, device, and system for performing scheduling in multi-processor core system
CN112735115A (en) Multithreading business processing method and device, server and storage medium
US20150347305A1 (en) Method and apparatus for outputting log information
CN101385000A (en) System and method for multi-processor application support
Qian et al. Jade: An efficient energy-aware computation offloading system with heterogeneous network interface bonding for ad-hoc networked mobile devices
CN111163018B (en) Network equipment and method for reducing transmission delay thereof
CN102325255A (en) Multi-core CPU (central processing unit) video transcoding scheduling method and multi-core CPU video transcoding scheduling system
CN110048879A (en) Micro services register method, device, electronic equipment and computer readable storage medium
CN111045810A (en) Task scheduling processing method and device
CN111767199A (en) Resource management method, device, equipment and system based on batch processing operation
CN110489242B (en) Distributed data computing method, device, terminal equipment and storage medium
CN111984402A (en) Unified scheduling monitoring method and system for thread pool
CN106162569B (en) A kind of the transmitting-receiving conflict processing method and device of Multi-card multi-standby communication terminal
CN111985634A (en) Operation method and device of neural network, computer equipment and storage medium
KR101747113B1 (en) The method for executing cloud computing
CN115334001A (en) Data resource scheduling method and device based on priority relation
CN109062702B (en) Computing resource allocation method, related device and readable storage medium
CN110391952B (en) Performance analysis method, device and equipment
CN116502870B (en) Scheduling policy determination method, device, management terminal and storage medium
CN111277665B (en) 3D application scheduling method, device, terminal and storage medium based on interference measurement
CN114598705B (en) Message load balancing method, device, equipment and medium
CN114860409A (en) Data acquisition method, device and system
CN116775253A (en) Task scheduling method, device and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20210430

RJ01 Rejection of invention patent application after publication