CN106940658B - Task processing method and device based on thread pool - Google Patents

Task processing method and device based on thread pool Download PDF

Info

Publication number
CN106940658B
CN106940658B CN201710093555.9A CN201710093555A CN106940658B CN 106940658 B CN106940658 B CN 106940658B CN 201710093555 A CN201710093555 A CN 201710093555A CN 106940658 B CN106940658 B CN 106940658B
Authority
CN
China
Prior art keywords
thread pool
task
sub
thread
pool
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710093555.9A
Other languages
Chinese (zh)
Other versions
CN106940658A (en
Inventor
鲁可
李微
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201710093555.9A priority Critical patent/CN106940658B/en
Publication of CN106940658A publication Critical patent/CN106940658A/en
Application granted granted Critical
Publication of CN106940658B publication Critical patent/CN106940658B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/48Program initiating; Program switching, e.g. by interrupt
    • G06F9/4806Task transfer initiation or dispatching
    • G06F9/4843Task transfer initiation or dispatching by program, e.g. task dispatcher, supervisor, operating system
    • G06F9/4881Scheduling strategies for dispatcher, e.g. round robin, multi-level priority queues
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Abstract

The disclosure discloses a task processing method and device based on a thread pool, and belongs to the technical field of computers. The method comprises the following steps: and receiving a task rejection instruction of the thread pool, obtaining the task rejected by the thread pool according to the task rejection instruction, and processing the task through a sub-thread pool of the thread pool. The task processing method and device based on the thread pool can improve the task processing capacity of the computer equipment based on the thread pool.

Description

Task processing method and device based on thread pool
Technical Field
The present disclosure relates to the field of computer application technologies, and in particular, to a task processing method and apparatus based on a thread pool.
Background
With the rapid development of computer application technology, most computer devices adopt a thread pool for task processing, that is, the resources of the computer devices are fully utilized by a multi-thread parallel processing mode in the thread pool, and the task processing efficiency is improved.
However, the task processing capacity of each thread pool is limited, and after a task is submitted to the thread pool, the task will be rejected by the thread pool due to reasons such as the processing capacity of the thread pool being out of limit, and the task is discarded and cannot be processed correspondingly, thereby reducing the task processing capacity of the computer device.
Fig. 1 is a diagram illustrating a structure of a Java system with a thread pool for performing task processing according to an exemplary embodiment. As shown in fig. 1, when a task is submitted to a thread pool, if the number of threads in the thread pool is smaller than corePoolSize (the core size of the thread pool), the thread pool will create a new thread to process the task; if the number of threads in the thread pool is not less than corePoolSize but less than maxPoolSize (maximum number of threads), placing the task into a task cache queue when the task cache queue is not full, and creating a new thread to process the task when the task cache queue is full; however, if the number of threads in the thread pool is not less than maxPoolSize, the task is rejected, so that the task cannot be processed, thereby reducing the task processing capability of the computer device.
Therefore, when the current computer device processes tasks based on the thread pools, the tasks cannot be processed after being rejected by one thread pool, which results in lower task processing capability.
Disclosure of Invention
In order to solve the technical problem that the task processing capacity based on the thread pool is low in the related art, the disclosure provides a task processing method and device based on the thread pool.
A task processing method based on a thread pool comprises the following steps:
receiving a task rejection instruction of a thread pool;
obtaining the tasks rejected by the thread pool according to the task rejection instruction;
and processing the task through a sub-thread pool of the thread pool.
A task processing device based on a thread pool, comprising:
the rejection receiving module is used for receiving a task rejection instruction of the thread pool;
the task obtaining module is used for obtaining the tasks rejected by the thread pool according to the task rejection instruction;
and the task processing module is used for processing the task through the sub thread pool of the thread pool.
The technical scheme provided by the embodiment of the disclosure can have the following beneficial effects:
when the task processing is carried out based on the thread pool, the task rejected by the thread pool is obtained according to the task rejection instruction by receiving the task rejection instruction of the thread pool, and then the task is processed by adopting the sub-thread pool of the thread pool, so that the task cannot be processed due to rejection of the task by the thread pool is avoided, the task is processed by adopting the thread pool and the sub-thread pool of the thread pool in parallel, and the task processing capacity of the computer equipment is greatly improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the invention and together with the description, serve to explain the principles of the invention.
FIG. 1 is a block diagram illustrating the processing of tasks by a Java system's own thread pool in accordance with an illustrative embodiment;
FIG. 2 is a flowchart illustrating a method for task processing based on thread pools in accordance with an exemplary embodiment;
FIG. 3 is a flow diagram illustrating another method for thread pool based task processing according to the corresponding embodiment of FIG. 2;
FIG. 4 is a flow diagram illustrating another method for thread pool based task processing according to the corresponding embodiment of FIG. 2;
FIG. 5 is a flow diagram illustrating another method for thread pool based task processing according to the corresponding embodiment of FIG. 2;
FIG. 6 is a flowchart illustrating an implementation of step S150 in the task processing method based on thread pools according to the corresponding embodiment of FIG. 2;
FIG. 7 is a diagram illustrating a particular application scenario for thread pool based task processing in accordance with an illustrative embodiment;
FIG. 8 is a block diagram illustrating a task processing device based on thread pools in accordance with an illustrative embodiment;
FIG. 9 is a block diagram of another thread pool based task processing device shown in accordance with the corresponding embodiment of FIG. 8;
FIG. 10 is a block diagram of another thread pool based task processing device according to the corresponding embodiment of FIG. 8;
FIG. 11 is a block diagram of another thread pool based task processing device shown in accordance with the corresponding embodiment of FIG. 8;
FIG. 12 is a block diagram of a task processing module 150 in the thread pool based task processing apparatus according to the corresponding embodiment of FIG. 8;
FIG. 13 is a schematic diagram illustrating a computer device in one implementation environment, according to an example embodiment.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present invention. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the invention, as detailed in the appended claims.
In one embodiment, the present disclosure is directed to an implementation environment comprising: computer equipment and install task processing module on computer equipment. The computer device has an independent operating system and an independent operating space, and can be provided with software and software provided by a third-party service provider, for example, the computer device can be various intelligent system processing devices and the like. The task processing module is a hardware module capable of processing tasks based on a thread pool.
FIG. 2 is a flowchart illustrating a method for task processing based on thread pools in accordance with an exemplary embodiment. As shown in fig. 2, the task processing method based on the thread pool may include the following steps.
In step S110, a task rejection instruction of the thread pool is received.
Thread pools are a form of multi-threaded processing in which tasks are added to a task cache queue and then automatically executed after a thread is created. And after the task is executed, automatically recovering the thread.
A thread, also called a LightWeight Process (LWP), is the smallest unit of program execution flow.
A task rejection instruction is an instruction issued by the thread pool for a task that rejects acceptance.
It will be appreciated that each thread pool in a computer device has limited task processing capabilities. When the task amount of the thread pool exceeds the limit, the new task is refused to be accepted, and a task refusing instruction is sent out aiming at the task.
Thus, after the thread pool issues a task rejection instruction for the task, the computer device receives the task rejection instruction.
In a specific exemplary embodiment, the maximum thread count of thread pool A is maxPolSize. When the number of threads in the thread pool A reaches maxPoolSize and the task cache queue is full, if a new task is still submitted to the thread pool A at the moment, the thread pool A refuses to accept the task and sends a task refusing instruction aiming at the task.
In step S130, a task rejected by the thread pool is obtained according to the task rejection instruction.
It will be appreciated that each task in the computer device has its corresponding identification information.
The task rejection instruction is issued by the thread pool for rejection of a certain task, and the task rejection instruction includes identification information of the task, such as a task sequence number.
Therefore, the rejected task can be found according to the identification information in the task rejection instruction.
In step S150, the task is processed by the sub-thread pool of the thread pool.
A sub-thread pool is a pool of dependent threads in a computer device that belongs to a thread pool.
The sub-thread pool stores the reference of the thread pool to which the sub-thread pool belongs, and shares the task cache list in the thread pool to which the sub-thread pool belongs, so that the task cache queue of the thread pool to which the sub-thread pool belongs can be obtained, and the task processing of the thread pool to which the sub-thread pool belongs is assisted.
For example, the computer device has a thread pool a and a sub-thread pool B, where the sub-thread pool B is a sub-thread pool of the thread pool a, and the sub-thread pool B can obtain a task cache list of the thread pool a.
After receiving the task, the sub-thread pool executes the rejected task by creating a new thread, so that the task is processed.
By using the method, after the task is rejected by the thread pool, the task is processed by the sub-thread pool of the thread pool, so that the phenomenon that the task cannot be processed is avoided, and the task processing capability based on the thread pool is greatly improved.
FIG. 3 is a flowchart illustrating a method for thread pool based task processing, according to an example embodiment. As shown in fig. 3, before step S150 shown in fig. 2 in the corresponding embodiment, the method for processing tasks based on thread pools may further include the following steps.
In step S210, it is determined whether or not a sub thread pool exists in the thread pool, and if no (N), step S230 is executed, and if yes (Y), step S150 is executed.
After the task rejected by the thread pool is obtained, whether the thread pool belongs to the thread pool or not is judged by judging whether the thread pool belongs to the thread pool or not, and whether a sub-thread pool exists in the thread pool or not is further determined.
And judging whether the thread pool belongs to the thread pool, namely judging whether the thread pool has a sub-thread pool or not can judge whether the thread pool shares a task cache queue to other thread pools or not.
When the thread pool shares the task cache queue with another thread pool, the thread pool adds task sharing information to the thread pool.
Therefore, after the thread pool added with the task sharing information is found by searching the task sharing information in each thread pool, the sub-thread pool sharing the task cache queue in the thread pool is determined according to the task sharing information.
When the thread pool refusing the task does not have a sub-thread pool, creating the sub-thread pool for the thread pool; and when the thread pool rejecting the task has a sub-thread pool, processing the task through the sub-thread pool.
In step S230, a child thread pool is created for the thread pool.
And establishing a subordinate thread pool, namely a sub-thread pool, for the thread pool according to the task cache list of the thread pool, and setting the task cache list of the thread pool to be shared by the sub-thread pool.
When the sub-thread pool is created, task sharing information is added in the thread pool, so that the sub-thread pool can share a task cache queue in the thread pool, and the tasks in the thread pool are monitored.
In a specific exemplary embodiment, when a thread pool whose task processing capacity is over-limit receives a task, a task rejection instruction is sent for the task. And according to the task rejection instruction, when the thread pool does not have the sub-thread pool, applying for creating the sub-thread pool for the thread pool, and further executing the task rejected by the thread pool through the thread of the sub-thread pool.
By the method, after the thread pool rejects the tasks, when the thread pool does not have the sub-thread pool, the sub-thread pool is created for the thread pool, and then the tasks rejected by the thread pool are processed by the sub-thread pool, so that the phenomenon that the tasks cannot be processed is avoided, and the task processing capability based on the thread pool is greatly improved.
FIG. 4 is a flowchart illustrating a method for task processing based on thread pools in accordance with an exemplary embodiment. As shown in fig. 4, the step S150 shown in fig. 2 according to the corresponding embodiment may further include the following steps.
In step S310, it is determined whether the task buffer queue of the sub thread pool is empty, and if yes (Y), step S330 is performed, and if no (N), step S311 is performed.
It is understood that the sub-thread pool is used to assist the thread pool to which it belongs in task processing.
When the task buffer queue of the sub-thread pool is empty, the sub-thread pool is in an idle state at the moment, which indicates that the tasks processed by the sub-thread pool are executed. In order to further assist the thread pool to which the computer belongs to perform task processing, the task is extracted from the thread pool to which the computer belongs, so that the task processing amount of the thread pool to which the computer belongs is reduced, and the task processing capacity of the computer equipment based on the thread pool is improved.
And when the task buffer queue of the sub-thread pool is not empty, the sub-thread pool continues to process the tasks in the task buffer queue of the sub-thread pool.
In a specific exemplary embodiment, when the sub-thread pool is used for task processing, tasks are sequentially extracted from a task cache queue according to the arrangement sequence of each task in the task cache queue of the sub-thread pool, and then the tasks are executed through threads of the sub-thread pool; after the task in the task cache queue of the sub-thread pool is executed, the task is extracted from the task cache queue of the thread pool to which the sub-thread pool belongs, and the task is executed through the thread of the sub-thread pool until the task cache queues of the sub-thread and the thread pool to which the sub-thread belongs are empty.
In step S330, a task is extracted from the task buffer queue of the thread pool.
In step S311, the processing of the task in the task buffer queue is continued.
In step S350, a task is executed using threads of the child thread pool.
By utilizing the method, the sub-thread pool assists the thread pool to which the sub-thread pool belongs to perform task processing, when the task cache queue of the sub-thread pool is empty, the task is extracted from the thread pool to which the sub-thread pool belongs, and then the thread in the sub-thread pool is adopted to execute the task, so that the task processing is performed in parallel through the plurality of thread pools, the task processing speed is increased, and the task processing capacity of the thread pools is improved.
FIG. 5 is a flow diagram illustrating another method for thread pool based task processing in accordance with an illustrative embodiment. As shown in fig. 5, the step S150 shown in fig. 2 according to the corresponding embodiment may further include the following steps.
In step S410, task buffer queue detection is performed on the thread pool and the sub-thread pool of the thread pool, respectively.
The task buffer queue detection is to detect whether there are still unexecuted tasks in the task buffer queue.
In step S430, when the task buffer queues in the thread pool and the sub-thread pools in the thread pool are empty, the sub-thread pool in the thread pool is destroyed.
It can be understood that when the task buffer queues in the thread pool and the sub-thread pool of the thread pool are empty, it indicates that the thread pool and the sub-thread pool of the thread pool have processed tasks, and at this time, the thread pool and the sub-thread pool of the thread pool are both in an idle state, and no task is executed.
Therefore, if the task buffer queues in the thread pool and the sub-thread pool of the thread pool are empty, the sub-thread pool is still reserved, and system resources are wasted. At this time, even if a new task needs to be processed, the thread pool can be executed correspondingly, and the task processing capacity is not influenced by destroying the sub thread pool. But the waste of system resources can be avoided by destroying the sub thread pool of the thread pool.
By using the method, when the thread pool and the task buffer queues in the sub-thread pools of the thread pool are empty, the sub-thread pools of the thread pool are destroyed, so that the waste of system resources is avoided when the task processing is not influenced.
Fig. 6 is a depiction of details of step S150 in fig. 2, shown in accordance with an exemplary embodiment. The step S150 may include the following steps.
In step S151, the task is placed in the task buffer queue of the sub-thread pool.
When the thread pool processes the task, the task is firstly placed into the task cache queue of the thread pool, and when the thread of the thread pool does not reach the thread capacity, the new thread is created to execute the task in the task cache queue.
And after receiving the new task, the sub-thread pool places the task into a task cache queue of the sub-thread pool so as to be executed after the sub-thread pool creates a thread.
In step S153, it is determined whether the number of threads in the sub-thread pool reaches the preset thread capacity, if no (N), step S155 is executed, and if yes (Y), the tasks in the task cache queue of the sub-thread pool are not processed for the while.
It should be noted that the thread pools are all preset with thread capacities. By controlling the number of threads in the thread pool, system crash caused by the fact that excessive threads in the thread pool occupy system resources is avoided.
When the number of threads in the sub-thread pool does not reach the preset thread capacity, processing the tasks in the task cache queue by creating a new thread in the sub-thread pool; and when the number of the threads in the sub-thread pool reaches the preset thread capacity, the tasks in the task cache queue are processed after the execution of the executing tasks is finished.
In step S155, the task in the task buffer queue is executed by the thread of the sub-thread pool.
By using the method, when the number of threads in the sub-thread pool does not reach the preset thread capacity, the threads in the sub-thread pool are adopted to execute the tasks in the task cache queue, so that the problem that the system is crashed due to excessive occupation of system resources because the thread execution tasks are still established after the number of threads in the sub-thread pool reaches the preset thread capacity is avoided.
The above task processing method based on thread pool is described in detail below with reference to a specific application scenario. The task processing method based on the thread pool is operated in a computer device, and particularly, as shown in fig. 7.
And submitting the task to the parent thread pool.
And judging whether the parent thread pool reaches the maximum processing capacity. If the father thread pool does not reach the maximum processing capacity, processing the task through the father thread pool; and if the parent thread pool reaches the maximum processing capacity, the parent thread pool sends a task rejection instruction aiming at the task, and the task rejected by the parent thread pool is searched according to the task rejection instruction.
And judging whether the parent thread pool has a child thread pool or not. If the parent thread pool has a sub-thread pool, placing the task into a task cache queue of the sub-thread pool; and if the parent thread pool does not have the child thread pool, establishing the child thread pool for the parent thread pool, and placing the task into a task cache queue of the child thread pool.
And executing the tasks in the task cache queue through the threads of the sub-thread pool.
And judging whether the task buffer queue of the sub thread pool is empty or not. And when the task buffer queue of the child thread pool is empty, judging whether the task buffer queue of the parent thread pool is empty or not. And when the task cache queue of the father thread pool is empty, extracting the task from the task cache queue of the father thread pool, and placing the task into the task cache queue of the sub thread pool.
And when the task cache queues in the parent thread pool and the child thread pool of the parent thread pool are empty, destroying the child thread pool of the parent thread pool.
The following is an embodiment of the apparatus of the present disclosure, which may be used to execute the embodiment of the task processing method based on the thread pool. For details not disclosed in the embodiments of the apparatus of the present disclosure, please refer to the embodiments of the task processing method based on thread pool of the present disclosure.
FIG. 8 is a block diagram illustrating a thread pool based task processing device including, but not limited to: a rejection receiving module 110, a task obtaining module 130 and a task processing module 150.
A rejection receiving module 110, configured to receive a task rejection instruction of the thread pool;
a task obtaining module 130, configured to obtain, according to the task rejection instruction, a task rejected by the thread pool;
and the task processing module 150 is configured to process the task through a sub thread pool of the thread pool.
The implementation process of the functions and actions of each module in the above device is specifically described in the implementation process of the corresponding step in the above task processing method based on the thread pool, and is not described herein again.
Optionally, as shown in fig. 9, the task processing apparatus based on a thread pool shown in fig. 8 further includes but is not limited to: a sub-thread pool judging module 210 and a sub-thread pool creating module 230.
A sub-thread pool judging module 210, configured to judge whether a sub-thread pool exists in the thread pool;
a sub-thread pool creating module 230, configured to create a sub-thread pool for the thread pool when the thread pool does not exist.
Optionally, as shown in fig. 10, the task processing device based on the thread pool shown in fig. 8 further includes but is not limited to: a task buffer queue determining module 310, a task extracting module 330, and a task executing module 350.
A task buffer queue determining module 310, configured to determine whether a task buffer queue of the sub-thread pool is empty;
the task extracting module 330 is configured to extract a task from a task buffer queue of the thread pool when the task buffer queue of the sub-thread pool is empty;
and a task execution module 350, configured to execute the task by using the thread of the sub-thread pool.
Optionally, as shown in fig. 11, the task processing device based on the thread pool shown in fig. 8 further includes but is not limited to: a task buffer queue detection module 410 and a sub thread pool destruction module 430.
A task buffer queue detection module 410, configured to perform task buffer queue detection on the thread pool and the sub-thread pool of the thread pool respectively;
and the sub-thread pool destroying module 430 is configured to destroy the sub-thread pool of the thread pool when both the task cache queues in the thread pool and the sub-thread pool of the thread pool are empty.
Optionally, as shown in fig. 12, the task processing module 150 in fig. 8 includes but is not limited to: a task placing unit 151, a thread number judging unit 153, and a task executing unit 155.
A task placing unit 151, configured to place the task into a task cache queue of the sub-thread pool;
a thread number judging unit 153, configured to judge whether the thread number of the sub-thread pool reaches a preset thread capacity;
and the task execution unit 155 is configured to create a new thread through the sub-thread pool to execute the task in the task cache queue when the number of threads in the sub-thread pool does not reach a preset thread capacity.
Fig. 13 is a block diagram illustrating a terminal 100 according to an example embodiment. The terminal 100 may be applied to a computer device in the above-described implementation environment.
Referring to fig. 13, the terminal 100 may include one or more of the following components: a processing component 101, a memory 102, a power component 103, a multimedia component 104, an audio component 105, a sensor component 107 and a communication component 108. The above components are not all necessary, and the terminal 100 may add other components or reduce some components according to its own functional requirements, which is not limited in this embodiment.
The processing component 101 generally controls overall operation of the terminal 100, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing components 101 may include one or more processors 109 to execute instructions to perform all or a portion of the above-described operations. Further, the processing component 101 may include one or more modules that facilitate interaction between the processing component 101 and other components. For example, the processing component 101 may include a multimedia module to facilitate interaction between the multimedia component 104 and the processing component 101.
The memory 102 is configured to store various types of data to support operations at the terminal 100. Examples of such data include instructions for any application or method operating on the terminal 100. The Memory 102 may be implemented by any type of volatile or non-volatile Memory device or combination thereof, such as an SRAM (Static Random Access Memory), an EEPROM (Electrically Erasable Programmable Read-Only Memory), an EPROM (Erasable Programmable Read-Only Memory), a PROM (Programmable Read-Only Memory), a ROM (Read-Only Memory), a magnetic Memory, a flash Memory, a magnetic disk, or an optical disk. Also stored in memory 102 are one or more modules configured to be executed by the one or more processors 109 to perform all or a portion of the steps of any of the methods shown in fig. 2, 3, 4, 5, and 6.
The power supply component 103 provides power to the various components of the terminal 100. The power components 103 may include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for the terminal 100.
The multimedia component 104 includes a screen providing an output interface between the terminal 100 and the user. In some embodiments, the screen may include an LCD (Liquid Crystal Display) and a TP (Touch Panel). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation.
The audio component 105 is configured to output and/or input audio signals. For example, the audio component 105 includes a microphone configured to receive external audio signals when the terminal 100 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may further be stored in the memory 102 or transmitted via the communication component 108. In some embodiments, audio component 105 also includes a speaker for outputting audio signals.
The sensor assembly 107 includes one or more sensors for providing various aspects of state assessment for the terminal 100. For example, the sensor assembly 107 can detect an open/close state of the terminal 100, relative positioning of the components, a change in position of the terminal 100 or a component of the terminal 100, and a change in temperature of the terminal 100. In some embodiments, the sensor assembly 107 may also include a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 108 is configured to facilitate communications between the terminal 100 and other devices in a wired or wireless manner. The terminal 100 may access a WIreless network based on a communication standard, such as WiFi (WIreless-Fidelity), 2G or 3G, or a combination thereof. In an exemplary embodiment, the communication component 108 receives a broadcast signal or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the Communication component 108 further includes a Near Field Communication (NFC) module to facilitate short-range Communication. For example, the NFC module may be implemented based on an RFID (Radio Frequency Identification) technology, an IrDA (Infrared data association) technology, an UWB (Ultra-Wideband) technology, a BT (Bluetooth) technology, and other technologies.
In an exemplary embodiment, the terminal 100 may be implemented by one or more ASICs (Application Specific Integrated circuits), DSPs (Digital Signal processors), PLDs (Programmable Logic devices), FPGAs (Field-Programmable gate arrays), controllers, micro-controllers, microprocessors, or other electronic components for performing the above-described task Processing method based on the thread pool.
The specific manner in which the processor of the terminal in this embodiment performs operations has been described in detail in the embodiment related to the task processing method based on the thread pool, and will not be elaborated here.
In an exemplary embodiment, a storage medium is also provided that is a computer-readable storage medium, such as may be transitory and non-transitory computer-readable storage media, including instructions. The storage medium includes, for example, a memory 102 of instructions executable by a processor 109 of the terminal 100 to perform the above-described thread pool-based task processing method.
It will be understood that the invention is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the invention is limited only by the appended claims.

Claims (9)

1. A task processing method based on a thread pool is characterized by comprising the following steps:
receiving a task rejection instruction which is generated after the number of threads in a thread pool reaches the maximum number of threads and a task cache queue of the thread pool is full;
obtaining the tasks rejected by the thread pool according to the task rejection instruction;
judging whether a sub-thread pool exists in the thread pool, and if not, creating the sub-thread pool for the thread pool; the sub thread pool is used for assisting the thread pool in performing task processing;
processing the task through a sub-thread pool of the thread pool;
respectively carrying out task cache queue detection on the thread pool and the sub-thread pool of the thread pool;
and destroying the sub-thread pool of the thread pool when the task buffer queues in the thread pool and the sub-thread pool of the thread pool are empty.
2. The method of claim 1, further comprising:
and if the thread pool has a sub-thread pool, executing the step of processing the task through the sub-thread pool of the thread pool.
3. The method of claim 1, wherein after the step of processing the task with the sub-thread pool of the thread pool, the method further comprises:
judging whether a task cache queue of the sub-thread pool is empty or not, if so, judging whether a task cache queue of the sub-thread pool is empty or not, and if not, judging whether a task cache queue of the sub-thread pool is empty or not
Extracting tasks from a task cache queue of the thread pool;
and executing the task by adopting the threads of the sub-thread pool.
4. The method of claim 1, wherein processing the task with a sub-thread pool of the thread pool comprises:
placing the task into a task cache queue of the sub-thread pool;
judging whether the number of threads in the sub-thread pool reaches the preset thread capacity or not, and if not, judging whether the number of threads in the sub-thread pool reaches the preset thread capacity or not
And creating a new thread through the sub-thread pool to execute the task in the task cache queue.
5. A task processing apparatus based on a thread pool, the apparatus comprising:
a rejection receiving module, configured to receive a task rejection instruction generated after the number of threads in the thread pool reaches a maximum number of threads and a task cache queue of the thread pool is full;
the task obtaining module is used for obtaining the tasks rejected by the thread pool according to the task rejection instruction;
the sub-thread pool judging module is used for judging whether the thread pool has a sub-thread pool or not;
the sub-thread pool creating module is used for creating a sub-thread pool for the thread pool when the thread pool does not exist; the sub thread pool is used for assisting the thread pool in performing task processing;
the task processing module is used for processing the tasks through the sub thread pools of the thread pools;
the task cache queue detection module is used for respectively detecting the task cache queues of the thread pool and the sub-thread pools of the thread pool;
and the sub-thread pool destroying module is used for destroying the sub-thread pool of the thread pool when the task cache queues in the thread pool and the sub-thread pool of the thread pool are empty.
6. The apparatus of claim 5, further comprising:
the task cache queue judging module is used for judging whether a task cache queue of the sub thread pool is empty or not;
the task extraction module is used for extracting tasks from the task buffer queues of the thread pool when the task buffer queues of the sub-thread pool are empty;
and the task execution module is used for executing the task by adopting the threads of the sub-thread pool.
7. The apparatus of claim 5, wherein the task processing module comprises:
the task placing unit is used for placing the tasks into a task cache queue of the sub-thread pool;
the thread quantity judging unit is used for judging whether the thread quantity of the sub-thread pool reaches the preset thread capacity or not;
and the task execution unit is used for creating a new thread through the sub-thread pool to execute the task in the task cache queue when the number of the threads in the sub-thread pool does not reach the preset thread capacity.
8. A computer device, characterized in that the computer device comprises:
a memory for storing computer instructions;
a processor for executing the computer instructions to cause the computer device to implement the task processing method of any of claims 1-4.
9. A storage medium characterized in that it stores instructions which, when executed by a processor of a computer device, implement a task processing method according to any one of claims 1 to 4.
CN201710093555.9A 2017-02-21 2017-02-21 Task processing method and device based on thread pool Active CN106940658B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710093555.9A CN106940658B (en) 2017-02-21 2017-02-21 Task processing method and device based on thread pool

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710093555.9A CN106940658B (en) 2017-02-21 2017-02-21 Task processing method and device based on thread pool

Publications (2)

Publication Number Publication Date
CN106940658A CN106940658A (en) 2017-07-11
CN106940658B true CN106940658B (en) 2022-09-09

Family

ID=59468660

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710093555.9A Active CN106940658B (en) 2017-02-21 2017-02-21 Task processing method and device based on thread pool

Country Status (1)

Country Link
CN (1) CN106940658B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107741922A (en) * 2017-10-18 2018-02-27 山东浪潮通软信息科技有限公司 A kind of operational formula treating method and apparatus
CN108846632A (en) * 2018-05-28 2018-11-20 浙江口碑网络技术有限公司 Thread processing method and device

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1588316A (en) * 2004-06-29 2005-03-02 北京大学 Property optimizing method for applying server
US7237242B2 (en) * 2002-12-31 2007-06-26 International Business Machines Corporation Dynamic thread pool tuning techniques
CN103455377A (en) * 2013-08-06 2013-12-18 北京京东尚科信息技术有限公司 System and method for managing business thread pool
CN105159768A (en) * 2015-09-09 2015-12-16 浪潮集团有限公司 Task management method and cloud data center management platform
CN106095590A (en) * 2016-07-21 2016-11-09 联动优势科技有限公司 A kind of method for allocating tasks based on thread pool and device
CN106155803A (en) * 2015-04-07 2016-11-23 北大方正集团有限公司 A kind of thread pool sharing method based on semaphore and system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7237242B2 (en) * 2002-12-31 2007-06-26 International Business Machines Corporation Dynamic thread pool tuning techniques
CN1588316A (en) * 2004-06-29 2005-03-02 北京大学 Property optimizing method for applying server
CN103455377A (en) * 2013-08-06 2013-12-18 北京京东尚科信息技术有限公司 System and method for managing business thread pool
CN106155803A (en) * 2015-04-07 2016-11-23 北大方正集团有限公司 A kind of thread pool sharing method based on semaphore and system
CN105159768A (en) * 2015-09-09 2015-12-16 浪潮集团有限公司 Task management method and cloud data center management platform
CN106095590A (en) * 2016-07-21 2016-11-09 联动优势科技有限公司 A kind of method for allocating tasks based on thread pool and device

Also Published As

Publication number Publication date
CN106940658A (en) 2017-07-11

Similar Documents

Publication Publication Date Title
RU2637474C2 (en) Method and device for controlling background application and terminal device
CN106970754B (en) Screen capture processing method and device
CN107463403B (en) Process control method, device, storage medium and electronic equipment
US20150333971A1 (en) Method and device for managing processes of application program
RU2651207C2 (en) Method and apparatus for processing application program package
US10409635B2 (en) Switching method, switching system and terminal for system and/or application program
US10551996B2 (en) Method and apparatus for starting an application in a screen-locked state
CN105677460A (en) Application processing method and apparatus
US10216248B2 (en) Method and device for controlling power consumption
US11481229B2 (en) Method for application processing, storage medium, and electronic device
CN109343902A (en) Operation method, device, terminal and the storage medium of audio processing components
KR102154457B1 (en) Status detection method, device and storage medium
US20170293494A1 (en) Method and device for starting application interface
JP6315740B2 (en) Theme application loading method and device
CN106940658B (en) Task processing method and device based on thread pool
US9380438B2 (en) Method and device for forwarding an incoming call according to a remaining battery capacity
CN109062625B (en) Application program loading method and device and readable storage medium
CN108073610B (en) Method and device for realizing webpage loading
CN112363825A (en) Elastic expansion method and device
EP3200075A1 (en) Method and device for calling process
CN104298528A (en) Application program installation control method and device
CN104391742A (en) Application optimization method and device
CN113268325A (en) Method, device and storage medium for scheduling task
CN107797645B (en) Resource control method and device
WO2019169692A1 (en) Smartphone-based method for prompting download completion, electronic device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant