CN113823014A - Message pushing method based on access control processor and access control system - Google Patents

Message pushing method based on access control processor and access control system Download PDF

Info

Publication number
CN113823014A
CN113823014A CN202110974630.9A CN202110974630A CN113823014A CN 113823014 A CN113823014 A CN 113823014A CN 202110974630 A CN202110974630 A CN 202110974630A CN 113823014 A CN113823014 A CN 113823014A
Authority
CN
China
Prior art keywords
access control
message
event information
access
queue
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110974630.9A
Other languages
Chinese (zh)
Inventor
何猛
莫明锋
陈荣
简智君
李锦华
郭军
胡运龙
胡远航
李大乐
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ralid Information System Co ltd
Original Assignee
Ralid Information System Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ralid Information System Co ltd filed Critical Ralid Information System Co ltd
Priority to CN202110974630.9A priority Critical patent/CN113823014A/en
Publication of CN113823014A publication Critical patent/CN113823014A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C9/00Individual registration on entry or exit
    • G07C9/00174Electronically operated locks; Circuits therefor; Nonmechanical keys therefor, e.g. passive or active electrical keys or other data carriers without mechanical keys

Abstract

The invention provides an information pushing method based on an access control processor and an access control system, wherein the information pushing method comprises the following steps: s101: acquiring a parallel value of the access control processor according to the core number of the access control processor and a preset pressure value, and generating a message cache queue based on the parallel value, wherein each access control controller is mapped with one message cache queue; s102: and receiving event information sent by the access controller, acquiring a message cache queue mapped by the access controller, and pushing the event information to the message cache queue. The invention can avoid the problem of unsmooth locking caused by the fact that threads are continuously increased and the processing capacity of the access control processor is exceeded, can control a large number of access control processors, does not need to set new equipment, reduces the installation cost and the use cost, and improves the user experience.

Description

Message pushing method based on access control processor and access control system
Technical Field
The invention relates to the field of access control management, in particular to an information pushing method based on an access control processor and an access control system.
Background
The door control system refers to the prohibition authority of a door in the field of intelligent buildings, namely ACS for short, and the door comprises various passable passages including a passable door for people, a passable door for vehicles and the like in a broad sense. Therefore, the entrance guard comprises the vehicle entrance guard, in the application of the management of the parking lot, the vehicle entrance guard is an important means of vehicle management, the purpose of collecting parking fee is not taken as the purpose, the entrance guard mainly manages the vehicle access authority, the entrance guard safety management system is a novel modern safety management system, the entrance guard safety management system integrates the microcomputer automatic identification technology and the modern safety management measures into a whole, and the entrance guard safety management system relates to a plurality of new technologies such as electronics, machinery, optics, computer technology, communication technology, biotechnology and the like, and is an effective measure for solving the problem of realizing the safety precaution management of the entrance of an important department. The system is suitable for various essential departments, such as banks, hotels, parking lot management, machine rooms, ordnance depots, key rooms, offices, intelligent districts, factories and the like, the access control system surpasses simple doorway and key management, has gradually developed into a set of complete access management system, and plays a great role in administrative management work such as work environment safety, personnel attendance management and the like.
With the development of science and technology and the progress of technology, door access controllers for controlling access management devices such as doors and gates are widely used in daily life of people. In order to realize the unified management of the access controllers, a single access processor is used for controlling a large number of access controllers. These access controllers can send the event that produces for the access control treater in real time, the access control treater is to the parallel message queue of sending of the event of every controller propelling movement handle, if do not do the parallel transmission of restriction according to controller quantity, so the controller is more, the thread that the access control treater need be opened is more, exceed access control treater self throughput very easily, cause access control treater card pause, can not normally work, in order to solve this problem, prior art adopts the access control ware quantity of restriction connection or increases access control treater quantity and uses high performance access control treater, the installation cost and the use cost of entrance guard have been increased to these modes, user experience has been reduced.
Disclosure of Invention
In order to overcome the defects of the prior art, the invention provides an access control processor-based message pushing method and an access control system, a parallel value is obtained according to the core number and the preset pressure value of the access control processor, a message cache queue is generated by using the parallel value, so that an access control controller is mapped to a fixed message cache queue, and event information of the access control controller is pushed to the message cache queue for processing, thereby avoiding the problem that threads are continuously increased and the jam of the access control controller is caused by exceeding the processing capacity of the access control processor, controlling a large number of access control controllers without setting new equipment, reducing the installation cost and the use cost, and improving the user experience.
In order to solve the above problems, the present invention adopts a technical solution as follows: a message pushing method based on an access control processor comprises the following steps: s101: acquiring a parallel value of an access control processor according to the core number of the access control processor and a preset pressure value, and generating a message cache queue based on the parallel value, wherein each access control controller is mapped with one message cache queue; s102: and receiving event information sent by the access controller, acquiring a message cache queue mapped by the access controller, and pushing the event information to the message cache queue.
Further, the step of obtaining the parallel value of the access control processor according to the core number of the access control processor and the preset pressure value specifically includes: and acquiring a parallel value according to the product of the core number and a preset pressure value.
Further, the parallel value is equal to the total number of message buffer queues.
Further, the step of generating the message buffer queue based on the parallel value specifically includes: and the control buffer area mapping generates message buffer queues according to the parallel values, and each message buffer queue is processed by different threads.
Further, after the step of pushing the event information to the message buffer queue, the method further includes: and consuming the event information according to the sequence of pushing the event information to the message buffer queue, and sending the consumed event information to the next processing flow.
Based on the same inventive concept, the invention also provides an access control system, which comprises an access control processor and a plurality of access controllers, wherein the access control processor is in communication connection with the access controllers, and the access control system realizes the following message pushing method based on the access control processor through the access control processor: s201: acquiring a parallel value of an access control processor according to the core number of the access control processor and a preset pressure value, and generating a message cache queue based on the parallel value, wherein each access control controller is mapped with one message cache queue; s202: and receiving event information sent by the access controller, acquiring a message cache queue mapped by the access controller, and pushing the event information to the message cache queue.
Further, the step of obtaining the parallel value of the access control processor according to the core number of the access control processor and the preset pressure value specifically includes: and acquiring a parallel value according to the product of the core number and a preset pressure value.
Further, the parallel value is equal to the total number of message buffer queues.
Further, the step of generating the message buffer queue based on the parallel value specifically includes: and the control buffer area mapping generates message buffer queues according to the parallel values, and each message buffer queue is processed by different threads.
Further, after the step of pushing the event information to the message buffer queue, the method further includes: and consuming the event information according to the sequence of pushing the event information to the message buffer queue, and sending the consumed event information to the next processing flow.
Compared with the prior art, the invention has the beneficial effects that: obtain the parallel value according to entrance guard's treater core number and preset pressure value, utilize the parallel value to generate message buffer memory queue, make entrance guard's controller map a fixed message buffer memory queue, push this entrance guard's controller's incident information to this message buffer memory queue and handle, can avoid the thread constantly to increase, exceed entrance guard's treater throughput and cause its problem of card pause, and can control a large amount of entrance guard's controllers, need not to set up new equipment, installation cost and use cost have been reduced, user experience has been improved.
Drawings
FIG. 1 is a flowchart of an embodiment of a message pushing method based on an access control processor according to the present invention;
FIG. 2 is a flowchart illustrating a message pushing method based on an access control processor according to another embodiment of the present invention;
FIG. 3 is a block diagram of an embodiment of an access control system of the present invention;
fig. 4 is a structural diagram of an embodiment of a message pushing method based on an access control processor executed by an access control system according to the present invention.
Detailed Description
The present invention will be further described with reference to the accompanying drawings and the detailed description, and it should be noted that any combination of the embodiments or technical features described below can be used to form a new embodiment without conflict.
Referring to fig. 1-2, fig. 1 is a flowchart illustrating a message pushing method based on an access control processor according to an embodiment of the present invention; fig. 2 is a flowchart of another embodiment of a message pushing method based on an access control processor according to the present invention. The message pushing method based on the access control processor of the invention is described in detail with reference to fig. 1-2.
In this embodiment, the message pushing method based on the access control processor includes:
s101: and acquiring a parallel value of the access control processor according to the core number of the access control processor and a preset pressure value, and generating a message cache queue based on the parallel value, wherein each access control controller is mapped with one message cache queue.
In this embodiment, the step of obtaining the parallel value of the access control processor according to the core number of the access control processor and the preset pressure value specifically includes: and acquiring a parallel value according to the product of the core number and a preset pressure value. The specific value of the preset pressure value can be specifically set according to the performance and the actual condition of the access control processor, and is not limited herein.
In a specific embodiment, the parallel value is the core number and the predetermined pressure value.
In this embodiment, the access control processor is a CPU, and in other embodiments, the access control processor may also be an SOC, a DSP, an MCU, or other intelligent chips capable of concurrently processing information.
In this embodiment, the parallel value is equal to the total number of message buffer queues. The step of generating the message buffer queue based on the parallel value specifically includes: the control buffer map generates message buffer queues based on the parallel values and causes each message buffer queue to be processed by a different thread. Wherein the message buffer queue is placed in a thread-safe concurrent dictionary.
In order to realize parallel control of the access controllers and ensure that the time sequence of the event information of the same controller is consistent. Each access controller maps a fixed message buffer queue.
S102: and receiving event information sent by the access controller, acquiring a message cache queue mapped by the access controller, and pushing the event information to the message cache queue.
In this embodiment, the access control processor and the access control controller may be connected through internet of things, 3G, 4G, WiFi, and other communication methods. When the entrance guard controller identifies events such as password input, card swiping operation and the like, corresponding event information is sent to the entrance guard processor in a wireless transmission or wired transmission mode.
And after receiving the event information, the access control processor identifies the access control device sending the event information, further obtains a message cache queue mapped by the access control device through the concurrency dictionary, and pushes the event information to the message cache queue.
In other embodiments, the access controller may not be mapped to a fixed message buffer queue, and after receiving the event information, the access controller and the receiving time corresponding to the event information are marked, the number of event information in each message buffer queue is obtained, and the event information is pushed to the message buffer queue with the least event information or lower than a preset value, so as to implement balanced distribution of the event information.
In this embodiment, after the step of pushing the event information to the message buffer queue, the method further includes: and consuming the event information according to the sequence of pushing the event information to the message buffer queue, and sending the consumed event information to the next processing flow.
In a specific embodiment, after the event information is processed, the cache consumption thread of the access control processor monitors the data change of the message cache queue of the cache in real time, and processes the event information immediately once new event information is pushed to the message cache queue.
In the above embodiment, the access control processor further monitors an increase or decrease condition of the access control, and if it is found that the access control is disconnected from or closed with the access control processor, the mapping between the access control and the message cache queue is removed. And if a new access controller is added, acquiring the number of the access controllers mapped by each message cache queue, and mapping the new access controller to the message cache queue with the least number of the mapped access controllers or less than a preset value so as to distribute the number of the access controllers mapped by each message cache queue in a balanced manner.
In other embodiments, when the number of access controllers mapped to one or more message buffer queues is lower than a preset value, the number of access controllers mapped to each message buffer queue may be reallocated.
The message pushing method is further explained by using a related execution program of the message pushing method based on the access control processor.
Figure BDA0003227208580000041
Figure BDA0003227208580000051
Figure BDA0003227208580000061
Figure BDA0003227208580000071
Figure BDA0003227208580000081
Figure BDA0003227208580000091
Figure BDA0003227208580000101
Figure BDA0003227208580000111
Figure BDA0003227208580000121
The design idea of the program is as follows:
1. acquiring the Cpu core number and the configured pressure value when a program is started;
2. calculating a parallel value (Cpu core number pressure value), and circularly creating a thread according to the parallel value;
3. each thread generates a thread-safe message buffer queue and automatically consumes the message buffer queue, and the message buffer queue is buffered in a thread-safe concurrent dictionary so as to be mapped with the gate inhibition controller;
4. and (3) starting a thread for monitoring the controller (actually, when the program is started, the thread is started in other processes), registering a callback event of the thread, and mapping the controller and the concurrent dictionary in the step (3) when the entrance guard controller is increased or decreased.
5. And the event receiving thread automatically enqueues the event to the associated message buffer queue according to the access controller of the event source when receiving the controller event.
6. After monitoring the queue message, the queue processing thread actively consumes and sends the queue message to the next processing flow.
Has the advantages that: according to the message pushing method based on the access control processor, the parallel value is obtained according to the core number and the preset pressure value of the access control processor, the message cache queue is generated by the parallel value, the access control controller is mapped to the fixed message cache queue, and the event information of the access control controller is pushed to the message cache queue for processing, so that the problem that threads are continuously increased and are blocked due to the fact that the processing capacity of the access control processor is exceeded can be solved, a large number of access control controllers can be controlled, new equipment does not need to be arranged, the installation cost and the use cost are reduced, and the user experience is improved.
Based on the same inventive concept, the present invention further provides an access control system, please refer to fig. 3 and 4, fig. 3 is a structural diagram of an embodiment of the access control system of the present invention; fig. 4 is a structural diagram of an embodiment of a message pushing method based on an access control processor executed by an access control system according to the present invention. The access control system of the present invention will be described with reference to fig. 3 and 4.
In this embodiment, the access control system includes an access controller, and a plurality of access controllers, where the access controller is in communication connection with the access controller, and the access control system implements the message pushing method based on the access controller as described below through the access controller.
S201: and acquiring a parallel value of the access control processor according to the core number of the access control processor and a preset pressure value, and generating a message cache queue based on the parallel value, wherein each access control controller is mapped with one message cache queue.
In this embodiment, the step of obtaining the parallel value of the access control processor according to the core number of the access control processor and the preset pressure value specifically includes: and acquiring a parallel value according to the product of the core number and a preset pressure value. The specific value of the preset pressure value can be specifically set according to the performance and the actual condition of the access control processor, and is not limited herein.
In a specific embodiment, the parallel value is the core number and the predetermined pressure value.
In this embodiment, the access control processor is a CPU, and in other embodiments, the access control processor may also be an SOC, a DSP, an MCU, or other intelligent chips capable of concurrently processing information.
In this embodiment, the parallel value is equal to the total number of message buffer queues. The step of generating the message buffer queue based on the parallel value specifically includes: the control buffer map generates message buffer queues based on the parallel values and causes each message buffer queue to be processed by a different thread. Wherein the message buffer queue is placed in a thread-safe concurrent dictionary.
In order to realize parallel control of the access controllers and ensure that the time sequence of the event information of the same controller is consistent. Each access controller maps a fixed message buffer queue.
S202: and receiving event information sent by the access controller, acquiring a message cache queue mapped by the access controller, and pushing the event information to the message cache queue.
In this embodiment, the access control processor and the access control controller may be connected through internet of things, 3G, 4G, WiFi, and other communication methods. When the entrance guard controller identifies events such as password input, card swiping operation and the like, corresponding event information is sent to the entrance guard processor in a wireless transmission or wired transmission mode.
And after receiving the event information, the access control processor identifies the access control device sending the event information, further obtains a message cache queue mapped by the access control device through the concurrency dictionary, and pushes the event information to the message cache queue.
In other embodiments, the access controller may not be mapped to a fixed message buffer queue, and after receiving the event information, the access controller and the receiving time corresponding to the event information are marked, the number of event information in each message buffer queue is obtained, and the event information is pushed to the message buffer queue with the least event information or lower than a preset value, so as to implement balanced distribution of the event information.
In this embodiment, after the step of pushing the event information to the message buffer queue, the method further includes: and consuming the event information according to the sequence of pushing the event information to the message buffer queue, and sending the consumed event information to the next processing flow.
In a specific embodiment, after the event information is processed, the cache consumption thread of the access control processor monitors the data change of the message cache queue of the cache in real time, and processes the event information immediately once new event information is pushed to the message cache queue.
In the above embodiment, the access control processor further monitors an increase or decrease condition of the access control, and if it is found that the access control is disconnected from or closed with the access control processor, the mapping between the access control and the message cache queue is removed. And if a new access controller is added, acquiring the number of the access controllers mapped by each message cache queue, and mapping the new access controller to the message cache queue with the least number of the mapped access controllers or less than a preset value so as to distribute the number of the access controllers mapped by each message cache queue in a balanced manner.
In other embodiments, when the number of access controllers mapped to one or more message buffer queues is lower than a preset value, the number of access controllers mapped to each message buffer queue may be reallocated.
The message pushing method is further explained by using a related execution program of the message pushing method based on the access control processor.
Figure BDA0003227208580000141
Figure BDA0003227208580000151
Figure BDA0003227208580000161
Figure BDA0003227208580000171
Figure BDA0003227208580000181
Figure BDA0003227208580000191
Figure BDA0003227208580000201
Figure BDA0003227208580000211
The design idea of the program is as follows:
2. acquiring the Cpu core number and the configured pressure value when a program is started;
2. calculating a parallel value (Cpu core number pressure value), and circularly creating a thread according to the parallel value;
3. each thread generates a thread-safe message buffer queue and automatically consumes the message buffer queue, and the message buffer queue is buffered in a thread-safe concurrent dictionary so as to be mapped with the gate inhibition controller;
4. and (3) starting a thread for monitoring the controller (actually, when the program is started, the thread is started in other processes), registering a callback event of the thread, and mapping the controller and the concurrent dictionary in the step (3) when the entrance guard controller is increased or decreased.
5. And the event receiving thread automatically enqueues the event to the associated message buffer queue according to the access controller of the event source when receiving the controller event.
6. After monitoring the queue message, the queue processing thread actively consumes and sends the queue message to the next processing flow.
In the embodiments provided in the present invention, it should be understood that the disclosed apparatus/terminal and method may be implemented in other ways. For example, the above-described device/terminal embodiments are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or may be integrated into another storage device, or some features may be omitted, or not executed. In addition, the shown or discussed coupling or direct coupling or communication connection between each other may be through some codes, indirect coupling or communication connection of devices or units, and may be in electrical, mechanical or other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
The integrated module, if implemented in the form of a software functional unit and sold or used as a separate product, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow in the method according to the above embodiments may be implemented by a computer program, which may be stored in a computer-readable storage medium and used to implement the steps of the above embodiments. Wherein the program data comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer memory, Read-only memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, electrical signals, software distribution medium, and the like. It should be noted that the computer readable medium may contain any suitable combination of components that can be modified in accordance with the requirements of statutory and patent practice in the jurisdiction, for example, in some jurisdictions, computer readable media may not include electrical carrier signals or electrical signals in accordance with statutory and patent practice.
The embodiments in the present description are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present invention. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (10)

1. The message pushing method based on the access control processor is characterized by comprising the following steps of:
s101: acquiring a parallel value of an access control processor according to the core number of the access control processor and a preset pressure value, and generating a message cache queue based on the parallel value, wherein each access control controller is mapped with one message cache queue;
s102: and receiving event information sent by the access controller, acquiring a message cache queue mapped by the access controller, and pushing the event information to the message cache queue.
2. The message pushing method based on the entrance guard processor of claim 1, wherein the step of obtaining the parallel value of the entrance guard processor according to the core number of the entrance guard processor and the preset pressure value specifically comprises:
and acquiring a parallel value according to the product of the core number and a preset pressure value.
3. The message pushing method based on entrance guard processor as claimed in claim 1, wherein the parallel value is equal to the total number of the message buffer queue.
4. The message pushing method based on the access control processor according to claim 1, wherein the step of generating the message buffer queue based on the parallel value specifically comprises:
and the control buffer area mapping generates message buffer queues according to the parallel values, and each message buffer queue is processed by different threads.
5. The message pushing method based on the entrance guard processor as claimed in claim 1, wherein the step of pushing the event information to the message buffer queue further comprises:
and consuming the event information according to the sequence of pushing the event information to the message buffer queue, and sending the consumed event information to the next processing flow.
6. The utility model provides an access control system, its characterized in that, access control system includes entrance guard's treater, a plurality of access control ware, entrance guard's treater with access control ware communication connection, access control system passes through the entrance guard's treater realizes as follows based on entrance guard's treater's message propelling movement method:
s201: acquiring a parallel value of an access control processor according to the core number of the access control processor and a preset pressure value, and generating a message cache queue based on the parallel value, wherein each access control controller is mapped with one message cache queue;
s202: and receiving event information sent by the access controller, acquiring a message cache queue mapped by the access controller, and pushing the event information to the message cache queue.
7. The access control system of claim 6, wherein the step of obtaining the parallel value of the access processor according to the core number of the access processor and the preset pressure value specifically comprises:
and acquiring a parallel value according to the product of the core number and a preset pressure value.
8. The access control system of claim 6, wherein the parallel value is equal to a total number of the message buffer queues.
9. The door access control system of claim 6, wherein the step of generating a message buffer queue based on the parallel value specifically comprises:
and the control buffer area mapping generates message buffer queues according to the parallel values, and each message buffer queue is processed by different threads.
10. The access control system of claim 6, wherein the step of pushing the event information to the message buffer queue is further followed by:
and consuming the event information according to the sequence of pushing the event information to the message buffer queue, and sending the consumed event information to the next processing flow.
CN202110974630.9A 2021-08-24 2021-08-24 Message pushing method based on access control processor and access control system Pending CN113823014A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110974630.9A CN113823014A (en) 2021-08-24 2021-08-24 Message pushing method based on access control processor and access control system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110974630.9A CN113823014A (en) 2021-08-24 2021-08-24 Message pushing method based on access control processor and access control system

Publications (1)

Publication Number Publication Date
CN113823014A true CN113823014A (en) 2021-12-21

Family

ID=78913531

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110974630.9A Pending CN113823014A (en) 2021-08-24 2021-08-24 Message pushing method based on access control processor and access control system

Country Status (1)

Country Link
CN (1) CN113823014A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160110223A1 (en) * 2011-09-20 2016-04-21 Intel Corporation Multi-threaded queuing system for pattern matching
CN108021434A (en) * 2017-12-06 2018-05-11 浪潮软件集团有限公司 Data processing apparatus, method of processing data thereof, medium, and storage controller
CN111614577A (en) * 2020-05-11 2020-09-01 湖南智领通信科技有限公司 Multi-communication trust service management method and device and computer equipment
CN111813805A (en) * 2019-04-12 2020-10-23 中国移动通信集团河南有限公司 Data processing method and device
CN111930530A (en) * 2020-06-24 2020-11-13 山东浪潮通软信息科技有限公司 Equipment message processing method, device and medium based on Internet of things

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160110223A1 (en) * 2011-09-20 2016-04-21 Intel Corporation Multi-threaded queuing system for pattern matching
CN108021434A (en) * 2017-12-06 2018-05-11 浪潮软件集团有限公司 Data processing apparatus, method of processing data thereof, medium, and storage controller
CN111813805A (en) * 2019-04-12 2020-10-23 中国移动通信集团河南有限公司 Data processing method and device
CN111614577A (en) * 2020-05-11 2020-09-01 湖南智领通信科技有限公司 Multi-communication trust service management method and device and computer equipment
CN111930530A (en) * 2020-06-24 2020-11-13 山东浪潮通软信息科技有限公司 Equipment message processing method, device and medium based on Internet of things

Similar Documents

Publication Publication Date Title
CN104169832B (en) Providing energy efficient turbo operation of a processor
US10733810B2 (en) Method and system for managing parking violations by vehicles in parking areas in real-time
CN101271444B (en) Multi-component self-organizing soft-connection cluster computer intelligence resource management method
CN105359460A (en) Cloud spectrum management system
CN113096435B (en) Shared parking place determining method, device, equipment and storage medium
CN115730790A (en) Charging configuration method, device and equipment based on edge calculation and storage medium
CN112084486A (en) User information verification method and device, electronic equipment and storage medium
CN109800261A (en) Dynamic control method, device and the relevant device of double data library connection pool
CN109345373A (en) Check and write off method for prewarning risk, device, electronic equipment and computer-readable medium
CN104375621A (en) Dynamic weighting load assessment method based on self-adaptive threshold values in cloud computing
CN106533619B (en) Distributed second-generation identity card management system based on cloud platform
CN113823014A (en) Message pushing method based on access control processor and access control system
CN112862319B (en) Order scheduling method, device, equipment and storage medium
CN104111876A (en) Dynamic resource management device and method based on Oracle resource plan
CN112927064A (en) Deferred payment data processing method and device, electronic equipment and storage medium
Jain et al. Finite population retrial queueing model with threshold recovery, geometric arrivals and impatient customers
CN112541640A (en) Resource authority management method and device, electronic equipment and computer storage medium
CN202422229U (en) Wireless fingerprint attendance system
CN115619138B (en) Method, device, equipment and medium for building management and control operation based on digital twin
CN108446174B (en) Multi-core job scheduling method based on resource pre-allocation and public boot agent
CN111144634A (en) Method and device for predicting power price
CN114219114B (en) Reservation big data-based tourism passenger flow management and control method and cloud service platform
CN114518798A (en) Low-power-consumption control method and device for equipment cluster
CN113763609A (en) Access control method and access control system based on permission verification
CN115080197A (en) Computing task scheduling method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20211221

RJ01 Rejection of invention patent application after publication