CN111144734A - Cache-based fair dispatching method and device and computer readable storage medium - Google Patents

Cache-based fair dispatching method and device and computer readable storage medium Download PDF

Info

Publication number
CN111144734A
CN111144734A CN201911341919.6A CN201911341919A CN111144734A CN 111144734 A CN111144734 A CN 111144734A CN 201911341919 A CN201911341919 A CN 201911341919A CN 111144734 A CN111144734 A CN 111144734A
Authority
CN
China
Prior art keywords
dispatching
cache
personnel
scoring
request
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911341919.6A
Other languages
Chinese (zh)
Other versions
CN111144734B (en
Inventor
李志伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ping An Property and Casualty Insurance Company of China Ltd
Original Assignee
Ping An Property and Casualty Insurance Company of China Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ping An Property and Casualty Insurance Company of China Ltd filed Critical Ping An Property and Casualty Insurance Company of China Ltd
Priority to CN201911341919.6A priority Critical patent/CN111144734B/en
Publication of CN111144734A publication Critical patent/CN111144734A/en
Application granted granted Critical
Publication of CN111144734B publication Critical patent/CN111144734B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0631Resource planning, allocation, distributing or scheduling for enterprises or organisations
    • G06Q10/06311Scheduling, planning or task assignment for a person or group
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0639Performance analysis of employees; Performance analysis of enterprise or organisation operations
    • G06Q10/06393Score-carding, benchmarking or key performance indicator [KPI] analysis
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y04INFORMATION OR COMMUNICATION TECHNOLOGIES HAVING AN IMPACT ON OTHER TECHNOLOGY AREAS
    • Y04SSYSTEMS INTEGRATING TECHNOLOGIES RELATED TO POWER NETWORK OPERATION, COMMUNICATION OR INFORMATION TECHNOLOGIES FOR IMPROVING THE ELECTRICAL POWER GENERATION, TRANSMISSION, DISTRIBUTION, MANAGEMENT OR USAGE, i.e. SMART GRIDS
    • Y04S10/00Systems supporting electrical power generation, transmission or distribution
    • Y04S10/50Systems or methods supporting the power network operation or management, involving a certain degree of interaction with the load-side end user applications

Landscapes

  • Business, Economics & Management (AREA)
  • Human Resources & Organizations (AREA)
  • Engineering & Computer Science (AREA)
  • Strategic Management (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Economics (AREA)
  • Development Economics (AREA)
  • Educational Administration (AREA)
  • Operations Research (AREA)
  • Marketing (AREA)
  • Game Theory and Decision Science (AREA)
  • Quality & Reliability (AREA)
  • Tourism & Hospitality (AREA)
  • Physics & Mathematics (AREA)
  • General Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Financial Or Insurance-Related Operations Such As Payment And Settlement (AREA)

Abstract

The invention relates to a fair dispatching method based on cache, which comprises the following steps: receiving historical dispatching records, classifying to obtain a dispatching category set and an original personnel information set, classifying and grading information items of the original personnel information set to obtain a personnel information evaluation set, setting a mark pointer of a personnel cache queue according to the dispatching category set, storing the practitioner experiential personnel information set into the personnel cache queue according to the mark pointer, receiving a dispatching request input by a user, extracting a name of a dispatcher from the personnel cache queue according to a pre-constructed dispatching heredity and a preset cache dispatching rule, and completing the dispatching request. The invention also provides a fair dispatching device based on the cache and a computer readable storage medium. The invention can realize the high-efficiency fair dispatching function based on the cache.

Description

Cache-based fair dispatching method and device and computer readable storage medium
Technical Field
The invention relates to the technical field of computers, in particular to a fair dispatching method and device based on cache and a computer readable storage medium.
Background
With the technological progress and the human development, the dispatching service quality is required to be higher and higher. For example, after a customer performs dispatching by telephone, the dispatching system immediately responds to the dispatching request of the customer, but as the dispatching steps are complicated and the number of dispatching requests increases, most dispatching systems have delayed response speed and increased system calculation pressure during dispatching on the premise that dispatching personnel and dispatching requests are stored in a dispatching platform and a database based on real-time calculation, an efficient dispatching method is urgently needed.
Disclosure of Invention
The invention provides a fair dispatching method and device based on cache and a computer readable storage medium, and mainly aims to carry out reasonable dispatching on personnel according to the caching technology.
In order to achieve the above object, the present invention provides a fair dispatch method based on cache, which comprises:
receiving historical dispatching records, extracting an original personnel information set based on the historical dispatching records, classifying information items of the original personnel information set to obtain a standard personnel information set, and classifying the historical dispatching records to obtain a dispatching classification set;
scoring the standard personnel information set according to a pre-selection constructed scoring rule to obtain a personnel information scoring set;
dividing the standard staff information set based on the working experience and the staff information scoring set to obtain a working experience staff information set;
setting a mark pointer of a personnel cache queue according to the dispatching class set, and storing the experienced personnel information set into the personnel cache queue according to the mark pointer;
receiving a dispatching request input by a user, calculating according to a pre-constructed dispatching genetic model to obtain a dispatching person of the dispatching request, selecting a mark pointer corresponding to the dispatching request from the person cache queue, and extracting a name of the dispatching person from the person cache queue according to a preset cache dispatching rule according to the mark pointer corresponding to the dispatching request and the dispatching person to complete the dispatching request.
Optionally, the scoring the standard personal information set based on a pre-selection constructed scoring rule to obtain a personal information scoring set, including:
performing experience scoring on the standard personnel information set based on the information items to obtain experience scoring sets;
and optimizing the experience evaluation set based on a pre-constructed optimization evaluation rule to obtain the personnel information evaluation set.
Optionally, calculating, according to a pre-constructed dispatching genetic model, to obtain the dispatching personnel of the dispatching request, including:
calculating the minimum delay time of the dispatching request according to the number of the service links of the dispatching request;
calculating the fitness of the dispatching request by using the minimum delay time;
calculating to obtain a dispatching probability set of the dispatching request according to the fitness;
and selecting the dispatching personnel corresponding to the largest dispatching probability in the dispatching probability set.
Optionally, the method for calculating the minimum delay time includes:
Figure BDA0002331391320000021
wherein C is the minimum delay time, n is the service ring number of the dispatching request input by the user, tnRepresenting the actual processing time, x, of each business segmentnValue 0 or 1, representing whether each business link is in problem, dnIndicating the standard processing time for the business segment.
Optionally, the fitness calculation method includes:
Figure BDA0002331391320000022
wherein f isjRepresents the fitness, j represents a dispatching request number input by a user,
Figure BDA0002331391320000023
representing a fitness correction value;
the calculation method of the dispatching probability set comprises the following steps:
Figure BDA0002331391320000024
wherein, PjAnd m represents the dispatching probability set, and the dispatching classification number in the dispatching classification set.
In addition, in order to achieve the above object, the present invention further provides a fair cache-based dispatch apparatus, including a memory and a processor, where the memory stores a fair cache-based dispatch program operable on the processor, and when the fair cache-based dispatch program is executed by the processor, the apparatus implements the following steps:
receiving historical dispatching records, extracting an original personnel information set based on the historical dispatching records, classifying information items of the original personnel information set to obtain a standard personnel information set, and classifying the historical dispatching records to obtain a dispatching classification set;
scoring the standard personnel information set according to a pre-selection constructed scoring rule to obtain a personnel information scoring set;
dividing the standard staff information set based on the working experience and the staff information scoring set to obtain a working experience staff information set;
setting a mark pointer of a personnel cache queue according to the dispatching class set, and storing the experienced personnel information set into the personnel cache queue according to the mark pointer;
receiving a dispatching request input by a user, calculating according to a pre-constructed dispatching genetic model to obtain a dispatching person of the dispatching request, selecting a mark pointer corresponding to the dispatching request from the person cache queue, and extracting a name of the dispatching person from the person cache queue according to a preset cache dispatching rule according to the mark pointer corresponding to the dispatching request and the dispatching person to complete the dispatching request.
Optionally, the scoring the standard personal information set based on a pre-selection constructed scoring rule to obtain a personal information scoring set, including:
performing experience scoring on the standard personnel information set based on the information items to obtain experience scoring sets;
and optimizing the experience evaluation set based on a pre-constructed optimization evaluation rule to obtain the personnel information evaluation set.
Alternatively,
calculating the dispatching personnel of the dispatching request according to a pre-constructed dispatching genetic model, wherein the method comprises the following steps:
calculating the minimum delay time of the dispatching request according to the number of the service links of the dispatching request;
calculating the fitness of the dispatching request by using the minimum delay time;
calculating to obtain a dispatching probability set of the dispatching request according to the fitness;
and selecting the dispatching personnel corresponding to the largest dispatching probability in the dispatching probability set.
Optionally, the method for calculating the minimum delay time includes:
Figure BDA0002331391320000031
wherein C is the minimum delay time, n is the service ring number of the dispatching request input by the user, tnRepresenting the actual processing time, x, of each business segmentnValue 0 or 1, representing whether each business link is in problem, dnIndicating the standard processing time for the business segment.
In addition, to achieve the above object, the present invention also provides a computer readable storage medium having a cache-based fair dispatching program stored thereon, where the cache-based fair dispatching program is executable by one or more processors to implement the steps of the cache-based fair dispatching method as described above.
The invention scores the personnel information set according to the scoring rule to obtain the personnel information scoring set, the scoring specification has quick and simple ranking effect, furthermore, the dispatching probability calculation is carried out on the dispatching request input by the user according to the preselection constructed dispatching genetic model and the dispatching category set to obtain the dispatching classification result, the whole dispatching probability calculation method is quick, does not occupy larger calculation resources, and is more important. Therefore, the fair dispatching method and device based on the cache and the computer readable storage medium can realize the efficient dispatching distribution problem.
Drawings
Fig. 1 is a flowchart illustrating a fair dispatch method based on a cache according to an embodiment of the present invention;
fig. 2 is a schematic diagram of an internal structure of a fair dispatch apparatus based on a cache according to an embodiment of the present invention;
fig. 3 is a block diagram illustrating a cache-based fair dispatch procedure in the cache-based fair dispatch apparatus according to an embodiment of the present invention.
Fig. 4 is a schematic diagram illustrating task assignment in a cache-based fair dispatching method according to an embodiment of the present invention.
The implementation, functional features and advantages of the objects of the present invention will be further explained with reference to the accompanying drawings.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
The invention provides a cache-based fair dispatching method. Fig. 1 is a schematic flow chart of a fair dispatch method based on a cache according to an embodiment of the present invention. The method may be performed by an apparatus, which may be implemented by software and/or hardware.
In this embodiment, the fair dispatching method based on cache includes:
s1, receiving historical dispatching records, extracting an original personnel information set based on the historical dispatching records, classifying information items of the original personnel information set to obtain a standard personnel information set, and classifying the historical dispatching records to obtain a dispatching classification set.
Preferably, the historical dispatching record is based on the personnel allocation record of dispatching before the current time. For example, in the insurance claim case, a complete insurance claim processing flow comprises 7 claim flows of survey, plan, survey, accountability, loss assessment, claim verification and recheck, and the original personnel information set extracted based on the historical assignment record of the insurance claim case refers to the information of the name, the duty, the working time, the insurance time, the processing time of each flow and the like of the task processing personnel in all claim cases processed by the insurance company before.
Preferably, the information entry classification is classification based on different historical dispatching records, for example, 7 processes in the insurance claim case are coordinated with related personnel, and according to different processes, names, duties and working times of 7 processes, and result information of the insurance claim case, including information entries of the insurance claim case, such as the time of occurrence, customer satisfaction and creative benefit, are classified to obtain different standard personnel information sets in the 7 processes.
Preferably, the historical dispatching records are classified to obtain dispatching class sets according to the fields to which the historical dispatching records belong, for example, in an insurance claim case, the insurance claim case can be classified into 4 types, namely, a property insurance type, a personal insurance type, a responsibility insurance type and a credit insurance type, and all the types (the property insurance type, the personal insurance type, the responsibility insurance type and the credit insurance type) comprise a plurality of standard personnel information sets of corresponding types for processing. Such as dividing the historical dispatch record a into the liability insurance class, and the historical dispatch record a is completed by 7 process handlers (i.e. the members in the standard staff information set).
And S2, scoring the standard personnel information set according to a pre-selected scoring rule to obtain a personnel information scoring set.
Preferably, the pre-selected scoring rule is obtained by inviting experienced practitioners or fitting based on machine learning methods such as gradient regression and the like according to the industry where the historical dispatching records are located.
Preferably, the scoring the standard personal information set based on the pre-selection established scoring rule to obtain a personal information scoring set, includes: and performing experience scoring on the standard personnel information set based on the information items to obtain experience scoring sets, and optimizing the experience scoring sets based on a pre-constructed optimization scoring rule to obtain the personnel information scoring sets.
As in the insurance claim case: the risk time is less than or equal to 1 hour, the risk time of the scoring rule is 3 points, the risk time is less than or equal to 2 hours after 1 hour, the risk time is 2 points after 2 hours, and the risk time is 1 point after the scoring rule; the customer satisfaction score is "dissatisfied", the rating rule is quantized to 1, the "customer satisfaction" score is "satisfied", the rating rule is quantized to 2, the "customer satisfaction" score is "very satisfied", and the rating rule is quantized to 3; and creating a profit project, wherein when the obtained profit accounts for less than 20% of the venture amount, the scoring rule is quantized to 1 point, when the obtained profit accounts for less than 60% of the venture amount and is less than 20%, the scoring rule is quantized to 2 points, and when the obtained profit accounts for less than 60% of the venture amount, the scoring rule is quantized to 3 points.
Further, the experience scoring set under the condition of obtaining a plurality of information items is optimized based on the pre-constructed optimization scoring rule. For example, in the insurance claim case, the optimization scoring rules of m for the terms of personnel information score, time to leave insurance, "T", "customer satisfaction" F "and" earning "P can adopt the arithmetic mean of the quantitative scores, namely:
Figure BDA0002331391320000061
preferably, the personnel information corresponding to the 7 processes extracted from a certain insurance claim case is: surveying: zhang Xiaohong business manager for 12 years; setting up a plan: wang Lei ordinary people for 3 years; and (3) investigation: the Li ordinary people are 7 years old; nuclear power: hostwork business prison 17 years; loss assessment: common people of Xiao Qiong are 9 years old; and (4) check and claim: putting aside the wall service manager for 12 years; rechecking: gong Yijie business manager for 10 years. Whereby the result information is: the time of taking out the insurance: 30 minutes; customer satisfaction: satisfying; the benefits are created: 40%, the result information quantization is divided into: the time of taking out the insurance: 3 points, customer satisfaction: 2, creating benefits: score 2, based on the optimized scoring rule, is obtained as: (3+2+2)/3 ═ 2.33.
And S3, dividing the standard staff information set based on the experience of the practitioner to obtain an experience staff information set of the practitioner.
In a preferred embodiment of the present invention, the dividing of the working experience refers to dividing the standard staff information set according to the age and position of the working experience of the staff. For example, in the case of insurance claim settlement, a person who has a total staff member and who has a staff member information score of 2.5 or more is divided into one working experience level, the average working time is 10 years or more, the person information score of 2 or more is divided into one working experience level, the average working time is 3 years or more, and the person information score of 1.6 or more is divided into one working experience level, and the like, thereby obtaining a plurality of working experience person information sets.
And S4, setting a mark pointer of a staff buffer queue according to the dispatching class set, and storing the experienced staff information set into the staff buffer queue according to the mark pointer.
In a preferred embodiment of the present invention, for example, in the insurance industry, the dispatching category set can be divided into 4 categories, namely, a property insurance category, a life insurance category, a responsibility insurance category, and a credit insurance category, and the personnel cache queue is divided into 4 task queues according to the 4 categories, wherein each queue is represented by a flag pointer. Preferably, the flag pointer may set a priority queue, such as a task queue set to 00 for depositing property insurance category, a task queue set to 01 for depositing life insurance category, a task queue set to 10 for depositing responsibility insurance category, and a task queue set to 11 for depositing credit insurance category.
S5, receiving a dispatching request input by a user, calculating according to a pre-constructed dispatching genetic model to obtain a dispatching personnel of the dispatching request, selecting a mark pointer corresponding to the dispatching request from the personnel cache queue, extracting a name of the dispatching personnel from the personnel cache queue according to a preset cache dispatching rule according to the mark pointer corresponding to the dispatching request and the dispatching personnel, and completing the dispatching request.
Preferably, the calculating the dispatching probability of the dispatching request input by the user based on the preselectively constructed dispatching genetic model and the dispatching category set to obtain a dispatching classification result includes: calculating the minimum delay time of the dispatching request according to the number of the business links of the dispatching request, calculating the fitness of the dispatching request by using the minimum delay time, calculating to obtain a dispatching probability set of the dispatching request according to the fitness, and selecting the dispatching category corresponding to the maximum dispatching probability in the dispatching probability set to obtain the dispatching personnel.
Preferably, the method for calculating the minimum delay time includes:
Figure BDA0002331391320000071
wherein n is the service ring number of the dispatching request input by the user, tnRepresenting the actual processing time, x, of each business segmentnRepresenting whether the business link has problems or not, if the business link has problems, the business link has 0, and if the business link does not have problems, the business link has 1, dnIndicating the standard processing time for the business segment.
Preferably, the fitness calculation method includes:
Figure BDA0002331391320000072
wherein f isjRepresents the fitness, j represents a dispatching request number input by a user,
Figure BDA0002331391320000073
indicating the fitness correction value.
Preferably, the method for calculating the dispatch probability comprises:
Figure BDA0002331391320000074
where m represents the number of dispatch classifications described in S1.
In a preferred embodiment of the present invention, for example, in an insurance claim case, a user dispatch request B is received, the user dispatch request B is classified into property insurance classes based on S4, and according to the sequence of the property insurance class business processes, the method comprises: surveying → planning → surveying → accountability → loss → check → recheck, and through the first-in first-out preset cache assignment rule, pop up the process handler from the priority queue of the property insurance class in turn to perform the assignment task, as follows:
the demand of property insurance → the matching flag is 01 → the priority queue A2 is selected. The whole flow is shown in figure 4.
The first step is a 'survey' flow → POP command, and a 'Zhang Xiaohong business manager 12 years' is popped up, which indicates that: the business manager Zhang Xiaohong is responsible for processing the process of 'surveying' of the property insurance class. After the 'survey' process is finished, POP the 'Wang Lei ordinary people for 3 years' by using a POP command in turn, assign the 'Wang Lei' to be responsible for the 'case setting' work of the second step, and analogize the rest processes until the claim settlement service is finished.
Further, after a certain claim settlement service is completed, all the personnel in the corresponding priority queue are popped up by the POP command, and the corresponding priority queue is restored to be in a state with an empty storage state and is ready to process the next claim settlement service.
The invention also provides a fair dispatching device based on the cache. Fig. 2 is a schematic diagram illustrating an internal structure of a fair cache-based dispatching apparatus according to an embodiment of the present invention.
In this embodiment, the fair cache-based dispatching device 1 may be a PC (Personal Computer), a terminal device such as a smart phone, a tablet Computer, or a mobile Computer, or may be a server. The cache-based fair dispatching apparatus 1 at least comprises a memory 11, a processor 12, a communication bus 13, and a network interface 14.
The memory 11 includes at least one type of readable storage medium, which includes a flash memory, a hard disk, a multimedia card, a card type memory (e.g., SD or DX memory, etc.), a magnetic memory, a magnetic disk, an optical disk, and the like. The memory 11 may in some embodiments be an internal storage unit of the cache-based fair dispatching apparatus 1, such as a hard disk of the cache-based fair dispatching apparatus 1. The memory 11 may also be an external storage device of the cache-based fair dispatch apparatus 1 in other embodiments, such as a plug-in hard disk, a Smart Memory Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like, provided on the cache-based fair dispatch apparatus 1. Further, the memory 11 may also include both an internal storage unit of the cache-based fair dispatching apparatus 1 and an external storage device. The memory 11 may be used not only to store application software installed in the cache-based fair dispatching apparatus 1 and various types of data, such as the code of the cache-based fair dispatching program 01, but also to temporarily store data that has been output or is to be output.
Processor 12, which in some embodiments may be a Central Processing Unit (CPU), controller, microcontroller, microprocessor or other data Processing chip, is configured to execute program code stored in memory 11 or process data, such as executing fair cache-based dispatch program 01.
The communication bus 13 is used to realize connection communication between these components.
The network interface 14 may optionally include a standard wired interface, a wireless interface (e.g., WI-FI interface), typically used to establish a communication link between the apparatus 1 and other electronic devices.
Optionally, the apparatus 1 may further comprise a user interface, which may comprise a Display (Display), an input unit such as a Keyboard (Keyboard), and optionally a standard wired interface, a wireless interface. Alternatively, in some embodiments, the display may be an LED display, a liquid crystal display, a touch-sensitive liquid crystal display, an OLED (Organic Light-Emitting Diode) touch device, or the like. The display, which may also be referred to as a display screen or display unit, is suitable for displaying information processed in the cache-based fair dispatching apparatus 1 and for displaying a visual user interface.
While fig. 2 shows only a cache-based fair dispatching apparatus 1 having components 11-14 and a cache-based fair dispatching program 01, those skilled in the art will appreciate that the structure shown in fig. 1 does not constitute a limitation of the cache-based fair dispatching apparatus 1, and may include fewer or more components than shown, or some components in combination, or a different arrangement of components.
In the embodiment of the apparatus 1 shown in fig. 2, a cache-based fair dispatch program 01 is stored in the memory 11; when the processor 12 executes the cache-based fair dispatch program 01 stored in the memory 11, the following steps are implemented:
the method comprises the steps of firstly, receiving historical dispatching records, extracting an original personnel information set based on the historical dispatching records, classifying information items of the original personnel information set to obtain a standard personnel information set, and classifying the historical dispatching records to obtain a dispatching classification set.
Preferably, the historical dispatching record is based on the personnel allocation record of dispatching before the current time. For example, in the insurance claim case, a complete insurance claim processing flow comprises 7 claim flows of survey, plan, survey, accountability, loss assessment, claim verification and recheck, and the original personnel information set extracted based on the historical assignment record of the insurance claim case refers to the information of the name, the duty, the working time, the insurance time, the processing time of each flow and the like of the task processing personnel in all claim cases processed by the insurance company before.
Preferably, the information entry classification is classification based on different historical dispatching records, for example, 7 processes in the insurance claim case are coordinated with related personnel, and according to different processes, names, duties and working times of 7 processes, and result information of the insurance claim case, including information entries of the insurance claim case, such as the time of occurrence, customer satisfaction and creative benefit, are classified to obtain different standard personnel information sets in the 7 processes.
Preferably, the historical dispatching records are classified to obtain dispatching class sets according to the fields to which the historical dispatching records belong, for example, in an insurance claim case, the insurance claim case can be classified into 4 types, namely, a property insurance type, a personal insurance type, a responsibility insurance type and a credit insurance type, and all the types (the property insurance type, the personal insurance type, the responsibility insurance type and the credit insurance type) comprise a plurality of standard personnel information sets of corresponding types for processing. Such as dividing the historical dispatch record a into the liability insurance class, and the historical dispatch record a is completed by 7 process handlers (i.e. the members in the standard staff information set).
And secondly, scoring the standard personnel information set according to a pre-selection constructed scoring rule to obtain a personnel information scoring set.
Preferably, the pre-selected scoring rule is obtained by inviting experienced practitioners or fitting based on machine learning methods such as gradient regression and the like according to the industry where the historical dispatching records are located.
Preferably, the scoring the standard personal information set based on the pre-selection established scoring rule to obtain a personal information scoring set, includes: and performing experience scoring on the standard personnel information set based on the information items to obtain experience scoring sets, and optimizing the experience scoring sets based on a pre-constructed optimization scoring rule to obtain the personnel information scoring sets.
As in the insurance claim case: the risk time is less than or equal to 1 hour, the risk time of the scoring rule is 3 points, the risk time is less than or equal to 2 hours after 1 hour, the risk time is 2 points after 2 hours, and the risk time is 1 point after the scoring rule; the customer satisfaction score is "dissatisfied", the rating rule is quantized to 1, the "customer satisfaction" score is "satisfied", the rating rule is quantized to 2, the "customer satisfaction" score is "very satisfied", and the rating rule is quantized to 3; and creating a profit project, wherein when the obtained profit accounts for less than 20% of the venture amount, the scoring rule is quantized to 1 point, when the obtained profit accounts for less than 60% of the venture amount and is less than 20%, the scoring rule is quantized to 2 points, and when the obtained profit accounts for less than 60% of the venture amount, the scoring rule is quantized to 3 points.
Further, the experience scoring set under the condition of obtaining a plurality of information items is optimized based on the pre-constructed optimization scoring rule. For example, in the insurance claim case, the optimization scoring rules of m for the terms of personnel information score, time to leave insurance, "T", "customer satisfaction" F "and" earning "P can adopt the arithmetic mean of the quantitative scores, namely:
Figure BDA0002331391320000111
preferably, the personnel information corresponding to the 7 processes extracted from a certain insurance claim case is: surveying: zhang Xiaohong business manager for 12 years; setting up a plan: wang Lei ordinary people for 3 years; and (3) investigation: the Li ordinary people are 7 years old; nuclear power: hostwork business prison 17 years; loss assessment: common people of Xiao Qiong are 9 years old; and (4) check and claim: putting aside the wall service manager for 12 years; rechecking: gong Yijie business manager for 10 years. Whereby the result information is: the time of taking out the insurance: 30 minutes; customer satisfaction: satisfying; the benefits are created: 40%, the result information quantization is divided into: the time of taking out the insurance: 3 points, customer satisfaction: 2, creating benefits: score 2, based on the optimized scoring rule, is obtained as: (3+2+2)/3 ═ 2.33.
And thirdly, dividing the standard staff information set based on the working experience to obtain a working experience staff information set.
In a preferred embodiment of the present invention, the dividing of the working experience refers to dividing the standard staff information set according to the age and position of the working experience of the staff. For example, in the case of insurance claim settlement, a person who has a total staff member and who has a staff member information score of 2.5 or more is divided into one working experience level, the average working time is 10 years or more, the person information score of 2 or more is divided into one working experience level, the average working time is 3 years or more, and the person information score of 1.6 or more is divided into one working experience level, and the like, thereby obtaining a plurality of working experience person information sets.
And step four, setting a mark pointer of a personnel cache queue according to the dispatching class set, and storing the practitioner experience personnel information set into the personnel cache queue according to the mark pointer.
In a preferred embodiment of the present invention, for example, in the insurance industry, the dispatching category set can be divided into 4 categories, namely, a property insurance category, a life insurance category, a responsibility insurance category, and a credit insurance category, and the personnel cache queue is divided into 4 task queues according to the 4 categories, wherein each queue is represented by a flag pointer. Preferably, the flag pointer may set a priority queue, such as a task queue set to 00 for depositing property insurance category, a task queue set to 01 for depositing life insurance category, a task queue set to 10 for depositing responsibility insurance category, and a task queue set to 11 for depositing credit insurance category.
And step five, receiving a dispatching request input by a user, calculating according to a pre-constructed dispatching genetic model to obtain a dispatching person of the dispatching request, selecting a mark pointer corresponding to the dispatching request from the person cache queue, and extracting a name of the dispatching person from the person cache queue according to a preset cache dispatching rule according to the mark pointer corresponding to the dispatching request and the dispatching person to complete the dispatching request.
Preferably, the calculating the dispatching probability of the dispatching request input by the user based on the preselectively constructed dispatching genetic model and the dispatching category set to obtain a dispatching classification result includes: calculating the minimum delay time of the dispatching request according to the number of the business links of the dispatching request, calculating the fitness of the dispatching request by using the minimum delay time, calculating to obtain a dispatching probability set of the dispatching request according to the fitness, and selecting the dispatching category corresponding to the maximum dispatching probability in the dispatching probability set to obtain the dispatching personnel.
Preferably, the method for calculating the minimum delay time includes:
Figure BDA0002331391320000121
wherein n is the service ring number of the dispatching request input by the user, tnRepresenting the actual processing time, x, of each business segmentnRepresenting whether the business link has problems or not, if the business link has problems, the business link has 0, and if the business link does not have problems, the business link has 1, dnIndicating the standard processing time for the business segment.
Preferably, the fitness calculation method includes:
Figure BDA0002331391320000122
wherein f isjRepresents the fitness, j represents a dispatching request number input by a user,
Figure BDA0002331391320000123
indicating the fitness correction value.
Preferably, the method for calculating the dispatch probability comprises:
Figure BDA0002331391320000124
where m represents the number of dispatch classifications described in S1.
In a preferred embodiment of the present invention, for example, in an insurance claim case, a user dispatch request B is received, the user dispatch request B is classified into property insurance classes based on S4, and according to the sequence of the property insurance class business processes, the method comprises: surveying → planning → surveying → accountability → loss → check → recheck, and through the first-in first-out preset cache assignment rule, pop up the process handler from the priority queue of the property insurance class in turn to perform the assignment task, as follows:
the demand of property insurance → the matching flag is 01 → the priority queue A2 is selected. The whole flow is shown in figure 4.
The first step is a 'survey' flow → POP command, and a 'Zhang Xiaohong business manager 12 years' is popped up, which indicates that: the business manager Zhang Xiaohong is responsible for processing the process of 'surveying' of the property insurance class. After the 'survey' process is finished, POP the 'Wang Lei ordinary people for 3 years' by using a POP command in turn, assign the 'Wang Lei' to be responsible for the 'case setting' work of the second step, and analogize the rest processes until the claim settlement service is finished.
Further, after a certain claim settlement service is completed, all the personnel in the corresponding priority queue are popped up by the POP command, and the corresponding priority queue is restored to be in a state with an empty storage state and is ready to process the next claim settlement service.
Alternatively, in other embodiments, the cache-based fair dispatch program may be further divided into one or more modules, and the one or more modules are stored in the memory 11 and executed by one or more processors (in this embodiment, the processor 12) to implement the present invention.
For example, referring to fig. 3, a schematic diagram of a program module of a fair cache-based dispatch program in an embodiment of the fair cache-based dispatch apparatus of the present invention is shown, in this embodiment, the fair cache-based dispatch program may be divided into a data receiving and processing module 10, a dispatch personnel scoring module 20, a dispatch cache module 30, and a dispatch result output module 40, which exemplarily:
the data receiving and processing module 10 is configured to: receiving historical dispatching records, extracting an original personnel information set based on the historical dispatching records, classifying information items of the original personnel information set to obtain a standard personnel information set, and classifying the historical dispatching records to obtain a dispatching classification set.
The worker dispatching scoring module 20 is configured to: and scoring the standard personnel information set based on a pre-selection constructed scoring rule to obtain a personnel information scoring set.
The dispatch cache module 30 is configured to: and setting a mark pointer of a personnel cache queue according to the dispatching class set, and storing the experienced personnel information set into the personnel cache queue according to the mark pointer.
The dispatch result output module 40 is configured to: receiving a dispatching request input by a user, calculating according to a pre-constructed dispatching genetic model to obtain a dispatching person of the dispatching request, selecting a mark pointer corresponding to the dispatching request from the person cache queue, and extracting a name of the dispatching person from the person cache queue according to a preset cache dispatching rule according to the mark pointer corresponding to the dispatching request and the dispatching person to complete the dispatching request.
The functions or operation steps of the data receiving and processing module 10, the dispatching personnel scoring module 20, the dispatching cache module 30, the dispatching result output module 40 and other program modules when executed are substantially the same as those of the above embodiments, and are not repeated herein.
Furthermore, an embodiment of the present invention further provides a computer-readable storage medium, where a cache-based fair dispatch program is stored on the computer-readable storage medium, where the cache-based fair dispatch program is executable by one or more processors to implement the following operations:
receiving historical dispatching records, extracting an original personnel information set based on the historical dispatching records, classifying information items of the original personnel information set to obtain a standard personnel information set, and classifying the historical dispatching records to obtain a dispatching classification set.
And scoring the standard personnel information set based on a pre-selection constructed scoring rule to obtain a personnel information scoring set.
And setting a mark pointer of a personnel cache queue according to the dispatching class set, and storing the experienced personnel information set into the personnel cache queue according to the mark pointer.
Receiving a dispatching request input by a user, calculating according to a pre-constructed dispatching genetic model to obtain a dispatching person of the dispatching request, selecting a mark pointer corresponding to the dispatching request from the person cache queue, and extracting a name of the dispatching person from the person cache queue according to a preset cache dispatching rule according to the mark pointer corresponding to the dispatching request and the dispatching person to complete the dispatching request.
It should be noted that the above-mentioned numbers of the embodiments of the present invention are merely for description, and do not represent the merits of the embodiments. And the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, apparatus, article, or method that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, apparatus, article, or method. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, apparatus, article, or method that includes the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium (e.g., ROM/RAM, magnetic disk, optical disk) as described above and includes instructions for enabling a terminal device (e.g., a mobile phone, a computer, a server, or a network device) to execute the method according to the embodiments of the present invention.
The above description is only a preferred embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes, which are made by using the contents of the present specification and the accompanying drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (10)

1. A fair dispatching method based on cache is characterized by comprising the following steps:
receiving historical dispatching records, extracting an original personnel information set based on the historical dispatching records, classifying information items of the original personnel information set to obtain a standard personnel information set, and classifying the historical dispatching records to obtain a dispatching classification set;
scoring the standard personnel information set according to a pre-selection constructed scoring rule to obtain a personnel information scoring set;
dividing the standard staff information set based on the working experience and the staff information scoring set to obtain a working experience staff information set;
setting a mark pointer of a personnel cache queue according to the dispatching class set, and storing the experienced personnel information set into the personnel cache queue according to the mark pointer;
receiving a dispatching request input by a user, calculating according to a pre-constructed dispatching genetic model to obtain a dispatching person of the dispatching request, selecting a mark pointer corresponding to the dispatching request from the person cache queue, and extracting a name of the dispatching person from the person cache queue according to a preset cache dispatching rule according to the mark pointer corresponding to the dispatching request and the dispatching person to complete the dispatching request.
2. The cache-based fair dispatch method according to claim 1, wherein the scoring the standard human information set based on a pre-selection established scoring rule to obtain a human information scoring set comprises:
performing experience scoring on the standard personnel information set based on the information items to obtain experience scoring sets;
and optimizing the experience evaluation set based on a pre-constructed optimization evaluation rule to obtain the personnel information evaluation set.
3. The cache-based fair dispatching method of claim 1 or 2, wherein calculating dispatching personnel of the dispatching request according to a pre-constructed dispatching genetic model comprises:
calculating the minimum delay time of the dispatching request according to the number of the service links of the dispatching request;
calculating the fitness of the dispatching request by using the minimum delay time;
calculating to obtain a dispatching probability set of the dispatching request according to the fitness;
and selecting the dispatching personnel corresponding to the largest dispatching probability in the dispatching probability set.
4. The fair dispatch method based on caches according to claim 3, wherein the minimum delay time is calculated by:
Figure FDA0002331391310000011
wherein C is the minimum delay time, n is the service ring number of the dispatching request input by the user, tnRepresenting the actual processing time, x, of each business segmentnValue 0 or 1, representing whether each business link is in problem, dnIndicating the standard processing time for the business segment.
5. The fair dispatch method based on caches according to claim 3, wherein the fitness calculating method is:
Figure FDA0002331391310000021
wherein f isjRepresents the fitness, j represents a dispatching request number input by a user,
Figure FDA0002331391310000022
representing a fitness correction value;
the calculation method of the dispatching probability set comprises the following steps:
Figure FDA0002331391310000023
wherein, PjAnd m represents the dispatching probability set, and the dispatching classification number in the dispatching classification set.
6. A cache-based fair dispatch apparatus, comprising a memory and a processor, wherein the memory has stored thereon a cache-based fair dispatch program operable on the processor, the cache-based fair dispatch program when executed by the processor performs the steps of:
receiving historical dispatching records, extracting an original personnel information set based on the historical dispatching records, classifying information items of the original personnel information set to obtain a standard personnel information set, and classifying the historical dispatching records to obtain a dispatching classification set;
scoring the standard personnel information set according to a pre-selection constructed scoring rule to obtain a personnel information scoring set;
dividing the standard staff information set based on the working experience and the staff information scoring set to obtain a working experience staff information set;
setting a mark pointer of a personnel cache queue according to the dispatching class set, and storing the experienced personnel information set into the personnel cache queue according to the mark pointer;
receiving a dispatching request input by a user, calculating according to a pre-constructed dispatching genetic model to obtain a dispatching person of the dispatching request, selecting a mark pointer corresponding to the dispatching request from the person cache queue, and extracting a name of the dispatching person from the person cache queue according to a preset cache dispatching rule according to the mark pointer corresponding to the dispatching request and the dispatching person to complete the dispatching request.
7. The fair cache-based dispatch device of claim 6, wherein the scoring the standard set of human information based on a pre-selection established scoring rule to obtain a set of human information scores comprises:
performing experience scoring on the standard personnel information set based on the information items to obtain experience scoring sets;
and optimizing the experience evaluation set based on a pre-constructed optimization evaluation rule to obtain the personnel information evaluation set.
8. The fair cache-based dispatch device of claim 6 or 7, wherein the computing of the dispatch personnel of the dispatch request based on a pre-built dispatch genetic model comprises:
calculating the minimum delay time of the dispatching request according to the number of the service links of the dispatching request;
calculating the fitness of the dispatching request by using the minimum delay time;
calculating to obtain a dispatching probability set of the dispatching request according to the fitness;
and selecting the dispatching personnel corresponding to the largest dispatching probability in the dispatching probability set.
9. The fair cache-based dispatch device of claim 8, wherein said minimum delay time is calculated by:
Figure FDA0002331391310000031
wherein C is the minimum delay time, n is the service ring number of the dispatching request input by the user, tnRepresenting the actual processing time, x, of each business segmentnValue 0 or 1, representing whether each business link is in problem, dnIndicating the standard processing time for the business segment.
10. A computer-readable storage medium having stored thereon a cache-based fair dispatch program executable by one or more processors to perform the steps of the cache-based fair dispatch method of any one of claims 1 to 5.
CN201911341919.6A 2019-12-23 2019-12-23 Cache-based fair dispatch method and device and computer readable storage medium Active CN111144734B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911341919.6A CN111144734B (en) 2019-12-23 2019-12-23 Cache-based fair dispatch method and device and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911341919.6A CN111144734B (en) 2019-12-23 2019-12-23 Cache-based fair dispatch method and device and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN111144734A true CN111144734A (en) 2020-05-12
CN111144734B CN111144734B (en) 2023-06-02

Family

ID=70519498

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911341919.6A Active CN111144734B (en) 2019-12-23 2019-12-23 Cache-based fair dispatch method and device and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN111144734B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107067163A (en) * 2017-03-24 2017-08-18 青岛海信网络科技股份有限公司 A kind of breakdown maintenance work dispatching method and device
CN109472452A (en) * 2018-10-11 2019-03-15 平安科技(深圳)有限公司 Intelligent worker assigning method, apparatus, computer equipment and storage medium
CN109858724A (en) * 2018-11-08 2019-06-07 中国平安财产保险股份有限公司 Intelligent worker assigning method, apparatus and computer equipment based on data analysis
CN110334917A (en) * 2019-06-17 2019-10-15 悟空财税服务有限公司 A kind of smart client relationship management method and system
CN110363402A (en) * 2019-06-26 2019-10-22 同济大学 A kind of factory personnel dispatching method based on grouping strategy

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107067163A (en) * 2017-03-24 2017-08-18 青岛海信网络科技股份有限公司 A kind of breakdown maintenance work dispatching method and device
CN109472452A (en) * 2018-10-11 2019-03-15 平安科技(深圳)有限公司 Intelligent worker assigning method, apparatus, computer equipment and storage medium
CN109858724A (en) * 2018-11-08 2019-06-07 中国平安财产保险股份有限公司 Intelligent worker assigning method, apparatus and computer equipment based on data analysis
CN110334917A (en) * 2019-06-17 2019-10-15 悟空财税服务有限公司 A kind of smart client relationship management method and system
CN110363402A (en) * 2019-06-26 2019-10-22 同济大学 A kind of factory personnel dispatching method based on grouping strategy

Also Published As

Publication number Publication date
CN111144734B (en) 2023-06-02

Similar Documents

Publication Publication Date Title
WO2019056710A1 (en) Supplier recommendation method and apparatus, and computer readable storage medium
CN111552870A (en) Object recommendation method, electronic device and storage medium
CN109359798A (en) Method for allocating tasks, device and storage medium
US20160217383A1 (en) Method and apparatus for forecasting characteristic information change
CN109816321A (en) A kind of service management, device, equipment and computer readable storage medium
CN110852785B (en) User grading method, device and computer readable storage medium
CN112579621B (en) Data display method and device, electronic equipment and computer storage medium
CN111652282B (en) Big data-based user preference analysis method and device and electronic equipment
CN113919738A (en) Business handling window distribution method and device, electronic equipment and readable storage medium
CN115936895A (en) Risk assessment method, device and equipment based on artificial intelligence and storage medium
CN115186151A (en) Resume screening method, device, equipment and storage medium
CN113344415A (en) Deep neural network-based service distribution method, device, equipment and medium
WO2021139276A1 (en) Automatic operation and maintenance method and device for platform databases, and computer readable storage medium
CN112948705A (en) Intelligent matching method, device and medium based on policy big data
CN111652471A (en) List distribution control method and device, electronic equipment and storage medium
CN111144734A (en) Cache-based fair dispatching method and device and computer readable storage medium
CN111159355A (en) Customer complaint order processing method and device
KR101603977B1 (en) Prediction system for business ordering possibillity
CN113435746B (en) User workload scoring method and device, electronic equipment and storage medium
CN115689143A (en) Work order assignment method, work order assignment device, electronic device and medium
US20050102157A1 (en) Project managing system, project managing method and project managing program
CN109886819B (en) Method for predicting insurance payment expenditure, electronic device and storage medium
CN114581130A (en) Bank website number assigning method and device based on customer portrait and storage medium
CN111160637A (en) Intelligent manpower distribution method and device and computer readable storage medium
CN109857501A (en) A kind of page display method of APP, device, storage medium and server

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant