GB2481820A - Parallel workflow engine for classified data elements in workflow - Google Patents
Parallel workflow engine for classified data elements in workflow Download PDFInfo
- Publication number
- GB2481820A GB2481820A GB1011428.8A GB201011428A GB2481820A GB 2481820 A GB2481820 A GB 2481820A GB 201011428 A GB201011428 A GB 201011428A GB 2481820 A GB2481820 A GB 2481820A
- Authority
- GB
- United Kingdom
- Prior art keywords
- data
- workflow
- data processing
- case
- processing system
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/06—Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/06—Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
- G06Q10/063—Operations research, analysis or management
- G06Q10/0633—Workflow analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q40/00—Finance; Insurance; Tax strategies; Processing of corporate or income taxes
- G06Q40/02—Banking, e.g. interest calculation or account maintenance
Landscapes
- Business, Economics & Management (AREA)
- Engineering & Computer Science (AREA)
- Human Resources & Organizations (AREA)
- Economics (AREA)
- Strategic Management (AREA)
- Entrepreneurship & Innovation (AREA)
- Physics & Mathematics (AREA)
- Development Economics (AREA)
- Theoretical Computer Science (AREA)
- Marketing (AREA)
- General Physics & Mathematics (AREA)
- General Business, Economics & Management (AREA)
- Educational Administration (AREA)
- Tourism & Hospitality (AREA)
- Quality & Reliability (AREA)
- Operations Research (AREA)
- Game Theory and Decision Science (AREA)
- Accounting & Taxation (AREA)
- Finance (AREA)
- Technology Law (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
A work flow system groups data elements into a single case structure for processing; classifying and dividing data within the case structure into one or more workflows; and means for managing the status of activities of the workflows within the case structure such that individual data elements are processed simultaneously in parallel by a workflow engine. The workflow may be for anti-money laundering, arrears management, security perfection, customs, banking or title deeds system where scoring is used to evaluate risk in transactions. Parallel workflows allow real time flexible rating of risks. Each data processing layer includes a start point and an end point indicative of desired outcome according to a combination of pre-defined criteria, and may have sequential and parallel sub-layers.
Description
Data Processing System and Method
Field of the Invention
The invention relates to a data processing system and method. In particular the invention relates to a data processing system architecture for managing multiple data processes.
Background to the Invention
Traditionally banking systems have been built around case based workflow or entities based workflLow. To illustrate the point, a simplified scenario can be created around a customer who applies for a top-up home loan to assist with repayment difficulties. Typically, the bank will open a "Home Loan Case" with a view to approving or declining the loan by assessing the level of risk involved in facilitating the loan. Details of the customer, the facility type, the facility amount requested, details of the property and other collateral details are entered into this case. The case will then progress through the various stages of data capture, approval, documentation and post-loan processing.
However, a problem with these type of systems is in reality the situation tend to be complicated and difficult to implement. For example: * the customer may require processing through an through Anti-Money Laundering workflow.
* the customer accounts may be going through arrears management workflow.
* the security will go through a Security Perfection workflow.
* title deeds of a property may go through a Account Trust Receivable (ATR) Workflow.
Prior art examples of such early warning risk indicator systems are described e.g. in US 6,202,053 B1, US 6,311,169 B2, and the Journal of Commercial Landing, June 1995, pages to 16 "How the RMA/Fair, Isaac Credit-scoring model was build" by Latimer Asch.
US 6,202,053 El describes how, to assess the credit risk of an individual, a financial institution will develop a score for each credit applicant based on certain information. The applicant receives points for each item of information analysed by the financial institution. The amount of points awarded for each item, the items actually analysed, and the score necessary for approval may vary. This score awarded is used to evaluate a risk involved in performing a certain transaction. In other words, the decision to approve or deny an applicants request for e.g. a bank card or another type of transaction is based on a scoring system. The scoring system used to evaluate each applicant and the minimum score required for approval was applied uniformly by a financial institution to all its applicants. The use of such a scoring system for evaluating a risk involved with a transaction is rather superficial and could be made more secure by either monitoring the financial behaviour of the approved client after approval or increasing the score required for approval. The first alternative would require an increase in costs and efforts whereas the second alternative might lead to unnecessarily declining a large number of the clients.
Therefore, U5 6,202,053 Bi proposes to develop a segmentation tree, building a client's score card for each segment, grouping clients into sub-populations corresponding to each segment, and applying the client's score card to the applicants within the corresponding segment. Using an automated system to implement the generation of the clients score cards and scoring the applications further lowers costs and effort of assessing a risk involved with a transaction.
The general background of the RMA/Fair, Isaac, credit-scoring model is described in the above mentioned article by Latimer Asch. The model is suitable for e.g. reducing the time spent for processing small business loan applications using an automated solution which is based on a pooled-data score card. A scorecard is a tool used to calculate the risk associated with a credit application. It calculates the credit risk based on multiple items of information called characteristics. Characteristics can come from several sources, including the credit application and consumer and business credit reports. Each characteristic is divided into two or more possible responses known as attributes. A numerical score is associated with each attribute, so for any credit application, the numerical attribute values for all characteristics can be added together to provide a total score. Scoring, in principal, uses the same data a loan officer uses in his or her judgmental, or non-scoring, decision process. But scoring is faster, more objective, and more consistent. With the current regulatory pressure to provide more small business loans, prospective lenders need efficient, time-saving, cost-cutting tools. With credit-scoring, a lender can increase the number of approved applications without increasing risk, time, or other resources.
The scoring system described above, although performed automatically in a data processing system and being able to handle a relatively large number of data, has proved to be not very precise and could not determine risks in real time. Some of the problems of the known scoring system are that they are not flexible, they cannot take into account historical data, they are limited in the type of information which is taken into account and they have limited reporting possibility.
Traditional systems, as described above, are limited in how they can cater for the level of complexity required. The flaws and the problems with existing systems have been exposed with the recent financial fallout with banking institutions, especially in Ireland, where banking systems and protocols have clearly failed.
There is therefore a need to provide a data processing system to overcome the above mentioned problems.
Summary of the Invention
According to the invention there is provided, as set out in the appended claims, a data processing system comprising: a plurality of data processing terminals operated by respective users, said data processing terminals connected to a network; means for grouping a plurality of data elements into a single case data structure for processing; means for classifying and division of said data elements within the case structure into one or more means for managing the status of activities of the workflows within the case structure such that individual data elements are processed simultaneously in parallel by a workflow engine, said workflow engine comprises a plurality of data processing layers adapted to be processed in parallel in any one or all of data processing terminals, each layer comprises a layer start point, which may be any of a case, entity or workflow data structure, as well as a layer end point, which may be of any of a case, entity or workflow data structure, wherein the end point is representative of a desired data processing outcome according to a combination of pre-defined criteria.
A key benefit of this engine is that it allows individual system entities (e.g. Customer, Account, Document, Security etc) to have their own independent workflows while simultaneously being part of a wider case based workflow.
This specific feature allows the system to precisely mimic complex real life processes which are typically far more interactive than existing workflow linear workflow engines can handle.
Following that, advantageously, according to the present invention, there is provided an improved system for provision of data over a network. In particular, the method is carried out within an existing system management environment, preferably running on a computer/network system of a business. The computer system may be or include an existing, running computer system. The system may be a custom system.
Hence, the subject-matter of the present invention allows, in a more efficient manner to exchange data between different entities, such as different computers, workstations, personal computers, etc. Further to that, the present invention allows, in a more efficient manner, the visualization, i.e. the display of data on e.g. a remote computer, such as a client computer, a client terminal, etc. Therefore, the system according to the present invention, among other things, aims at providing a method, which is adapted to serve, assist or replace activities of different kinds, such as selection of data and provision of data and particularly allows a more efficient way of display of data. Further to that, even more advantageously, the system according to the present invention allows a user to perform his object in a more efficient manner, since the user is preferably automatically provided with data. In particular, the provided data is pre-selected preferably according to the business context, even more particular according to attributes/attribute data of the user.
Thereby, the selection data provided for the user is decreased which means that the selection process is more efficient. Accordingly, e.g. the Graphical User Interface (GUI) preferably is improved by displaying data relevant for the user. In particular, preferably unnecessary data is not displayed, thus keeping the GUI as simple as possible.
Thereby, the visual presentation and design of the GUI is improved. Also the data can be appreciated by the user much better, since preferably only essential data is displayed.
Thereby, it is easier for the user to view the data.
In one embodiment each data processing layer data stores a workflow data structure, each workflow data structure comprises data defining any of a workflow layer, a workflow start point, a workflow end point, a workflow layer stage, a data processing sub-layer, a sequential sub-layer stage and a parallel sub-layer stage.
In one embodiment each case data structure comprises a register of workflow data structures and entity data structures combined into discrete, project-specific, temporary or permanent relationships, each of which is representative of an entire workflow layer including any sub-layer.
In one embodiment there is provided means for selecting at least one or a plurality of risk factors by the workflow engine and the output is compared against same for compliance, if the compliance check is positive, then the workflow engine is adapted to check whether another risk factor needs to be considered such that a data processing flow looping process until such time as all risk factors associated with the workflow, entity or case data updates have been processed.
In one embodiment there is provided an integrating data processing module adapted to perform formatting, addressing and translation data processing tasks for routing case, entity and workflow data to and from local and remote databases.
In one embodiment said integrating module channels data to and from a working module, which includes the workflow engine and the pre-defined rules, processing thresholds, user-defined sequences and data structure relationships.
In one embodiment the system receives and stores data feeds and external variables and broadcasts same to the workflow engine for updating said database.
In one embodiment said data feeds and external variables comprise any of real-time, deferred or projected financial data streams, data variables representative of market status and/or risk, business status and/or risk, real estate status and/or risk, commercial or other status and/or risk.
In one embodiment each data layer stores workflow processing thresholds, risk factors embodied as pre-defined rules, user defined processing sequences and data structure relationships which correspond essentially to a register of workflow data structures combined into discrete temporary or permanent relationships, each of which is representative of systemic constraints applicable to any case.
In one embodiment there is provided a calculator module adapted with means to perform arithmetic calculus, and configured to link operations with the case, entity or workflow currently selected for input or consultation, and integrate constraints and thresholds in order to generate alerts.
In one embodiment there is provided a task list module adapted with means to allow a user to consult workflows associated with cases, in aggregate or granular manner, to manage overall caseload.
In one embodiment there is provided a bulk-update valuation module based on a set of parameters.
In one embodiment there is provided an automated update module adapted to interrupt an ongoing workflow layers and/or sub-layers, to alert a user that a particular threshold is no longer being met, and that action should be taken to stop the process.
There is also provided a computer program comprising program instructions for causing a computer program to carry out the above method which may be embodied on a record medium, carrier signal or read-only memory.
Brief Description of the Drawings
The invention will be more clearly understood from the following description of an embodiment thereof, given by way of example only, with reference to the accompanying drawings, Ira which:-Figure 1 illustrates a network environment in which a system according to the invention is embodied; Figure 2 illustrates an embodiment of a state machine according to the present invention; Figure 3 illustrates an architecture of a state machine shown in Figure 2 in an environment as shown in Figure 1; Figure 4 illustrates details of data structures processed by the architecture shown in Figure 3; Figure 5 illustrates relationships between data structures shown in Figure 4; Figure 6 details data processing steps of the system shown in Figures 1 to 5, including steps of updating data structures and data structures relationships; Figure 7 further details the data structure updating step of Figure 6, including a step of raising an alert; Figure 8 further details the data structure relationship updating step of Figure 6, including a step of receiving an alert; and Figure 9 illustrates details of a graphical user interface of the system shown in Figures 1 to 8.
Detailed Description of the Drawings
Referring now to the figures and initially Figure 1, there is shown a network environment in which a system according to the invention is embodied.
The environment includes a plurality of data processing terminals 10, 20, 30, 40 operated by respective users 11, 21, 31 and 41. The data processing terminals are connected to a network 50, for instance the World Wide Web or Internet through respective network connection means 12, 22, 32, 42.
Each data processing terminals 10, 20 30 and 40 includes at least data processing means, specifically a microprocessor connected with data storage means and network interfacing means; user input means, specifically an alpha numerical input device and optionally a pointing device; and display means to facilitate input, output and interaction of the respective user with the data processing terminal.
There is therefore the scope within the network environment shown in Figure 1 for any one networked data processing terminal to broadcast data to and receive data from any other networked data processing terminal.
Referring now to Figure 2, there is shown a block diagram of a state machine embodying the system according to the present invention.
The state machine herein shall be referred to as a "workflow engine", although this denomination is not intended to limit the scope of the present disclosure.
The workflow engine 60 implements a layered data processing methodology, according to which individual data structures, hereinafter referred to as cases, entities and workflows, are processed according to a combination of pre-defined rules, processing thresholds, user-defined sequences and data structure relationships. Accordingly, means are provided for grouping a plurality of data structures, or elements, into a single case structure for processing, for classifying and division of data within the case structure into one or more workflows, and for managing the status of activities of the workflows within the case structure, such that individual data elements are processed simultaneously in parallel by a workflow engine.
A plurality of data processing layers 70 are processed in parallel in any one or all of data processing terminals 10, 20, 30 and 40. Each layer 70 has a layer start point 71, which may be any of a case, entity or workflow data structure, as well as a layer end point 72, which again may be of any of a case, entity or workflow data structure, wherein the end point 72 is representative of a desired data processing outcome according to a combination of pre-defined rules, processing thresholds, user-defined sequences and data structure relationships.
Between the layer start point 71 and the layer end point 72, the layer 70 may include any number of intermediary workflow stages 70n, as required by the data processing necessary to achieve the outcome. Within each workflow stage 70n, including start and end points 71, 72, each point or stage may itself include at least one data processing sub-layer 80.
Whilst each data processing layer 70 is essentially sequential, wherein a next workflow stage 70n is processed only upon receiving the output of a preceding workflow stage 70n-1, a data processing sub-layer 80 may itself comprise a combination of sequential 81 and parallel 82, 83 data processing sub-layer stages 80n, again according to a combination of pre-defined rules, processing thresholds, user-defined sequences and data structure relationships.
With reference to Figure 3, an architecture of the state machine 60 is shown in the environment of Figure 1, wherein the workflow engine 60 configures data processing terminal for processing data according to a set of rules which will be further detailed herein below, and wherein it is represented as an integral part of an architecture 90, including a plurality of databases both local 100, 110 and remote 120 respectively at data processing terminal 20, 130 respectively at data processing terminal 30, and 140 at data processing terminal 40.
The architecture firstly includes an integrating data processing module 160, which perform relevant formatting, addressing and translation data processing tasks for routing case, entity and workflow data to and from the local and remote databases. The services 160 therefore include at least one Application Programmers Interface (API) 161, an Internet interface (WEB) 162 and a server for example a SQL Server Integration Services (SSIS) Server 163.
The integrating module 160 channels data to and from the working module 170, which includes the workflow engine 60 and the pre-defined rules, processing thresholds, user-defined sequences and data structure relationships 170n.
The working module 170 channels processed data to a user interface 180, which will be further detailed herein below and which is output to the display means of the data processing terminal 10, in order to facilitate interaction of the user 11 with the architecture.
Details of data structures processed by the architecture and the workflow engine 60 thereof are shown in Figure 4, which are stored in the plurality of databases 100, 110, 120, 130 and 140. As explained, in the example the databases are maintained at respective, network-connected terminals, and each will be described hereinafter as storing a respective, specific type of data structure for purposes of facilitating comprehension of the system according to the invention. It will be readily understood by the skilled person, however, that the databases may be fewer in number, may all be stored at a same terminal, and/or that the storage of data structure and data structure type per database may be performed according to different rules or logic.
Database 100 stores workflow data structures 200. Each workflow data structure 200 comprises data defining any of a workflow layer 70, a workflow start point 71, a workflow end point 72, a workflow layer stage 70n, a data processing sub-layer 80, a sequential 81 sub-layer stage 80n or a parallel sub-layer stage 80n. Database 100 essentially stores the data structures required to progress the status of activities within the system.
Database 110 stores entity data structures. Each entity data structure comprises data defining any of a legal or individual entity 210, for instance the personal details of an individual, or data defining any further entities associated with one or a plurality of legal or individual entity 210, such as current balances and exposures 211, arrears and recoveries 212, credit applications 213, credit agreements 214 and amendments 215 to same, collaterals 216 and valuations 217 of same and the like. Database 110 essentially stores the most granular data structures within the system, which are subjected to and/or are part of workflow processes, and representative of the classification and division of data within the system.
Database 120 stores case data structures 220. Each case data structure 220 is essentially a register of workflow data structures 200n and entity data structures 210n combined into discrete, project-specific, temporary or permanent relationships, each of which is representative of an entire workflow layer 70 including any sub-layer 80, to achieve a particular outcome, for instance for a legal entity to apply for, negotiate, obtain and fulfil a credit agreement to term, and all of the workflow data structures required to achieve the outcome.
Database 130 receives and stores data feeds and external variables 230 and broadcasts same to the workflow engine for updating databases 100, 110, 120 and 140. Data feeds and external variables 230 comprise any of real-time, deferred or projected financial data streams 231, for instance stock market indexes and libor rates, as well as any further data variables representative of market status and/or risk 232, business status and/or risk 233, real estate status and/or risk 234, commercial or other status and/or risk 235. Database 130 essentially receives a stream of external factor data which may or may not bias the processing of workflow layers 70, and thus forwards it to the workflow engine 60 in connection with same.
Database 140 stores workflow processing thresholds 240, risk factors embodied as pre-defined rules 250, user-or business-defined processing sequences 260 and data structure relationships 270 which, in this database, correspond essentially to a register of workflow data structures 200 combined into discrete, temporary or permanent relationships, each of which is representative of statutory or systemic constraints applicable to any case 220, for instance regulator and compliance targets at a portfolio level, internal portfolio targets, solicitor exposure and the like.
This combination of data structures allows the system to precisely mimic complex real life business processes, which are typically far more interactive than existing workflow linear workflow engines can handle. For this purpose, user-and/or system-defined relationships exist between most data structures across the databases within the system, examples of which are illustrated in Figure 5.
The example is based on a customer requesting a financial loan, initiating a new case 220 in database 120, which will register and inventory all the associations between workflow and entity data structures as explained below.
The customer entity 210 in database 110 is initially associated with a relevant loan application workflow 200 in database 100, comprising a start "application" point 71 and an end "offer" point 72, wherein the start "application" point 71 is associated with an application processing workflow 200 likewise in database 100, comprising one or a plurality of sub-layer stages 80n.
The application is subjected to the existence and value of a collateral in the first place, whereby the customer's existing collateral 216 and its value 217 in database 110 is associated with the loan application workflow 200 in database 100. This allows the application process to be initiated, and the customer entity 210 to be associated with a credit application structure 213 in database 110, and the credit application structure 213 to be associated with the application processing workflow 200 in database 100.
Upon completion of the application processing workflow 200 by the system, a credit agreement 214 representative of the loan offer in database 110 is associated with the end "offer" point 72 of the loan application workflow 200 in database 100 and physically sent out to the customer according to the customer particulars 210.
The processing steps according to which the workflow engine processes case, entity and workflow data elements are detailed in Figure 6.
At step 601, the data processing terminal 10 is switched on by the user ii. At step 602 instructions configuring the data processing terminal 10 to embody substantially the architecture 90, including connectivity to remote databases 120, 130 and 140 are loaded in the terminal storage means.
The loading step 602 may be performed by reading instructions from a]oca] medium or by obtaining instructions from a remote terminal across network 50.
Upon completion of the loading step 602, the workflow engine is initialised and the user interface 180 is output to the display mears at step 603, at which stage the workflow engine may process case, entity or workflow data according to local or remote input.
A first question is asked at step 604, as to whether local user input has been received. If the question is answered positively, the workflow engine 60 next checks whether the input has been received in connection with an existing case, entity or workflow data structure at step 605. If the check is negative, then the workflow engine creates a corresponding case, entity or workflow data structure at step 606, according to the input received at step 604.
Alternatively, if the question of step 604 is answered negatively, then a further question is asked at step 607, as to whether the integrating module 160 has received remote data input. If the question of step 605 is answered positively, then the data processing flow proceeds to answer question 607 negatively, whereby the workflow engine checks whether the input has been received in connection with an existing case, entity or workflow data structure relationship at step 608.
If the check is negative, then the workflow engine defines a corresponding case, entity or workflow data structure relationship at step 609, according to the input received at step 604. If the check is positive, however, then the local user input is read at the next step 610, which may for instance relate to a data update in connection with an existing case, entity, workflow, pre-defined rule, processing threshold, user-defined sequence or data structure relationship.
At step 611, the input data is processed by the workflow engine 60 in order to update the case, entity or workflow data structure. The input data is either the local input data read at step 610, or remote input data received by terminal 10 across network 50, which causes the data processing flow to answer question 607 positively.
At step 612, the updated case, entity or workflow data structure is processed by the workflow engine 60 in order to update the case, entity or workflow data structure relationships. At the next step 613, the user interface 180 is updated, so that the user 11 may either verify that the correct data has been input, or that the new case, entity or workflow data structure has been created, or that the new case, entity or workflow data structure relationship has been defined, and remain informed of any external factors embodied by the remote data of step 607.
At step 614, a further question is asked as to whether further input is required or has been received which, if answered positively, returns the data processing flow to the question of step 604. Alternatively, the question of step 614 is answered negatively, whereby a last question is asked at step 615, as to whether the workflow engine 60 and, generally, the architecture 90 should now be interrupted.
If the question of step 615 is answered negatively, then again control returns to the question of step 604.
Alternatively, the question of step 615 is answered positively, whereby instructions are unloaded from the storage means at step 616 and the terminal may be stopped or switched off at step 617.
The data processing system resulting from recursively performing steps 604 to 615, in relation to every workflow stage 70n and data processing sub-layer 80n, embodies a combination of tried and tested business techniques of: (i) Case Management -the group of data elements into a single case for processing; (ii) Entity Management -the classification and division of data within a system; and (iii)Workflow Management -managing the status of activities within the system.
The data processing system of the present invention is such that not only can the case have a workflow 70n but also every entity within the case may simultaneously have their own workflow 70n. The impact of this in system terms is significant in how it opens up business possibilities.
What is unique about the data processing system is that not only can the case have a workflow 70n but also every entity within the case may simultaneously have their own workflow 70n.
The workflow engine 60 is capable of managing these multiple parallel workflows 70n, 80n with ease and allows the banks to clearly establish a holistic perspective of the relationships with the customer and ensuring that they remain compliant with the increasing regulator and risk management demands.
While the above example is relatively simple, the complexity that occurs in commercial and corporate banking is significantly greater and corresponding benefits of the Each individual entity within a workflow stage 70n can contain additional independent sub-workflows 80n in an associated one-to-many fashion. The workflow engine 60 allows for end to end collateral management and loan provisioning processes to have fully automated tasks created and scheduled with items being directed towards the required and configurable user skill sets in a fully
auditable system.
The case, entity or workflow data structure updating step 611 is further detailed in Figure 7.
At step 701, a first question is asked as to whether the input data corresponds to a workflow data structure update.
If the question of step 701 is answered positively, then at step 702, the relevant workflow data structure is fetched by the workflow engine 60 from the relevant database 100 with a corresponding call to the integrating module 160.
Alternatively, the question of step 701 is answered negatively, whereby a further question is asked at step 703, as to whether the input data corresponds to an entity data structure update. If the question of step 703 is answered positively, then at step 704, the relevant entity data structure is fetched by the workflow engine 60 from the relevant database 110 with a corresponding call to the integrating module 160.
Alternatively, the question of step 703 is answered negatively, whereby a further question is asked at step 705, as to whether the input data corresponds to a case data structure update. If the question of step 705 is answered positively, then at step 706, the relevant case data structure is fetched by the workflow engine 60 from the relevant database 120 with a corresponding call to the integrating module 160.
Alternatively, the question of step 705 is answered negatively, signifying that the local or remote data is not applicable to an existing or new case, entity or workflow data structure and requires either further user input or an automatic update according to a predefined rule, whereby an alert is triggered at step 707.
If a workflow, entity or case data structure is fetched according to steps 702, 704 or 706 respectively, then at step 708 the workflow engine 60, specifically the layer 70 or sub-layer 80 processes the input data.
The updating of the relevant database 100, 110 or 120 at step 712 further to the processing of step 708 is dependent upon checking compliance of the data update against at least one, but preferably a plurality of risk factors embodied as pre-defined rules and/or processing thresholds stored in database 140.
Thus, at step 709 at least one or the first of a plurality of risk factors is selected by the workflow engine 60 and the output of step 708 is compared against same for compliance at step 710. If the compliance check of step 710 is positive, then the workflow engine 60 next checks at step 711, whether another risk factor needs to be considered and control returns to step 709 in the affirmative, the data processing flow looping through steps 709 to 711 until such time as all risk factors associated with the workflow, entity or case data updates have been considered.
If the compliance check of step 710 is negative however, signifying that the case, entity or workflow data update has failed a risk threshold and requires either further user input or an automatic update according to a predefined rule, an alert is again triggered at step 707.
When the last risk factor has been considered and the data update compliance checked, then the question of step 711 is answered negatively and the relevant database 100, 110 or may then be updated at step 712.
The case, entity or workflow data structure relationship updating step 612 is further detailed in Figure 8.
After updating the relevant data structure itself, with reference to the inter-relationships upon which the workflow engine 60 relies on achieve the main benefit of the present invention, the workflow engine preferably propagates the effect of the data update across any data structures having an existing relationship with the data structure last updated at step 611.
Accordingly, at step 801 the related data structures are aggregated by the workflow engine 60 from the relevant database 100 with a corresponding call to the integrating module 160. At the next step 802, the relationship map linking the related data structures is parsed by the workflow engine 60 and a question is asked at step 803, as to whether the data update of step 611 corresponds to a new case, entity or workflow relationship defined at step 609.
In the affirmative, the relationship map is updated at step 804 with particulars of the new relationship, then the data update is propagated at step 805 to all related data structures according to the map. If the question of step 803 is answered negatively however, then there is no requirement to update the relationship map and control proceeds directly to the propagating of step 805.
The workflow manager 60, when recursively processing steps 604 to 615, and by extension recursively processing steps 701 to 712 and 801 to 805 for every case, workflow and entity data element, monitors a customer's financial position by tracking the following key variables in real time: * Current Balances, Exposures, Arrears and Recoveries * Credit Applications or Amendment to existing credit agreements * Collateral Valuations (based on internal and external data feeds) * Regulator and Compliance Targets at a Portfolio Level * Internal Portfolio Targets * Solicitor Exposure The workflow manager 60 effectively tracks simultaneously all of the above inputs and determine a customer's overall health, expressed in data structure conforming to pre-defined processing parameters and thresholds. In particular, the system triggers alerts 707 if any thresholds have been breached. By extension, this granular level control also allows the financial institution to track the overall portfolio position and ensure ongoing regulator compliance.
A functional illustration of a graphical user interface for the above system is shown in Figure 9, with which to facilitate the user interaction with the complex set of relationships detailed hereinbefore. The graphical user interface 180 is preferably embodied as a set of instructions configuring a browser program 901, to ensure inter -operability and platform -agnosticism.
The instructions preferably comprise a security API 902 to ensure that the data exchanged over the network between the terminal processing the instruction and remote terminals, which is sensitive and confidential by nature, cannot be easily intercepted. The instructions next comprise a database API 903, which essentially corresponds functionally to the integrating data processing module 160 previously described.
User -interactive features of the user interface 180 include a plurality of fast -access modules, briefly comprising a data entry module 904, a calculator module 905, a task list module 906, a reporting module 907 and a dashboard module 908.
Upon activation, the data entry module 904 allows the user to input and update case, workflow and entity data structures at step 604, which will be propagated as previously described.
The calculator module 905 allows the user to perform conventional arithmetic calculus, however it may usefully be configured to link operations with the case, entity or workflow currently selected for input or consultation, and integrate constraints and thresholds 240, 250, 260 in order to generate alerts 707.
The task list module 906 allows the user to consult workflows associated with cases, in aggregate or granular manner, for instance to manage overall caseload.
The reporting module 907 allows the user to query any or all of databases 100 to 140 for data stored therein according to reporting parameters, from a per data-structure basis querying the data at its most granular level, to a total database contents basis querying the data at the most holistic level permissible.
The dashboard module 908 is preferably user-configurable on a discrete basis, and allows each distinct user of the system to select cases, workflows, entities, tasks and reports of current and/or periodical interest, in a summary and synthesised manner.
With this system, as well as allowing the user to complete a full data capture from start to finish as relevant data structures 200 to 260, each stage of the process can be subjected to a unique set of rules to calculate and score an individual based on the data capture, whereby the determination of whether they are eligible for the loan may be on request. This calculation is based on the predefined intelligent web of relationships between the predefined entities.
For example in order to calculate whether a particular deal should be approved the decision is based on the mapping of relationships between the main entities for that business process, including security, exposure, financials, facilities, history, arrears, credit Score and security.
A useful part of this system is also the ability to bulk-update valuations 217 based on a set of parameters. For example, an update could be issued to devalue all agricultural securities in a geographical location by a set percentage, based on an external factor 230 to 235. This automated update can then interrupt the ongoing workflow layers 70n and/or sub-layers 80n, to alert 707 the user that a particular threshold 240, 250 is no longer being met, and that action should be taken to decline the application.
The embodiments in the invention described with reference to the drawings comprise a computer apparatus and/or processes performed in a computer apparatus. However, the invention also extends to computer programs, particularly computer programs stored on or in a carrier adapted to bring the invention into practice. The program may be in the form of source code, object code, or a code intermediate source and object code, such as in partially compiled form or in any other form suitable for use in the implementation of the method according to the invention.
The carrier may comprise a storage medium such as ROM, e.g. CD ROM, or magnetic recording medium, e.g. a floppy disk or hard disk. The carrier may be an electrical or optical signal which may be transmitted via an electrical or an optical cable or by radio or other means.
In the specification the terms "comprise, comprises, comprised and comprising" or any variation thereof and the terms 41include, includes, included and including" or any variation thereof are considered to be totally interchangeable and they should all be afforded the widest possible interpretation and vice versa.
The invention is not limited to the embodiments hereinbefore described but may be varied in both construction and detail.
Claims (14)
- Claims l.A data processing system comprising: a plurality of data processing terminals operated by respective users, said data processing terminals connected to a network; means for grouping a plurality of data elements into a single case data structure for processing; means for classifying and division of said data elements within the case structure into one or more means for managing the status of activities of the workflows within the case structure such that individual data elements are processed simultaneously in parallel by a workflow engine, said workflow engine comprises a plurality of data processirig layers adapted to be processed ira parallel in any one or all of data processing terminals, each layer comprises a layer start point, which may be any of a case, entity or workflow data structure, as well as a layer end point, which may be of any of a case, entity or workflow data structure, wherein the end point is representative of a desired data processing outcome according to a combination of pre-defined criteria.
- 2.Ihe data processing system as claimed in claim 1 wherein each data processing layer data stores a workflow data structure, each workflow data structure comprises data defining any of a workflow layer, a workflow start point, a workflow end point, a workflow ]ayer stage, a data processing sub-]ayer, a sequentialL sub-layer stage and a parallel sub-layer stage.
- 3.The data processing system as claimed in claim 2 wherein each case data structure comprises a register of workflow data structures and entity data structures combined into discrete, project-specific, temporary or permanent relationships, each of which is representative of an entire workflow layer including any sub-layer.
- 4.The data processing system as claimed in any preceding claim comprising means for selecting at least one or a plurality of risk factors by the workflow engine and the output is compared against same for compliance, if the compliance check is positive, then the workflow engine is adapted to check whether another risk factor needs to be considered such that a data processing flow looping process until such time as all risk factors associated with the workflow, entity or case data updates have been processed.
- 5.The data processing system as claimed in any preceding claim comprising an integrating data processing module adapted to perform formatting, addressing and translation data processing tasks for routing case, entity and workflow data to and from local and remote databases.
- 6.The data processing system as claimed in claim 5 wherein said integrating module channels data to and from a working module, which includes the workflow engine and the pre-defined rules, processing thresholds, user-defined sequences and data structure relationships.
- 7.The data processing system as claimed in any preceding claim wherein the system receives and stores data feeds and external variables and broadcasts same to the workflow engine for updating said database.
- 8.The data processing system as claimed in claim 7 wherein said data feeds and external variables comprise any of real-time, deferred or projected financial data streams, data variables representative of market status and/or risk, business status and/or risk, real estate status and/or risk, commercial or other status and/or risk.
- 9.The data processing system as claimed in any preceding claim wherein each data layer stores workflow processing thresholds, risk factors embodied as pre-defined rules, user defined processing sequences and data structure relationships which correspond essentially to a register of workflow data structures combined into discrete, temporary or permanent relationships, each of which is representative of systemic constraints applicable to any case.
- lO.The data processing system as claimed in any preceding claim comprising a calculator module adapted with means to perform arithmetic calculus, and configured to link operations with the case, entity or workflow currently selected for input or consultation, and integrate constraints and thresholds in order to generate alerts.
- ll.The data processing system as claimed in any preceding claim comprising a task list module adapted with means to allow a user to consult workflows associated with cases, in aggregate or granular manner, to manage overall caseload.
- 12.The data processing system as claimed in any preceding claim comprising a bulk-update valuation module based on a set of parameters.
- 13.The data processing system as claimed in any preceding claim comprising an automated update module adapted to interrupt an ongoing workflow layers and/or sub-layers, to alert a user that a particular threshold is no longer being met, and means for stopping said processing.
- 14.A data processing system as substantially hereinbefore described with reference to the accompanyingdescription and/or drawings.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB1011428.8A GB2481820A (en) | 2010-07-07 | 2010-07-07 | Parallel workflow engine for classified data elements in workflow |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB1011428.8A GB2481820A (en) | 2010-07-07 | 2010-07-07 | Parallel workflow engine for classified data elements in workflow |
Publications (2)
Publication Number | Publication Date |
---|---|
GB201011428D0 GB201011428D0 (en) | 2010-08-25 |
GB2481820A true GB2481820A (en) | 2012-01-11 |
Family
ID=42712033
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GB1011428.8A Withdrawn GB2481820A (en) | 2010-07-07 | 2010-07-07 | Parallel workflow engine for classified data elements in workflow |
Country Status (1)
Country | Link |
---|---|
GB (1) | GB2481820A (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10235105B1 (en) | 2018-02-27 | 2019-03-19 | Ricoh Company, Ltd. | Microservice architecture workflow management mechanism |
CN109976725A (en) * | 2019-03-20 | 2019-07-05 | 中信梧桐港供应链管理有限公司 | A kind of process program development approach and device based on lightweight flow engine |
US10650374B1 (en) | 2015-10-22 | 2020-05-12 | Amdocs Development Limited | System, method, and computer program for implementing high performance digital wallets |
-
2010
- 2010-07-07 GB GB1011428.8A patent/GB2481820A/en not_active Withdrawn
Non-Patent Citations (1)
Title |
---|
None * |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10650374B1 (en) | 2015-10-22 | 2020-05-12 | Amdocs Development Limited | System, method, and computer program for implementing high performance digital wallets |
US10235105B1 (en) | 2018-02-27 | 2019-03-19 | Ricoh Company, Ltd. | Microservice architecture workflow management mechanism |
CN109976725A (en) * | 2019-03-20 | 2019-07-05 | 中信梧桐港供应链管理有限公司 | A kind of process program development approach and device based on lightweight flow engine |
CN109976725B (en) * | 2019-03-20 | 2022-06-14 | 中信梧桐港供应链管理有限公司 | Flow program development method and device based on lightweight flow engine |
Also Published As
Publication number | Publication date |
---|---|
GB201011428D0 (en) | 2010-08-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9313209B2 (en) | Loan origination software system for processing mortgage loans over a distributed network | |
Davenport et al. | Automated decision making comes of age | |
US8543486B2 (en) | Method and system for the protection of broker and investor relationships, accounts and transactions | |
US20180082381A1 (en) | Systems and methods for dynamically providing financial loan products | |
US7761354B2 (en) | Banking software design generator | |
US7708196B2 (en) | Modular web-based ASP application for multiple products | |
US20070067234A1 (en) | Mortgage loan system and method | |
US20030187768A1 (en) | Virtual finance/insurance company | |
US20020188544A1 (en) | System and method for trade entry | |
JP5500662B2 (en) | Banking business processing system, method and program | |
CN110472815A (en) | To the risk control method and system of financing enterprise in a kind of supply chain financial business | |
US20140032432A1 (en) | Distressed properties marketing system and method | |
GB2481820A (en) | Parallel workflow engine for classified data elements in workflow | |
Sastry | Artificial intelligence in financial services and banking industry | |
JP2004046363A (en) | Medium and small size enterprise grading evaluation system | |
Pradeep et al. | Use of Artificial Intelligence in the Indian Insurance Sector, including Healthcare Companies. | |
Uprety | Six Sigma in banking services: a case study based approach | |
IE20100418A1 (en) | Data processing system and method | |
KR20200009774A (en) | Method and system for diagnosing and analyzing fundings | |
IES20100417A2 (en) | Data processing system and method | |
IES85685Y1 (en) | Data processing system and method | |
IE20100417U1 (en) | Data processing system and method | |
Fotso | Integrated framework for digitalisation and business process reengineering for banking performance in South African banks | |
Vinora et al. | Application of Hyperautomation in Insurance and Retail Industries | |
Mankge | A Critical Analysing of the Pricing Process for the Corporate and Commercial Segments of Bank XYZ in South Africa |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WAP | Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1) |