IE20100417U1 - Data processing system and method - Google Patents

Data processing system and method

Info

Publication number
IE20100417U1
IE20100417U1 IE2010/0417A IE20100417A IE20100417U1 IE 20100417 U1 IE20100417 U1 IE 20100417U1 IE 2010/0417 A IE2010/0417 A IE 2010/0417A IE 20100417 A IE20100417 A IE 20100417A IE 20100417 U1 IE20100417 U1 IE 20100417U1
Authority
IE
Ireland
Prior art keywords
data
workflow
data processing
layer
case
Prior art date
Application number
IE2010/0417A
Other versions
IES85685Y1 (en
Inventor
Purdy John
Murphy Marc
Kenny Mark
Original Assignee
Fenergo Ip Ltd
Filing date
Publication date
Application filed by Fenergo Ip Ltd filed Critical Fenergo Ip Ltd
Publication of IES85685Y1 publication Critical patent/IES85685Y1/en
Publication of IE20100417U1 publication Critical patent/IE20100417U1/en

Links

Abstract

ABSTRACT The invention provides a data processing system comprising means for grouping a plurality of data elements into a single case structure for processing; means for classifying and division of data within the case structure into one or more workflows; and means for managing the status of activities of the workflows within the case structure such that individual data elements are processed simultaneously in parallel by a workflow engine.

Description

Field of the Invention The invention relates to a data processing system and method. In particular the invention relates to a data processing system architecture for managing multiple data processes. is Background to the Invention Traditionally banking systems have been built around case based workflow or entities based workflow. To illustrate the point, a simplified scenario can be created around a customer who applies for a top—up home loan to assist with repayment difficulties. Typically, the bank will open a “Home Loan Case" with a View to approving or declining the loan by assessing the level of risk involved in facilitating the loan. Details of the customer, the facility type, the facility amount requested, details of the property and other collateral details are entered into this case. The case will then progress through the various stages of data capture, approval, documentation and post- loan processing.
However, a problem with these type of systems is in reality the situation tend to be complicated and difficult to implement. For example: the customer may require processing through an through Anti—Money Laundering workflow. the customer accounts may be going through arrears management workflow. the security will go through a Security Perfection workflow. title deeds of a property may go through a Account Trust Receivable (ATR) Workflow.
Prior art examples of such early warning risk indicator systems are described e.g. in US 6,202,053 B1, US 6,311,169 B2, and the Journal of Commercial Landing, June 1995, pages to 16 "How the RMA/Fair, Isaac Credit—scoring model was build" by Latimer Asch.
US 6,202,053 B1 describes how, to assess the credit risk of a financial institution will develop a score an individual, for each credit applicant based on certain information. applicant receives points for each item of information analysed by the financial institution. The amount of points awarded for each item, the items actually analysed, and the score necessary for approval may vary. This score awarded is used to evaluate a risk involved in performing a certain transaction. In other‘ words, the decision to approve or a bank card or another deny an applicants request for e.g. type of transaction is based on a scoring system. scoring system used to evaluate each applicant and the minimum score required for approval was applied uniformly by a financial institution to all its applicants. The use of such 23 scoring system: for evaluating a .risk involved with a transaction is rather superficial and could be made more secure by either monitoring the financial behaviour of the approved client after approval or increasing the score The first alternative would require required for approval. an increase in costs and efforts whereas second alternative might lead to unnecessarily declining a large number of the clients.
Therefore, US develop a ,202,053 B1 proposes to building a client's score card for each segmentation tree, segment, grouping clients into sub—populations corresponding to each segment, and applying the client's the within the corresponding score card to applicants segment. Using an automated system to implement generation of the clients score cards and scoring the applications further lowers costs and effort of assessing a risk involved with a transaction.
The general background of the RMA/Fair, Isaac, credit- scoring model is described in the above mentioned article by Latimer Asch. The model is suitable for e.g. reducing the time spent for processing small business loan applications using an automated solution which is based on a pooled—data score card. A scorecard is a tool used to calculate the risk associated with a credit application. It the called calculates credit risk based on multiple items of information characteristics. Characteristics can come from several sources, including the credit application business credit Each divided attributes. A and consumer and reports. characteristic is into two or more possible responses known as numerical score is credit with for each attribute, so associated any application, numerical attribute values for characteristics can be added together to provide a total score. Scoring, in principal, uses the same data a loan officer uses in his or her judgmental, or non—scoring, decision process. But scoring is faster, more objective, and more consistent. With the current regulatory pressure prospective lenders With to provide more small business loans, need efficient, time-saving, cost—cutting tools. credit—scoring, a lender can increase the number of approved applications without increasing risk, time, or other resources.
IE2ea417 The above, although performed scoring system described automatically in a data processing system and being able to handle a relatively large number of data, has proved to be not very precise and could not determine risks in real time. Some of the problems of the known scoring system are that they are not flexible, they cannot take into account historical data, they are limited in the type of information which is taken into account and they have limited reporting possibility. are limited in how Traditional systems, as described above, they can cater for the level of complexity required. The flaws and the problems with existing systems have been exposed with the recent financial fallout with banking where banking systems institutions, especially in Ireland, and protocols have clearly failed.
There is therefore a need to provide a data processing system to overcome the above mentioned problems.
Summary of the Invention According to the invention there is provided, as set out in the appended claims, a data processing system comprising: a plurality cxf data processing terminals operated by respective users, said data processing terminals connected to a network; means for grouping a plurality of data elements into a single case data structure for processing; means for classifying and division of said data elements within the case structure into one or more workflows; lEees417 means for managing the status of activities of the workflows within the case structure such that individual data elements are processed simultaneously in parallel by a workflow engine, said workflow engine comprises a plurality of data processing layers adapted to be processed in parallel in any one or all of data processing terminals, each layer comprises a layer start point, which may be any of a case, entity or workflow data structure, as well as a layer end point, which may be of any of a case, entity or workflow data structure, wherein the end point is representative of a desired data processing outcome according to a combination of pre—defined criteria.
A key benefit of this engine is that it allows individual system entities (e.g. Customer, Account, Document, Security etc) to have their own independent workflows while simultaneously being part of a wider case based workflow.
This specific feature allows the system to precisely mimic complex real life processes which are typically far more interactive than existing workflow linear workflow engines can handle.
Following that, advantageously, according to the present invention, there is provided an improved system for provision of data over a network. In particular, the method is carried out within an existing system management environment, preferably running on a computer/network system of a business. The computer system may be or include an existing, running computer system. The system may be a custom system. lE1eo417 Hence, the subject—matter of the present invention allows, in a more efficient manner to exchange data between different computers, the different entities, such as etc. Further to that, workstations, personal computers, present invention allows, in a more efficient manner, visualization, i.e. the display of data on elg. a remote computer, such as a client computer, a client terminal, etc. Therefore, the system according to the present invention, among other things, aims at providing a method, which is adapted to serve, assist or replace activities of different kinds, such as selection of data and provision of allows efficient way of data a more and particularly Further to that, even more advantageously, display of data. the system according to the present invention allows a user to perform his object in a more efficient manner, since the user is preferably automatically provided with data. In particular, the provided data is pre—selected. preferably even more particular the according to the business context, attributes/attribute data of user. according to Thereby, the selection data provided for user is decreased which means that the selection process is more the Graphical User Interface efficient. Accordingly, (GUI) e.g. preferably is improved by displaying data relevant for the user. In particular, preferably unnecessary data is not displayed, thus keeping the GUI as simple as possible.
Thereby, the visual presentation and design of the GUI is improved. Also the data can be appreciated by the user much better, since preferably only essential data is displayed.
Thereby, it is easier for the user to view the data.
In one embodiment each data processing layer data stores a workflow data structure, each workflow data structure comprises data defining any of a workflow layer, a workflow lEiao430 start point, a workflow end point, a workflow layer stage, a data processing sub—layer, a sequential sub—layer stage and a parallel sub—layer stage.
In one embodiment each case data structure comprises a register of workflow data structures and entity data structures combined into discrete, project—specific, temporary or permanent relationships, each of which is representative of an entire workflow layer including any sub—layer.
In one embodiment there is provided means for selecting at least one or a plurality of risk factors by the workflow engine and the output is compared against same for compliance, if the compliance check is positive, then the workflow engine is adapted to check whether another risk factor needs to be considered such that a data processing flow looping process until such time as all risk factors associated with the workflow, entity or case data updates have been processed.
In. one embodiment there is provided an integrating data processing module adapted to perform formatting, addressing and translation data. processing tasks for routing case, entity’ and workflow data to and from local and remote databases.
In one embodiment said integrating module channels data to and front a working module, which includes the workflow engine and the pre—defined rules, processing thresholds, user—defined sequences and data structure relationships.
IE1ao430 In one embodiment the system receives and stores data feeds and external variables and broadcasts same to the workflow engine for updating said database.
In one embodiment said data feeds and external variables comprise any of real—time, deferred or projected financial data streams, data variables representative of market status and/or risk, business status and/or risk, real estate status and/or risk, commercial or other status and/or risk.
In one embodiment each data layer stores workflow processing thresholds, risk factors embodied as pre—defined rules, user defined processing sequences and data structure relationships which correspond essentially to a register of workflow data structures combined into discrete, temporary or permanent relationships, each of which is representative of systemic constraints applicable to any case.
In. one embodiment there is provided. a calculator~ module adapted with means to perform arithmetic calculus, and configured to link operations with the case, entity or workflow currently selected for input or consultation, and integrate constraints and thresholds in order to generate alerts.
In one embodiment there is provided a task list module adapted with means to allow a user to consult workflows associated with cases, in aggregate or granular manner, to manage overall caseload.
In one embodiment there is provided a bulk—update valuation module based on a set of parameters. lE?00417 In one embodiment there is provided an automated update module adapted to interrupt an ongoing workflow layers and/or sub—layers, to alert a user that a particular threshold is no longer being met, and that action should be taken to stop the process.
There is also provided a computer program comprising program instructions for causing a computer program to carry’ out the above method which may be embodied on a record medium, carrier signal or read—only memory.
Brief Description of the Drawings The invention will be more clearly understood from. the following description of an embodiment thereof, given by way of example only, with reference to the accompanying drawings, in which:— Figure l illustrates a network environment in which a system according to the invention is embodied; Figure 2 illustrates an embodiment of a state machine according to the present invention; an architecture of a state Figure 3 illustrates machine shown in Figure 2 in an environment as shown in Figure 1; Figure 4 illustrates details of data structures processed by the architecture shown in Figure 3; Figure 5 illustrates relationships between data structures shown in Figure 4; Figure 6 details data processing steps of the system shown in Figures 1 to 5, including steps of updating data structures and data structures relationships; IE1M417 Figure 7 further details the data structure updating step of Figure 6, including" a step of raising an alert; Figure 8 further details the data structure relationship updating step of Figure 6, including a step of receiving an alert; and illustrates details of a Figure 9 graphical user interface of the system shown in Figures 1 to 8.
Detailed Description of the Drawingg Referring now to the figures and initially Figure 1, there is shown a network environment in which a system according to the invention is embodied. includes The environment a. plurality of data processing , The data processing terminals are connected terminals 10, 40 operated by respective users 11, 21, 31 and 41. to a network 50, for instance the World Wide Web or Internet through respective network connection Hmans 12, , 32, 42.
Each data processing terminals 10, 20 30 and 40 includes at least data processing means, specifically a microprocessor connected with data storage means and network interfacing means; user input means, specifically an alpha numerical input device and optionally a pointing device; and display means to facilitate input, output and interaction of the respective user with the data processing terminal.
There is therefore the scope within the network environment shown in Figure 1. for any one networked data processing terminal to broadcast data to and receive data from any other networked data processing terminal. lE1ao417 H Referring now to Figure 2, there is shown a block diagram of a state machine embodying the system according to the present invention.
The state machine herein shall be referred to as a “workflow engine”, although this denomination is not intended to limit the scope of the present disclosure.
The workflow engine 60 implements a layered data processing methodology: according to which individual data structures, hereinafter referred to as cases, entities and workflows, are processed according to a combination of pre—defined rules, processing thresholds, user-defined sequences and data structure relationships. Accordingly, means are provided for grouping a plurality of data structures, or elements, into a single case structure for processing, for classifying and division of data within the case structure into one or more workflows, and for managing the status of activities of the workflows within the case structure, such that individual data elements are processed simultaneously in parallel by a workflow engine.
A plurality of data processing layers 70 are processed in parallel in any one or all of data processing terminals 10, , 30 and 40. Each layer 70 has a layer start point 71, which may be any of a case, entity or workflow data structure, as well as a layer end point 72, which again may be of any of a case, entity or workflow data structure, wherein the end. point 72 is representative of a tdesired data processing outcome according to a combination of pre- defined thresholds, user—defined rules, processing sequences and data structure relationships. lE1oe417 Q Between the layer start point 71 and the layer end point , the layer 70 may include any number of intermediary workflow stages 70n, as required. by the data processing necessary to achieve the outcome. Within each workflow stage 70n, including start and end. points 71, 72, each point or stage may itself include at least one data processing sub—layer 80. processing layer 70 is essentially Whilst each data sequential, wherein a next workflow stage 70n is processed only upon receiving the output of a preceding workflow stage 70n-1, a data processing sub—layer 80 may itself comprise a combination of sequential 81 and parallel 82, 83 data processing sub—layer stages 80n, again according to a combination of pre-defined rules, processing thresholds, user—defined sequences and data structure relationships.
With reference to Figure 3, an architecture of the state machine 60 is shown in the environment of Figure 1, wherein the workflow engine 60 configures data processing terminal for processing data according to a set of rules which will be further detailed herein below, and wherein it is integral part of an architecture 90, 110 and remote 120 respectively at data processing terminal 20, 130 and 140 at represented as an including a plurality of databases both local 100, respectively’ at data processing terminal 30, data processing terminal 40.
The architecture firstly includes an integrating data processing module 160, which perform relevant formatting, addressing and translation data processing tasks for entity and workflow data to and from. the routing case, lE1%G4‘i7 local and remote databases. The services 160 therefore one Application Interface (WEB) include at least Programmers (API) 161, example a SQL Server Integration Services . and a server for (SSIS) an Internet interface Server The integrating module 160 channels data to and from the working module 170, which includes the workflow engine 60 and the pre—defined rules, processing thresholds, user- defined sequences and data structure relationships 170m.
The working module 170 channels processed data to a user which will be further detailed herein below of the data interface 180, and which is output to the display means processing terminal 10, in order to facilitate interaction of the user 11 with the architecture.
Details and the workflow engine 60 thereof are shown in Figure 4, 110, the which are stored in the plurality of databases 100, , 130 140. As the example and explained, in databases are maintained at respective, network—connected terminals, and each will be described hereinafter as specific type of data structure for storing a respective, purposes of facilitating comprehension of system according to the invention. It will be readily understood by the skilled person, however, that the databases may be fewer in number, may’ all be stored at a same terminal, that the structure type per database may be performed according to and/or storage of data structure and data different rules or logic. :iE1eo4i7 Database 1 stores workflow data structures 200. Each workflow data structure 200 comprises data defining any of a workflow layer 70, a workflow start point 71, a workflow end point 72, a workflow layer stage 70n, a data processing a sequential 81 sub—layer stage 80n or a n. sub-layer 80, parallel sub-layer stage Database 100 essentially stores the data structures required to progress the status of activities within the system.
Database 110 stores entity data structures. Each entity data structure comprises data defining any of a legal or individual entity 210, for instance the personal details of an individual, or data defining any further entities associated with one or a plurality of legal or individual entity 210, such as current balances and. exposures 211, arrears and recoveries 212, credit applications 213, credit collaterals 216 110 agreements 214 and amendments 215 to same, and valuations 217 of same and the like. Database essentially stores the most granular data structures“within“““ which are subjected to and/or are part of the system, workflow processes, and representative of classification and division of data within the system.
Database 120 stores case data structures 220. Each case data structure 220 is essentially a register of workflow n structures 210n data structures and entity data combined into discrete, project—specific, temporary or permanent relationships, each of which is representative of an entire workflow layer 70 including any sub—layer 80, to achieve a particular outcome, for instance for a legal entity to apply for, negotiate, obtain and fulfil a credit agreement to term, and all of the workflow data structures required to achieve the outcome.
!E1ea4 Database 130 receives and stores data feeds and external variables 230 and broadcasts same to the workflow engine for updating databases 100, 110, 120 and 140. Data feeds and external variables 230 comprise any of real—time, deferred or projected financial data streams 231, for instance stock market indexes and libor rates, as well as any further data variables representative of market status and/or risk 232, real business status and/or risk 233, estate status and/or risk 234, commercial or other status and/or risk 235. of external factor data which may or may not bias Database 130 essentially receives a stream processing of workflow layers 70, and thus forwards it to the workflow engine 60 in connection with same.
Database 140 stores workflow processing thresholds 240, risk factors embodied as pre—defined rules 250, user— or business—defined processing sequences 260 and data structure relationships 270 which, in this database, correspond essentially to a register of workflow data structures 200 combined into discrete, temporary or permanent relationships, each of which is representative of statutory or systemic constraints applicable to any case 220, for instance regulator and compliance targets at a portfolio level, internal portfolio targets, solicitor exposure and the like.
This combination of data structures allows the system to precisely mimic complex real life business processes, which are typically far more interactive than existing workflow user- linear workflow engines can handle. For this purpose, and/or system—defined relationships exist between most data IE1oo-417 16 structures across the databases within the system, examples of which are illustrated in Figure 5.
The example is based on a customer requesting a financial loan, initiating a new case 220 in database 120, which will register and inventory all the associations between workflow and entity data structures as explained below.
The customer entity 210 in database 110 is initially associated with a relevant loan application workflow 200 in database 100, comprising a start “application” point 71 and an end “offer” point 72, wherein the start “application” point 71 is associated with an application processing workflow 200 likewise in database 100, comprising one or a plurality of sub—layer stages 80m.
The application is subjected to the existence and value of a collateral in the first place, whereby the customer's existing collateral 216 and its value 217 in database 110 is associated with the loan application workflow 200 in database 100. This allows the application. process to be initiated, and the customer entity 210 to be associated with a credit application structure 213 in database 110, and the credit application structure 213 to be associated with the application processing workflow 200 in database lO0.
Upon completion of the application processing workflow 200 by the system, a credit agreement 214 representative of the loan offer in database 110 is associated with the end “offer” point 72 of the loan application workflow 200 in database 100 and physically sent out to the customer according to the customer particulars 210.
!E1co4 The processing steps according to which the workflow engine processes case, entity and workflow data elements are detailed in Figure 6. the data processing terminal 10 is switched on At step 601, by the user 11. At step 602 instructions configuring the data processing terminal 10 to embody substantially the architecture 90, including connectivity to remote databases , 130 and 140 are loaded in the terminal storage means.
The loading step 602 may be performed by reading instructions fronl a local medium. or by obtaining instructions from a remote terminal across network 50.
Upon completion of the loading step 602, the workflow engine is initialised and the user interface 180 is output to the display means at step 603, at which stage the workflow engine may process case, entity or workflow data according to local or remote input.
A first question is asked at step 604, as to whether local user input has been received. If the question is answered positively, the workflow engine 60 next checks whether the been received in connection with an existing input has case, entity or workflow data structure at step 605. If the check is negative, then the workflow engine creates a corresponding case, entity or workflow data structure at step 606, according to the input received at step 604.
Alternatively, if the question of step 604 is answered then a further question is asked at step 607, has negatively, as to whether the integrating module received If the question of step 605 is answered remote data input.
IEM94 U then the data processing flow proceeds to positively, answer question 607 negatively, whereby the workflow engine 60 checks whether the input has been received in connection with an existing case, entity or workflow data structure relationship at step 608.
If the check is negative, then the workflow engine defines a corresponding case, entity or workflow data structure according to the input received then the relationship at step 609, at step 604. If the check is positive, however, local user input is read at the next step 610, which may for instance relate to a data update in connection with an existing case, entity, workflow, pre—defined rule, processing threshold, user—defined. sequence or data structure relationship.
At step 611, the input data is processed by the workflow engine 60 in order to update the case, entity or workflow data structure. The input data is either the local input data read at step 610, or remote input data received by terminal 10 across network 50, which causes the data processing flow to answer question 607 positively.
At step 612, the updated case, entity or workflow data structure is processed by the workflow engine 60 in order to update the case, entity or workflow data structure relationships. At the next step 613, the user interface 180 is updated, so that the user 11 may either verify that the correct data has been input, or that the new case, entity or workflow data structure has been created, or that the new case, entity or workflow data structure relationship has been defined, and remain informed of any external factors embodied by the remote data of step 607. lE1eo4 I5 At step 614, a further question is asked as to whether further input is required or has been received which, if answered positively, returns the data processing flow to the question of step 604. Alternatively, the question of step 614 is answered negatively, whereby a last question is as to whether the workflow engine 60 should asked at step 615, and, generally, the architecture 90 now be interrupted. then 604.
If the question of step 615 is answered negatively, the question of is again control returns to step Alternatively, the question of step answered positively, whereby instructions are unloaded from the storage means at step 616 and the terminal may be stopped or switched off at step 617.
The data processing system resulting from recursively performing steps 604 to 615, in relation to every workflow stage 70n and data processing sub—layer 80n, embodies a combination of tried and tested business techniques of: (i) Case Management — the group of data elements into a single case for processing; (ii) Entity Management — the classification and division of data within a system; and (iii)Workflow the status of Management - managing activities within the system.
The data processing system of the present invention is such that not only can the case have a workflow 70n but also every entity within the case may simultaneously have their own workflow 70n. The impact of this in system terms is significant in how it opens up business possibilities.
IE1eo4 H What is unique about the data processing system is that not only can the case have a workflow 70n but also every entity within the case may simultaneously have their own workflow n.
The workflow engine 60 is capable of managing these multiple parallel workflows 70n, 80n with ease and allows the banks to clearly establish a holistic perspective of the relationships with the customer and ensuring that they remain compliant with the increasing regulator and risk management demands.
While the above example is relatively simple, the complexity that occurs in commercial and corporate banking is significantly greater and corresponding benefits of the workflow engine 60 are significant. individual entity within a workflow stage 70n can Each sub-workflows in an contain additional independent associated one—to—many fashion. workflow engine 60 allows for end to end collateral management and loan provisioning processes to have fully automated tasks created and scheduled with items being directed towards the and user skill sets in a fully required configurable auditable system.
The case, entity or workflow data structure updating step 611 is further detailed in Figure 7.
At step 701, a first question is asked as to whether the input data corresponds to a workflow data structure update.
If the question of step 701 is answered positively, then at the relevant workflow data structure is fetched step 702, 'E1%G4 by the workflow engine 60 from the relevant database 100 with a corresponding call to the integrating module 160.
Alternatively, the question of step 701 is answered negatively, whereby ea further‘ question .is asked..at step , as to whether the input data corresponds to an entity data structure update. If the question of step 703 is then at step 704, the relevant entity answered positively, data structure is fetched by the workflow engine 60 from the relevant database 110 with a corresponding call to the integrating module 160. the question of 703 is answered Alternatively, step negatively, whereby" a further question is asked at step 705, data as to whether the input data corresponds to a case structure update. If the question of step 705 is answered positively, then at step 706, the relevant case data structure is fetched by the workflow engine 60 from the relevant database 120 with a corresponding call to the integrating module 160.
Alternatively, the question of step 705 is answered negatively, signifying that the local or remote data is not applicable to an existing or new case, entity or workflow data structure and requires either further user input or an automatic update according to a predefined rule, whereby an alert is triggered at step 707.
If a workflow, entity or case data structure is fetched according to steps 702, 704 or 706 respectively, then at step 708 the workflow engine 60, specifically the layer 70 or sub—layer 80 processes the input data. lE1oo4 The updating of the relevant database 100, 110 or 120 at step 712 further to the processing of step 708 is dependent upon checking compliance of the data update against at least one, but preferably a plurality of risk factors embodied as pre-defined rules and/or processing thresholds stored in database 140.
Thus, at step 709 at least one or the first of a plurality of risk factors is selected by the workflow engine 60 and the output of step 708 is compared against same for compliance at step 710. If the compliance check of step 710 is positive, then the workflow engine 60 next checks at step 711, whether another risk factor needs to be considered and control returns to step 709 in the affirmative, the data processing flow looping through steps to 711 until such time as all risk factors associated with the workflow, entity or case data updates have been considered.
If the compliance check of step 710 is negative however, signifying that the case, entity or workflow data update has failed. a risk. threshold and requires either further user input or an automatic update according to a predefined rule, an alert is again triggered at step 707.
When the last risk factor has been considered and the data update compliance checked, then the question of step 711 is answered negatively and the relevant database 100, l1O or 120 may then be updated at step 712.
The case, entity or workflow data structure relationship updating step 612 is further detailed in Figure 8.
IE1eoM7 After‘ updating the relevant data structure itself, with reference to the inter—relationships upon which the workflow engine 60 relies on achieve the main benefit of the present invention, the workflow engine preferably propagates the effect of the data update across any data structures having an existing relationship with the data structure last updated at step 611.
Accordingly, at step 801 the related data structures are aggregated by the workflow engine 60 from the relevant database 100 with a corresponding call to the integrating module 160. At the next step 802, the relationship map linking the related data structures is parsed by the workflow engine 60 and a question is asked at step 803, as to whether the data update of step 611 corresponds to a new case, entity or workflow relationship defined at step 609.
In the affirmative, the relationship map is updated at step 804 with particulars of the new relationship, then the data update is propagated at step 805 to all related data structures according to the map. If the question of step 803 is answered negatively however, then there is no requirement to update the relationship map and control proceeds directly to the propagating of step 805.
The workflow manager 60, when recursively processing steps 604 to 615, and by extension recursively processing steps 701 to 712 and 801 to 805 for every case, workflow and entity data element, monitors a customer's financial position by tracking the following key variables in real time: Current Balances, Exposures, Arrears and Recoveries Credit Applications or Amendment to existing credit agreements lE1ae-M7 Collateral Valuations (based on internal and external data feeds) 0 Regulator and Compliance Targets at a Portfolio Level 0 Internal Portfolio Targets Solicitor Exposure The workflow manager 60 effectively tracks simultaneously all of the above inputs and determine a customer's overall health, expressed in data structure conforming to pre- defined processing parameters and thresholds. In particular, the system triggers alerts 707 if any this granular thresholds have been breached. By extension, level control also allows the financial institution to track the overall portfolio position and ensure ongoing regulator compliance.
A functional illustration of a graphical user interface for the above system is shown in Figure 9, with which to facilitate the user interaction with. the complex set of relationships detailed hereinbefore. The graphical user interface 180 is preferably embodied as a set of instructions configuring a browser program 901, to ensure inter ~ operability and platform — agnosticism.
The instructions preferably comprise a security API 902 to ensure that the data exchanged over the network between the terminal processing the instruction and remote terminals, which is sensitive and confidential by nature, cannot be easily intercepted. The instructions next comprise a database API 903, which essentially corresponds functionally to the integrating data processing module 160 previously described. $160417 25 User ~ interactive features of the user interface 180 include a plurality of fast — access modules, briefly comprising as data entry Inodule 904, a calculator‘ module , a task list module 906, a reporting module 907 and a dashboard module 908.
Upon activation, the data entry module 904 allows the user to input and update case, workflow and entity data structures at step 604, which will be propagated as previously described.
The calculator module 905 allows the user to perform conventional arithmetic calculus, however it may usefully be configured to link operations with the case, entity or workflow currently selected for input or consultation, and integrate constraints and thresholds 240, 250, 260 in order to generate alerts 707.
The task list module 906 allows the user to consult workflows associated with cases, in aggregate or granular manner, for instance to manage overall caseload.
The reporting module 907 allows the user to query any or all of databases 100 to 140 for data stored therein according to reporting parameters, from a per data- structure basis querying" the data at its most granular level, to a total database contents basis querying the data at the most holistic level permissible.
The dashboard module 908 is preferably user—configurable on a discrete basis, and allows each distinct user of the system to select cases, workflows, entities, tasks and lE1@04 reports of current and/or periodical interest, in a summary and synthesised manner.
With this system, as well as allowing the user to complete a full data capture from start to finish as relevant data structures 200 to 260, each stage of the process can be subjected to a unique set of rules to calculate and score an individual based on the data capture, whereby the determination of whether they are eligible for the loan may be on request. This calculation is based on the predefined intelligent web of relationships between the predefined entities.
For example in order to calculate whether a particular deal should be approved the decision is based on the mapping of relationships between the main entities for that business process, including security, exposure, financials, facilities, history, arrears, credit Score and security.
A useful part of this system is also the ability to bulk- update valuations 217 based on a set of parameters. For example, an update could be issued to devalue all agricultural securities in a geographical location by a set percentage, based on an external factor 230 to 235. This automated update can then interrupt the ongoing workflow layers 70n and/or sub—layers 80n, to alert 707 the user that a particular threshold 240, 250 is no longer being met, and that action should be taken to decline the application.
The embodiments in the invention described with reference to the drawings comprise a computer apparatus and/or processes performed in a computer apparatus. However, the !E1oa4 invention also extends to computer programs, particularly computer programs stored on or in a carrier adapted to bring the invention into practice. The program may be in the code form of source code, object code, or a intermediate source and object code, such as in partially compiled form or in any other form suitable for use in the implementation of the method according to the invention.
The carrier may comprise a storage medium such as ROM, e.g.
CD ROM, or magnetic recording medium, e.g. a floppy disk or hard disk. The carrier‘ may be an electrical or optical signal which may be transmitted via an electrical or an optical cable or by radio or other means.
In the specification the terms "comprise, comprises, comprised and comprising" or any variation thereof and the included and including" “include, includes, thereof terms or any variation are considered to be totally interchangeable and they should all be afforded the widest possible interpretation and vice versa. the embodiments both The invention is not limited to hereinbefore described but may be varied in construction and detail.

Claims (5)

Claims
1.A data processing system comprising: a plurality of data processing terminals operated 5 by respective users, said data processing terminals connected to a network; means for grouping" a plurality of data elements into a single case data structure for processing; means for classifying and division of said data 10 elements within the case structure into one or more workflows; and means for managing the status of activities of the workflows within the case structure such that individual data elements are processed 15 simultaneously in parallel by a workflow engine, said workflow engine comprises a plurality of data processing layers adapted to be processed in parallel in any one or all of data processing terminals, each layer comprises a layer start 20 point, which may be any of a case, entity or workflow’ data structure, as well as a layer end point, which may be of any of a case, entity or workflow data structure, wherein the end point is representative of a desired data processing outcome 25 according to a combination of pre—defined criteria.
2.The data processing system as claimed in claim 1 wherein each data processing layer data stores a workflow data structure, each workflow data structure 30 comprises data defining any of a workflow layer, a workflow start point, a workflow end point, a workflow layer stage, a data processing sub—layer, a sequential sub-layer stage and a parallel sub-layer stage.
3.The data processing system as claimed in claim 2 wherein each case data structure comprises a register of workflow data structures and entity data structures combined into discrete, project—specific, temporary or 5 permanent relationships, each of which is representative of an entire workflow layer including any sublayer.
4.The data processing system as claimed in any preceding 10 claim comprising means for selecting at least one or a plurality of risk factors by the workflow engine and the output is compared against same for compliance, if the compliance check is positive, then the workflow engine is adapted to check whether another risk factor I5 needs to be considered such that a data processing flow looping process until such time as all risk factors associated with the workflow, entity or case data updates have been processed. 20
5.A data processing system as substantially hereinbefore described with reference to the accompanying description and/or drawings.
IE2010/0417A 2010-07-07 Data processing system and method IE20100417U1 (en)

Publications (2)

Publication Number Publication Date
IES85685Y1 IES85685Y1 (en) 2011-01-19
IE20100417U1 true IE20100417U1 (en) 2011-01-19

Family

ID=

Similar Documents

Publication Publication Date Title
US8543486B2 (en) Method and system for the protection of broker and investor relationships, accounts and transactions
US8990254B2 (en) Loan origination software system for processing mortgage loans over a distributed network
US7877320B1 (en) System and method for tracking and facilitating analysis of variance and recourse transactions
US20180082381A1 (en) Systems and methods for dynamically providing financial loan products
US7761354B2 (en) Banking software design generator
Gupta et al. Influences of artificial intelligence and blockchain technology on financial resilience of supply chains
US20070067234A1 (en) Mortgage loan system and method
US20030187768A1 (en) Virtual finance/insurance company
US20140278741A1 (en) Customer community analytics
JP5500662B2 (en) Banking business processing system, method and program
US11294863B2 (en) Data conversion and distribution systems
CN110472815A (en) To the risk control method and system of financing enterprise in a kind of supply chain financial business
Grody et al. Risk Accounting-Part 1: The risk data aggregation and risk reporting (BCBS 239) foundation of enterprise risk management (ERM) and risk governance
Wu A supportive pricing model for suppliers with purchase order financing
Yang et al. Resilient Supply Chains to Improve the Integrity of Accounting Data in Financial Institutions Worldwide Using Blockchain Technology
GB2481820A (en) Parallel workflow engine for classified data elements in workflow
Ekström et al. Valuing flexibility in architecture, engineering, and construction information technology investments
IE20100417U1 (en) Data processing system and method
IES85685Y1 (en) Data processing system and method
IE20100418A1 (en) Data processing system and method
IES20100417A2 (en) Data processing system and method
KR20200009774A (en) Method and system for diagnosing and analyzing fundings
Duarte Arenas Quality costs analysis in the service sector: an empirical study of the Colombian banking system
Recor et al. GRC technology introduction
Simonson et al. Analytics in banking