IES85685Y1 - Data processing system and method - Google Patents
Data processing system and method Download PDFInfo
- Publication number
- IES85685Y1 IES85685Y1 IE2010/0417A IE20100417A IES85685Y1 IE S85685 Y1 IES85685 Y1 IE S85685Y1 IE 2010/0417 A IE2010/0417 A IE 2010/0417A IE 20100417 A IE20100417 A IE 20100417A IE S85685 Y1 IES85685 Y1 IE S85685Y1
- Authority
- IE
- Ireland
- Prior art keywords
- data
- workflow
- data processing
- layer
- case
- Prior art date
Links
- 230000000694 effects Effects 0.000 claims abstract description 8
- 238000000034 method Methods 0.000 claims description 19
- 230000000875 corresponding Effects 0.000 description 9
- 239000000969 carrier Substances 0.000 description 4
- 230000003993 interaction Effects 0.000 description 3
- 230000002452 interceptive Effects 0.000 description 3
- 238000004590 computer program Methods 0.000 description 2
- 230000003278 mimic Effects 0.000 description 2
- 230000003287 optical Effects 0.000 description 2
- 230000000644 propagated Effects 0.000 description 2
- 230000001960 triggered Effects 0.000 description 2
- 230000004913 activation Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 230000003247 decreasing Effects 0.000 description 1
- 230000001419 dependent Effects 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000000977 initiatory Effects 0.000 description 1
- 238000004900 laundering Methods 0.000 description 1
- 230000001902 propagating Effects 0.000 description 1
- 239000003247 radioactive fallout Substances 0.000 description 1
- 230000001105 regulatory Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000011218 segmentation Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Abstract
ABSTRACT The invention provides a data processing system comprising means for grouping a plurality of data elements into a single case structure for processing; means for classifying and division of data within the case structure into one or more workflows; and means for managing the status of activities of the workflows within the case structure such that individual data elements are processed simultaneously in parallel by a workflow engine.
Description
Title Data Processing System and Method Field of the Invention to a data processing system and The invention relates method. In particular the invention relates to a data processing system architecture for managing multiple data processes .
Background to the Invention Traditionally banking systems have been built around case based workflow or entities based workflow. To illustrate the point, a simplified scenario can be created around a customer who applies for a top—up home loan to assist with the bank will open a repayment difficulties. Typically, “Home Loan Case” with a View to approving or declining the loan by assessing the level of risk involved in facilitating the loan. Details of the customer, the facility type, the facility amount requested, details of the property and other collateral details are entered into this case. The case will then progress through the various stages of data capture, approval, documentation and post- loan processing.
However, a problem with these type of systems is in reality and difficult to the situation tend to be complicated implement. For example: 0 the customer may require processing through an through Anti—Money Laundering workflow. the customer accounts may be going through arrears management workflow. 0 the security will go through a Security Perfection workflow.
W 385685‘ title deeds of a property may go through a Account Trust Receivable (ATR) Workflow.
Prior art examples of such early warning risk indicator systems are described e.g. in US 6,202,053 B1, US 6,311,169 B2, and the Journal of Commercial Landing, June 1995, pages to 16 "How the RMA/Fair, build" by Latimer Asch.
Isaac Credit—scoring model was US 6,202,053 B1 describes how, to assess the credit risk of an individual, a financial institution will develop a score for each credit applicant based on certain information. The each item of information applicant receives points for analysed by the financial institution. The amount of points awarded for each item, the items actually analysed, and the score necessary for approval may vary. This score awarded is used to evaluate a risk involved in performing a certain transaction. In other words, the decision to approve or a bank card or another deny an applicants request for e.g. type of transaction is based on a scoring system. scoring system. used to evaluate each applicant and the minimum score required for approval was applied uniformly by a financial institution to all its applicants. The use of such ea scoring systenl for evaluating a :risk involved with a transaction is rather superficial and could be made more secure by either monitoring the financial behaviour of the approved client after approval or increasing the score The first alternative would require required for approval. an increase in costs and efforts whereas second alternative might lead to unnecessarily declining a large number of the clients. develop a Therefore, US 6,202,053 B1 proposes to segmentation tree, building a client's score card for each segment, grouping clients into sub—populations and applying the client's the corresponding to each segment, within corresponding score card to the applicants segment. Using an automated system to implement generation of the clients score cards and scoring the applications further lowers costs and effort of assessing a risk involved with a transaction.
The general background of the RMA/Fair, Isaac, credit- scoring model is described in the above mentioned article by Latimer Asch. The model is suitable for e.g. reducing the loan time spent for processing small business applications using an automated solution which is based on a pooled—data score card. A scorecard is a tool used to calculate the risk associated with a credit application. It the credit items of called calculates risk based on multiple information characteristics. Characteristics can including the credit application Each come from several sources, business credit divided attributes. A and consumer and reports. characteristic is into two or more possible responses known as numerical score is credit associated with each attribute, so for any application, the numerical attribute values for characteristics can be added together to provide a total score. Scoring, in principal, uses the same data a loan officer uses in his or her judgmental, or non—scoring, decision process. But scoring is faster, more objective, and more consistent. With the current regulatory pressure to provide more small business loans, prospective lenders time—saving, cost—cutting tools. With need efficient, credit—scoring, a lender can increase number of approved applications without increasing risk, time, or other resources.
The scoring system described above, although performed automatically in a data processing system and being able to handle a relatively large number of data, has proved to be not very precise and could not determine risks in real time. Some of the problems of the known scoring system are that they are not flexible, they cannot take into account data, the historical they are limited in type of information which is taken into account and they have limited reporting possibility. are limited in how Traditional systems, as described above, they can cater for the level of complexity required. flaws and the problems with existing systems have been exposed. with. the recent financial fallout with banking where banking systems institutions, especially in Ireland, and protocols have clearly failed.
There is therefore a need to provide a data processing system to overcome the above mentioned problems.
Summary of the Invention According to the invention there is provided, as set out in the appended claims, a data processing system comprising: a plurality of data processing terminals operated by respective users, said data processing terminals connected to a network; means for grouping’ a plurality’ of data elements into a single case data structure for processing; means for classifying and division of said data elements within the case structure into one or more workflows; and means for managing the status of activities of the workflows within the case structure such that individual data elements are processed simultaneously in parallel by a workflow engine, said workflow engine comprises a plurality of data processing layers adapted to be processed in parallel in any one or all of data processing terminals, each layer comprises a layer start point, which may be any of a case, entity or workflow‘ data structure, as well as a layer end point, which may be of any of a case, entity or workflow data structure, wherein the end point is representative of a desired data processing outcome according to a combination of pre—defined criteria.
A key benefit of this engine is that it allows individual system entities (e.g. Customer, Account, Document, Security etc) to have their own independent workflows while simultaneously being part of a wider case based workflow.
This specific feature allows the system to precisely mimic complex real life processes which are typically far more interactive than existing workflow linear workflow engines can handle.
Following that, advantageously, according to the present invention, there is provided an improved system for provision of data over a network. In particular, the method is carried out within an existing system management environment, preferably running on a computer/network system of a business. The computer system may be or include The system may be a an existing, running computer system. custom system.
Hence, the subject—matter of the present invention allows, in a more efficient manner to exchange data between different entities, such as different computers, workstations, personal computers, etc. Further to that, the present invention allows, in a more efficient manner, the visualization, i.e. the display of data on e.g. a remote a client terminal, computer, such as a client computer, etc. Therefore, the system according to present invention, among other things, aims at providing a method, which is adapted to serve, assist or replace activities of different kinds, such as selection of data and provision of a more efficient way of data and particularly allows Further to that, even more advantageously, display of data. the system according to the present invention allows a user to perform his object in a more efficient manner, since the user is preferably automatically provided with data. In particular, the provided data is pre—selected preferably according’ to the business context, even more particular according to attributes/attribute data of the user.
Thereby, the selection data provided for the user is decreased which means that the selection process is more the Graphical User Interface efficient. Accordingly, (GUI) e.g. preferably is improved by displaying data relevant for the user. In particular, preferably unnecessary data is not displayed, thus keeping the GUI as simple as possible.
Thereby, the visual presentation and design of the GUI is improved. Also the data can be appreciated by the user much better, since preferably only essential data is displayed.
Thereby, it is easier for the user to view the data.
In one embodiment each data processing layer data stores a workflow data structure, each workflow data structure comprises data defining any of a workflow layer, a workflow start point, a workflow end point, a workflow layer stage, a data processing sub—layer, a sequential sub—layer stage and a parallel sub-layer stage.
In one embodiment each case data structure comprises a register of workflow data structures and entity data structures combined into discrete, project—specific, temporary or permanent relationships, each of which is representative of an entire workflow layer including any sub—layer.
In one embodiment there is provided means for selecting at least one or a plurality of risk factors by the workflow for engine and the output is compared against same compliance, if the compliance check is positive, then the workflow engine is adapted to check whether another risk factor needs to be considered such that a data processing flow looping process until such time as all risk factors associated with the workflow, entity or case data updates have been processed.
In one embodiment there is provided. an integrating data processing module adapted to perform formatting, addressing and translation data processing tasks for routing case, entity and. workflow data to and from local and remote databases.
In one embodiment said integrating module channels data to and from. a working module, which includes the workflow engine and the pre-defined rules, processing thresholds, user—defined sequences and data structure relationships.
In one embodiment the system receives and stores data feeds and external variables and broadcasts same to the workflow engine for updating said database.
In one embodiment said data feeds and external variables comprise any of real—time, deferred or projected financial data streams, data variables representative of market status and/or risk, business status and/or risk, real estate status and/or risk, commercial or other status and/or risk.
In one embodiment each data layer stores workflow processing thresholds, risk factors embodied as pre—defined rules, user defined processing sequences and data structure relationships which correspond essentially to a register of workflow data structures combined into discrete, temporary or permanent relationships, each of which is representative of systemic constraints applicable to any case.
In adapted with means to perform arithmetic calculus, and configured to link operations with the case, entity or workflow currently selected for input or consultation, and integrate constraints and thresholds in order to generate alerts.
In one embodiment there is provided a task list module adapted with means to allow a user to consult workflows associated with cases, in aggregate or granular manner, to manage overall caseload.
In one embodiment there is provided a bulk—update valuation module based on a set of parameters.
In one embodiment there is provided. an automated update module adapted. to interrupt an ongoing workflow layers alert a user that a particular and/or sub—layers, to threshold is no longer being met, and that action should be taken to stop the process.
There is also provided a computer program comprising program instructions for causing a computer program to carry out the above method which may be embodied on a record medium, carrier signal or read—only memory.
Brief Description of the Drawings The invention will be more clearly understood. from the following description of an embodiment thereof, given by way of example only, with reference to the accompanying drawings, in which:— Figure 1 illustrates a network environment in which a system according to the invention is embodied; Figure 2 illustrates an embodiment of a state machine according to the present invention: Figure 3 illustrates an architecture of a state machine shown in Figure 2 in an environment as shown in Figure 1; Figure 4 illustrates details of data structures processed by the architecture shown in Figure 3; Figure 5 illustrates relationships between data structures shown in Figure 4; Figure 6 details data processing steps of the system shown in Figures l to 5, including steps of updating data structures and data structures relationships; Figure 7 further details the data structure updating step of Figure 6, including a step of raising an alert; Figure 8 further details the data structure relationship updating step of Figure 6, including a step of receiving an alert; and Figure 9 illustrates details of a graphical user interface of the system shown in Figures 1 to 8.
Detailed Description of the Drawings Referring now to the figures and initially Figure 1, there is shown a network environment in which a system according to the invention is embodied.
The environment includes a plurality of data processing , 30, The data processing terminals are connected terminals 10, 40 operated by respective users 11, 21, 31 and 41. to a network 50, for instance the World Wide Web or Internet through respective network connection means 12, 22, 32, 42.
Each data processing terminals 10, 20 30 and 40 includes at least data processing means, specifically a microprocessor connected with data storage means and network interfacing specifically an alpha numerical means; user input means, input device and optionally a pointing device; and display means to facilitate input, output and interaction of the respective user with the data processing terminal.
There is therefore the scope within the network environment shown in Figure 1 for any one networked data processing terminal to broadcast data to and receive data from any other networked data processing terminal.
Referring now to Figure 2, there is shown a block diagram of a state machine embodying the system according to the present invention. shall be referred to as a this The state machine herein “workflow engine”, although denomination is not intended to limit the scope of the present disclosure.
The workflow engine 60 implements a layered data processing methodology, according to which individual data structures, hereinafter referred to as cases, entities and workflows, are processed according to a combination of pre—defined rules, processing thresholds, user—defined sequences and data structure relationships. Accordingly, means are provided for grouping a plurality of data structures, or elements, into a single case structure for processing, for classifying and division of data within the case structure into one or more workflows, and for managing the status of activities of the workflows within the case structure, such that individual data elements are processed simultaneously in parallel by a workflow engine.
A plurality of data processing layers 70 are processed in parallel in any one or all of data processing terminals 10, , 30 and 40. Each layer 70 has a layer start point 71, which may be any of a case, entity or workflow data structure, as well as a layer end point 72, which again may be of any of a case, entity or workflow data structure, wherein the end point 72 is representative of a desired data processing outcome according to a combination of pre- defined rules, processing thresholds, user—defined sequences and data structure relationships.
Between the layer start point 71 and the layer end point 72, the layer 70 may include any number of intermediary workflow stages 70n, as required. by the data processing necessary to achieve the outcome. Within each workflow stage 70n, including start and end points 71, 72, each point or stage may itself include at least one data processing sub-layer 80. each data layer 70 is essentially Whilst processing sequential, wherein a next workflow stage 70n is processed only upon. receiving the output of a preceding workflow stage 70n-1, a data processing sub—layer 80 may itself comprise a combination of sequential 81 and parallel 82, 83 again according to a thresholds, data processing sub—layer stages 80n, combination of pre-defined rules, processing user-defined sequences and data structure relationships.
With reference to Figure 3, an architecture of the state machine 60 is shown in the environment of Figure 1, wherein the workflow engine 60 configures data processing terminal for processing data according to a set of rules which will be further detailed herein below, and wherein it is represented as an integral part of an architecture 90, 110 and remote 120 respectively at data processing terminal 20, 130 and 140 at including a plurality of databases both local 100, respectively at data processing terminal 30, data processing terminal 40.
The architecture firstly includes an integrating data processing module 160, which perform relevant formatting, addressing and translation data processing tasks for routing case, entity and workflow data to and from. the local and remote databases. The services 160 therefore include at least one Application Programmers Interface (API) 161, an Internet interface (WEB) 162 and a server for example a SQL Server Integration Services (SSIS) Server .
The integrating module 160 channels data to and from the working module 170, which includes the workflow engine 60 .188}?- and the pre—defined rules, processing thresholds, defined sequences and data structure relationships l70n.
The working module 170 channels processed data to a user interface 180, which will be further detailed herein below and which is output to the display means of the data processing terminal 10, in order to facilitate interaction of the user 11 with the architecture.
Details of data structures processed by the architecture and the workflow engine 60 thereof are shown in Figure 4, 110, the which are stored in the plurality of databases 100, , 130 and 140. As explained, in the example databases are maintained at respective, network—connected terminals, and each will be described hereinafter as storing a respective, specific type of data structure for purposes of facilitating comprehension of the system according to the invention. It will be readily understood by the skilled person, however, that the databases may be may all be stored. at a same terminal, of data and data fewer in number, and/or that the storage structure structure type per database may be performed according to different rules or logic.
Database 100 stores workflow data structures 200. Each workflow data structure 200 comprises data defining any of a workflow a workflow layer 70, a workflow start point 71, end point 72, a workflow layer stage 70n, a data processing a sequential 81 sub—layer stage 80n or a n. 100 sub—layer 80, parallel sub—layer stage Database essentially stores the data structures required to progress the status of activities within the system.
Database 110 stores entity data structures. Each entity data structure comprises data defining any of a legal or individual entity 210, for instance the personal details of an individual, or data defining any further entities associated with one or a plurality of legal or individual entity 210, such as current balances and. exposures 211, credit applications 213, credit collaterals 216 Database 110 arrears and recoveries 212, agreements 214 and amendments 215 to same, of same and the like. and valuations essentially stores the most granular data structures within the system, which are subjected to and/or are part of workflow processes, and representative of the classification and division of data within the system.
Database 120 stores case data structures 220. Each case data structure 220 is essentially a register of workflow data structures 200m and entity data structures 210n combined into discrete, project—specific, temporary or permanent relationships, each of which is representative of an entire workflow layer 70 including any sub—layer 80, to instance for a legal achieve a particular outcome, for entity to apply for, negotiate, obtain and fulfil a credit agreement to term, and all of the workflow data structures required to achieve the outcome.
Database 130 receives and stores data feeds and external variables 230 and broadcasts same to the workflow engine for updating databases 100, 110, 120 and 140. Data feeds and external variables 230 comprise any of real—time, deferred or projected financial data streams 231, for instance stock market indexes and libor rates, as well as any further data variables representative of market status and/or risk 232, and/or risk 233, real business status estate status and/or risk 234, commercial or other status and/or risk 235. Database 130 essentially receives a stream of external factor data which may or may not bias the processing of workflow layers 70, and thus forwards it to the workflow engine 60 in connection with same.
, Database 140 stores workflow processing thresholds risk factors embodied as pre—defined rules 250, user— or business—defined processing sequences 260 and data structure relationships 270 which, in this database, correspond essentially to a register of workflow data structures 200 combined into discrete, temporary or permanent relationships, each of which is representative of statutory or systemic constraints applicable to any case 220, for instance regulator and compliance targets at a portfolio level, internal portfolio targets, solicitor exposure and the like.
This combination of data structures allows the system to precisely mimic complex real life business processes, which are typically far more interactive than existing workflow user- linear workflow engines can handle. For this purpose, and/or system—defined relationships exist between most data structures across the databases within the system, examples of which are illustrated in Figure 5.
The example is based on a customer requesting a financial initiating a new case 220 in database 120, which will loan, register and inventory the associations between workflow and entity data structures as explained below.
The in database 110 is initially customer entity 210 associated with a relevant loan application workflow 200 in database 100, “offer” comprising a start “application” point 71 and an end point 72, wherein the start “application” point 71 is associated with an application processing workflow 200 comprising one or a likewise in database 100, plurality of sub-layer stages 80n.
The application is subjected to the existence and value of a collateral in the first. place, whereby the customer's existing collateral 216 and its value 217 in database ll0 is associated with the loan application workflow 200 in database 100. This allows the application process to be initiated, and the customer entity 210 to be associated with a credit application structure 213 in database 110, and the credit application structure 213 to be associated with the application processing workflow 200 in database .
Upon completion of the application processing workflow 200 by the system, a credit agreement 214 representative of the loan offer in database 110 is associated with the end “offer” point 72 of the loan application workflow 200 in database 100 and physically sent out to the customer according to the customer particulars 210.
The processing steps according to which the workflow engine 60 processes case, entity and workflow data elements are detailed in Figure 6.
At step 601, the data processing terminal 10 is switched on by the user 11. At step 602 instructions configuring the data processing terminal 10 to embody substantially the architecture 90, including connectivity to remote databases , 130 and 140 are loaded in the terminal storage means.
The loading step 602 may be performed by reading instructions from a local medium. or by obtaining instructions from a remote terminal across network 50.
Upon completion of the loading step 602, the workflow engine is initialised and the user interface 180 is output step 603, at which stage the to the display means at workflow engine may process case, entity or workflow data according to local or remote input.
A first question is asked at step 604, as to whether local user input has been received. If the question is answered positively, the workflow engine 60 next checks whether the input has been received in connection with an existing case, entity or workflow data structure at step 605. If the check is negative, then the workflow engine creates a corresponding case, entity or workflow data structure at step 606, according to the input received at step 604.
Alternatively, if the question of step 604 is answered then a further question is asked at step 607, has negatively, as to whether the integrating module received remote data input. If the question of step 605 is answered then the data processing flow proceeds to positively, answer question 607 negatively, whereby the workflow engine 60 checks whether the input has been received in connection with an existing case, entity or workflow data structure relationship at step 608.
If the check is negative, then the workflow engine defines a corresponding case, entity or workflow data structure relationship at step 609, according to the input received at step 604. If the check is positive, however, then the local user input is read at the next step 610, which may for instance relate to a data update in connection with an existing case, entity, workflow, pre—defined rule, threshold, user—defined data processing sequence or structure relationship.
At step 611, the input data is processed by the workflow engine 60 in order to update the case, entity or workflow data structure. The input data is either the local input data read at step 610, or remote input data received by terminal 10 across network 50, which causes the data processing flow to answer question 607 positively.
At step 612, the updated case, entity" or workflow data structure is processed by the workflow engine 60 in order to update the case, entity or workflow" data structure relationships. At the next step 613, the user interface 180 is updated, so that the user 11 may either verify that the entity correct data has been input, or that the new case, or workflow data structure has been created, or that the new case, entity or workflow data structure relationship has been defined, and remain informed of any external factors embodied by the remote data of step 607.
At step 614, a further question is asked as to whether further input is required or has been received which, if answered positively, returns the data processing flow to the question of step 604. Alternatively, the question of step 614 is answered negatively, whereby a last question is as to whether the workflow engine 60 should asked at step 615, and, generally, the architecture 90 now be interrupted. then 604.
If the question of step 615 is answered negatively, the question of is again control returns to step Alternatively, the question of step answered positively, whereby instructions are unloaded from the storage means at step 616 and the terminal may be stopped or switched off at step 617.
The data processing system resulting from recursively performing steps 604 to 615, in relation to every workflow stage 70n and data processing sub-layer 80n, embodies a combination of tried and tested business techniques of: (i) Case Management — the group of data elements into a single case for processing; (ii) Entity Management — the classification and division of data within a system; and (iii)Workflow Management — managing the status of activities within the system.
The data processing system of the present invention is such that not only can the case have a workflow 70n but also every entity within the case may simultaneously have their own workflow 70n. The impact of this in system terms is significant in how it opens up business possibilities.
What is unique about the data processing system is that not only can the case have a workflow 70n but also every entity within the case may simultaneously have their own workflow n.
The workflow engine 60 is capable of managing these multiple parallel workflows 70n, 80n with ease and allows the banks to clearly establish a holistic perspective of the relationships with the customer and ensuring that they remain compliant with. the increasing regulator and risk management demands.
While the above example is relatively simple, the complexity that occurs in commercial and corporate banking is significantly greater and corresponding benefits of the workflow engine 60 are significant.
Each. individual entity within a workflow stage 70n can contain additional independent sub—workflows 80n in an associated one—to—many fashion. The workflow engine 60 end collateral management and loan fully allows for end to provisioning processes to have automated tasks created and scheduled with items being directed towards the and user skill sets in a fully required configurable auditable system.
The case, entity or workflow data structure updating step 611 is further detailed in Figure 7.
At step 701, a first question is asked as to whether the input data corresponds to a workflow data structure update.
If the question of step 701 is answered positively, then at structure is fetched step 702, the relevant workflow data by the workflow engine 60 from the relevant database 100 with a corresponding call to the integrating module 160.
Alternatively, the question of step 701 is answered negatively, whereby" a further question. is asked at step , as to whether the input data corresponds to an entity data structure update. If the question of step 703 is answered positively, then at step 704, the relevant entity data structure is fetched by the workflow engine 60 from the relevant database 110 with a corresponding call to the integrating module 160.
Alternatively, the question of step 703 is answered negatively, whereby a. further‘ question is asked at step 705, data as to whether the input data corresponds to a case structure update. If the question of step 705 is answered positively, then at step 706, the relevant case data structure is fetched by the workflow engine 60 from the relevant database 120 with a corresponding call to the integrating module 160.
Alternatively, the question of step 705 is answered negatively, signifying that the local or remote data is not applicable to an existing or new case, entity or workflow data structure and requires either further user input or an automatic update according to a predefined rule, whereby an alert is triggered at step 707.
If a workflow, entity or case data structure is fetched according to steps 702, 704 or 706 respectively, then at step 708 the workflow engine 60, specifically the layer 70 or sub-layer 80 processes the input data.
The updating of the relevant database 100, 110 or 120 at step 712 further to the processing of step 708 is dependent upon checking compliance of the data update against at plurality of risk factors least one, but preferably a embodied as pre—defined rules and/or processing thresholds stored in database 140.
Thus, at step 709 at least one or the first of a plurality of risk factors is selected by the workflow engine 60 and the output of step 708 is compared against same for compliance at step 710. If the compliance check of step 710 is positive, then the workflow engine 60 next checks at 711, needs to be in step whether another risk factor considered and control returns to step the affirmative, the data processing flow looping through steps 709 to 711 until such time as all risk factors associated with the workflow, entity or case data updates have been considered.
If the compliance check of step 710 is negative however, signifying that the case, entity or workflow data update has failed a risk threshold and requires either further user input or an automatic update according to a predefined rule, an alert is again triggered at step 707.
When the last risk factor has been considered and the data update compliance checked, then the question of step 711 is answered negatively and the relevant database 100, 110 or 120 may then be updated at step 712.
The case, entity or workflow data structure relationship updating step 612 is further detailed in Figure 8.
After updating the relevant data structure itself, with reference to the inter—relationships upon which the workflow engine 60 relies on achieve the main benefit of the present invention, the workflow engine preferably propagates the effect of the data update across any data structures having an existing relationship with the data structure last updated at step 611.
Accordingly, at step 801 the related data structures are aggregated by" the workflow engine 60 from the relevant database 100 with a corresponding call to the integrating module 160. At the next step 802, the relationship map linking the related data structures is parsed by the workflow engine 60 and a question is asked at step 803, as to whether the data update of step 611 corresponds to a new case, entity or workflow relationship defined at step 609.
In the affirmative, the relationship map is updated at step 804 with particulars of the new relationship, then the data update is propagated at step 805 to all related data structures according to the map. If the question of step 803 is answered negatively however, then there is no requirement to update the relationship map and control proceeds directly to the propagating of step 805.
The workflow manager 60, when recursively processing steps 604 to 615, and by extension recursively processing steps 701 to 712 and 801 to 805 for every case, workflow and entity data element, monitors a customer's financial position by tracking the following key variables in real time: Current Balances, Exposures, Arrears and Recoveries Credit Applications or Amendment. to existing credit agreements Collateral Valuations (based on internal and external data feeds) Regulator and Compliance Targets at a Portfolio Level Internal Portfolio Targets Solicitor Exposure The workflow manager 60 effectively tracks simultaneously all of the above inputs and determine a customer's overall health, expressed in data structure conforming to pre- defined processing parameters and thresholds. In particular, the system triggers alerts 707 if any thresholds have been breached. By extension, this granular level control also allows the financial institution to track the overall portfolio position and ensure ongoing regulator compliance.
A functional illustration of a graphical user interface for the above system is shown in Figure 9, with which to facilitate the user interaction with the complex set of relationships detailed hereinbefore. The graphical user interface 180 is preferably embodied as a set of instructions configuring a browser program 901, to ensure inter — operability and platform — agnosticism.
The instructions preferably comprise a security API 902 to ensure that the data exchanged over the network between the terminal processing the instruction and remote terminals, which is sensitive and confidential by nature, cannot be intercepted. The instructions next comprise a , easily database API which essentially corresponds functionally to the integrating data processing module 160 previously described.
User — interactive features of the user interface 180 include a plurality of fast — access modules, briefly comprising 23 data entry Inodule 904, a calculator‘ module 905, dashboard module 908. a task list module 906, a reporting module 907 and a Upon activation, the data entry module 904 allows the user to input and update case, workflow and entity data structures at step 604, which will be propagated as previously described.
The calculator module 905 allows the user to perform conventional arithmetic calculus, however it may usefully be configured to link operations with the case, entity or workflow currently selected for input or consultation, and integrate constraints and thresholds 240, 250, 260 in order to generate alerts 707.
The task list module 906 allows the user to consult workflows associated with cases, in aggregate or granular manner, for instance to manage overall caseload.
The reporting module 907 allows the user to query any or all 100 to 140 stored therein of databases for data reporting parameters, from a data- according to per structure basis querying the data at its most granular level, to a total database contents basis querying the data at the most holistic level permissible.
The dashboard module 908 is preferably user—configurable on a discrete basis, and allows each distinct user of the system. to select cases, workflows, entities, tasks and reports of current and/or periodical interest, in a summary and synthesised manner.
With this system, as well as allowing the user to complete a full data capture from start to finish as relevant data structures 200 to 260, each stage of the process can be subjected to a unique set of rules to calculate and score an individual based on the data capture, whereby the determination of whether they are eligible for the loan may be on request. This calculation is based on the predefined intelligent web of relationships between the predefined entities.
For example in order to calculate whether a particular deal should be approved the decision is based on the mapping of relationships between the main entities for that business including security, financials, process, exposure, facilities, history, arrears, credit Score and security.
A useful part of this system is also the ability to bulk- update valuations 217 based on a set of parameters. For example, an update could be issued to devalue all agricultural securities in a geographical location by a set percentage, based on an external factor 230 to 235. This automated update can then interrupt the ongoing workflow layers 70n and/or sub—layers 80n, to alert 707 the user that a particular threshold 240, 250 is no longer being met, and that action should be taken to decline the application.
The embodiments in the invention described with reference to the drawings comprise a computer apparatus and/or processes performed in a computer apparatus. However, the invention also extends to computer programs, particularly computer" programs stored on or in a carrier adapted to bring the invention into practice. The program may be in the form of source code, object code, or a code intermediate source and object code, such as in partially compiled form or in any other form suitable for use in the implementation of the method according to the invention.
The carrier may comprise a storage medium such as ROM, e.g.
CD ROM, a floppy disk or hard disk. signal which may be transmitted via an electrical or an or magnetic recording medium, e.g.
The carrier‘ may be an electrical or optical optical cable or by radio or other means.
In the specification the terms "comprise, comprises, comprised and comprising" or any variation thereof and the terms “include, includes, included and including" or any thereof considered to be totally variation are interchangeable and they should all be afforded the widest possible interpretation and vice versa. embodiments both The invention is not limited to the hereinbefore described but may be varied in construction and detail.
Claims (5)
1.A data processing system comprising: a plurality of data processing terminals operated by respective users, said data processing terminals connected to a network; means for grouping" a plurality of data elements into a single case data structure for processing; means for classifying and division of said data elements within the case structure into one or more workflows; and means for managing the status of activities of the workflows within the case structure such that individual data elements are processed simultaneously in parallel by a workflow engine, said workflow engine comprises a plurality of data processing layers adapted to be processed in parallel in any one or all of data processing terminals, each layer comprises a layer start point, which may be any of a case, entity or workflow data structure, as well as a layer end point, which may be of any of a case, entity or workflow data structure, wherein the end point is representative of a desired data processing outcome according to a combination of pre—defined criteria.
2.The data processing system as claimed in claim 1 wherein each data processing layer data stores a workflow data structure, each workflow data structure comprises data defining any of a workflow layer, a workflow start point, a workflow end point, a workflow layer stage, a data processing sub—layer, a sequential sub—layer stage and a parallel sub—layer stage.
3.The data processing system as claimed in claim 2 wherein each case data structure comprises a register of workflow data structures and entity data structures combined into discrete, project-specific, temporary or permanent relationships, each of which is representative of an entire workflow layer including any sub—layer.
4.The data processing system as claimed in any preceding claim comprising means for selecting at least one or a plurality of risk factors by the workflow engine and the output is compared against same for compliance, if the compliance check is positive, then the workflow engine is adapted to check whether another risk factor needs to be considered such that a data processing flow looping" process until such time as all risk factors associated with the workflow, entity or case data updates have been processed.
5.A data processing system as substantially hereinbefore described with reference to the accompanying description and/or drawings.
Publications (2)
Publication Number | Publication Date |
---|---|
IES85685Y1 true IES85685Y1 (en) | 2011-01-19 |
IE20100417U1 IE20100417U1 (en) | 2011-01-19 |
Family
ID=
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Liu et al. | Mitigating information asymmetry in inventory pledge financing through the Internet of things and blockchain | |
US8543486B2 (en) | Method and system for the protection of broker and investor relationships, accounts and transactions | |
US9313209B2 (en) | Loan origination software system for processing mortgage loans over a distributed network | |
Bahri et al. | Economic value added: a useful tool for SME performance management | |
Anshari et al. | Financial technology with AI-enabled and ethical challenges | |
US20180082381A1 (en) | Systems and methods for dynamically providing financial loan products | |
US7761354B2 (en) | Banking software design generator | |
CN108780559A (en) | Target reaches Portfolio Selection generating means, program and method | |
US20070067234A1 (en) | Mortgage loan system and method | |
JP5500662B2 (en) | Banking business processing system, method and program | |
CN110472815A (en) | To the risk control method and system of financing enterprise in a kind of supply chain financial business | |
US20170358027A1 (en) | Scoring trustworthiness, competence, and/or compatibility of any entity for activities including recruiting or hiring decisions, composing a team, insurance underwriting, credit decisions, or shortening or improving sales cycles | |
US20230116362A1 (en) | Scoring trustworthiness, competence, and/or compatibility of any entity for activities including recruiting or hiring decisions, composing a team, insurance underwriting, credit decisions, or shortening or improving sales cycles | |
Breese et al. | Blockchain Technology Adoption in Supply Change Management: Two Theoretical Perspectives. | |
Wu | A supportive pricing model for suppliers with purchase order financing | |
Wei | [Retracted] A Machine Learning Algorithm for Supplier Credit Risk Assessment Based on Supply Chain Management | |
GB2481820A (en) | Parallel workflow engine for classified data elements in workflow | |
Sastry | Artificial intelligence in financial services and banking industry | |
JP2004046363A (en) | Medium and small size enterprise grading evaluation system | |
JP2009503748A (en) | Purse commercial share | |
Chiantera | Data quality and data governance in insurance corporations | |
Shidaganti et al. | Challenges in Banking and Solving Them Using RPA | |
US20150134567A1 (en) | Employee stock ownership plan (esop) management system and method | |
IES85685Y1 (en) | Data processing system and method | |
IE20100417U1 (en) | Data processing system and method |