US20220359080A1 - Multi-model member outreach system - Google Patents
Multi-model member outreach system Download PDFInfo
- Publication number
- US20220359080A1 US20220359080A1 US17/861,672 US202217861672A US2022359080A1 US 20220359080 A1 US20220359080 A1 US 20220359080A1 US 202217861672 A US202217861672 A US 202217861672A US 2022359080 A1 US2022359080 A1 US 2022359080A1
- Authority
- US
- United States
- Prior art keywords
- outreach
- ticket
- members
- predictions
- prediction
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000036541 health Effects 0.000 claims abstract description 63
- 238000000034 method Methods 0.000 claims description 38
- 238000010801 machine learning Methods 0.000 claims description 32
- 230000008901 benefit Effects 0.000 claims description 20
- 230000008859 change Effects 0.000 claims description 15
- 238000012549 training Methods 0.000 claims description 8
- 230000000875 corresponding effect Effects 0.000 description 29
- 238000007637 random forest analysis Methods 0.000 description 24
- 238000003745 diagnosis Methods 0.000 description 19
- 229940079593 drug Drugs 0.000 description 19
- 239000003814 drug Substances 0.000 description 19
- 238000004891 communication Methods 0.000 description 16
- 238000002483 medication Methods 0.000 description 15
- 230000000630 rising effect Effects 0.000 description 14
- 238000013135 deep learning Methods 0.000 description 13
- 238000013136 deep learning model Methods 0.000 description 13
- 230000006870 function Effects 0.000 description 13
- 238000004422 calculation algorithm Methods 0.000 description 12
- NOESYZHRGYRDHS-UHFFFAOYSA-N insulin Chemical compound N1C(=O)C(NC(=O)C(CCC(N)=O)NC(=O)C(CCC(O)=O)NC(=O)C(C(C)C)NC(=O)C(NC(=O)CN)C(C)CC)CSSCC(C(NC(CO)C(=O)NC(CC(C)C)C(=O)NC(CC=2C=CC(O)=CC=2)C(=O)NC(CCC(N)=O)C(=O)NC(CC(C)C)C(=O)NC(CCC(O)=O)C(=O)NC(CC(N)=O)C(=O)NC(CC=2C=CC(O)=CC=2)C(=O)NC(CSSCC(NC(=O)C(C(C)C)NC(=O)C(CC(C)C)NC(=O)C(CC=2C=CC(O)=CC=2)NC(=O)C(CC(C)C)NC(=O)C(C)NC(=O)C(CCC(O)=O)NC(=O)C(C(C)C)NC(=O)C(CC(C)C)NC(=O)C(CC=2NC=NC=2)NC(=O)C(CO)NC(=O)CNC2=O)C(=O)NCC(=O)NC(CCC(O)=O)C(=O)NC(CCCNC(N)=N)C(=O)NCC(=O)NC(CC=3C=CC=CC=3)C(=O)NC(CC=3C=CC=CC=3)C(=O)NC(CC=3C=CC(O)=CC=3)C(=O)NC(C(C)O)C(=O)N3C(CCC3)C(=O)NC(CCCCN)C(=O)NC(C)C(O)=O)C(=O)NC(CC(N)=O)C(O)=O)=O)NC(=O)C(C(C)CC)NC(=O)C(CO)NC(=O)C(C(C)O)NC(=O)C1CSSCC2NC(=O)C(CC(C)C)NC(=O)C(NC(=O)C(CCC(N)=O)NC(=O)C(CC(N)=O)NC(=O)C(NC(=O)C(N)CC=1C=CC=CC=1)C(C)C)CC1=CN=CN1 NOESYZHRGYRDHS-UHFFFAOYSA-N 0.000 description 12
- 230000004913 activation Effects 0.000 description 9
- 210000002569 neuron Anatomy 0.000 description 9
- 206010012601 diabetes mellitus Diseases 0.000 description 8
- 238000013473 artificial intelligence Methods 0.000 description 7
- 102000004877 Insulin Human genes 0.000 description 6
- 108090001061 Insulin Proteins 0.000 description 6
- 238000013528 artificial neural network Methods 0.000 description 6
- 229940125396 insulin Drugs 0.000 description 6
- 230000007246 mechanism Effects 0.000 description 6
- 230000006872 improvement Effects 0.000 description 5
- 238000011282 treatment Methods 0.000 description 5
- 206010028980 Neoplasm Diseases 0.000 description 4
- 201000011510 cancer Diseases 0.000 description 4
- 230000002596 correlated effect Effects 0.000 description 4
- 238000009795 derivation Methods 0.000 description 4
- 201000010099 disease Diseases 0.000 description 4
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 4
- 230000000694 effects Effects 0.000 description 4
- 230000035558 fertility Effects 0.000 description 4
- 238000012417 linear regression Methods 0.000 description 4
- 230000008569 process Effects 0.000 description 4
- 238000005070 sampling Methods 0.000 description 4
- 230000009471 action Effects 0.000 description 3
- 238000013475 authorization Methods 0.000 description 3
- 230000003993 interaction Effects 0.000 description 3
- 230000002093 peripheral effect Effects 0.000 description 3
- 238000012545 processing Methods 0.000 description 3
- 230000009467 reduction Effects 0.000 description 3
- 238000001356 surgical procedure Methods 0.000 description 3
- 208000006673 asthma Diseases 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 2
- 238000002790 cross-validation Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000007613 environmental effect Effects 0.000 description 2
- 210000000744 eyelid Anatomy 0.000 description 2
- 238000007477 logistic regression Methods 0.000 description 2
- 238000000554 physical therapy Methods 0.000 description 2
- 238000011160 research Methods 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 238000012360 testing method Methods 0.000 description 2
- 238000012546 transfer Methods 0.000 description 2
- 238000012935 Averaging Methods 0.000 description 1
- 208000017667 Chronic Disease Diseases 0.000 description 1
- 206010020772 Hypertension Diseases 0.000 description 1
- 241000208125 Nicotiana Species 0.000 description 1
- 235000002637 Nicotiana tabacum Nutrition 0.000 description 1
- 230000004931 aggregating effect Effects 0.000 description 1
- 238000003915 air pollution Methods 0.000 description 1
- 239000008280 blood Substances 0.000 description 1
- 210000004369 blood Anatomy 0.000 description 1
- 239000003795 chemical substances by application Substances 0.000 description 1
- 238000004883 computer application Methods 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 238000013479 data entry Methods 0.000 description 1
- 238000003066 decision tree Methods 0.000 description 1
- 230000001627 detrimental effect Effects 0.000 description 1
- 235000013410 fast food Nutrition 0.000 description 1
- 230000008713 feedback mechanism Effects 0.000 description 1
- 235000013305 food Nutrition 0.000 description 1
- 210000002364 input neuron Anatomy 0.000 description 1
- 238000002955 isolation Methods 0.000 description 1
- 238000012886 linear function Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000000399 orthopedic effect Effects 0.000 description 1
- 230000000306 recurrent effect Effects 0.000 description 1
- 238000000611 regression analysis Methods 0.000 description 1
- 230000000391 smoking effect Effects 0.000 description 1
- 238000012706 support-vector machine Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
- G06N20/20—Ensemble learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N5/00—Computing arrangements using knowledge-based models
- G06N5/04—Inference or reasoning models
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/10—Office automation; Time management
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/02—Marketing; Price estimation or determination; Fundraising
- G06Q30/0201—Market modelling; Market analysis; Collecting market data
- G06Q30/0203—Market surveys; Market polls
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/02—Marketing; Price estimation or determination; Fundraising
- G06Q30/0201—Market modelling; Market analysis; Collecting market data
- G06Q30/0204—Market segmentation
- G06Q30/0205—Location or geographical consideration
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/02—Marketing; Price estimation or determination; Fundraising
- G06Q30/0201—Market modelling; Market analysis; Collecting market data
- G06Q30/0206—Price or cost determination based on market factors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q40/00—Finance; Insurance; Tax strategies; Processing of corporate or income taxes
- G06Q40/08—Insurance
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H10/00—ICT specially adapted for the handling or processing of patient-related medical or healthcare data
- G16H10/60—ICT specially adapted for the handling or processing of patient-related medical or healthcare data for patient-specific data, e.g. for electronic patient records
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/20—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/30—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/044—Recurrent networks, e.g. Hopfield networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
Definitions
- a health plan can be a health insurance plan that fully, or partially, covers the costs of medical services for members of the health insurance plan.
- an administrator can administer a health plan on behalf of a sponsor, such as an employer that offers a health plan to its employees and/or their families.
- the administrator may be a third party that administers the health plan for the employer, and members may be the employees, and/or family members of the employee, that are covered by the health plan.
- member outreach services may help members find healthcare services, help members make healthcare appointments, help members find transportation for healthcare appointments, educate members about medical conditions, educate members about prescriptions or other medical treatments, and/or otherwise assist members in overcoming hurdles in obtaining healthcare services.
- FIG. 1 depicts an example of a member outreach system.
- FIG. 2 depicts an example of data that can be stored in a health healthcare data repository.
- FIG. 3 depicts an example of a prediction engine.
- FIG. 4 depicts a non-limiting example of a prediction engine in which an ensemble of base predictive models includes instances of four types of base predictive models.
- FIG. 5 depicts an example of a curve that can be used by a prediction combination model to determine relative outreach priorities associated with base predictions and/or ticket feedback.
- FIG. 6 shows a flowchart of an example method for generating predictions and outreach tickets, and for adjusting weights in a prediction engine according to ticket feedback.
- FIG. 7 shows an example system architecture for a computing device associated with a member outreach system.
- the systems and methods described herein are associated with predictive models that are configured to generate predictions indicating when members of a health plan may benefit from member outreach services.
- the health plan can be an insurance plan, or other benefit plan, that provides one or more benefits to members of the health plan.
- a health plan can provide benefits that fully or partially cover the costs of healthcare services for members of the health plan.
- a health plan can be offered and/or managed by an administrator.
- the administrator may provide information about the health plan to members and potential members.
- the administrator may adjudicate healthcare claims associated with members of the health plan, when such healthcare claims are submitted by members, medical providers, or other entities.
- An administrator may also, or alternately, offer member outreach services to members. For example, representatives of an administrator can contact members to help the members find in-person healthcare services, help the members make healthcare appointments, help the members find transportation to and from healthcare appointments, educate members about their conditions, treatments, or medications, and/or otherwise assist members in overcoming hurdles in obtaining healthcare services.
- member outreach services may be particularly beneficial to “high cost” members and/or “rising risk” members of a health plan.
- High cost members can be members who are predicted to incur relatively high healthcare expenses over a period of time.
- Rising risk members can, in some examples, be members whose healthcare expenses are predicted to increase during a future time period, relative to previous time periods.
- Rising risk members may also, or alternatively, be members whose risks of being admitted or readmitted to a hospital or other healthcare facility are predicted to increase, whose utilization of healthcare services are predicted to increase, whose pharmaceutical usage is predicted to advance to more expensive medications and/or to medications with increased side effects, whose suffering levels are predicted to increase, whose mortality risks are predicted to increase, and/or whose risk in any other category is predicted to increase during a future time period.
- member outreach services may prompt early medical intervention or other proactive steps, for instance before such high or rising healthcare expenses are incurred or other detrimental effects are experienced. Accordingly, member outreach services may proactively improve the health of a member and/or reduce healthcare expenses ultimately incurred by the member.
- a typical conventional risk adjustment model may multiply an age of a member by a first value to generate a first cost estimate, a tobacco smoking status of the member by a second value to generate a second cost estimate, an indicator of the member's chronic disease history by a third value to generate a third cost estimate, and/or multiply other factors by other values to generate other cost estimates.
- These conventional risk adjustment model then add the various cost estimates together to calculate a final cost estimate of the member's expected costs over the next year, or other time period.
- cost estimates generated by such conventional linear regression-based risk adjustment models have often had a poor correlation to actual healthcare costs that members ultimately incurred.
- many conventional risk adjustment models are less than 60% accurate at predicting which one of two given members will have higher costs over a period of time. Accordingly, the poor performance of such conventional methods of determining cost estimates may be insufficient to adequately identify high cost and/or rising risk members that may benefit the most from member outreach services.
- member outreach services may not be able to assist the first member if the first member has already been hospitalized.
- a cost prediction for a second member indicates that the second member's healthcare costs are expected to rise by $10,000 over the next year, there may still be time for member outreach services to assist the second member before those costs are incurred, and/or to prevent those costs from being incurred.
- the second member may be a better candidate for member outreach services because the second member's situation is more “intervenable” by member outreach services than the first member.
- the systems and methods described herein can prompt member outreach services to be offered to members based on predictions generated by an ensemble of different base predictive models that are weighted based on feedback from representatives.
- the ensemble of base predictive models can generate a set of base predictions for a member of a health plan.
- the base predictions can be combined into a final prediction, using weights associated with the various base predictive models.
- the predictions can, for example, indicate predicted costs and/or changes in costs of members over a future period of time.
- the predictions may also indicate predicted utilizations and/or predicted risk metrics associated with members. The predictions can thus indicate which members are high cost and/or rising risk members that should be contacted to offer member outreach services.
- feedback from representatives who offer member outreach services to members can be used to adjust the weights used to combine the base predictions into the final prediction. In some cases, this may cause some lower cost members to be prioritized for member outreach services over some higher cost members, if the representative feedback indicates that some lower cost members are more “intervenable” using member outreach services than some higher cost members.
- the predictions made by the systems and methods described herein can be more accurate than conventional predictions made using linear regression methods alone.
- test data has shown improved performance of an example of the ensemble-based model described herein, relative to previous risk models that seek to predict who will become a high-risk claimant or who is a rising risk member.
- the test data indicated that the ensemble-based model described herein performed better than other risk models with respect to metrics including a percentage of cost variation explained by the model, a mean absolute error from perfect accuracy, and an area under the curve (C-statistic) indicating a probability of correctly predicting which of two people are higher risk.
- C-statistic area under the curve
- the ensemble-based model described herein can improve identification of members who may benefit the most from member outreach services that assist the members with finding care, arranging care, and that provide other medical and social support and education to members. Accordingly, time and computing resources associated with member outreach services can be more efficiently directed to those members who may benefit the most from those member outreach services. Additionally, attributes that lead the ensemble-based model to identify a member as a high cost or rising risk member can be identified to a representative in an outreach ticket, such that the identification of such attributes in an outreach ticket can improve efficiency of a member outreach workflow and/or reduce computing resources used by the representative.
- a representative can use information provided directly in an outreach ticket to understand why a member may benefit from member outreach services.
- FIG. 1 depicts an example of a member outreach system 100 associated with an administrator of a health plan.
- the administrator can administer the health plan on behalf of a sponsor.
- the sponsor can be an employer that offers a health plan to its employees and/or family members of the employees, and the administrator can be a third-party entity that manages the health plan for the employer.
- the sponsor can be an insurance company or any other entity that offers a health plan, but uses a third-party administrator to manage administration of the health plan.
- the administrator can itself be an insurance company or other entity that directly offers a health plan.
- the member outreach system 100 can be associated with member outreach services that can be provided by the administrator to members 102 of the health plan.
- Member outreach services may, for example, help members 102 find in-person healthcare services, help members 102 make healthcare appointments, help members 102 find transportation to and from healthcare appointments, educate members 102 about their medical conditions or medications, and/or otherwise assist members 102 in overcoming hurdles in obtaining healthcare services.
- the member outreach system 100 can predict when members 102 may benefit from member outreach services. In some examples, when the member outreach system 100 predicts that a member 102 may benefit from member outreach services, the member outreach system 100 can prompt a representative 104 to attempt to contact the member 102 to offer the member outreach services.
- a representative 104 can be a nurse, physician, pharmacist, social worker, member outreach agent, care coordinator, or any other type of representative of the administrator or an associated entity.
- the member outreach system 100 can automatically send notifications or other information directly to the member 102 , as discussed further below.
- predictions made by the member outreach system can be used to refer members 102 to partner providers or other entities. Accordingly, such partner providers may be able to provide healthcare services directly to members 102 in addition to, or instead of, member outreach services provided by representatives 104 .
- the member outreach system 100 can include one or more computing devices 106 , such as servers, workstations, cloud computing elements, and/or other computing devices.
- the computing devices 106 can have, and/or access, at least one healthcare data repository 108 .
- the computing devices 106 can also have a prediction engine 110 and a ticket manager 112 .
- the ticket manager 112 may be part of the prediction engine 110 .
- the prediction engine 110 and the ticket manager 112 can be separate components that execute on the same or different computing devices 106 .
- One or more healthcare data repositories 108 may also be stored on the same or different computing devices 106 than the prediction engine 110 and/or the ticket manager 112 .
- representatives 104 can use computing devices 106 to access the ticket manager 112 and/or other elements of the member outreach system 110 .
- a representative 104 can use a computer, workstation, mobile device, or other type of computing device 106 to locally execute an application that interfaces with the ticket manager 112 and/or other elements of the member outreach system 110 executing on a server or other remote computing device 106 .
- a representative 104 can use a computer, workstation, mobile device, or other type of computing device 106 to access the ticket manager 112 and/or other elements of the member outreach system 110 executing on one or more different computing devices 106 via a web browser or other user interface.
- a representative 104 may directly use the same one or more computing devices 106 that store and/or execute the healthcare data repository 108 , prediction engine 110 , and/or ticket manager 112 .
- FIG. 2 depicts an example of data that can be stored in a health healthcare data repository 108 .
- a healthcare data repository 108 can be a database or other type of data repository that stores data associated with members 102 of a health plan, data about healthcare claims associated with members 102 of the health plan, data about healthcare providers, and/or other data.
- a healthcare data repository 108 can use HIPAA-compliant security measures, and/or other security measures, to protect the privacy of stored information.
- a healthcare data repository 108 can store enrollment data 202 , claim data 204 , engagement data 206 , demographic data 208 , provider data 210 , and/or other types of data.
- Enrollment data 202 can include information about members 102 that has been provided by members, sponsors, partner networks, and/or other entities.
- enrollment data 202 can include a member's name, age, gender, ZIP code, other contact information, and/or other types of information about the member 102 .
- information in the enrollment data 202 may have been provided by a member 102 when the member 102 filled out enrollment paperwork online or on paper forms when the member 102 signed up for the health plan.
- enrollment data 202 can be provided by a sponsor of the health plan, such as an employer that enrolls employees in a health plan and provides employee information as enrollment data 202 .
- enrollment data 202 can be provided by, and/or updated by, members 102 through a health plan account website, mobile application, and/or other mechanism.
- Claim data 204 can include information about insurance claims or other claims that have been submitted in association with members 102 of one or more health plans. Claims may be submitted by members 102 , healthcare providers, and/or other sources. Claims may be associated with healthcare services, such as medical services, pharmaceutical services, dental services, vision care services, and/or other types of services. For example, when a member 102 obtains healthcare services from a healthcare provider, the member 102 and/or the healthcare provider may submit a claim to have some or all of the cost of the healthcare services be covered by a health plan. In some examples, claims may be submitted directly to the administrator. In other examples, claims may be submitted to insurance companies or other entities, who then provide claim data 204 to the administrator. In some examples, claim data 204 can include information from partner networks, partner providers, other administrators, and/or other entities about claims submitted against other health plans.
- claim data 204 can include names, membership numbers, contact information, and/or other information about members 102 associated with submitted claims.
- Claim data 204 can include names, identifiers, contact information, and/or other information about healthcare providers associated with submitted claims.
- claim data 204 can include a national provider identification number of a healthcare provider associated with a claim.
- information in claim data 204 about healthcare providers can also correspond to provider data 210 stored in the healthcare data repository 108 .
- claim data 204 can include diagnosis information associated with submitted claims.
- diagnosis information may be included in a submitted claim using a diagnosis code, such as diagnoses codes defined in the 10 th revision of the International Classification of Diseases, Clinical Modification (ICD-10-CM), or other types of diagnosis codes.
- ICD-10-CM codes or other diagnosis codes can be categorized into diagnosis categories within claim data 204 , such as Clinical Classification Software (CCS) codes used by the Agency for HealthCare Quality and Research.
- CCS Clinical Classification Software
- claim data 204 may also include information from pre-authorization requests that have been submitted prior to planned healthcare procedures. For instance, a pre-authorization request for a radiology procedure may be submitted by a provider to check whether a member's health plan covers that radiology procedure. In this example, information in or derived from the pre-authorization request can be added to the claim data 204 in the healthcare repository to indicate that a healthcare procedure may be planned for a member 102 , even if a final claim for that procedure has not yet been received.
- claim data 204 may be received from partners of the administrator that offer services that may not be directly covered by a health plan.
- an employer may be a sponsor that offers employees a health plan that does not directly cover fertility services.
- the administrator may partner with certain fertility service providers, and recommend such fertility service providers to members even if they are not directly covered by the members' health plan.
- the sponsor employer may offer fertility services through a partner provider as a separate benefit apart from a health plan.
- partner providers may provide the administrator with some types of data about whether members 102 have used the partner provider's services, and such data can be stored as claim data 204 in the healthcare data repository 108 even if no official claim has been submitted in association with those services.
- Engagement data 206 may include information about how members 102 have used websites, mobile applications, and/or other resources provided by the administrator.
- the administrator may provide websites or mobile applications that allow members 102 to search for providers, contact providers, access a member portal to update enrollment data 202 , and/or perform other actions associated with a health plan.
- the engagement data 206 may indicate whether members 102 have registered for an account to access a website and/or mobile application, what searches those members 102 have performed in the website and/or mobile application, whether members 102 clicked through on search results to view provider information and/or contact providers, whether members 102 have taken action on care recommendations provided to them through the website and/or mobile application, and/or other types of usage data or engagement data.
- the engagement data 206 may also indicate locations of members 102 .
- the mobile application may report location information about the smartphone, such as GPS coordinates.
- the mobile application can use that location information to display information about healthcare providers who are near the member's location, such as information about nearby healthcare providers that is drawn from the provider data 210 or other data sources.
- Location information associated with a mobile application and/or smartphone can also be stored in engagement data 206 in association with a member 102 .
- Demographic data 208 can include information, such as publicly available information, about people and/or environmental conditions in different geographical locations.
- demographic data 208 can indicate, for a given ZIP code, education levels of people in the ZIP code, air pollution rates in the ZIP code, cancer rates in the ZIP code, a number of fast food restaurants in the ZIP code, and/or other types of information.
- Provider data 210 can include information about healthcare providers.
- the provider data 210 can include names of healthcare providers, names of doctors or other healthcare workers who are associated with the healthcare providers, locations of the healthcare providers, contact information for the healthcare providers, information about specialties of the healthcare providers, national provider identification numbers of the healthcare providers, and/or other information about healthcare providers.
- the healthcare repository 108 can be accessible or indexed at a member level.
- data in the healthcare repository 108 can be correlated, filtered, indexed, or searched at a member level.
- information about the member 102 can be obtained from the enrollment data 202
- information from claims associated with the member 102 can be obtained from the claim data 204
- information about how the member 102 has used a website and/or mobile application can be obtained from the engagement data 204
- information about demographics and/or environmental conditions in the member's ZIP code can be obtained from the demographic data 208 .
- the healthcare repository 108 may also, or alternately, be accessible or indexed at a provider level.
- data in the healthcare repository 108 can be correlated, filtered, indexed, or searched at a provider level, for instance in association with one or more healthcare providers identified in provider data 210 and/or in claim data 204 .
- claim data 204 for claims associated with certain providers may be used to identify trends associated with individual providers, trends within one set of providers relative to other providers, geographical provider trends, and/or other provider-level data.
- claim data 204 may reveal that one provider generally recommends physical therapy before scheduling a type of surgery, while another provider schedules that type of surgery without first attempting physical therapy.
- claim data 204 may reveal local trends, for instance that providers in one locality are more likely to recommend a certain type of procedure as treatment for a particular medical condition than similar providers in a different locality.
- provider-level data can be used in the prediction engine 110 to generate predictions 114 of costs and/or visits associated with the providers.
- provider-level data can also be used when representatives 104 provide member outreach services to members 102 . For example, if provider-level data indicates that a first provider often charges much more for a type of procedure than a second provider, representatives 104 may recommend or suggest to members 102 that the members 102 visit the second provider.
- the healthcare repository 108 can be accessible or indexed at both the member level and the provider level.
- enrollment data 202 and/or claim data 204 may identify demographic information about a member 102 , such as the member's age, sex, and ZIP code. Such member-specific demographic information can be referenced against claim data 204 associated with healthcare providers members 102 have visited, to determine practice patterns of the healthcare providers. Such member-level data and provider-level data may accordingly be used together in the prediction engine 110 , as discussed further below.
- the prediction engine 110 can use data from the one or more healthcare data repositories 108 to generate predictions 114 associated with the members 102 . Examples of how the prediction engine 110 can be configured to generate predictions 114 are discussed in more detail below with respect to FIG. 3 and FIG. 4 .
- a prediction 114 generated by the prediction engine 110 for a member 102 can indicate one or more cost estimates associated with the member 102 , such as a total estimated healthcare cost, and/or a change in healthcare costs relative to previous healthcare costs, associated with the member 102 over the next year or other future time period.
- a prediction 114 may include an indication that a member's healthcare costs are predicted to rise, or decrease, by a predicted amount over the next year.
- a prediction 114 may also, or alternately, indicate one or more estimated utilizations associated with the health plan.
- a prediction 114 can include an estimated number of visits to healthcare providers in total, and/or an estimated number of visits to providers in one or more provider categories, over the next year or other future time period.
- a prediction 114 may indicate that, based on data in one or more healthcare data repositories 108 , a member 102 is predicted to incur $15,000 more in healthcare costs over the next year than the previous year, and is predicted to visit healthcare providers ten times over the next year, including three visits to emergency rooms, four visits to a primary care physician, two visits to a cardiologist, and one visit to a chiropractor.
- a prediction 114 may also, or alternately, indicate one or more estimated risk metrics associated with a member 102 .
- a prediction 114 can indicate predicted risk metric values such as a predicted risk of hospitalization, a predicted risk of readmission to a hospital, a predicted mortality rate, predicted suffering or pain levels, a predicted risk of pharmaceutical advancement to more expensive medications and/or to medications with increased levels of side effects, and/or other types of predicted risk metrics.
- member outreach services can be offered to members 102 based on predicted risk metrics of the members 102 , and/or comparisons between predicted risk metrics for different members 102 .
- predictions 114 may indicate that a pain level of a first member 102 is expected to be higher than a pain level of a second member 102 , based on claim data 204 indicating that the first member 102 has been diagnosed with a disease that is generally more painful than a disease with which the second member 102 has been diagnosed.
- the first member 102 may in some cases be prioritized for member outreach services over the second member 102 , as the member outreach services may help reduce the higher predicted pain levels of the first member 102 .
- a prediction 114 may indicate that a suffering level and/or mortality risk for a particular member 102 is expected to increase over the next two months. Accordingly, the member 102 can be prioritized for member outreach services that may help reduce, or lessen the increase in, the member's suffering level or mortality risk.
- the various types of information associated with a member 102 in the healthcare data repositories 108 may provide more complete information about a member 102 for the prediction engine 110 than could be provided by claim data 204 alone.
- engagement data 206 may indicate that a member 102 was using a mobile application to search for and contact podiatrists before claim data 204 indicates that foot care claims for the member 102 have been submitted. The engagement data 206 may therefore be used by the prediction engine 110 to predict that the member's healthcare costs may be expected to rise over the next month due to possible upcoming foot care services, even if claim data 204 does not yet indicate that foot care services have been provided to the member 102 .
- Provider-level data from the healthcare data repositories 108 can be used by the prediction engine 110 to make predictions 114 associated with members 102 .
- enrollment data 202 engagement data 206 , and/or provider data 210 may indicate that a member 102 is located in a geographical area where a provider-level data indicates that an expensive procedure is used more often to treat a certain condition than in other locations.
- past claim data 204 and/or provider data 210 may indicate that a specific provider, that engagement data 206 indicates a member 102 has searched for, routinely uses an expensive procedure more often than other nearby providers.
- the prediction engine 110 may predict that the member's costs may be expected to rise due to a relatively high chance of the expensive procedure being performed.
- machine learning and/or other artificial intelligence in the prediction engine 110 can analyze data in one or more healthcare data repositories 108 , at the member-level and/or the provider-level, to generate predictions 114 based on multiple factors and/or interactions between multiple factors.
- various factors from enrollment data 202 , claim data 204 , engagement data 206 , and/or demographic data 208 may indicate that people in relatively-low income ZIP codes who take insulin to treat diabetes have a spike in emergency room visits during the last week of each month, possibly because such people have relatively little money left from a first-of-the month paycheck and don't have sufficient food to prevent insulin dosages leading to low blood sugar.
- the predictive engine 110 can take such factors, and/or interactions between such factors, when generating predictions 114 of costs, health plan utilizations, risk metrics, and/or other values.
- the prediction engine 110 can include multiple base predictive models 304 that can take various factors from one or more healthcare data repositories 108 into account when generating base predictions 306 .
- the base predictions 306 from multiple base predictive models 304 can be combined into a final prediction 114 .
- the ticket manager 112 can be configured to generate outreach tickets 116 for representatives 104 .
- the outreach tickets 116 can be viewable and/or accessible via computing devices 106 used by the representatives 104 .
- the representatives 104 can at least attempt to contact members 102 identified by outreach tickets 116 to offer member outreach services to the members 102 .
- the representatives 104 can also use computing devices 106 to provide ticket feedback 118 to the ticket manager 112 about outreach tickets 116 and/or corresponding members 102 .
- the prediction engine 110 can use ticket feedback 118 provided by representatives 104 to adjust how the prediction engine 110 generates subsequent predictions 114 .
- the computing devices 106 may also, or alternately, provide automated notifications 120 to members 102 in response to generated predictions 114 .
- the ticket manager 112 can generate outreach tickets 116 based on part on cost estimates, utilization metrics, risk metrics, and/or other data in predictions 114 generated by the prediction engine 110 .
- a prediction 114 may indicate that a member's total costs are predicted to be over $50,000, or another threshold value, over the next year, and thus indicate that the member 102 is predicted to be a “high cost” member 102 .
- a prediction 114 may indicate that a member's healthcare costs are predicted rise by over $1,000, or another threshold value, over the next month relative to the previous month, and thus indicate that the member 102 is predicted to be a “rising risk” member 102 .
- a member 102 may also, or alternately, be considered to be a “rising risk” member 102 if risk metrics in a prediction 114 indicates that the member 102 is predicted to utilize or visit healthcare providers in a future time period more often than the member 102 has in the past, that the member's risk of admission or readmission to a hospital or other healthcare facility is precited to increase, that the member's suffering levels or mortality risks are expected to increase relative to previous values, that the pharmaceutical usage is predicted to advance to more expensive medications and/or to medications with increased side effects, and/or that the member's risk in any other category is predicted to increase during a future time period.
- the ticket manager 112 may generate an outreach ticket 116 associated with the member 102 due to the total costs, or one or more predicted rises in costs, utilizations, and/or risk metrics, exceeding a threshold amount.
- member outreach services may be able to proactively assist the member 102 to obtain healthcare services and/or otherwise manage the healthcare issues associated with the predicted costs, utilizations, or risk metrics.
- member outreach services may be able to assist the member 102 with proactively seeking medical care, before a condition worsens and more costly healthcare services may be needed to manage the condition.
- Member outreach services may therefore proactively improve the health of the member 102 , reduce the healthcare costs ultimately incurred by the member 102 , reduce a number of utilizations associated with the member 102 , and/or reduce risk metrics associated with the member 102 . Accordingly, in some examples, member outreach services may cause healthcare costs incurred by a member 102 over a period of time to be lower than cost estimates in a prediction 114 for that period of time.
- An outreach ticket 116 can identify a corresponding member 102 , for example as by including the member's name, contact information, and/or other demographic information.
- the outreach ticket 116 may also indicate one or more reasons why the member 102 may benefit from member outreach services, based on information in a final prediction 114 and/or base predictions 306 .
- key variables from the healthcare data repository 108 that most influenced the final prediction 114 and/or base predictions 306 can be identified and flagged within an outreach ticket 116 .
- a Local Interpretable Model-Agnostic Explanations (LIME) method can be used to identify the variables that most influenced a prediction 114 or its component base predictions 306 .
- LIME Local Interpretable Model-Agnostic Explanations
- a LIME analysis may indicate that a new cancer diagnosis or a change in prescribed medications was one of the primary reasons that caused the prediction engine 110 to generate a prediction 114 indicating that a member 102 is expected to have a rise in costs, a rise in utilizations, and/or increased risk metrics over an upcoming period of time, while a recent procedure for a flu shot is not a key variable that is associated with the predicted rise in costs, utilizations, or risk metrics.
- the ticket manager 112 can generate an outreach ticket 116 that includes indicators of the identified key variables.
- the ticket manager 112 can generate an outreach ticket 116 by filling in blanks of a template with language associated with the identified key variables, such as the new cancer diagnosis or the change in prescribed medications, to indicate to a representative 104 why the corresponding member 102 may benefit from member outreach services.
- the prediction engine 110 may generate a prediction indicating that the member's costs are predicted to rise due to increased visits to a diabetes specialist to manage conditions resulting from missed medications.
- the ticket manager 112 may generate an outreach ticket 116 indicating to a representative 104 that the member 102 may need assistance with insulin medications or other issues associated with diabetes.
- the ticket manager 112 may use data in a prediction 114 , such as cost predictions, predicted total utilizations across all types of providers, predicted utilizations of one or more specific providers or types of providers, and/or predicted risk metrics to select corresponding words, phrases, or values for variables in templates for outreach tickets 116 .
- the ticket manager 112 may have one or more templates for outreach tickets 116 .
- a template may or may not have some predefined language, and may include blank spaces where words or phrases can be inserted, variable fields where values can be entered or selected, or other portions that can be otherwise changed based on a prediction 114 .
- the ticket manager 112 may select words, phrases, or other values for corresponding elements of a template based on key variables of predictions 114 generated by the prediction engine 110 .
- a generated outreach ticket 116 may assist a representative 104 in understanding what types of member outreach services should be offered to a member 102 . For example, if key variables from predictions 114 indicated that a first member 102 has been diagnosed with a foot problem and also has been diagnosed with diabetes, an outreach ticket 116 may identify those diagnoses. A representative 104 may accordingly understand from the outreach ticket 116 that the first member 102 may need assistance with insulin medications or other diabetes-related issues. However, if a prediction 114 for a second member 102 indicates a diagnosis of a similar foot issue but also indicates that the second member 102 has been visiting a sports therapist, the member's foot issue may have occurred because the second member 102 is a runner.
- a representative 104 may understand from a corresponding outreach ticket 116 that the second member 102 may need a different type of patient outreach services than the first member 102 who has diabetes.
- ticket feedback 118 can be received from a representative 104 about how “interpretable” the information in the outreach ticket 116 was to the representative 104 , such a score indicating whether the outreach ticket 116 expressed understandable and/or useful information to the representative 104 .
- the ticket manager 112 can provide the outreach tickets 116 to representatives 104 .
- the ticket manager 112 can add a generated outreach ticket 116 to a queue of outreach tickets 116 available to representatives 104 .
- a representative 104 may use a user interface of the ticket manager 112 , or other application executing on a computing device 106 , to view and/or select outreach tickets 116 from the queue of outreach tickets 116 .
- the ticket manager 112 can send a generated outreach ticket 116 to a representative 104 via an email, text message, or other type of electronic notification, and/or otherwise provide the outreach ticket 116 to a representative 104 .
- the outreach tickets 116 may be associated with priority levels, and the ticket manager 112 can arrange, sort, and/or filter outreach tickets 116 in the queue based on priority levels of the outreach tickets 116 . For instance, the ticket manager 112 can arrange a queue such that more urgent outreach tickets 116 are prioritized over, and/or placed closer to the front or to the top of the queue than, less urgent outreach tickets 116 . As will be discussed further below, a priority level of an outreach ticket 116 can be based at least in part on cost estimates, utilization estimates, risk metric estimates, or other data in an associated prediction 114 .
- a first outreach ticket 116 associated with a first prediction 114 indicating that a first member's costs are expected to rise by $15,000 may be a higher priority than a second outreach ticket 116 associated with a second prediction 114 indicating that a second member's costs are expected to rise by $10,000.
- a priority level of an outreach ticket 116 can also be based, at least in part, on ticket feedback 118 from representatives 104 .
- the ticket feedback 118 can indicate indicating how “interpretable” previous similar outreach tickets 116 were to the representatives 104 , based on how well the representatives 104 subjectively felt that the previous outreach tickets 116 expressed information about why a member 102 was to be contacted.
- the ticket feedback 118 can also, or alternately, indicate how “intervenable” members 102 identified by previous similar outreach tickets 116 were, based on subjective input from representatives 104 about whether those members 102 were good candidates for member outreach services.
- the second outreach ticket 116 may be prioritized above the first outreach ticket 116 even though the first outreach ticket 116 is associated with a higher predicted rise in costs.
- a representative 104 can at least attempt to contact the member 102 to offer and/or provide member outreach services to the member 102 .
- the representative 104 can place a phone call to a phone number of the member 102 , send a text message to the phone number of the member 102 , send an email to an email address of the member 102 , initiate a video chat with the member 102 , and/or otherwise at least attempt to contact the member 102 to provide member outreach services to the member 102 .
- the computing devices 106 can alternately, or additionally, attempt to contact a member 102 identified by a prediction 114 and/or an outreach ticket 116 via an automated notification 120 .
- the computing devices 106 can be configured to send an email to an email address of the member 102 , send a text message to a phone number of the member 102 , make a phone call to a phone number of the member 102 to play a prerecorded audio message for the member 102 via the phone call, display a flagged notification in a member portal accessible via a website or mobile application, display a notification on a smartphone via a mobile application, and/or attempt to contact the member 102 using any other automated communication mechanism.
- An automated notification 120 can, based on a prediction 114 , indicate to the member 102 why the member 102 is being contacted.
- an automated notification 120 can be an automated email that reminds a member 102 to take a medication and/or that provides medication instructions.
- a healthcare data repository 108 may indicate communication preferences of members 102 .
- communication preferences of a member 102 may indicate that the member 102 prefers phone calls over email communications.
- the computing devices 106 may follow the communication preferences of the member 102 by providing an outreach ticket 116 to a representative 104 , so that the representative 104 can place a phone call to the member 102 . If the communication preferences of the member 102 instead indicate that the member 102 prefers email communications over phone calls, the computing devices 106 may instead attempt to contact the member 102 using an emailed automated notification 120 instead of providing an outreach ticket 116 to a representative 104 .
- communication preferences of a member 102 may be based on enrollment data 202 , such as from preference information indicated by the member 102 on a health plan enrollment form.
- the computing devices 106 can infer or derive communication preferences of a member 102 based on engagement data 206 or other data about interactions with the member 102 over time. For instance, if a member 102 answers phone calls from administrator personnel more often than the member 102 responds to emails from the administrator, the computing devices 106 can infer that the member 102 prefers to communicate via phone calls and indicate that inferred preference in data about communication preferences of the member 102 .
- the computing devices 106 may override communication preferences of a member 102 . For example, if a prediction 114 and/or outreach ticket 116 is generated with a priority level above a certain threshold, for instance if the member 102 is predicted to incur a sudden rise in costs over a short period of time, the ticket manager 112 may provide an outreach ticket 116 to a representative 104 . The outreach ticket 116 may instruct the representative 104 to call the member 102 to address conditions associated with the predicted sudden rise in costs, even if communication preferences of the member 102 indicate that the member 102 prefers email communications.
- the representative 104 may at least attempt to contact the member 102 to provide member outreach services, as discussed above.
- the representative can also return ticket feedback 118 associated with the outreach ticket 116 to the computing devices 106 .
- the ticket feedback 118 can include subjective feedback from the representative 104 about an outreach ticket 116 .
- the ticket manager 112 can provide a survey, form, user interface, and/or other mechanisms that representatives 104 can use to provide ticket feedback 118 .
- ticket feedback 118 may indicate a score, based on subjective input from a representative 104 , of how “intervenable” a member 102 was via member outreach services. For example, if a representative 104 could not reach a member 102 to offer member outreach services because the member 102 had already been hospitalized for a medical condition, the representative 104 may provide ticket feedback 118 with a score indicating that the member 102 was a relatively poor candidate for member outreach services because no intervention prior to hospitalization was possible. Similarly, if a representative 104 determines that a member 102 has a terminal disease for which little meaningful intervention is possible, the representative 104 may also provide ticket feedback 118 with a score indicating that the member 102 was a relatively poor candidate for member outreach services.
- the representative 104 may provide ticket feedback 118 with a score indicating that the member 102 was a relatively good candidate for member outreach services. For instance, if an outreach ticket 116 indicates that a member 102 was recently diagnosed with asthma, the representative 104 may be able to educate the member 102 about how to use an inhaler and thereby reduce the risk of the member 102 visiting an emergency room due to the member's asthma.
- ticket feedback 118 may also, or alternately, indicate a score, based on subjective input from a representative 104 , of how “interpretable” an outreach ticket 116 was to the representative 104 .
- a representative 104 may not understand from the outreach ticket 116 why the member 102 is to be contacted.
- the representative 104 may need to call the member 102 without understanding the member's situation, which may impact the quality or types of member outreach services provided to the member 102 .
- the representative 104 may provide ticket feedback 118 indicating that the outreach ticket 116 itself was unclear and/or unhelpful to the representative 104 .
- ticket feedback 118 can be based on Likert scale assessments by a representative 104 of one or more aspects of an outreach ticket 116 .
- a representative 104 can provide a rating on a scale of 0 to 5, or any other scale, for categories such as “intervenability,” “interpretability,” and/or other categories associated with an outreach ticket 116 .
- the representative 104 can provide an intervenability rating on a 0 to 5 scale based on whether the representative 104 felt that the member 102 identified by the outreach ticket 116 was a suitable candidate for member outreach services, and can also provide an interpretability rating on a 0 to 5 scale based on whether the representative 104 felt that the outreach ticket 116 sufficiently explained a health issue or rise in costs associated with the member 102 that could be addressed using member outreach services.
- scores in different categories can be summed together, or otherwise combined, into a total score for the ticket feedback 118 . For example, if a representative 104 provides an intervenability score of “3” and an interpretability score of “4,” the ticket feedback 118 may indicate a total score of “7” for an outreach ticket 116 .
- the ticket manager 112 can provide a user interface for representatives 104 to enter scores for ticket feedback.
- the user interface can provide a user interface with sliders, radio buttons, dropdown menus, freeform data entry fields, and/or other user interface elements that representatives 104 can use to enter intervenability scores, interpretability scores, and/or other types of ticket feedback 118 associated with outreach tickets 116 .
- Ticket feedback 118 for example as submitted to the ticket manager 112 , can be provided to, and/or shared with, the prediction engine 110 .
- the prediction engine 110 can use the ticket feedback 118 to adjust weights used within the prediction engine 110 to combine base predictions into a final prediction 114 , as discussed further below with respect to FIG. 3 and FIG. 4 .
- the member outreach system 100 can also, or alternatively, use a prediction 114 to determine if or when a member 102 is suitable for intervention by a healthcare provider that partners with the administrator. For example, in addition to, or instead of, providing an outreach ticket 116 to a representative, the member outreach system 100 may generate a referral for a partner provider, such that the partner provider can contact the member 102 to directly offer corresponding healthcare services to the member 102 .
- the prediction engine 110 can use other types of feedback associated with the partner provider and/or the member, in addition to or instead of ticket feedback 118 , to adjust predictions 114 or how predictions 114 are generated.
- the prediction engine 110 can determine if that treatment ultimately reduces actual healthcare costs incurred by the member 102 , reduces predicted healthcare costs during future time periods, and/or improves the member's health over time. If such actual or predicted cost reductions and/or health improvements do occur, the prediction engine 110 can use machine learning and/or other artificial intelligence techniques to learn that similar members 102 should be referred to the partner provider in the future.
- FIG. 3 depicts an example of the prediction engine 110 .
- the prediction engine 110 can include an ensemble of multiple base predictive models 304 .
- the base predictive models 304 can be configured to, based at least in part on healthcare data 302 from one or more healthcare data repositories 108 , generate base predictions 306 associated with members 102 of a health plan.
- the healthcare data 302 used by the base predictive models 304 can include enrollment data 202 , claim data 204 , engagement data 206 , demographic data 208 , provider data 210 , and/or other types of data at a member level and/or at a provider level, as discussed above.
- an individual base prediction 306 produced by an individual base model 304 can include an estimate of costs that a member 102 will incur over a period of time, and/or an estimated change in such costs relative to previous costs incurred by the member 102 .
- a base prediction 306 may also include estimated utilization data, such as predictions of how many times a member 102 will visit providers in one or more categories, and/or in total, over a period of time.
- the base predictive models 304 can be machine learning models, artificial intelligence models, and/or other types of predictive models.
- the base predictive models 304 can include, or be based on, regularized linear algorithms, Gradient Boosted Machines (GBMs), Random Forest algorithms, deep learning algorithms, recurrent neural networks, other types of neural networks, nearest-neighbor algorithms, support-vector networks, linear regression, logistic regression, other types of regression analysis, decision trees, and/or other types of artificial intelligence or machine learning frameworks.
- a base predictive model 304 can be a tree-based learning model that generates trees with different branches associated with different factors.
- a tree-based leaning model can generate a tree with a branch for whether a member 102 has a diabetes, a subsequent branch for whether the member 102 is on insulin, a subsequent branch for whether the member 102 is in a low-income ZIP code, and a subsequent branch for whether it is the last week of the month. If data for a member 102 indicates that the answer is “yes” for all of these branches, the tree may indicate a base prediction 306 at an ending node of the branches with predicted costs, utilizations, and/or risk metrics for members 102 who meet that criteria. For example, as discussed above, members 102 who meet the criteria of this example have increased chances of visiting emergency rooms.
- the ensemble of base predictive models 304 can include at least two types of predictive models.
- FIG. 4 depicts a non-limiting example in which the prediction engine 110 includes four types of base predictive models 304 , including regularized linear models 402 , GBM models 404 , Random Forest models 406 , and deep learning models 408 .
- the prediction engine 110 may include different types of base predictive models 304 than are shown in FIG. 4 .
- FIG. 4 depicts a non-limiting example in which the prediction engine 110 includes four types of base predictive models 304 , including regularized linear models 402 , GBM models 404 , Random Forest models 406 , and deep learning models 408 .
- the prediction engine 110 may include different types of base predictive models 304 than are shown in FIG. 4 .
- FIG. 4 depicts a non-limiting example in which the prediction engine 110 includes four types of base predictive models 304 , including regularized linear models 402 , GBM models 404 , Random Forest models 406 , and deep learning models 408 .
- the prediction engine 110 can have two different types of base predictive models 304 , three different types of base predictive models 304 , four different types of base predictive models 304 , five different types of base predictive models 304 , or any other number of different types of base predictive models 304 .
- Different types of predictive models may have different strengths and weaknesses, such that some types of predictive models may be better suited to make base predictions 306 about certain types of situations than other types of predictive models. Accordingly, even if a particular type of predictive model is not well-suited to make base predictions 306 for a certain type of situation associated with a member 102 , one or more other types of predictive models in the ensemble may be better suited to make base predictions 306 for that type of situation.
- base predictions 306 from multiple base predictive models 304 can be combined together into a final prediction 114 . Accordingly, even if individual base predictive models 304 are relatively imperfect, the combined final prediction 114 can more closely approximate actual results than the individual base predictions 304 taken in isolation.
- At least some of the base predictive models 304 can be machine learning or artificial intelligence models that can be trained based on historical data in the healthcare data 302 .
- a base predictive model 304 can be trained to identify which features within historical healthcare data 302 associated with a member 102 are predictors of past healthcare costs incurred by that member 102 , past visits to healthcare providers by that member 102 and/or past risk metrics associated with the member 102 .
- the base predictive models 304 can use current healthcare data 302 corresponding to the identified features to generate base predictions 306 of future costs, future utilizations, and/or future risk metrics associated with members 102 .
- the ensemble of base predictive models 304 may include one or more instances of each type of predictive model present in the ensemble.
- the ensemble of base predictive models 304 can include a first set of instances of a first type of predictive model, and also include a second set of instances of a second type of predictive model.
- each instance of a base predictive model 304 can be trained on a random subset of the healthcare data 302 . Accordingly, one instance of a base predictive model 304 may be trained on a different random subset of the healthcare data 302 than a different instance of a base predictive model 304 .
- the prediction engine 110 can also have a prediction combination model 308 .
- the prediction combination model 308 can be configured to combine a set of base predictions 306 , generated by multiple base predictive models 304 , into a final prediction 114 .
- the prediction combination model 308 can, at least in part, use weights 310 associated with different types of base predictive models 304 and/or individual instances of base predictive models 304 to combine corresponding base predictions 306 into a final prediction 114 .
- the prediction combination model 308 may combine the two base predictions into a prediction that the member's costs will rise by $15,000, if the base predictive models 304 that made the base predictions 306 are weighted equally. However, weights 310 associated with different base predictive models 304 may not be equal, and numerous base predictions 306 generated by numerous base predictive models 304 with varying weights 310 can be used to generate the final prediction 114 .
- the prediction combination model 308 can be a machine learning model, artificial intelligence model, or other type of predictive model.
- the prediction combination model 308 may be referred to as a “meta-learner” or a “super learner.”
- the prediction combination model 308 can use gradient boosting meta-leaming and/or a stacking method to determine the weights 310 to use when combining base predictions 306 into a final cost prediction 114 .
- the trained base predictive models 304 can generate base predictions 306 on remaining samples of the historical healthcare data 302 .
- Such base cost predictions 306 can be evaluated using an area under a curve statistic, known as a C-statistic, which indicates a probability of correctly predicting which of two members 102 are “high cost” members 102 with healthcare costs above a threshold value.
- C-statistics associated with different instances of base predictive models 304 can accordingly indicate relative performances of the different instances of the base predictive models 304 , and in some examples can be used to determine corresponding weights 310 to select for the different instances of the base predictive models 304 .
- a base predictive model 304 can be trained until its C-statistic meets a threshold value.
- base predictive models 304 can also be subject to regularization, such as elastic net regularization, to prevent the base predictive models 304 from overfitting their base predictions 306 .
- Each instance of base predictive model 304 can also be assigned a corresponding weight 310 in the prediction combination model 308 .
- the weights 310 for different base predictive model 304 can initially be equal, or be set to a predefined starting value. However, thereafter the prediction combination model 308 can use ticket feedback 118 provided by representatives 104 to continually or periodically adjust the weights 310 associated with different base predictive models 304 . In some examples, the prediction combination model 308 may adjust weights 310 for different base predictive models 304 until predictions 114 generated using a weighted average of base predictions 306 maximizes a C-statistic in one or more cross-validation samples from training data.
- the prediction engine 110 may generate a set of predictions 114 for a set of members 102 , and the ticket manager 112 can generate corresponding outreach tickets 116 for the members 102 that are prioritized in a queue based on which members 102 are predicted to have the highest rise in costs. However, if ticket feedback 118 indicates that the member 102 associated with the highest priority outreach ticket 116 was not as intervenable, and/or the outreach ticket 116 was not as interpretable, than a lower-priority outreach ticket 116 , the prediction engine 110 can adjust the weights 310 so that base predictions 306 from base predictive models 304 that indicated the member 102 was a high priority are down-weighted in the future.
- a loss function in the prediction combination model 308 can be adjusted to down-weight base predictive models 304 that were associated with lower ticket feedback 118 on a Likert scale or other scale, and up-weight base predictive models 304 that were associated with higher ticket feedback 118 on a Likert scale or other scale.
- An example of adjusting weights 310 based on ticket feedback 118 is discussed further below with respect to FIG. 5 .
- the prediction engine 110 can also use other types of feedback to adjust how base predictions 306 and/or final predictions 114 are generated. For example, if a prediction 114 indicates that a member's healthcare costs are expected to rise due a new diagnosis, the prediction engine 110 can use claims data 206 to determine whether claims associated with that diagnosis are ultimately received. In some examples, the lack of later-submitted claims associated with a diagnosis can be an indication that member outreach services have been successful. As another example, the prediction engine 110 or the ticket manager 112 can track whether outreach tickets 116 are transferred between representatives 104 .
- the prediction engine 110 may use transfer data to increase the chances of similar outreach tickets 116 indicating medication issues and/or being directed to pharmacists in the future.
- FIG. 4 depicts a non-limiting example of a prediction engine 110 in which the ensemble of base predictive models 304 includes instances of four types of base predictive models 304 , including regularized linear models 402 , Gradient Boosted Machine (GBM) models 404 , Random Forest models 406 , and deep learning models 408 .
- GBM Gradient Boosted Machine
- FIG. 4 is not intended to be limiting, as the prediction engine 110 may include different types of predictive models than are shown in FIG. 4 , and/or can include fewer, or more, than four types of base predictive models 304 .
- a regularized linear model 402 can be a base predictive model 304 that outputs a linear model prediction 410 .
- a linear model prediction 410 can be a base prediction 306 of costs, utilization, risk metrics, and/or other information about a member 102 .
- a regularized linear model 402 can be based on elastic net regularization, which seeks to minimize a penalty function over a grid of values for a regularization parameter, 2 . Such regularization can avoid overfitting to noise.
- the regularization parameter, 2 can control the strength of a balancing parameter, a, between lasso regression and ridge regression.
- the lasso regression can be L1 regularization, which selects one correlated covariate and removes other covariates from the equation.
- the ridge regression can be L2 regularization, which shrinks coefficients of correlated covariates towards each other.
- a regularized linear model 402 can seek to minimize the negative binomial log-likelihood expressed in the following equation (Equation 1):
- regularization algorithm can perform coordinate descent on a quadratic approximation to the log-likelihood.
- Generalized linear modeling estimators can be trained on healthcare data 302 in the prediction engine 110 to find an optimal 2 value using Equation 1, at each a value, across internal cross-validations on a training dataset from the healthcare data 302 .
- a first portion of Equation 1 (ahead of the ⁇ value) can determine x values that are most predictive.
- a cancer diagnosis can be highly predictive of future costs, while a diagnosis of an eyelid flutter may not be highly predictive of future costs.
- an orthopedic surgery diagnosis can indicate that a member's condition may be “intervenable” through member outreach services, while an eyelid flutter diagnosis may not be “intervenable” through member outreach services.
- the second portion of Equation 1 (multiplied by the value) can assist with pulling out noise in the data.
- the value can be determined through machine learning to balance the two portions of the equation.
- a GBM model 404 can be a base predictive model 304 that outputs a GBM prediction 412 .
- a GBM prediction 412 can be a base prediction 306 of costs, utilization, risk metrics, and/or other information about a member 102 .
- GBM models 404 can develop regression trees.
- a GBM model 404 can be an ensemble learning algorithm that sequentially builds classification trees on features of derivation datasets, such as datasets from healthcare data 302 .
- regression trees can be developed in a GBM model 404 by sequentially selecting covariates to parse each derivation dataset into subsets with high between-group and low-within-group variance in future predicted annualized cost.
- Gradient boosting can seek to improve the choice of covariates and their combinations over time to minimize error between observed and predicted outcome rates in each terminal node, while repeatedly sampling from the underlying data to prevent overfitting.
- a GBM model 404 can generate a first tree based on a random sample of healthcare data 302 , and then generate subsequent trees based in part on previous trees. For example, a second tree may be based on different random sample than the first tree, and the GBM model 404 can determine if predictions of the first tree or the second tree are more accurate, and adjust trees accordingly. As such, the GBM model 404 can generate trees by learning from preceding trees.
- a GBM model 404 can follow Equations 2 through 7, presented below, to build k regression trees with incremental boosts of the function f over M iterations:
- a GBM model 404 can be tuned, and/or can be sensitive to hyperparameter choices. Accordingly, some hyperparameters can be varied across broad ranges when multiple GBM models 404 are trained.
- GBM estimators can be trained with common values of a learning rate, a maximum tree depth, and a column sample rate.
- the learning rate can control a weighted average prediction of the trees, and can refer to a rate (for example ranging from 0 to 1) at which the GBM models 404 change parameters in response to error (i.e., learn) when building an estimator.
- a rate for example ranging from 0 to 1
- Lower learning rate values may enable slower learning, but may use more trees to achieve a level of fit than would be used with a higher learning rate.
- lower learning rates can also help to avoid overfitting, even if the lower learning rates may be more computationally expensive.
- the maximum tree depth can refer to a number of layers of the tree, and thus can control how many splits can occur to separate a population into subpopulations. For example, in some algorithms, a tree may no longer split if either a maximum depth is reached, or a minimum improvement in prediction (relative improvement in squared error reduction for a split>1 ⁇ 5 ) is not satisfied. In some situations, deeper trees may provide more accuracy on a derivation dataset, but may also lead to overfitting.
- the sample rate can refer to a fraction of the data sampled among columns of the data for each tree, which can help improve sampling accuracy.
- the sample rate can be expressed on a 0 to 1 scale, such that, for instance, 0.7 refers to 70% of the data sampled. On this scale, values of the sampling rate that are less than 1 may help improve generalization by preventing overfitting.
- stochastic gradient boosting can be used, such that to fit each GBM estimator, in each learning iteration a subsample of training data can be drawn at random without replacement.
- a Random Forest model 406 can be a base predictive model 304 that outputs a Random Forest prediction 414 .
- a Random Forest prediction 414 can be a base prediction 306 of costs, utilization, risk metrics, and/or other information about a member 102 .
- Random Forest models 406 can also be associated with regression trees.
- Random Forest models 406 can use a broad range of hyperparameter starting values entered into a random grid search.
- a tree learning algorithm can apply a modified version of bootstrap aggregating (bagging), in which the tree learning algorithm selects, at each candidate split in tree learning process, a random subset of covariates to build and train the tree (feature bagging). Trees can be trained in parallel and split, until either a maximum depth is reached or a minimum improvement in prediction (such as a relative improvement in squared error reduction for a split>1 ⁇ 5 ) is not satisfied. Forests can be fit with varied values of a maximum tree depth and column sampling rates.
- a Random Forest algorithm may generate trees based which factors or variables are most predictive of actual results. For instance, a Random Forest algorithm may determine from healthcare data 302 that age is the most important predictor of hospitalizations, and that high blood pressure diagnoses are the second-most important predictor of hospitalizations. In some examples, a Random Forest model 406 may generate numerous trees based on different random subsets of data, and generate a final tree by averaging those trees to avoid overfitting.
- Random Forest models 406 may use greater maximum tree depth values, and sample from a smaller number of covariates in each tree, than GBM models 404 . In some cases, the greater depth of Random Forest models 406 may cause the Random Forest models 406 to be more efficient on longer datasets, while GBM models 404 may be more efficient on wider datasets. In the Random Forest models 406 , overfitting can be reduced by randomizing tree building and combining the trees together. In some examples, a majority vote of the resulting trees' predictions can be used to generate produce a final Random Forest prediction 414 from a forest of underlying trees.
- a deep learning model 408 can be a base predictive model 304 that outputs a deep learning prediction 416 .
- a deep learning prediction 416 can be a base prediction 306 of costs, utilization, risk metrics, and/or other information about a member 102 .
- Deep learning models 408 can be based on neural networks.
- a deep learning model 408 can include a multi-layer, feedforward network of neurons, where each neuron takes a weighted combination of input signals as shown in Equation 8:
- Equation 8 reflects a weighted sum of input covariates, with additional bias b, the neuron's activation threshold.
- the neuron can produce an output, f( ⁇ ), that represents a nonlinear activation function.
- an input layer can match the covariate space, and can be followed by multiple layers of neurons to produce abstractions from input data, ending with a classification layer to match a discrete set of outcomes.
- Each layer of neurons can provide inputs to the next layer, and weights and bias values can determine the output from the neural network.
- learning can involve adapting such weights to minimize a mean squared error between the predicted outcome and the observed outcome, across individuals in the derivation dataset.
- Deep learning estimators can be developed by training across multiple activation functions and hidden layer sizes.
- Such activation functions can, for example, include “maxout,” “rectifier,” or “tanh.”
- a maxout activation function can be a generalization of a rectified linear function given by Equation 9:
- each neuron can choose the largest output of two input channels, where each input channel has a weight and bias value.
- “dropout” can be used, in which the function is constrained so that each neuron in the neural network suppresses its activation with probability P ( ⁇ 0.2 for input neurons, and ⁇ 0.5 for hidden neurons), which scales network weight values towards 0 and can cause each training iteration to train a different estimator, thereby enabling exponentially larger numbers of estimators to be averaged as an ensemble to prevent overfitting and improve generalization.
- a rectifier activation function can be a version of the maxout activation function in which the output of one channel is always zero.
- a deep learning model 408 can assign different variables, such as ages of members 102 , diagnoses of members 102 , prescriptions of members 102 , and/or other factors to different columns.
- the deep learning model 408 can identify which of those variables, and/or which combinations of variables, are most predictive of actual results in a training dataset, such as incurred healthcare costs and visits to healthcare providers.
- a GBM meta-learner 418 can be a prediction combination model 308 that combines one or more linear model predictions 410 from one or more regularized linear models 402 , one or more GBM predictions 412 from one or more GBM models 404 , one or more Random Forest predictions 414 from one or more Random Forest models 406 , and/or one or more deep learning predictions 416 from one or more deep learning models 408 into a final prediction 114 .
- the GBM meta-learner 418 may use Gradient Boosting Machine techniques to determine weights 310 for different instances of base predictive models 304 , and use those weights 310 to combine corresponding base predictions 306 into a final prediction 114 .
- the GBM meta-learner 418 can select linear model weights 420 for corresponding regularized linear models 402 , select GBM weights 422 for corresponding GBM models 404 , select Random Forest weights 424 for corresponding Random Forest models 406 , and select deep learning weights 426 for corresponding deep learning models 408 .
- the GBM meta-learner 418 can use the linear model weights 420 , GBM weights 422 , Random Forest weights 424 , and deep learning weights 426 to generate the prediction 114 using weighted averages, or other weighted combinations, of associated linear model predictions 410 , GBM predictions 412 , Random Forest predictions 414 , and deep learning predictions 416 .
- the GBM meta-learner 418 may adjust the linear model weights 420 , GBM weights 422 , Random Forest weights 424 , and deep learning weights 426 . Accordingly, if ticket feedback 118 shows over time that one or more of the regularized linear models 402 , GBM models 404 , Random Forest models 406 , or deep learning models 408 generate base predictions 306 with priority levels that more closely match the ticket feedback 118 from representatives 104 , the better-performing predictive models can be up-weighted and the worse-performing predictive models can be down-weighted.
- future predictions 114 generated based on the adjusted weights 310 may have outreach priorities that are closer to how representatives 104 would rank their outreach priorities, and member outreach services can be offered to corresponding members 102 in a more timely manner and/or in a more effective order.
- FIG. 5 depicts an example of a curve that can be used by a prediction combination model 308 to determine relative outreach priorities 502 associated with base predictions 306 and/or ticket feedback 118 .
- base predictions 306 and final predictions 114 can include cost estimates, utilization estimates, risk metric estimates, and/or other information associated with members 102 .
- the information in base predictions 306 and final predictions 114 can be used to determine an outreach priority 502 for associated outreach tickets 116 .
- the prediction engine 110 or the ticket manager 112 may determine that an outreach ticket 116 associated with the first member 102 has a greater outreach priority 502 than an outreach ticket 116 associated with the second member 102 . Accordingly, the ticket manager 112 may arrange a queue of outreach tickets 116 based on corresponding metrics of outreach priorities 502 .
- the outreach priority 502 associated with a set of outreach tickets 116 can be scaled to a bell curve as shown in FIG. 5 .
- outreach priorities 502 of a set of outreach tickets 116 can be scaled into a Z-score bell curve with a mean of zero and a standard deviation of 1. On such a bell curve, most outreach tickets 116 may fall near the center of the bell curve. However, some outreach tickets 116 may fall in high-priority ranges that indicate that it may be more critical to contact corresponding members 102 to offer member outreach services than other members 102 whose outreach tickets 116 have lower scaled outreach priorities 502 .
- outreach priorities 502 of outreach tickets 116 can be based on final predictions 114 generated using a weighted combination of multiple base predictions 306 , scaled outreach priorities 502 that individual base predictions 306 would have had alone can also be plotted on such a bell curve.
- Ticket feedback 118 can also be scaled to the same bell curve. As discussed above, in some examples, ticket feedback 118 may be based on intervenability scores, interpretability scores, and/or a combination of both scores. For instance, if a representative 104 rates an outreach ticket 116 with an intervenability score of three out of five and an interpretability score of four out of five, the ticket feedback 118 can overall rate the outreach ticket 116 with as a seven out of ten on an outreach priority 502 scale. Such scores in ticket feedback 118 can be scaled to the same Z-score bell curve discussed above with respect to outreach tickets 116 , even if outreach tickets 116 are prioritized in queues based on predicted rises in costs, utilizations, risk metrics, and/or other outreach priority 502 factors.
- FIG. 5 shows that base prediction 306 B for a member 102 had a scaled outreach priority 502 that was closer to the scaled outreach priority 502 of actual ticket feedback 118 associated with that member's outreach ticket 116 than base prediction 306 A.
- the prediction combination model 308 may accordingly adjust weights 310 by down-weighting a first base predictive model 304 A that generated base prediction 306 A and/or up-weighting a second base predictive model 304 B that generated better-performing base prediction 306 B.
- Base predictions 306 from both the first base predictive model 304 A and the second base predictive model 304 B can continue to be used to generate final predictions in the prediction engine 110 according to the adjusted weights 310 .
- the prediction engine 110 or a ticket manager 112 can determine if the final prediction 114 includes one or more values that meet or exceed corresponding threshold values.
- the prediction 114 can include a predicted estimate of a rise or decrease in healthcare costs associated with the member 102 over a period of time, relative to previous periods of time.
- the prediction engine 110 or the ticket manager 112 may determine if the prediction 114 indicates a rise in costs that exceeds a predefined threshold value.
- the prediction 114 can include a predicted utilization estimate of a number of visits the member 102 will make to healthcare providers in total, or by category, over a period of time.
- the ticket manager 112 can generate an outreach ticket 116 for the member 102 at block 608 . For example, if a prediction 114 indicates that healthcare costs associated with a member 102 are expected to rise more than a threshold amount over the next year (e.g., the prediction 114 meets or exceeds a threshold), an outreach ticket 116 can be generated such that a representative 104 can attempt to contact the member 102 to offer member outreach services.
- a prediction 114 for a member 102 can return to block 602 and the prediction engine 110 can continue generating base predictions 306 for the same member 102 and/or other members 102 .
- a prediction 114 indicates that costs, utilizations, or risk metrics associated with a member 102 are expected to decrease relative to previous levels, or are expected to rise at less than a threshold amount, there may be little or no benefit to offering member outreach services to that member 102 .
- ticket feedback 118 can be received from a representative 104 via a survey, form, user interface, or other mechanism.
- Ticket feedback 118 may provide scores, rankings, or other feedback from representatives 104 in one or more categories, such as categories for how “intervenable” a member 102 identified in an outreach ticket 116 was, and/or how “interpretable” information in the outreach ticket 116 itself was.
- intervenability and/or interpretability scores can be provided by representatives on a Likert scale of 0 - 5 , or other scale.
- ticket feedback 118 can be a combination of both intervenability and interpretability scores on a 0 - 10 scale, or other scale.
- the GBM predictions 412 and the deep learning predictions 416 can be combined, along with other base predictions 306 from other base predictive models 304 using corresponding weights 310 , into final predictions 114 for the set of members 102 , and corresponding outreach tickets 116 can be generated. If ticket feedback 118 then indicates that representatives 104 subjectively felt that, among the set of members 102 , the particular member 102 was the third-most suitable for member outreach services.
- non-transitory computer-readable media include, but are not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile discs (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other non-transitory medium which can be used to store desired information and which can be accessed by one or more computing devices 106 associated with the member outreach system 100 . Any such non-transitory computer-readable media may be part of the computing devices 106 .
- the processor(s) 706 can be a central processing unit (CPU), a graphics processing unit (GPU), both a CPU and a GPU, or any other type of processing unit.
- Each of the one or more processor(s) 706 may have numerous arithmetic logic units (ALUs) that perform arithmetic and logical operations, as well as one or more control units (CUs) that extract instructions and stored content from processor cache memory, and then executes these instructions by calling on the ALUs, as necessary, during program execution.
- the processor(s) 706 may also be responsible for executing computer applications stored in the memory 702 , which can be associated with common types of volatile (RAM) and/or nonvolatile (ROM) memory.
- the input devices 714 can include any sort of input devices known in the art.
- input devices 714 can include a microphone, a keyboard/keypad, and/or a touch-sensitive display, such as the touch-sensitive display screen described above.
- a keyboard/keypad can be a push button numeric dialing pad, a multi-key keyboard, or one or more other types of keys or buttons, and can also include a joystick-like controller, designated navigation buttons, or any other type of input mechanism.
- the machine readable medium 178 can store one or more sets of instructions, such as software or firmware, that embodies any one or more of the methodologies or functions described herein.
- the instructions can also reside, completely or at least partially, within the memory 702 , processor(s) 706 , and/or communication interface(s) 708 during execution thereof by the one or more computing devices 106 of the member outreach system 100 .
- the memory 702 and the processor(s) 706 also can constitute machine readable media 718 .
Landscapes
- Engineering & Computer Science (AREA)
- Business, Economics & Management (AREA)
- Strategic Management (AREA)
- Accounting & Taxation (AREA)
- Finance (AREA)
- Theoretical Computer Science (AREA)
- Development Economics (AREA)
- Entrepreneurship & Innovation (AREA)
- Data Mining & Analysis (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Public Health (AREA)
- Economics (AREA)
- Marketing (AREA)
- General Business, Economics & Management (AREA)
- General Health & Medical Sciences (AREA)
- Software Systems (AREA)
- Biomedical Technology (AREA)
- Epidemiology (AREA)
- Primary Health Care (AREA)
- Human Resources & Organizations (AREA)
- Game Theory and Decision Science (AREA)
- Evolutionary Computation (AREA)
- Mathematical Physics (AREA)
- Computing Systems (AREA)
- General Engineering & Computer Science (AREA)
- Databases & Information Systems (AREA)
- Pathology (AREA)
- Artificial Intelligence (AREA)
- Quality & Reliability (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Operations Research (AREA)
- Tourism & Hospitality (AREA)
- Technology Law (AREA)
- Computational Linguistics (AREA)
- Biophysics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Molecular Biology (AREA)
Abstract
Description
- This U.S. Patent Application is a continuation of, and claims priority to, U.S. patent application Ser, No. 16/881,981, filed on May 22, 2020, which claims priority to U.S. Provisional Patent Application No. 62/969,573, entitled “MULTI-MODEL MEMBER RISK PREDICTION AND FEEDBACK,” filed on Feb. 3, 2020, the entirety of both of which is incorporated herein by reference.
- Administrators can offer and manage health plans that provide one or more benefits to members of the health plans. For example, a health plan can be a health insurance plan that fully, or partially, covers the costs of medical services for members of the health insurance plan. In some examples, an administrator can administer a health plan on behalf of a sponsor, such as an employer that offers a health plan to its employees and/or their families. In this example, the administrator may be a third party that administers the health plan for the employer, and members may be the employees, and/or family members of the employee, that are covered by the health plan.
- An administrator can also offer member outreach services to members of a health plan. For example, member outreach services may help members find healthcare services, help members make healthcare appointments, help members find transportation for healthcare appointments, educate members about medical conditions, educate members about prescriptions or other medical treatments, and/or otherwise assist members in overcoming hurdles in obtaining healthcare services.
- The detailed description is set forth with reference to the accompanying figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The use of the same reference numbers in different figures indicates similar or identical items or features.
-
FIG. 1 depicts an example of a member outreach system. -
FIG. 2 depicts an example of data that can be stored in a health healthcare data repository. -
FIG. 3 depicts an example of a prediction engine. -
FIG. 4 depicts a non-limiting example of a prediction engine in which an ensemble of base predictive models includes instances of four types of base predictive models. -
FIG. 5 depicts an example of a curve that can be used by a prediction combination model to determine relative outreach priorities associated with base predictions and/or ticket feedback. -
FIG. 6 shows a flowchart of an example method for generating predictions and outreach tickets, and for adjusting weights in a prediction engine according to ticket feedback. -
FIG. 7 shows an example system architecture for a computing device associated with a member outreach system. - The systems and methods described herein are associated with predictive models that are configured to generate predictions indicating when members of a health plan may benefit from member outreach services. The health plan can be an insurance plan, or other benefit plan, that provides one or more benefits to members of the health plan. For example, a health plan can provide benefits that fully or partially cover the costs of healthcare services for members of the health plan.
- A health plan can be offered and/or managed by an administrator. As an example, the administrator may provide information about the health plan to members and potential members. As another example, the administrator may adjudicate healthcare claims associated with members of the health plan, when such healthcare claims are submitted by members, medical providers, or other entities.
- An administrator may also, or alternately, offer member outreach services to members. For example, representatives of an administrator can contact members to help the members find in-person healthcare services, help the members make healthcare appointments, help the members find transportation to and from healthcare appointments, educate members about their conditions, treatments, or medications, and/or otherwise assist members in overcoming hurdles in obtaining healthcare services.
- In some examples, member outreach services may be particularly beneficial to “high cost” members and/or “rising risk” members of a health plan. High cost members can be members who are predicted to incur relatively high healthcare expenses over a period of time. Rising risk members can, in some examples, be members whose healthcare expenses are predicted to increase during a future time period, relative to previous time periods. Rising risk members may also, or alternatively, be members whose risks of being admitted or readmitted to a hospital or other healthcare facility are predicted to increase, whose utilization of healthcare services are predicted to increase, whose pharmaceutical usage is predicted to advance to more expensive medications and/or to medications with increased side effects, whose suffering levels are predicted to increase, whose mortality risks are predicted to increase, and/or whose risk in any other category is predicted to increase during a future time period. For such high cost and/or rising risk members, member outreach services may prompt early medical intervention or other proactive steps, for instance before such high or rising healthcare expenses are incurred or other detrimental effects are experienced. Accordingly, member outreach services may proactively improve the health of a member and/or reduce healthcare expenses ultimately incurred by the member.
- However, it can be difficult to identify which members are high cost or rising risk members that should be contacted to offer member outreach services. Some administrators and insurers use risk adjustment models to predict future costs of members, and can accordingly use predictions generated by the risk adjustment models to identify high cost and/or rising risk members. However, conventional risk adjustment models generally use relatively simple linear regression algorithms to estimate a member's future costs. As an example, a typical conventional risk adjustment model may multiply an age of a member by a first value to generate a first cost estimate, a tobacco smoking status of the member by a second value to generate a second cost estimate, an indicator of the member's chronic disease history by a third value to generate a third cost estimate, and/or multiply other factors by other values to generate other cost estimates. These conventional risk adjustment model then add the various cost estimates together to calculate a final cost estimate of the member's expected costs over the next year, or other time period.
- However, cost estimates generated by such conventional linear regression-based risk adjustment models have often had a poor correlation to actual healthcare costs that members ultimately incurred. For example, many conventional risk adjustment models are less than 60% accurate at predicting which one of two given members will have higher costs over a period of time. Accordingly, the poor performance of such conventional methods of determining cost estimates may be insufficient to adequately identify high cost and/or rising risk members that may benefit the most from member outreach services.
- It can also be difficult to identify when representatives should contact members to offer member outreach services. For example, even if a cost prediction indicates that a first member's healthcare costs are expected to rise by $20,000 over the next year, member outreach services may not be able to assist the first member if the first member has already been hospitalized. However, if a cost prediction for a second member indicates that the second member's healthcare costs are expected to rise by $10,000 over the next year, there may still be time for member outreach services to assist the second member before those costs are incurred, and/or to prevent those costs from being incurred. Accordingly, in this example, although the first member has a greater predicted rise in costs than the second member, the second member may be a better candidate for member outreach services because the second member's situation is more “intervenable” by member outreach services than the first member.
- The systems and methods described herein can prompt member outreach services to be offered to members based on predictions generated by an ensemble of different base predictive models that are weighted based on feedback from representatives. For example, the ensemble of base predictive models can generate a set of base predictions for a member of a health plan. The base predictions can be combined into a final prediction, using weights associated with the various base predictive models. The predictions can, for example, indicate predicted costs and/or changes in costs of members over a future period of time. The predictions may also indicate predicted utilizations and/or predicted risk metrics associated with members. The predictions can thus indicate which members are high cost and/or rising risk members that should be contacted to offer member outreach services. Over time, feedback from representatives who offer member outreach services to members can be used to adjust the weights used to combine the base predictions into the final prediction. In some cases, this may cause some lower cost members to be prioritized for member outreach services over some higher cost members, if the representative feedback indicates that some lower cost members are more “intervenable” using member outreach services than some higher cost members.
- Accordingly, the predictions made by the systems and methods described herein can be more accurate than conventional predictions made using linear regression methods alone. For example, test data has shown improved performance of an example of the ensemble-based model described herein, relative to previous risk models that seek to predict who will become a high-risk claimant or who is a rising risk member. For example, based on various input data types including diagnosis-only data, pharmacy-only data, diagnosis and pharmacy data, and/or a combination of diagnosis data, pharmacy data, and prior year costs, the test data indicated that the ensemble-based model described herein performed better than other risk models with respect to metrics including a percentage of cost variation explained by the model, a mean absolute error from perfect accuracy, and an area under the curve (C-statistic) indicating a probability of correctly predicting which of two people are higher risk. Based on this improved accuracy of predictions, the systems and methods described herein can better identify which members are likely to become high cost and/or rising risk members, and may therefore be better candidates for member outreach services.
- As such, the ensemble-based model described herein can improve identification of members who may benefit the most from member outreach services that assist the members with finding care, arranging care, and that provide other medical and social support and education to members. Accordingly, time and computing resources associated with member outreach services can be more efficiently directed to those members who may benefit the most from those member outreach services. Additionally, attributes that lead the ensemble-based model to identify a member as a high cost or rising risk member can be identified to a representative in an outreach ticket, such that the identification of such attributes in an outreach ticket can improve efficiency of a member outreach workflow and/or reduce computing resources used by the representative. For example, rather than cold-calling the member without prior knowledge of what the member may need, or using time and computing resources to independently research the member's situation before contacting the member, a representative can use information provided directly in an outreach ticket to understand why a member may benefit from member outreach services.
-
FIG. 1 depicts an example of amember outreach system 100 associated with an administrator of a health plan. In some examples, the administrator can administer the health plan on behalf of a sponsor. For example, the sponsor can be an employer that offers a health plan to its employees and/or family members of the employees, and the administrator can be a third-party entity that manages the health plan for the employer. In other examples, the sponsor can be an insurance company or any other entity that offers a health plan, but uses a third-party administrator to manage administration of the health plan. In still other examples, the administrator can itself be an insurance company or other entity that directly offers a health plan. - The
member outreach system 100 can be associated with member outreach services that can be provided by the administrator tomembers 102 of the health plan. Member outreach services may, for example, helpmembers 102 find in-person healthcare services, helpmembers 102 make healthcare appointments, helpmembers 102 find transportation to and from healthcare appointments, educatemembers 102 about their medical conditions or medications, and/or otherwise assistmembers 102 in overcoming hurdles in obtaining healthcare services. - The
member outreach system 100 can predict whenmembers 102 may benefit from member outreach services. In some examples, when themember outreach system 100 predicts that amember 102 may benefit from member outreach services, themember outreach system 100 can prompt a representative 104 to attempt to contact themember 102 to offer the member outreach services. A representative 104 can be a nurse, physician, pharmacist, social worker, member outreach agent, care coordinator, or any other type of representative of the administrator or an associated entity. In other examples, when themember outreach system 100 predicts that amember 102 may benefit from member outreach services, themember outreach system 100 can automatically send notifications or other information directly to themember 102, as discussed further below. In still other examples, predictions made by the member outreach system can be used to refermembers 102 to partner providers or other entities. Accordingly, such partner providers may be able to provide healthcare services directly tomembers 102 in addition to, or instead of, member outreach services provided byrepresentatives 104. - The
member outreach system 100 can include one ormore computing devices 106, such as servers, workstations, cloud computing elements, and/or other computing devices. Thecomputing devices 106 can have, and/or access, at least onehealthcare data repository 108. Thecomputing devices 106 can also have aprediction engine 110 and aticket manager 112. In some examples, theticket manager 112 may be part of theprediction engine 110. In other examples, theprediction engine 110 and theticket manager 112 can be separate components that execute on the same ordifferent computing devices 106. One or morehealthcare data repositories 108 may also be stored on the same ordifferent computing devices 106 than theprediction engine 110 and/or theticket manager 112. - In some examples,
representatives 104 can usecomputing devices 106 to access theticket manager 112 and/or other elements of themember outreach system 110. For example, a representative 104 can use a computer, workstation, mobile device, or other type ofcomputing device 106 to locally execute an application that interfaces with theticket manager 112 and/or other elements of themember outreach system 110 executing on a server or otherremote computing device 106. As another example, a representative 104 can use a computer, workstation, mobile device, or other type ofcomputing device 106 to access theticket manager 112 and/or other elements of themember outreach system 110 executing on one or moredifferent computing devices 106 via a web browser or other user interface. In still other examples, a representative 104 may directly use the same one ormore computing devices 106 that store and/or execute thehealthcare data repository 108,prediction engine 110, and/orticket manager 112. -
FIG. 2 depicts an example of data that can be stored in a healthhealthcare data repository 108. Ahealthcare data repository 108 can be a database or other type of data repository that stores data associated withmembers 102 of a health plan, data about healthcare claims associated withmembers 102 of the health plan, data about healthcare providers, and/or other data. In some examples, ahealthcare data repository 108 can use HIPAA-compliant security measures, and/or other security measures, to protect the privacy of stored information. In some examples, ahealthcare data repository 108 can storeenrollment data 202,claim data 204,engagement data 206,demographic data 208,provider data 210, and/or other types of data. -
Enrollment data 202 can include information aboutmembers 102 that has been provided by members, sponsors, partner networks, and/or other entities. For example,enrollment data 202 can include a member's name, age, gender, ZIP code, other contact information, and/or other types of information about themember 102. In some examples, information in theenrollment data 202 may have been provided by amember 102 when themember 102 filled out enrollment paperwork online or on paper forms when themember 102 signed up for the health plan. In other examples,enrollment data 202 can be provided by a sponsor of the health plan, such as an employer that enrolls employees in a health plan and provides employee information asenrollment data 202. In still other examples,enrollment data 202 can be provided by, and/or updated by,members 102 through a health plan account website, mobile application, and/or other mechanism. - Claim
data 204 can include information about insurance claims or other claims that have been submitted in association withmembers 102 of one or more health plans. Claims may be submitted bymembers 102, healthcare providers, and/or other sources. Claims may be associated with healthcare services, such as medical services, pharmaceutical services, dental services, vision care services, and/or other types of services. For example, when amember 102 obtains healthcare services from a healthcare provider, themember 102 and/or the healthcare provider may submit a claim to have some or all of the cost of the healthcare services be covered by a health plan. In some examples, claims may be submitted directly to the administrator. In other examples, claims may be submitted to insurance companies or other entities, who then provideclaim data 204 to the administrator. In some examples, claimdata 204 can include information from partner networks, partner providers, other administrators, and/or other entities about claims submitted against other health plans. - Data included within submitted claims, data provided in association with submitted claims, and/or data derived from submitted claims can be stored as
claim data 204 within ahealthcare data repository 108. For example, claimdata 204 can include names, membership numbers, contact information, and/or other information aboutmembers 102 associated with submitted claims. Claimdata 204 can include names, identifiers, contact information, and/or other information about healthcare providers associated with submitted claims. For instance, claimdata 204 can include a national provider identification number of a healthcare provider associated with a claim. In some examples, information inclaim data 204 about healthcare providers can also correspond toprovider data 210 stored in thehealthcare data repository 108. - In some examples, claim
data 204 can include diagnosis information associated with submitted claims. For instance, diagnosis information may be included in a submitted claim using a diagnosis code, such as diagnoses codes defined in the 10th revision of the International Classification of Diseases, Clinical Modification (ICD-10-CM), or other types of diagnosis codes. In some examples, ICD-10-CM codes or other diagnosis codes can be categorized into diagnosis categories withinclaim data 204, such as Clinical Classification Software (CCS) codes used by the Agency for HealthCare Quality and Research. - In some examples, claim
data 204 may also include information from pre-authorization requests that have been submitted prior to planned healthcare procedures. For instance, a pre-authorization request for a radiology procedure may be submitted by a provider to check whether a member's health plan covers that radiology procedure. In this example, information in or derived from the pre-authorization request can be added to theclaim data 204 in the healthcare repository to indicate that a healthcare procedure may be planned for amember 102, even if a final claim for that procedure has not yet been received. - In some examples, claim
data 204 may be received from partners of the administrator that offer services that may not be directly covered by a health plan. For example, an employer may be a sponsor that offers employees a health plan that does not directly cover fertility services. However, the administrator may partner with certain fertility service providers, and recommend such fertility service providers to members even if they are not directly covered by the members' health plan. Alternatively, the sponsor employer may offer fertility services through a partner provider as a separate benefit apart from a health plan. Accordingly, such partner providers may provide the administrator with some types of data about whethermembers 102 have used the partner provider's services, and such data can be stored asclaim data 204 in thehealthcare data repository 108 even if no official claim has been submitted in association with those services. -
Engagement data 206 may include information about howmembers 102 have used websites, mobile applications, and/or other resources provided by the administrator. For example, the administrator may provide websites or mobile applications that allowmembers 102 to search for providers, contact providers, access a member portal to updateenrollment data 202, and/or perform other actions associated with a health plan. Theengagement data 206 may indicate whethermembers 102 have registered for an account to access a website and/or mobile application, what searches thosemembers 102 have performed in the website and/or mobile application, whethermembers 102 clicked through on search results to view provider information and/or contact providers, whethermembers 102 have taken action on care recommendations provided to them through the website and/or mobile application, and/or other types of usage data or engagement data. In some examples, theengagement data 206 may also indicate locations ofmembers 102. For example, when amember 102 uses a mobile application on smartphone, the mobile application may report location information about the smartphone, such as GPS coordinates. The mobile application can use that location information to display information about healthcare providers who are near the member's location, such as information about nearby healthcare providers that is drawn from theprovider data 210 or other data sources. Location information associated with a mobile application and/or smartphone can also be stored inengagement data 206 in association with amember 102. -
Demographic data 208 can include information, such as publicly available information, about people and/or environmental conditions in different geographical locations. For example,demographic data 208 can indicate, for a given ZIP code, education levels of people in the ZIP code, air pollution rates in the ZIP code, cancer rates in the ZIP code, a number of fast food restaurants in the ZIP code, and/or other types of information. -
Provider data 210 can include information about healthcare providers. For example, theprovider data 210 can include names of healthcare providers, names of doctors or other healthcare workers who are associated with the healthcare providers, locations of the healthcare providers, contact information for the healthcare providers, information about specialties of the healthcare providers, national provider identification numbers of the healthcare providers, and/or other information about healthcare providers. - The
healthcare repository 108 can be accessible or indexed at a member level. For example, data in thehealthcare repository 108 can be correlated, filtered, indexed, or searched at a member level. As a non-limiting example, for aparticular member 102, information about themember 102 can be obtained from theenrollment data 202, information from claims associated with themember 102 can be obtained from theclaim data 204, information about how themember 102 has used a website and/or mobile application can be obtained from theengagement data 204, and information about demographics and/or environmental conditions in the member's ZIP code can be obtained from thedemographic data 208. - In some examples, the
healthcare repository 108 may also, or alternately, be accessible or indexed at a provider level. For example, data in thehealthcare repository 108 can be correlated, filtered, indexed, or searched at a provider level, for instance in association with one or more healthcare providers identified inprovider data 210 and/or inclaim data 204. As a non-limiting example, claimdata 204 for claims associated with certain providers may be used to identify trends associated with individual providers, trends within one set of providers relative to other providers, geographical provider trends, and/or other provider-level data. As another non-limiting example, claimdata 204 may reveal that one provider generally recommends physical therapy before scheduling a type of surgery, while another provider schedules that type of surgery without first attempting physical therapy. As yet another non-limiting example, claimdata 204 may reveal local trends, for instance that providers in one locality are more likely to recommend a certain type of procedure as treatment for a particular medical condition than similar providers in a different locality. - In some examples, provider-level data can be used in the
prediction engine 110 to generatepredictions 114 of costs and/or visits associated with the providers. In other examples, provider-level data can also be used whenrepresentatives 104 provide member outreach services tomembers 102. For example, if provider-level data indicates that a first provider often charges much more for a type of procedure than a second provider,representatives 104 may recommend or suggest tomembers 102 that themembers 102 visit the second provider. - In some examples, the
healthcare repository 108 can be accessible or indexed at both the member level and the provider level. As an example,enrollment data 202 and/or claimdata 204 may identify demographic information about amember 102, such as the member's age, sex, and ZIP code. Such member-specific demographic information can be referenced againstclaim data 204 associated withhealthcare providers members 102 have visited, to determine practice patterns of the healthcare providers. Such member-level data and provider-level data may accordingly be used together in theprediction engine 110, as discussed further below. - Returning to
FIG. 1 , theprediction engine 110 can use data from the one or morehealthcare data repositories 108 to generatepredictions 114 associated with themembers 102. Examples of how theprediction engine 110 can be configured to generatepredictions 114 are discussed in more detail below with respect toFIG. 3 andFIG. 4 . - A
prediction 114 generated by theprediction engine 110 for amember 102 can indicate one or more cost estimates associated with themember 102, such as a total estimated healthcare cost, and/or a change in healthcare costs relative to previous healthcare costs, associated with themember 102 over the next year or other future time period. For example, aprediction 114 may include an indication that a member's healthcare costs are predicted to rise, or decrease, by a predicted amount over the next year. - A
prediction 114 may also, or alternately, indicate one or more estimated utilizations associated with the health plan. For example, aprediction 114 can include an estimated number of visits to healthcare providers in total, and/or an estimated number of visits to providers in one or more provider categories, over the next year or other future time period. As a non-limiting example, aprediction 114 may indicate that, based on data in one or morehealthcare data repositories 108, amember 102 is predicted to incur $15,000 more in healthcare costs over the next year than the previous year, and is predicted to visit healthcare providers ten times over the next year, including three visits to emergency rooms, four visits to a primary care physician, two visits to a cardiologist, and one visit to a chiropractor. - A
prediction 114 may also, or alternately, indicate one or more estimated risk metrics associated with amember 102. For example, aprediction 114 can indicate predicted risk metric values such as a predicted risk of hospitalization, a predicted risk of readmission to a hospital, a predicted mortality rate, predicted suffering or pain levels, a predicted risk of pharmaceutical advancement to more expensive medications and/or to medications with increased levels of side effects, and/or other types of predicted risk metrics. - In some examples, member outreach services can be offered to
members 102 based on predicted risk metrics of themembers 102, and/or comparisons between predicted risk metrics fordifferent members 102. For example,predictions 114 may indicate that a pain level of afirst member 102 is expected to be higher than a pain level of asecond member 102, based onclaim data 204 indicating that thefirst member 102 has been diagnosed with a disease that is generally more painful than a disease with which thesecond member 102 has been diagnosed. In this example, thefirst member 102 may in some cases be prioritized for member outreach services over thesecond member 102, as the member outreach services may help reduce the higher predicted pain levels of thefirst member 102. As another example aprediction 114 may indicate that a suffering level and/or mortality risk for aparticular member 102 is expected to increase over the next two months. Accordingly, themember 102 can be prioritized for member outreach services that may help reduce, or lessen the increase in, the member's suffering level or mortality risk. - In some example, the various types of information associated with a
member 102 in thehealthcare data repositories 108 may provide more complete information about amember 102 for theprediction engine 110 than could be provided byclaim data 204 alone. For instance,engagement data 206 may indicate that amember 102 was using a mobile application to search for and contact podiatrists beforeclaim data 204 indicates that foot care claims for themember 102 have been submitted. Theengagement data 206 may therefore be used by theprediction engine 110 to predict that the member's healthcare costs may be expected to rise over the next month due to possible upcoming foot care services, even ifclaim data 204 does not yet indicate that foot care services have been provided to themember 102. - Provider-level data from the
healthcare data repositories 108 can be used by theprediction engine 110 to makepredictions 114 associated withmembers 102. For example,enrollment data 202engagement data 206, and/orprovider data 210 may indicate that amember 102 is located in a geographical area where a provider-level data indicates that an expensive procedure is used more often to treat a certain condition than in other locations. As another example,past claim data 204 and/orprovider data 210 may indicate that a specific provider, thatengagement data 206 indicates amember 102 has searched for, routinely uses an expensive procedure more often than other nearby providers. In these examples, theprediction engine 110 may predict that the member's costs may be expected to rise due to a relatively high chance of the expensive procedure being performed. - Overall, machine learning and/or other artificial intelligence in the
prediction engine 110 can analyze data in one or morehealthcare data repositories 108, at the member-level and/or the provider-level, to generatepredictions 114 based on multiple factors and/or interactions between multiple factors. For example, various factors fromenrollment data 202,claim data 204,engagement data 206, and/ordemographic data 208 may indicate that people in relatively-low income ZIP codes who take insulin to treat diabetes have a spike in emergency room visits during the last week of each month, possibly because such people have relatively little money left from a first-of-the month paycheck and don't have sufficient food to prevent insulin dosages leading to low blood sugar. Thepredictive engine 110 can take such factors, and/or interactions between such factors, when generatingpredictions 114 of costs, health plan utilizations, risk metrics, and/or other values. As will be discussed below with respect toFIG. 3 , theprediction engine 110 can include multiple basepredictive models 304 that can take various factors from one or morehealthcare data repositories 108 into account when generatingbase predictions 306. Thebase predictions 306 from multiple basepredictive models 304 can be combined into afinal prediction 114. - Based on
predictions 114 generated by theprediction engine 110, theticket manager 112 can be configured to generateoutreach tickets 116 forrepresentatives 104. Theoutreach tickets 116 can be viewable and/or accessible viacomputing devices 106 used by therepresentatives 104. Therepresentatives 104 can at least attempt to contactmembers 102 identified byoutreach tickets 116 to offer member outreach services to themembers 102. Therepresentatives 104 can also usecomputing devices 106 to provideticket feedback 118 to theticket manager 112 aboutoutreach tickets 116 and/or correspondingmembers 102. As will be discussed further below, theprediction engine 110 can useticket feedback 118 provided byrepresentatives 104 to adjust how theprediction engine 110 generatessubsequent predictions 114. In some examples, thecomputing devices 106 may also, or alternately, provideautomated notifications 120 tomembers 102 in response to generatedpredictions 114. - In some examples, the
ticket manager 112 can generateoutreach tickets 116 based on part on cost estimates, utilization metrics, risk metrics, and/or other data inpredictions 114 generated by theprediction engine 110. For example, aprediction 114 may indicate that a member's total costs are predicted to be over $50,000, or another threshold value, over the next year, and thus indicate that themember 102 is predicted to be a “high cost”member 102. As another example, aprediction 114 may indicate that a member's healthcare costs are predicted rise by over $1,000, or another threshold value, over the next month relative to the previous month, and thus indicate that themember 102 is predicted to be a “rising risk”member 102. Amember 102 may also, or alternately, be considered to be a “rising risk”member 102 if risk metrics in aprediction 114 indicates that themember 102 is predicted to utilize or visit healthcare providers in a future time period more often than themember 102 has in the past, that the member's risk of admission or readmission to a hospital or other healthcare facility is precited to increase, that the member's suffering levels or mortality risks are expected to increase relative to previous values, that the pharmaceutical usage is predicted to advance to more expensive medications and/or to medications with increased side effects, and/or that the member's risk in any other category is predicted to increase during a future time period. - In these examples, the
ticket manager 112 may generate anoutreach ticket 116 associated with themember 102 due to the total costs, or one or more predicted rises in costs, utilizations, and/or risk metrics, exceeding a threshold amount. Based on theoutreach ticket 116, member outreach services may be able to proactively assist themember 102 to obtain healthcare services and/or otherwise manage the healthcare issues associated with the predicted costs, utilizations, or risk metrics. For instance, member outreach services may be able to assist themember 102 with proactively seeking medical care, before a condition worsens and more costly healthcare services may be needed to manage the condition. Member outreach services may therefore proactively improve the health of themember 102, reduce the healthcare costs ultimately incurred by themember 102, reduce a number of utilizations associated with themember 102, and/or reduce risk metrics associated with themember 102. Accordingly, in some examples, member outreach services may cause healthcare costs incurred by amember 102 over a period of time to be lower than cost estimates in aprediction 114 for that period of time. - An
outreach ticket 116 can identify a correspondingmember 102, for example as by including the member's name, contact information, and/or other demographic information. Theoutreach ticket 116 may also indicate one or more reasons why themember 102 may benefit from member outreach services, based on information in afinal prediction 114 and/orbase predictions 306. For example, key variables from thehealthcare data repository 108 that most influenced thefinal prediction 114 and/orbase predictions 306 can be identified and flagged within anoutreach ticket 116. In some examples, a Local Interpretable Model-Agnostic Explanations (LIME) method can be used to identify the variables that most influenced aprediction 114 or itscomponent base predictions 306. For example, a LIME analysis may indicate that a new cancer diagnosis or a change in prescribed medications was one of the primary reasons that caused theprediction engine 110 to generate aprediction 114 indicating that amember 102 is expected to have a rise in costs, a rise in utilizations, and/or increased risk metrics over an upcoming period of time, while a recent procedure for a flu shot is not a key variable that is associated with the predicted rise in costs, utilizations, or risk metrics. Theticket manager 112 can generate anoutreach ticket 116 that includes indicators of the identified key variables. For instance, in some examples, theticket manager 112 can generate anoutreach ticket 116 by filling in blanks of a template with language associated with the identified key variables, such as the new cancer diagnosis or the change in prescribed medications, to indicate to a representative 104 why the correspondingmember 102 may benefit from member outreach services. - As another example, if
claim data 204 indicates that themember 102 has been diagnosed with diabetes, but has not recently filled a prescription for insulin, theprediction engine 110 may generate a prediction indicating that the member's costs are predicted to rise due to increased visits to a diabetes specialist to manage conditions resulting from missed medications. In this example, theticket manager 112 may generate anoutreach ticket 116 indicating to a representative 104 that themember 102 may need assistance with insulin medications or other issues associated with diabetes. - In some examples, the
ticket manager 112 may use data in aprediction 114, such as cost predictions, predicted total utilizations across all types of providers, predicted utilizations of one or more specific providers or types of providers, and/or predicted risk metrics to select corresponding words, phrases, or values for variables in templates foroutreach tickets 116. For example, theticket manager 112 may have one or more templates foroutreach tickets 116. A template may or may not have some predefined language, and may include blank spaces where words or phrases can be inserted, variable fields where values can be entered or selected, or other portions that can be otherwise changed based on aprediction 114. For example, theticket manager 112 may select words, phrases, or other values for corresponding elements of a template based on key variables ofpredictions 114 generated by theprediction engine 110. - A generated
outreach ticket 116 may assist a representative 104 in understanding what types of member outreach services should be offered to amember 102. For example, if key variables frompredictions 114 indicated that afirst member 102 has been diagnosed with a foot problem and also has been diagnosed with diabetes, anoutreach ticket 116 may identify those diagnoses. A representative 104 may accordingly understand from theoutreach ticket 116 that thefirst member 102 may need assistance with insulin medications or other diabetes-related issues. However, if aprediction 114 for asecond member 102 indicates a diagnosis of a similar foot issue but also indicates that thesecond member 102 has been visiting a sports therapist, the member's foot issue may have occurred because thesecond member 102 is a runner. In this example, a representative 104 may understand from acorresponding outreach ticket 116 that thesecond member 102 may need a different type of patient outreach services than thefirst member 102 who has diabetes. As will be discussed further below,ticket feedback 118 can be received from a representative 104 about how “interpretable” the information in theoutreach ticket 116 was to the representative 104, such a score indicating whether theoutreach ticket 116 expressed understandable and/or useful information to the representative 104. - The
ticket manager 112 can provide theoutreach tickets 116 torepresentatives 104. In some examples, theticket manager 112 can add a generatedoutreach ticket 116 to a queue ofoutreach tickets 116 available torepresentatives 104. For instance, a representative 104 may use a user interface of theticket manager 112, or other application executing on acomputing device 106, to view and/or selectoutreach tickets 116 from the queue ofoutreach tickets 116. In other examples, theticket manager 112 can send a generatedoutreach ticket 116 to a representative 104 via an email, text message, or other type of electronic notification, and/or otherwise provide theoutreach ticket 116 to a representative 104. - In some examples, the
outreach tickets 116 may be associated with priority levels, and theticket manager 112 can arrange, sort, and/or filteroutreach tickets 116 in the queue based on priority levels of theoutreach tickets 116. For instance, theticket manager 112 can arrange a queue such that moreurgent outreach tickets 116 are prioritized over, and/or placed closer to the front or to the top of the queue than, lessurgent outreach tickets 116. As will be discussed further below, a priority level of anoutreach ticket 116 can be based at least in part on cost estimates, utilization estimates, risk metric estimates, or other data in an associatedprediction 114. For example, afirst outreach ticket 116 associated with afirst prediction 114 indicating that a first member's costs are expected to rise by $15,000 may be a higher priority than asecond outreach ticket 116 associated with asecond prediction 114 indicating that a second member's costs are expected to rise by $10,000. - However, in some examples, a priority level of an
outreach ticket 116 can also be based, at least in part, onticket feedback 118 fromrepresentatives 104. Theticket feedback 118 can indicate indicating how “interpretable” previoussimilar outreach tickets 116 were to therepresentatives 104, based on how well therepresentatives 104 subjectively felt that theprevious outreach tickets 116 expressed information about why amember 102 was to be contacted. Theticket feedback 118 can also, or alternately, indicate how “intervenable”members 102 identified by previoussimilar outreach tickets 116 were, based on subjective input fromrepresentatives 104 about whether thosemembers 102 were good candidates for member outreach services. For instance, in the example above, ifprevious ticket feedback 118 fromrepresentatives 104 has indicated thatmembers 102 similar to thesecond member 102 are more “intervenable” via member outreach services thanmembers 102 similar to thefirst member 102, thesecond outreach ticket 116 may be prioritized above thefirst outreach ticket 116 even though thefirst outreach ticket 116 is associated with a higher predicted rise in costs. - Based on an
outreach ticket 116 that identifies amember 102, a representative 104 can at least attempt to contact themember 102 to offer and/or provide member outreach services to themember 102. For example, the representative 104 can place a phone call to a phone number of themember 102, send a text message to the phone number of themember 102, send an email to an email address of themember 102, initiate a video chat with themember 102, and/or otherwise at least attempt to contact themember 102 to provide member outreach services to themember 102. - In some examples, the
computing devices 106 can alternately, or additionally, attempt to contact amember 102 identified by aprediction 114 and/or anoutreach ticket 116 via anautomated notification 120. For example, thecomputing devices 106 can be configured to send an email to an email address of themember 102, send a text message to a phone number of themember 102, make a phone call to a phone number of themember 102 to play a prerecorded audio message for themember 102 via the phone call, display a flagged notification in a member portal accessible via a website or mobile application, display a notification on a smartphone via a mobile application, and/or attempt to contact themember 102 using any other automated communication mechanism. Anautomated notification 120 can, based on aprediction 114, indicate to themember 102 why themember 102 is being contacted. For example, anautomated notification 120 can be an automated email that reminds amember 102 to take a medication and/or that provides medication instructions. - In some examples, a
healthcare data repository 108, or other information aboutmembers 102 that is available to thecomputing devices 106, may indicate communication preferences ofmembers 102. For example, communication preferences of amember 102 may indicate that themember 102 prefers phone calls over email communications. In this example, thecomputing devices 106 may follow the communication preferences of themember 102 by providing anoutreach ticket 116 to a representative 104, so that the representative 104 can place a phone call to themember 102. If the communication preferences of themember 102 instead indicate that themember 102 prefers email communications over phone calls, thecomputing devices 106 may instead attempt to contact themember 102 using an emailedautomated notification 120 instead of providing anoutreach ticket 116 to a representative 104. In some examples, communication preferences of amember 102 may be based onenrollment data 202, such as from preference information indicated by themember 102 on a health plan enrollment form. In other examples, thecomputing devices 106 can infer or derive communication preferences of amember 102 based onengagement data 206 or other data about interactions with themember 102 over time. For instance, if amember 102 answers phone calls from administrator personnel more often than themember 102 responds to emails from the administrator, thecomputing devices 106 can infer that themember 102 prefers to communicate via phone calls and indicate that inferred preference in data about communication preferences of themember 102. - In some examples, the
computing devices 106 may override communication preferences of amember 102. For example, if aprediction 114 and/oroutreach ticket 116 is generated with a priority level above a certain threshold, for instance if themember 102 is predicted to incur a sudden rise in costs over a short period of time, theticket manager 112 may provide anoutreach ticket 116 to a representative 104. Theoutreach ticket 116 may instruct the representative 104 to call themember 102 to address conditions associated with the predicted sudden rise in costs, even if communication preferences of themember 102 indicate that themember 102 prefers email communications. - When the
computing devices 106 provide anoutreach ticket 116 for amember 102 to a representative 104, the representative 104 may at least attempt to contact themember 102 to provide member outreach services, as discussed above. The representative can also returnticket feedback 118 associated with theoutreach ticket 116 to thecomputing devices 106. Theticket feedback 118 can include subjective feedback from the representative 104 about anoutreach ticket 116. In some examples, theticket manager 112 can provide a survey, form, user interface, and/or other mechanisms thatrepresentatives 104 can use to provideticket feedback 118. - In some examples,
ticket feedback 118 may indicate a score, based on subjective input from a representative 104, of how “intervenable” amember 102 was via member outreach services. For example, if a representative 104 could not reach amember 102 to offer member outreach services because themember 102 had already been hospitalized for a medical condition, the representative 104 may provideticket feedback 118 with a score indicating that themember 102 was a relatively poor candidate for member outreach services because no intervention prior to hospitalization was possible. Similarly, if a representative 104 determines that amember 102 has a terminal disease for which little meaningful intervention is possible, the representative 104 may also provideticket feedback 118 with a score indicating that themember 102 was a relatively poor candidate for member outreach services. However, if a representative 104 was able to successfully contact amember 102, and was able provide assistance to themember 102 that the representative 104 felt was beneficial, the representative 104 may provideticket feedback 118 with a score indicating that themember 102 was a relatively good candidate for member outreach services. For instance, if anoutreach ticket 116 indicates that amember 102 was recently diagnosed with asthma, the representative 104 may be able to educate themember 102 about how to use an inhaler and thereby reduce the risk of themember 102 visiting an emergency room due to the member's asthma. - In some examples,
ticket feedback 118 may also, or alternately, indicate a score, based on subjective input from a representative 104, of how “interpretable” anoutreach ticket 116 was to the representative 104. For example, if anoutreach ticket 116 identifies amember 102 but does not indicate why thatmember 102 is being referred for member outreach services, a representative 104 may not understand from theoutreach ticket 116 why themember 102 is to be contacted. The representative 104 may need to call themember 102 without understanding the member's situation, which may impact the quality or types of member outreach services provided to themember 102. In this case, the representative 104 may provideticket feedback 118 indicating that theoutreach ticket 116 itself was unclear and/or unhelpful to the representative 104. - In some examples,
ticket feedback 118 can be based on Likert scale assessments by a representative 104 of one or more aspects of anoutreach ticket 116. As an example, a representative 104 can provide a rating on a scale of 0 to 5, or any other scale, for categories such as “intervenability,” “interpretability,” and/or other categories associated with anoutreach ticket 116. For instance, the representative 104 can provide an intervenability rating on a 0 to 5 scale based on whether the representative 104 felt that themember 102 identified by theoutreach ticket 116 was a suitable candidate for member outreach services, and can also provide an interpretability rating on a 0 to 5 scale based on whether the representative 104 felt that theoutreach ticket 116 sufficiently explained a health issue or rise in costs associated with themember 102 that could be addressed using member outreach services. In some examples, scores in different categories can be summed together, or otherwise combined, into a total score for theticket feedback 118. For example, if a representative 104 provides an intervenability score of “3” and an interpretability score of “4,” theticket feedback 118 may indicate a total score of “7” for anoutreach ticket 116. - In some examples, the
ticket manager 112 can provide a user interface forrepresentatives 104 to enter scores for ticket feedback. For example, the user interface can provide a user interface with sliders, radio buttons, dropdown menus, freeform data entry fields, and/or other user interface elements thatrepresentatives 104 can use to enter intervenability scores, interpretability scores, and/or other types ofticket feedback 118 associated withoutreach tickets 116. -
Ticket feedback 118, for example as submitted to theticket manager 112, can be provided to, and/or shared with, theprediction engine 110. Theprediction engine 110 can use theticket feedback 118 to adjust weights used within theprediction engine 110 to combine base predictions into afinal prediction 114, as discussed further below with respect toFIG. 3 andFIG. 4 . - In some examples, the
member outreach system 100 can also, or alternatively, use aprediction 114 to determine if or when amember 102 is suitable for intervention by a healthcare provider that partners with the administrator. For example, in addition to, or instead of, providing anoutreach ticket 116 to a representative, themember outreach system 100 may generate a referral for a partner provider, such that the partner provider can contact themember 102 to directly offer corresponding healthcare services to themember 102. In this example, theprediction engine 110 can use other types of feedback associated with the partner provider and/or the member, in addition to or instead ofticket feedback 118, to adjustpredictions 114 or howpredictions 114 are generated. For example, if a partner provider indicates that it provided treatment to amember 104, theprediction engine 110 can determine if that treatment ultimately reduces actual healthcare costs incurred by themember 102, reduces predicted healthcare costs during future time periods, and/or improves the member's health over time. If such actual or predicted cost reductions and/or health improvements do occur, theprediction engine 110 can use machine learning and/or other artificial intelligence techniques to learn thatsimilar members 102 should be referred to the partner provider in the future. -
FIG. 3 depicts an example of theprediction engine 110. Theprediction engine 110 can include an ensemble of multiple basepredictive models 304. The basepredictive models 304 can be configured to, based at least in part onhealthcare data 302 from one or morehealthcare data repositories 108, generatebase predictions 306 associated withmembers 102 of a health plan. In some examples, thehealthcare data 302 used by the basepredictive models 304 can includeenrollment data 202,claim data 204,engagement data 206,demographic data 208,provider data 210, and/or other types of data at a member level and/or at a provider level, as discussed above. Similar to thefinal predictions 114 output by theprediction engine 110, anindividual base prediction 306 produced by anindividual base model 304 can include an estimate of costs that amember 102 will incur over a period of time, and/or an estimated change in such costs relative to previous costs incurred by themember 102. Abase prediction 306 may also include estimated utilization data, such as predictions of how many times amember 102 will visit providers in one or more categories, and/or in total, over a period of time. - The base
predictive models 304 can be machine learning models, artificial intelligence models, and/or other types of predictive models. For example, the basepredictive models 304 can include, or be based on, regularized linear algorithms, Gradient Boosted Machines (GBMs), Random Forest algorithms, deep learning algorithms, recurrent neural networks, other types of neural networks, nearest-neighbor algorithms, support-vector networks, linear regression, logistic regression, other types of regression analysis, decision trees, and/or other types of artificial intelligence or machine learning frameworks. - As an example, a base
predictive model 304 can be a tree-based learning model that generates trees with different branches associated with different factors. For instance, a tree-based leaning model can generate a tree with a branch for whether amember 102 has a diabetes, a subsequent branch for whether themember 102 is on insulin, a subsequent branch for whether themember 102 is in a low-income ZIP code, and a subsequent branch for whether it is the last week of the month. If data for amember 102 indicates that the answer is “yes” for all of these branches, the tree may indicate abase prediction 306 at an ending node of the branches with predicted costs, utilizations, and/or risk metrics formembers 102 who meet that criteria. For example, as discussed above,members 102 who meet the criteria of this example have increased chances of visiting emergency rooms. - In some examples, the ensemble of base
predictive models 304 can include at least two types of predictive models. For example,FIG. 4 , discussed further below, depicts a non-limiting example in which theprediction engine 110 includes four types of basepredictive models 304, including regularizedlinear models 402,GBM models 404,Random Forest models 406, anddeep learning models 408. However, although four types of basepredictive models 304 are shown in the example ofFIG. 4 , theprediction engine 110 may include different types of basepredictive models 304 than are shown inFIG. 4 . Additionally, althoughFIG. 4 shows an example with four different types of basepredictive models 304, theprediction engine 110 can have two different types of basepredictive models 304, three different types of basepredictive models 304, four different types of basepredictive models 304, five different types of basepredictive models 304, or any other number of different types of basepredictive models 304. - Different types of predictive models may have different strengths and weaknesses, such that some types of predictive models may be better suited to make
base predictions 306 about certain types of situations than other types of predictive models. Accordingly, even if a particular type of predictive model is not well-suited to makebase predictions 306 for a certain type of situation associated with amember 102, one or more other types of predictive models in the ensemble may be better suited to makebase predictions 306 for that type of situation. As will be discussed below,base predictions 306 from multiple basepredictive models 304 can be combined together into afinal prediction 114. Accordingly, even if individual basepredictive models 304 are relatively imperfect, the combinedfinal prediction 114 can more closely approximate actual results than theindividual base predictions 304 taken in isolation. - At least some of the base
predictive models 304 can be machine learning or artificial intelligence models that can be trained based on historical data in thehealthcare data 302. For example, a basepredictive model 304 can be trained to identify which features withinhistorical healthcare data 302 associated with amember 102 are predictors of past healthcare costs incurred by thatmember 102, past visits to healthcare providers by thatmember 102 and/or past risk metrics associated with themember 102. When such basepredictive models 304 have been trained, the basepredictive models 304 can usecurrent healthcare data 302 corresponding to the identified features to generatebase predictions 306 of future costs, future utilizations, and/or future risk metrics associated withmembers 102. - The ensemble of base
predictive models 304 may include one or more instances of each type of predictive model present in the ensemble. For example, the ensemble of basepredictive models 304 can include a first set of instances of a first type of predictive model, and also include a second set of instances of a second type of predictive model. In some examples, each instance of a basepredictive model 304 can be trained on a random subset of thehealthcare data 302. Accordingly, one instance of a basepredictive model 304 may be trained on a different random subset of thehealthcare data 302 than a different instance of a basepredictive model 304. - The
prediction engine 110 can also have aprediction combination model 308. Theprediction combination model 308 can be configured to combine a set ofbase predictions 306, generated by multiple basepredictive models 304, into afinal prediction 114. Theprediction combination model 308 can, at least in part,use weights 310 associated with different types of basepredictive models 304 and/or individual instances of basepredictive models 304 to combinecorresponding base predictions 306 into afinal prediction 114. For example, if onebase prediction 306 predicts that a member's costs will rise by $10,000 and anotherbase prediction 306 predicts that the same member's costs will rise by $20,000, theprediction combination model 308 may combine the two base predictions into a prediction that the member's costs will rise by $15,000, if the basepredictive models 304 that made thebase predictions 306 are weighted equally. However,weights 310 associated with different basepredictive models 304 may not be equal, andnumerous base predictions 306 generated by numerous basepredictive models 304 with varyingweights 310 can be used to generate thefinal prediction 114. - The
prediction combination model 308 can be a machine learning model, artificial intelligence model, or other type of predictive model. In some examples, theprediction combination model 308 may be referred to as a “meta-learner” or a “super learner.” For instance, in some examples, theprediction combination model 308 can use gradient boosting meta-leaming and/or a stacking method to determine theweights 310 to use when combiningbase predictions 306 into afinal cost prediction 114. - In some examples, after base
predictive models 304 have been trained on random samples ofhistorical healthcare data 302, the trained basepredictive models 304 can generatebase predictions 306 on remaining samples of thehistorical healthcare data 302. Such base costpredictions 306 can be evaluated using an area under a curve statistic, known as a C-statistic, which indicates a probability of correctly predicting which of twomembers 102 are “high cost”members 102 with healthcare costs above a threshold value. C-statistics associated with different instances of basepredictive models 304 can accordingly indicate relative performances of the different instances of the basepredictive models 304, and in some examples can be used to determine correspondingweights 310 to select for the different instances of the basepredictive models 304. Accordingly, in some examples, a basepredictive model 304 can be trained until its C-statistic meets a threshold value. In some examples, basepredictive models 304 can also be subject to regularization, such as elastic net regularization, to prevent the basepredictive models 304 from overfitting theirbase predictions 306. - Each instance of base
predictive model 304 can also be assigned acorresponding weight 310 in theprediction combination model 308. In some examples, theweights 310 for different basepredictive model 304 can initially be equal, or be set to a predefined starting value. However, thereafter theprediction combination model 308 can useticket feedback 118 provided byrepresentatives 104 to continually or periodically adjust theweights 310 associated with different basepredictive models 304. In some examples, theprediction combination model 308 may adjustweights 310 for different basepredictive models 304 untilpredictions 114 generated using a weighted average ofbase predictions 306 maximizes a C-statistic in one or more cross-validation samples from training data. - As an example, the
prediction engine 110 may generate a set ofpredictions 114 for a set ofmembers 102, and theticket manager 112 can generatecorresponding outreach tickets 116 for themembers 102 that are prioritized in a queue based on whichmembers 102 are predicted to have the highest rise in costs. However, ifticket feedback 118 indicates that themember 102 associated with the highestpriority outreach ticket 116 was not as intervenable, and/or theoutreach ticket 116 was not as interpretable, than a lower-priority outreach ticket 116, theprediction engine 110 can adjust theweights 310 so thatbase predictions 306 from basepredictive models 304 that indicated themember 102 was a high priority are down-weighted in the future. - In some examples, a loss function in the
prediction combination model 308 can be adjusted to down-weight basepredictive models 304 that were associated withlower ticket feedback 118 on a Likert scale or other scale, and up-weight basepredictive models 304 that were associated withhigher ticket feedback 118 on a Likert scale or other scale. An example of adjustingweights 310 based onticket feedback 118 is discussed further below with respect toFIG. 5 . - In some examples, the
prediction engine 110 can also use other types of feedback to adjust howbase predictions 306 and/orfinal predictions 114 are generated. For example, if aprediction 114 indicates that a member's healthcare costs are expected to rise due a new diagnosis, theprediction engine 110 can useclaims data 206 to determine whether claims associated with that diagnosis are ultimately received. In some examples, the lack of later-submitted claims associated with a diagnosis can be an indication that member outreach services have been successful. As another example, theprediction engine 110 or theticket manager 112 can track whetheroutreach tickets 116 are transferred betweenrepresentatives 104. For instance, if anoutreach ticket 116 is provided to a nurse, but the nurse determines that theoutreach ticket 116 would be better handled by a pharmacist due to a medication issue and transfers theoutreach ticket 116 to the pharmacist, theprediction engine 110 may use transfer data to increase the chances ofsimilar outreach tickets 116 indicating medication issues and/or being directed to pharmacists in the future. - As noted above,
FIG. 4 depicts a non-limiting example of aprediction engine 110 in which the ensemble of basepredictive models 304 includes instances of four types of basepredictive models 304, including regularizedlinear models 402, Gradient Boosted Machine (GBM)models 404,Random Forest models 406, anddeep learning models 408. However, the example ofFIG. 4 is not intended to be limiting, as theprediction engine 110 may include different types of predictive models than are shown inFIG. 4 , and/or can include fewer, or more, than four types of basepredictive models 304. - A regularized
linear model 402 can be a basepredictive model 304 that outputs alinear model prediction 410. Alinear model prediction 410 can be abase prediction 306 of costs, utilization, risk metrics, and/or other information about amember 102. A regularizedlinear model 402 can be based on elastic net regularization, which seeks to minimize a penalty function over a grid of values for a regularization parameter, 2. Such regularization can avoid overfitting to noise. The regularization parameter, 2, can control the strength of a balancing parameter, a, between lasso regression and ridge regression. In some examples, the lasso regression can be L1 regularization, which selects one correlated covariate and removes other covariates from the equation. Additionally, in some examples, the ridge regression can be L2 regularization, which shrinks coefficients of correlated covariates towards each other. - By way of a non-limiting example, for a logistic regression, a regularized
linear model 402 can seek to minimize the negative binomial log-likelihood expressed in the following equation (Equation 1): -
- In this example, regularization algorithm can perform coordinate descent on a quadratic approximation to the log-likelihood. Generalized linear modeling estimators can be trained on
healthcare data 302 in theprediction engine 110 to find an optimal 2 value using Equation 1, at each a value, across internal cross-validations on a training dataset from thehealthcare data 302. In particular, a first portion of Equation 1 (ahead of the λ value) can determine x values that are most predictive. For example, a cancer diagnosis can be highly predictive of future costs, while a diagnosis of an eyelid flutter may not be highly predictive of future costs. As another example, an orthopedic surgery diagnosis can indicate that a member's condition may be “intervenable” through member outreach services, while an eyelid flutter diagnosis may not be “intervenable” through member outreach services. The second portion of Equation 1 (multiplied by the value) can assist with pulling out noise in the data. The value can be determined through machine learning to balance the two portions of the equation. - A
GBM model 404 can be a basepredictive model 304 that outputs aGBM prediction 412. AGBM prediction 412 can be abase prediction 306 of costs, utilization, risk metrics, and/or other information about amember 102. -
GBM models 404 can develop regression trees. AGBM model 404 can be an ensemble learning algorithm that sequentially builds classification trees on features of derivation datasets, such as datasets fromhealthcare data 302. For example, regression trees can be developed in aGBM model 404 by sequentially selecting covariates to parse each derivation dataset into subsets with high between-group and low-within-group variance in future predicted annualized cost. Gradient boosting can seek to improve the choice of covariates and their combinations over time to minimize error between observed and predicted outcome rates in each terminal node, while repeatedly sampling from the underlying data to prevent overfitting. - In some examples, a
GBM model 404 can generate a first tree based on a random sample ofhealthcare data 302, and then generate subsequent trees based in part on previous trees. For example, a second tree may be based on different random sample than the first tree, and theGBM model 404 can determine if predictions of the first tree or the second tree are more accurate, and adjust trees accordingly. As such, theGBM model 404 can generate trees by learning from preceding trees. - In some examples, a
GBM model 404 can follow Equations 2 through 7, presented below, to build k regression trees with incremental boosts of the function f over M iterations: -
Initialize fk0=0, k=1, 2, . . . , K. (Eq. 2) -
For k=1 to K, fit a tree to targets r ikm =y ik −p k(x i), for i=1, 2, . . . , N, producing terminal regions R jkm, 1, 2, . . . , J m. (Eq. 4) -
Update the function f km(x)=f k,m−1(x)+Σj=1 Jm γjkm I(x∈R jkm) (Eq. 6) -
Output f k{circumflex over ( )}(x)=f kM(x). (Eq. 7) - In some situations, a
GBM model 404 can be tuned, and/or can be sensitive to hyperparameter choices. Accordingly, some hyperparameters can be varied across broad ranges whenmultiple GBM models 404 are trained. In some examples, GBM estimators can be trained with common values of a learning rate, a maximum tree depth, and a column sample rate. - The learning rate, or shrinkage, can control a weighted average prediction of the trees, and can refer to a rate (for example ranging from 0 to 1) at which the
GBM models 404 change parameters in response to error (i.e., learn) when building an estimator. Lower learning rate values may enable slower learning, but may use more trees to achieve a level of fit than would be used with a higher learning rate. In some examples, lower learning rates can also help to avoid overfitting, even if the lower learning rates may be more computationally expensive. - The maximum tree depth can refer to a number of layers of the tree, and thus can control how many splits can occur to separate a population into subpopulations. For example, in some algorithms, a tree may no longer split if either a maximum depth is reached, or a minimum improvement in prediction (relative improvement in squared error reduction for a split>1−5) is not satisfied. In some situations, deeper trees may provide more accuracy on a derivation dataset, but may also lead to overfitting.
- The sample rate can refer to a fraction of the data sampled among columns of the data for each tree, which can help improve sampling accuracy. In some examples, the sample rate can be expressed on a 0 to 1 scale, such that, for instance, 0.7 refers to 70% of the data sampled. On this scale, values of the sampling rate that are less than 1 may help improve generalization by preventing overfitting. In some examples, stochastic gradient boosting can be used, such that to fit each GBM estimator, in each learning iteration a subsample of training data can be drawn at random without replacement.
- A
Random Forest model 406 can be a basepredictive model 304 that outputs aRandom Forest prediction 414. ARandom Forest prediction 414 can be abase prediction 306 of costs, utilization, risk metrics, and/or other information about amember 102. - Similar to
GBM models 404,Random Forest models 406 can also be associated with regression trees. In particular,Random Forest models 406 can use a broad range of hyperparameter starting values entered into a random grid search. A tree learning algorithm can apply a modified version of bootstrap aggregating (bagging), in which the tree learning algorithm selects, at each candidate split in tree learning process, a random subset of covariates to build and train the tree (feature bagging). Trees can be trained in parallel and split, until either a maximum depth is reached or a minimum improvement in prediction (such as a relative improvement in squared error reduction for a split>1−5) is not satisfied. Forests can be fit with varied values of a maximum tree depth and column sampling rates. - As an example, a Random Forest algorithm may generate trees based which factors or variables are most predictive of actual results. For instance, a Random Forest algorithm may determine from
healthcare data 302 that age is the most important predictor of hospitalizations, and that high blood pressure diagnoses are the second-most important predictor of hospitalizations. In some examples, aRandom Forest model 406 may generate numerous trees based on different random subsets of data, and generate a final tree by averaging those trees to avoid overfitting. - In some examples,
Random Forest models 406 may use greater maximum tree depth values, and sample from a smaller number of covariates in each tree, thanGBM models 404. In some cases, the greater depth ofRandom Forest models 406 may cause theRandom Forest models 406 to be more efficient on longer datasets, whileGBM models 404 may be more efficient on wider datasets. In theRandom Forest models 406, overfitting can be reduced by randomizing tree building and combining the trees together. In some examples, a majority vote of the resulting trees' predictions can be used to generate produce a finalRandom Forest prediction 414 from a forest of underlying trees. - A
deep learning model 408 can be a basepredictive model 304 that outputs adeep learning prediction 416. Adeep learning prediction 416 can be abase prediction 306 of costs, utilization, risk metrics, and/or other information about amember 102. -
Deep learning models 408 can be based on neural networks. For example, adeep learning model 408 can include a multi-layer, feedforward network of neurons, where each neuron takes a weighted combination of input signals as shown in Equation 8: -
α=Σi=1 n w i x i +b. (Eq. 8) - Equation 8 reflects a weighted sum of input covariates, with additional bias b, the neuron's activation threshold. The neuron can produce an output, f(α), that represents a nonlinear activation function. In the overall neural network, an input layer can match the covariate space, and can be followed by multiple layers of neurons to produce abstractions from input data, ending with a classification layer to match a discrete set of outcomes. Each layer of neurons can provide inputs to the next layer, and weights and bias values can determine the output from the neural network. In some examples, learning can involve adapting such weights to minimize a mean squared error between the predicted outcome and the observed outcome, across individuals in the derivation dataset.
- Deep learning estimators can be developed by training across multiple activation functions and hidden layer sizes. Such activation functions can, for example, include “maxout,” “rectifier,” or “tanh.” In some examples, a maxout activation function can be a generalization of a rectified linear function given by Equation 9:
-
f(α)=max(0, α), f(·)∈R +. (Eq. 9) - For example, in a maxout activation function, each neuron can choose the largest output of two input channels, where each input channel has a weight and bias value. In some examples, “dropout” can be used, in which the function is constrained so that each neuron in the neural network suppresses its activation with probability P (<0.2 for input neurons, and <0.5 for hidden neurons), which scales network weight values towards 0 and can cause each training iteration to train a different estimator, thereby enabling exponentially larger numbers of estimators to be averaged as an ensemble to prevent overfitting and improve generalization. A rectifier activation function can be a version of the maxout activation function in which the output of one channel is always zero.
- In some examples, a
deep learning model 408 can assign different variables, such as ages ofmembers 102, diagnoses ofmembers 102, prescriptions ofmembers 102, and/or other factors to different columns. Thedeep learning model 408 can identify which of those variables, and/or which combinations of variables, are most predictive of actual results in a training dataset, such as incurred healthcare costs and visits to healthcare providers. - As shown in
FIG. 4 , a GBM meta-learner 418 can be aprediction combination model 308 that combines one or morelinear model predictions 410 from one or more regularizedlinear models 402, one ormore GBM predictions 412 from one ormore GBM models 404, one or moreRandom Forest predictions 414 from one or moreRandom Forest models 406, and/or one or moredeep learning predictions 416 from one or moredeep learning models 408 into afinal prediction 114. - The GBM meta-
learner 418 may use Gradient Boosting Machine techniques to determineweights 310 for different instances of basepredictive models 304, and use thoseweights 310 to combinecorresponding base predictions 306 into afinal prediction 114. For example, the GBM meta-learner 418 can selectlinear model weights 420 for corresponding regularizedlinear models 402,select GBM weights 422 forcorresponding GBM models 404, selectRandom Forest weights 424 for correspondingRandom Forest models 406, and selectdeep learning weights 426 for correspondingdeep learning models 408. The GBM meta-learner 418 can use thelinear model weights 420,GBM weights 422,Random Forest weights 424, anddeep learning weights 426 to generate theprediction 114 using weighted averages, or other weighted combinations, of associatedlinear model predictions 410,GBM predictions 412,Random Forest predictions 414, anddeep learning predictions 416. - Over time, as
ticket feedback 118 is received fromrepresentatives 104, the GBM meta-learner 418 may adjust thelinear model weights 420,GBM weights 422,Random Forest weights 424, anddeep learning weights 426. Accordingly, ifticket feedback 118 shows over time that one or more of the regularizedlinear models 402,GBM models 404,Random Forest models 406, ordeep learning models 408 generatebase predictions 306 with priority levels that more closely match theticket feedback 118 fromrepresentatives 104, the better-performing predictive models can be up-weighted and the worse-performing predictive models can be down-weighted. Accordingly,future predictions 114 generated based on the adjustedweights 310 may have outreach priorities that are closer to howrepresentatives 104 would rank their outreach priorities, and member outreach services can be offered tocorresponding members 102 in a more timely manner and/or in a more effective order. -
FIG. 5 depicts an example of a curve that can be used by aprediction combination model 308 to determinerelative outreach priorities 502 associated withbase predictions 306 and/orticket feedback 118. As discussed above,base predictions 306 andfinal predictions 114 can include cost estimates, utilization estimates, risk metric estimates, and/or other information associated withmembers 102. The information inbase predictions 306 andfinal predictions 114 can be used to determine anoutreach priority 502 for associatedoutreach tickets 116. For example, when afirst prediction 114 indicates that a first member's costs, utilizations, and/or risk metrics are expected to rise sharply, and a second member's costs, utilizations, and/or risk metrics are expected to rise more slowly, theprediction engine 110 or theticket manager 112 may determine that anoutreach ticket 116 associated with thefirst member 102 has agreater outreach priority 502 than anoutreach ticket 116 associated with thesecond member 102. Accordingly, theticket manager 112 may arrange a queue ofoutreach tickets 116 based on corresponding metrics ofoutreach priorities 502. - In some examples, the
outreach priority 502 associated with a set ofoutreach tickets 116 can be scaled to a bell curve as shown inFIG. 5 . For example,outreach priorities 502 of a set ofoutreach tickets 116 can be scaled into a Z-score bell curve with a mean of zero and a standard deviation of 1. On such a bell curve,most outreach tickets 116 may fall near the center of the bell curve. However, someoutreach tickets 116 may fall in high-priority ranges that indicate that it may be more critical to contact correspondingmembers 102 to offer member outreach services thanother members 102 whoseoutreach tickets 116 have lower scaledoutreach priorities 502. Althoughoutreach priorities 502 ofoutreach tickets 116 can be based onfinal predictions 114 generated using a weighted combination ofmultiple base predictions 306, scaledoutreach priorities 502 thatindividual base predictions 306 would have had alone can also be plotted on such a bell curve. -
Ticket feedback 118 can also be scaled to the same bell curve. As discussed above, in some examples,ticket feedback 118 may be based on intervenability scores, interpretability scores, and/or a combination of both scores. For instance, if a representative 104 rates anoutreach ticket 116 with an intervenability score of three out of five and an interpretability score of four out of five, theticket feedback 118 can overall rate theoutreach ticket 116 with as a seven out of ten on anoutreach priority 502 scale. Such scores inticket feedback 118 can be scaled to the same Z-score bell curve discussed above with respect tooutreach tickets 116, even ifoutreach tickets 116 are prioritized in queues based on predicted rises in costs, utilizations, risk metrics, and/orother outreach priority 502 factors. - The
prediction combination model 308 can determine how closely thescaled ticket feedback 118 matches the scaledoutreach priorities 502 associated withbase predictions 306 generated by different basepredictive models 304. Theprediction combination model 308 can adjustweights 310 associated with the basepredictive models 304 upward or downward, based on how closely the scaledoutreach priority 502 associated with the basepredictive models 304 matched the scaledticket feedback 118. - For example,
FIG. 5 shows thatbase prediction 306B for amember 102 had a scaledoutreach priority 502 that was closer to the scaledoutreach priority 502 ofactual ticket feedback 118 associated with that member'soutreach ticket 116 than base prediction 306A. Theprediction combination model 308 may accordingly adjustweights 310 by down-weighting a first base predictive model 304A that generated base prediction 306A and/or up-weighting a second base predictive model 304B that generated better-performingbase prediction 306B.Base predictions 306 from both the first base predictive model 304A and the second base predictive model 304B can continue to be used to generate final predictions in theprediction engine 110 according to the adjustedweights 310. - As noted above, machine-learning and/or artificial intelligence can also, or alternately, be used to adjust
weights 310 of different basepredictive models 304. For example, as discussed above with respect toFIG. 4 , a GBM meta-learner 418 can evaluate how closely elements ofbase predictions 306 matched elements of later-receivedticket feedback 118, and adjust correspondingweights 310 of basepredictive models 304 up or down accordingly. -
FIG. 6 shows a flowchart of an example method for generatingpredictions 114 andoutreach tickets 116, and for adjustingweights 310 in aprediction engine 110 according toticket feedback 118. At block 602, aprediction engine 110 can generate a set ofbase predictions 306 using an ensemble of basepredictive models 304. As discussed above, the ensemble of basepredictive models 304 may include multiple types of basepredictive models 304, and/or include multiple instances of each type of basepredictive model 304. Different instances of basepredictive models 304 can be trained on different random subsets of data from one or morehealthcare data repositories 108. At block 602, the ensemble of basepredictive models 304 can generatebase predictions 306 for amember 102 based on data in one or morehealthcare data repositories 108. - At
block 604, aprediction combination model 308 in theprediction engine 110, such as a meta-learner, can combine thebase predictions 306 into afinal prediction 114 for themember 102, based onweights 310 associated with the basepredictive models 304 that generated thebase predictions 306. For instance, in some examples, theprediction combination model 308 can generate afinal prediction 114 for amember 102 as a weighted average of thebase predictions 306, with the contributions of thebase predictions 306 to thefinal prediction 114 being based on correspondingweights 310. - At
block 606, theprediction engine 110 or aticket manager 112 can determine if thefinal prediction 114 includes one or more values that meet or exceed corresponding threshold values. For example, theprediction 114 can include a predicted estimate of a rise or decrease in healthcare costs associated with themember 102 over a period of time, relative to previous periods of time. In this example, theprediction engine 110 or theticket manager 112 may determine if theprediction 114 indicates a rise in costs that exceeds a predefined threshold value. As another example, theprediction 114 can include a predicted utilization estimate of a number of visits themember 102 will make to healthcare providers in total, or by category, over a period of time. In this example, theprediction engine 110 or theticket manager 112 may determine if theprediction 114 indicates that a predicted number of visits to healthcare providers in total, or in one or more categories, by themember 102 over a period of time exceeds a predefined threshold value. As yet another example, theprediction 114 can include predictions of one or more risk metrics, such as risks or admission or readmission to hospitals or other healthcare facilities, risks of pharmaceutical advancement to more expensive medications and/or to medications with increased side effects, risk of increased suffering levels, risks of increased mortality, and/or predictions of other types of future risks. In this example, theprediction engine 110 or theticket manager 112 can determine if theprediction 114 indicates a predicted rise in one or more risk metrics that exceeds a predefined threshold value. - If the
prediction engine 110 or theticket manager 112 determines atblock 606 that one or more values in aprediction 114 exceed one or more threshold values, theticket manager 112 can generate anoutreach ticket 116 for themember 102 atblock 608. For example, if aprediction 114 indicates that healthcare costs associated with amember 102 are expected to rise more than a threshold amount over the next year (e.g., theprediction 114 meets or exceeds a threshold), anoutreach ticket 116 can be generated such that a representative 104 can attempt to contact themember 102 to offer member outreach services. However, if values in aprediction 114 for amember 102 do not meet or exceed one or more threshold values atblock 606, the process can return to block 602 and theprediction engine 110 can continue generatingbase predictions 306 for thesame member 102 and/orother members 102. For example, if aprediction 114 indicates that costs, utilizations, or risk metrics associated with amember 102 are expected to decrease relative to previous levels, or are expected to rise at less than a threshold amount, there may be little or no benefit to offering member outreach services to thatmember 102. - In some examples, blocks 602 through 608 can be performed multiple times for
multiple members 102, such thatoutreach tickets 116 fordifferent members 102 can be generated atblock 608. As discussed above, a set ofoutreach tickets 116 fordifferent members 102 can be prioritized or ordered for one ormore representatives 104, for example in a queue ofoutreach tickets 116. In some examples,outreach tickets 116 can be prioritized based oncorresponding outreach priority 502 metrics. Accordingly,representatives 104 may attempt to contactmembers 102 in an order based on priorities ofcorresponding outreach tickets 116, such thatmembers 102 who may benefit more from member outreach services, who are better candidates for member outreach services, and/or whose situations are better explained byoutreach tickets 116 may be prioritized to be contacted beforeother members 102. - At
block 610, afterrepresentatives 104 have at least attempted to contactmembers 102 identified inoutreach tickets 116, theticket manager 112 and/or theprediction engine 110 can receiveticket feedback 118 from therepresentatives 104. As discussed above,ticket feedback 118 may be received from a representative 104 via a survey, form, user interface, or other mechanism.Ticket feedback 118 may provide scores, rankings, or other feedback fromrepresentatives 104 in one or more categories, such as categories for how “intervenable” amember 102 identified in anoutreach ticket 116 was, and/or how “interpretable” information in theoutreach ticket 116 itself was. For example, intervenability and/or interpretability scores can be provided by representatives on a Likert scale of 0-5, or other scale. In some examples,ticket feedback 118 can be a combination of both intervenability and interpretability scores on a 0-10 scale, or other scale. - At
block 612, theprediction combination model 308 can use theticket feedback 118 received atblock 610 to adjustweights 310 corresponding to different basepredictive models 304. For example, anoutreach ticket 116 may have been prioritized as the second-highest priority in a set ofoutreach tickets 116, butticket feedback 118 associated with the set ofoutreach tickets 116 may indicate thatrepresentatives 104 gave thatoutreach ticket 116 the tenth-highest score of the set ofoutreach tickets 116. Accordingly, theprediction combination model 308 may increaseweights 310 associated with basepredictive models 304 that producedbase predictions 306 that, taken alone, were closest to ranking member outreach to themember 102 as the tenth-highest amongbase predictions 306 for a set ofmembers 102. Theprediction combination model 308 may similarly decreaseweights 310 associated with basepredictive models 304 that producedbase predictions 306 that were farther away from ranking amember 102 as being the tenth-highest priority for member outreach services. - As a non-limiting example, a seventh instance of a
GBM model 404 may have producedGBM predictions 412 for a set ofmembers 102, and thoseGBM predictions 412 may include values indicating that aparticular member 102 is the second-highest priority for member outreach services, out of the set ofmembers 102. In this example, a fifty-fifth instance of adeep learning model 408 may have produceddeep learning predictions 416 for the same set ofmembers 102. Thesedeep learning predictions 416 may include values indicating that theparticular member 102 was the fifteenth-highest priority out of the set ofmembers 102. TheGBM predictions 412 and thedeep learning predictions 416 can be combined, along withother base predictions 306 from other basepredictive models 304 usingcorresponding weights 310, intofinal predictions 114 for the set ofmembers 102, andcorresponding outreach tickets 116 can be generated. Ifticket feedback 118 then indicates thatrepresentatives 104 subjectively felt that, among the set ofmembers 102, theparticular member 102 was the third-most suitable for member outreach services. In this example, theprediction combination model 308 may determine that theGBM predictions 412 of the seventh instance of the GBM model 404 (indicating that themember 102 was the second-highest priority) more closely matched the ticket feedback 118 (indicating that themember 102 was the third-highest priority) than thedeep learning predictions 416 fifty-fifth instance of the deep learning model 408 (indicating that themember 102 was the fifteenth-highest priority). Accordingly, theprediction combination model 308 can increase theweight 310 associated with the seventh instance of theGBM model 404 and decrease theweight 310 associated with the fifty-fifth instance of thedeep learning model 408. - Over time, through the example process of
FIG. 6 ,ticket feedback 118 can be used to increase theweights 310 of basepredictive models 304 that tend to producebase predictions 306 that would prioritizemembers 102 for member outreach services similarly to howrepresentatives 104 would prioritizesuch members 102. Similarly,ticket feedback 118 can be used to decrease theweights 310 of basepredictive models 304 that tend to producebase predictions 306 that would prioritizemembers 102 for member outreach services in orders that least match howrepresentatives 104 would prioritizesuch members 102.Base predictions 306 from numerous basepredictive models 304 can continue to be combined intofinal predictions 114. However, the contribution of basepredictive models 304 that leastmatch ticket feedback 118 to thefinal prediction 114 can be lessened according to the adjustedweights 310, and the contribution of basepredictive models 304 that mostmatch ticket feedback 118 to thefinal prediction 114 can be increased according to the adjustedweights 310. This process can accordingly improve the quality, accuracy, and/or priorities offinal prediction 114 andcorresponding outreach tickets 116, relative to equally weightingdifferent base predictions 306 from different instances of basepredictive models 304 and/or different types of basepredictive models 304. -
FIG. 7 shows an example system architecture for acomputing device 106 associated with themember outreach system 100 described herein. Acomputing device 106 can be a server, computer, cloud computing device, or other type of computing device that executes at least a portion of themember outreach system 100. In some examples, elements of themember outreach system 100 can be distributed among, and/or be executed by,multiple computing devices 106. For example, theprediction engine 110 may execute on adifferent computing device 106 than theticket manager 112. In other examples, theprediction engine 110 can execute on thesame computing device 106 as theticket manager 112. - A
computing device 106 can includememory 702. In various examples, thememory 702 can include system memory, which may be volatile (such as RAM), non-volatile (such as ROM, flash memory, etc.) or some combination of the two. Thememory 702 can further include non-transitory computer-readable media, such as volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information, such as computer readable instructions, data structures, program modules, or other data. System memory, removable storage, and non-removable storage are all examples of non-transitory computer-readable media. Examples of non-transitory computer-readable media include, but are not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile discs (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other non-transitory medium which can be used to store desired information and which can be accessed by one ormore computing devices 106 associated with themember outreach system 100. Any such non-transitory computer-readable media may be part of thecomputing devices 106. - The
memory 702 can store data, computer-readable or computer-executable instructions, and/or other information associated with themember outreach system 100. For example, thememory 702 can store data and/or instructions associated with thehealthcare data repository 108,prediction engine 110, andticket manager 112 discussed above. Thememory 702 may also store other modules anddata 704 that can be utilized by thecomputing device 106 to perform or enable performing any action taken by themember outreach system 100. Such other modules anddata 704 can include a platform, operating system, and applications, and data utilized by the platform, operating system, and applications. - One or
more computing devices 106 of themember outreach system 100 can also have processor(s) 706, communication interfaces 708,displays 710,output devices 712,input devices 714, and/or adrive unit 716 including a machinereadable medium 718. - In various examples, the processor(s) 706 can be a central processing unit (CPU), a graphics processing unit (GPU), both a CPU and a GPU, or any other type of processing unit. Each of the one or more processor(s) 706 may have numerous arithmetic logic units (ALUs) that perform arithmetic and logical operations, as well as one or more control units (CUs) that extract instructions and stored content from processor cache memory, and then executes these instructions by calling on the ALUs, as necessary, during program execution. The processor(s) 706 may also be responsible for executing computer applications stored in the
memory 702, which can be associated with common types of volatile (RAM) and/or nonvolatile (ROM) memory. - The communication interfaces 708 can include transceivers, modems, interfaces, antennas, telephone connections, and/or other components that can transmit and/or receive data over networks, telephone lines, or other connections.
- The
display 710 can be a liquid crystal display or any other type of display commonly used in computing devices. For example, adisplay 710 may be a touch-sensitive display screen, and can then also act as an input device or keypad, such as for providing a soft-key keyboard, navigation buttons, or any other type of input. - The
output devices 712 can include any sort of output devices known in the art, such as adisplay 710, speakers, a vibrating mechanism, and/or a tactile feedback mechanism.Output devices 712 can also include ports for one or more peripheral devices, such as headphones, peripheral speakers, and/or a peripheral display. - The
input devices 714 can include any sort of input devices known in the art. For example,input devices 714 can include a microphone, a keyboard/keypad, and/or a touch-sensitive display, such as the touch-sensitive display screen described above. A keyboard/keypad can be a push button numeric dialing pad, a multi-key keyboard, or one or more other types of keys or buttons, and can also include a joystick-like controller, designated navigation buttons, or any other type of input mechanism. - The machine readable medium 178 can store one or more sets of instructions, such as software or firmware, that embodies any one or more of the methodologies or functions described herein. The instructions can also reside, completely or at least partially, within the
memory 702, processor(s) 706, and/or communication interface(s) 708 during execution thereof by the one ormore computing devices 106 of themember outreach system 100. Thememory 702 and the processor(s) 706 also can constitute machinereadable media 718. - Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example embodiments.
Claims (20)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/861,672 US20220359080A1 (en) | 2020-02-03 | 2022-07-11 | Multi-model member outreach system |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US202062969573P | 2020-02-03 | 2020-02-03 | |
US16/881,981 US11386999B2 (en) | 2020-02-03 | 2020-05-22 | Multi-model member outreach system |
US17/861,672 US20220359080A1 (en) | 2020-02-03 | 2022-07-11 | Multi-model member outreach system |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/881,981 Continuation US11386999B2 (en) | 2020-02-03 | 2020-05-22 | Multi-model member outreach system |
Publications (1)
Publication Number | Publication Date |
---|---|
US20220359080A1 true US20220359080A1 (en) | 2022-11-10 |
Family
ID=77062679
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/881,981 Active US11386999B2 (en) | 2020-02-03 | 2020-05-22 | Multi-model member outreach system |
US17/861,672 Pending US20220359080A1 (en) | 2020-02-03 | 2022-07-11 | Multi-model member outreach system |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/881,981 Active US11386999B2 (en) | 2020-02-03 | 2020-05-22 | Multi-model member outreach system |
Country Status (2)
Country | Link |
---|---|
US (2) | US11386999B2 (en) |
WO (1) | WO2021158379A1 (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2022141527A (en) * | 2021-03-15 | 2022-09-29 | 富士通株式会社 | Model building method and model building program |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080015891A1 (en) * | 2006-07-12 | 2008-01-17 | Medai, Inc. | Method and System to Assess an Acute and Chronic Disease Impact Index |
US20140343955A1 (en) * | 2013-05-16 | 2014-11-20 | Verizon Patent And Licensing Inc. | Method and apparatus for providing a predictive healthcare service |
US20160125150A1 (en) * | 2014-10-31 | 2016-05-05 | Cerner Innovation, Inc. | Care management outreach |
US20160358290A1 (en) * | 2012-04-20 | 2016-12-08 | Humana Inc. | Health severity score predictive model |
US20180350461A1 (en) * | 2015-11-30 | 2018-12-06 | Optum, Inc. | System and method for point of care identification of gaps in care |
Family Cites Families (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8200506B2 (en) | 2006-12-19 | 2012-06-12 | Accenture Global Services Limited | Integrated health management platform |
US20080177567A1 (en) * | 2007-01-22 | 2008-07-24 | Aetna Inc. | System and method for predictive modeling driven behavioral health care management |
US20090138285A1 (en) * | 2007-11-26 | 2009-05-28 | Denberg Thomas D | Health Promotion Outreach System |
EP2245568A4 (en) * | 2008-02-20 | 2012-12-05 | Univ Mcmaster | Expert system for determining patient treatment response |
US20100004947A1 (en) * | 2008-07-01 | 2010-01-07 | Michael Nadeau | System and Method for Providing Health Management Services to a Population of Members |
US20160358282A1 (en) * | 2010-12-29 | 2016-12-08 | Humana Inc. | Computerized system and method for reducing hospital readmissions |
US20130096934A1 (en) * | 2011-04-07 | 2013-04-18 | APS Healthcare Bethesda, Inc. | Percolator systems and methods |
US20140100884A1 (en) * | 2012-10-08 | 2014-04-10 | Cerner Innovation, Inc. | Outreach program |
US20160035040A1 (en) * | 2014-08-04 | 2016-02-04 | Cigna Intellectual Property, Inc. | System and method of determining and optimizing healthcare savings opportunities for individuals via alternative engagement channels |
US9697469B2 (en) * | 2014-08-13 | 2017-07-04 | Andrew McMahon | Method and system for generating and aggregating models based on disparate data from insurance, financial services, and public industries |
US20170061091A1 (en) * | 2015-08-26 | 2017-03-02 | Uptake Technologies, Inc. | Indication of Outreach Options for Healthcare Facility to Facilitate Patient Actions |
US20190156955A1 (en) * | 2015-08-31 | 2019-05-23 | Palantir Technologies Inc. | Identifying program member data records for targeted operations |
US20170177801A1 (en) * | 2015-12-18 | 2017-06-22 | Cerner Innovation, Inc. | Decision support to stratify a medical population |
US10546102B2 (en) * | 2016-01-18 | 2020-01-28 | International Business Machines Corporation | Predictive analytics work lists for healthcare |
US20190172564A1 (en) * | 2017-12-05 | 2019-06-06 | International Business Machines Corporation | Early cost prediction and risk identification |
US11074485B2 (en) * | 2018-07-09 | 2021-07-27 | Koninklijke Philips N.V. | System and method for identifying optimal effective communication channel for subject engagement |
US11941513B2 (en) * | 2018-12-06 | 2024-03-26 | Electronics And Telecommunications Research Institute | Device for ensembling data received from prediction devices and operating method thereof |
US11250948B2 (en) * | 2019-01-31 | 2022-02-15 | International Business Machines Corporation | Searching and detecting interpretable changes within a hierarchical healthcare data structure in a systematic automated manner |
-
2020
- 2020-05-22 US US16/881,981 patent/US11386999B2/en active Active
-
2021
- 2021-01-22 WO PCT/US2021/014743 patent/WO2021158379A1/en active Application Filing
-
2022
- 2022-07-11 US US17/861,672 patent/US20220359080A1/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080015891A1 (en) * | 2006-07-12 | 2008-01-17 | Medai, Inc. | Method and System to Assess an Acute and Chronic Disease Impact Index |
US20160358290A1 (en) * | 2012-04-20 | 2016-12-08 | Humana Inc. | Health severity score predictive model |
US20140343955A1 (en) * | 2013-05-16 | 2014-11-20 | Verizon Patent And Licensing Inc. | Method and apparatus for providing a predictive healthcare service |
US20160125150A1 (en) * | 2014-10-31 | 2016-05-05 | Cerner Innovation, Inc. | Care management outreach |
US20180350461A1 (en) * | 2015-11-30 | 2018-12-06 | Optum, Inc. | System and method for point of care identification of gaps in care |
Also Published As
Publication number | Publication date |
---|---|
US11386999B2 (en) | 2022-07-12 |
WO2021158379A1 (en) | 2021-08-12 |
US20210241907A1 (en) | 2021-08-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11200984B2 (en) | Method for modeling behavior and psychotic disorders | |
US11488714B2 (en) | Machine learning for collaborative medical data metrics | |
US11195625B2 (en) | Method for modeling behavior and depression state | |
US10332624B2 (en) | System and methods for an intelligent medical practice system employing a learning knowledge base | |
US9990471B2 (en) | Systems and methods for facilitating integrated behavioral support | |
US10332631B2 (en) | Integrated medical platform | |
US20210110893A1 (en) | Cognitive Evaluation of Assessment Questions and Answers to Determine Patient Characteristics | |
US20190043606A1 (en) | Patient-provider healthcare recommender system | |
US20120004925A1 (en) | Health care policy development and execution | |
US20170061091A1 (en) | Indication of Outreach Options for Healthcare Facility to Facilitate Patient Actions | |
EP0917078A1 (en) | Disease management method and system | |
US11610679B1 (en) | Prediction and prevention of medical events using machine-learning algorithms | |
US20180181719A1 (en) | Virtual healthcare personal assistant | |
US20120109683A1 (en) | Method and system for outcome based referral using healthcare data of patient and physician populations | |
US20210082575A1 (en) | Computerized decision support tool for post-acute care patients | |
EP3826027A1 (en) | Event data modelling | |
US20210118557A1 (en) | System and method for providing model-based predictions of beneficiaries receiving out-of-network care | |
US20160358282A1 (en) | Computerized system and method for reducing hospital readmissions | |
US20230282324A1 (en) | Dynamic scripting for call agents | |
US20220359080A1 (en) | Multi-model member outreach system | |
WO2017007461A1 (en) | Integrated medical platform | |
US20210110924A1 (en) | System and method for monitoring system compliance with measures to improve system health | |
US20230042330A1 (en) | A tool for selecting relevant features in precision diagnostics | |
US20240013928A1 (en) | Systems and methods of patient prioritization scores and measures | |
KR102476612B1 (en) | Method and system for providing psychological customized solution based on artificial intelligence |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: COLLECTIVEHEALTH, INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:BASU, SANJAY;REEL/FRAME:060473/0314 Effective date: 20200521 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |