US20230162828A1 - Personalized assistance system for user of vision correction device - Google Patents

Personalized assistance system for user of vision correction device Download PDF

Info

Publication number
US20230162828A1
US20230162828A1 US18/160,238 US202318160238A US2023162828A1 US 20230162828 A1 US20230162828 A1 US 20230162828A1 US 202318160238 A US202318160238 A US 202318160238A US 2023162828 A1 US2023162828 A1 US 2023162828A1
Authority
US
United States
Prior art keywords
user
assistance system
controller
personalized assistance
remote computing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
US18/160,238
Inventor
Kevin Baker
Ramesh Sarangapani
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alcon Inc
Original Assignee
Alcon Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alcon Inc filed Critical Alcon Inc
Priority to US18/160,238 priority Critical patent/US20230162828A1/en
Publication of US20230162828A1 publication Critical patent/US20230162828A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/30ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to physical therapies or activities, e.g. physiotherapy, acupressure or exercising
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/0016Operational features thereof
    • A61B3/0033Operational features thereof characterised by user input arrangements
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/02Subjective types, i.e. testing apparatus requiring the active assistance of the patient
    • A61B3/028Subjective types, i.e. testing apparatus requiring the active assistance of the patient for testing visual acuity; for determination of refraction, e.g. phoropters
    • A61B3/04Trial frames; Sets of lenses for use therewith
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61LMETHODS OR APPARATUS FOR STERILISING MATERIALS OR OBJECTS IN GENERAL; DISINFECTION, STERILISATION OR DEODORISATION OF AIR; CHEMICAL ASPECTS OF BANDAGES, DRESSINGS, ABSORBENT PADS OR SURGICAL ARTICLES; MATERIALS FOR BANDAGES, DRESSINGS, ABSORBENT PADS OR SURGICAL ARTICLES
    • A61L12/00Methods or apparatus for disinfecting or sterilising contact lenses; Accessories therefor
    • GPHYSICS
    • G02OPTICS
    • G02CSPECTACLES; SUNGLASSES OR GOGGLES INSOFAR AS THEY HAVE THE SAME FEATURES AS SPECTACLES; CONTACT LENSES
    • G02C11/00Non-optical adjuncts; Attachment thereof
    • G02C11/10Electronic devices other than hearing aids
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H10/00ICT specially adapted for the handling or processing of patient-related medical or healthcare data
    • G16H10/20ICT specially adapted for the handling or processing of patient-related medical or healthcare data for electronic clinical trials or questionnaires
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/40ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the management of medical equipment or devices, e.g. scheduling maintenance or upgrades
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/67ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/70ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for mining of medical data, e.g. analysing previous cases of other patients
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H80/00ICT specially adapted for facilitating communication between medical practitioners or patients, e.g. for collaborative diagnosis, therapy or health monitoring
    • GPHYSICS
    • G02OPTICS
    • G02CSPECTACLES; SUNGLASSES OR GOGGLES INSOFAR AS THEY HAVE THE SAME FEATURES AS SPECTACLES; CONTACT LENSES
    • G02C7/00Optical parts
    • G02C7/02Lenses; Lens systems ; Methods of designing lenses
    • G02C7/04Contact lenses for the eyes

Definitions

  • the disclosure relates generally to a personalized assistance system for a user of a vision correction device and method.
  • Humans have five basic senses: sight, hearing, smell, taste and touch. Sight gives us the ability to visualize the world around us and connects us to our surroundings. According to some scientific reports, the brain devotes more space to processing and storing visual information than the other four senses combined, underscoring the importance of sight.
  • vision correction devices such as spectacles and contact lenses.
  • the personalized assistance system includes a remote computing unit with a controller having a processor and tangible, non-transitory memory on which instructions are recorded.
  • the controller is configured to selectively execute one or more machine learning models.
  • a user device is operable by the user and configured to communicate with the remote computing unit.
  • the user device includes an electronic diary module configured to prompt the user to answer one or more preselected questions at specific intervals.
  • the electronic diary module is configured to store respective answers entered by the user in response to the one or more preselected questions as self-reported data.
  • the one or more preselected questions may include an inquiry into a comfort level of the user, including at least one of a dryness factor and an irritation factor.
  • the one or more preselected questions may include an inquiry into when the user last cleaned the vision correction device.
  • the controller is configured to obtain the self-reported data from the electronic diary module and generate an analysis of the self-reported data, via the one or more machine learning models.
  • the controller is configured to assist the user based in part on the analysis.
  • the vision correction device may include, but is not limited to, a contact lens.
  • the contact lens may be a multi-focal lens having a first zone for distance vision, a second zone for near vision and a third zone for intermediate vision.
  • the remote computing unit may include a first cloud unit and a central server, with the controller being embedded in at least one of the first cloud unit and the central server.
  • the user device may include a query module configured to receive at least one question generated by the user.
  • the controller may be configured to receive the question from the query module, formulate a reply, based in part on a first one of the one or more machine learning models, and post the reply, via the query module, for consumption by the user.
  • a provider device is configured to communicate with the remote computing unit, the provider device being operable by an eye care provider associated with the user.
  • the user device and the provider device include respective message modules.
  • the remote computing unit may be configured to provide two-way communication between the eye care provider and the user via the respective message modules.
  • the remote computing unit may include a first database storing respective information pertaining to the user, including a type of the vision correction device.
  • the remote computing unit may include a second database storing group data pertaining to a group of users, the group data including respective self-reported data of the group of users.
  • the user device includes a comparative tracking module configured to enable the user to compare the self-reported data with the group data. Assisting the user may include at least one of: providing coaching on taking care of the vision correction device and/or an eye of the user; suggesting a follow-up visit with an eye care provider; and suggesting an alternative vision correction product.
  • FIG. 1 is a schematic illustration of a personalized assistance system having a remote computing unit with a controller;
  • FIG. 2 is a schematic flowchart for a method executable by the controller of FIG. 1 ;
  • FIG. 3 is a schematic example of a machine learning model executable by the controller of FIG. 1 .
  • FIG. 1 schematically illustrates a personalized assistance system 10 for assisting a user 12 of a vision correction device 14 .
  • the personalized assistance system 10 may include interfacing the user 12 with an eye care provider 16 associated with the user 12 .
  • the personalized assistance system 10 is configured to address problems leading to a user 12 no longer wearing the vision correction device 14 , i.e. mitigate a drop-off in the use of the vision correction device 14 by the user 12 .
  • the vision correction device 14 is a contact lens having multiple zones with different respective optical powers, such as a first zone 22 for distance vision, a second zone 24 for near vision and an third zone 26 for intermediate vision. It is to be understood that the contact lens may take many different forms and include multiple and/or alternate components. Additionally, any type of vision correction device available to those skilled in the art may be employed.
  • the user 12 may employ the personalized assistance system 10 after being fitted with the vision correction device 14 by the eye care provider 16 to achieve a number of goals, including but not limited to: reporting outcomes over time so that progress can be tracked and monitored, asking questions and getting answers in real-time, and receiving personalized suggestions based on past reported outcomes and past queries. Additionally, the personalized assistance system 10 may be configured to respond to specific actions requested by the user 12 . For example, a user 12 may request setting up of a reminder to remove their vision correction device 14 . As described below, the personalized assistance system 10 leverages both self-reported data and comparative data for optimizing the experience of the user 12 .
  • the personalized assistance system 10 includes a remote computing unit 30 having a controller C.
  • the controller C has at least one processor P and at least one memory M (or non-transitory, tangible computer readable storage medium) on which are recorded instructions for executing a method 100 .
  • Method 100 is shown in and described below with reference to FIG. 2 .
  • the remote computing unit 30 may include one or cloud units, such as a first cloud unit 32 , a second cloud unit 34 and a central server 36 .
  • the controller C may be embedded in at least one of the cloud units and the central server 36 .
  • the central server 36 may be a private or public source of information maintained by an organization, such as for example, a research institute, a company, a university and/or a hospital.
  • the first cloud unit 32 and the second cloud unit 34 may include one or more servers hosted on the Internet to store, manage, and process data.
  • the controller C has access to and is specifically programmed to selectively execute one or more machine learning models 40 , such as first machine learning model 42 and second machine learning model 44 .
  • the machine learning models 40 may be configured to find parameters, weights or a structure that minimizes a respective cost function.
  • Each of the machine learning models 40 may be a respective regression model.
  • the first machine learning model 42 and the second machine learning model 44 are respectively embedded in the first cloud unit 32 and the second cloud unit 34 .
  • the remote computing unit 30 may include a first database 46 for storing respective information pertaining to the user 12 , including the type of the vision correction device.
  • the remote computing unit 30 may include a second database 48 for storing group data pertaining to a group of users.
  • a user device 50 is operable by the user 12 and configured to communicate with, i.e. receive and transmit wireless communication, the remote computing unit 30 , via a first network 52 .
  • the user device 50 may include a respective processor 54 and a respective memory 56 .
  • the user device 50 may run a first application 58 , which may be a mobile application or “app.”
  • the circuitry and components of a server, network and mobile application (“apps”) available to those skilled in the art may be employed.
  • the user device 50 may be a smartphone, laptop, tablet, desktop or other electronic device that the user 12 may operate, for example with a touch screen interface or I/O device such as a keyboard or mouse.
  • the plurality of modules 60 may be executed in coordination with the remote computing unit 30 .
  • the plurality of modules 60 includes an electronic diary module 62 , a query module 64 , a first messaging module 66 and a suggestion module 68 .
  • the plurality of modules 60 may consume the output of a common or different machine learning models 40 .
  • the electronic diary module 62 is configured to prompt the user 12 to answer one or more preselected questions at specific intervals, e.g. daily.
  • the electronic diary module 62 is configured to store respective answers entered by the user 12 in response to the one or more preselected questions as self-reported data.
  • the one or more preselected questions may include an inquiry into a comfort level of the user 12 , including at least one of a dryness factor and an irritation factor.
  • the one or more preselected questions may include an inquiry into when the user 12 last cleaned the vision correction device 14 .
  • the controller C may be configured to obtain the self-reported data from the electronic diary module 62 and generate an analysis of the self-reported data, via the one or more machine learning models 40 .
  • the user 12 may compare the self-reported data with group data (second database 48 ) generated by other users of the same type of vision correction device 14 , via the electronic diary module 62 .
  • the controller C may be configured to assist the user 12 based in part on the analysis. Assisting the user 12 based in part on the analysis may include at least one of the following: providing coaching on taking care of the vision correction device 14 (e.g. cleaning procedures) and/or an eye of the user 12 ; comparing comfort scores and other markers at a specific time period (e.g. one week after being fitted with the vision correction device 14 ) for the user 12 relative to a group of users of the same product; suggesting a follow-up visit with the eye care provider 16 ; and suggesting an alternative vision correction product.
  • providing coaching on taking care of the vision correction device 14 e.g. cleaning procedures
  • an eye of the user 12 may include at least one of the following: providing coaching on taking care of the vision correction device 14 (e.g. cleaning procedures) and/or an eye of the user 12 ; comparing comfort scores and other markers at a specific time period (e.g. one week after being fitted with the vision correction device 14 ) for the user 12 relative to a group of users of the same
  • the query module 64 in the user device 50 may be configured to receive at least one question entered by the user 12 .
  • the controller C may be configured to receive the at least one question from the query module 64 and formulate a reply, based in part on the one or more machine learning models 40 .
  • the reply may be posted, via the query module 64 , for consumption by the user 12 .
  • the personalized assistance system 10 may be configured to be “adaptive” and may be updated periodically after the collection of additional data.
  • the machine learning models 40 may be configured to be “adaptive machine learning” algorithms that are not static and that improve after additional user data is collected.
  • a provider device 70 is operable by the eye care provider 16 associated with the user.
  • the provider device 70 is configured to communicate with the remote computing unit 30 via a second network 72 .
  • the provider device 70 includes a respective processor 74 and a respective memory 76 . Similar to the user device 50 , the provider device 70 may run a second application 78 (incorporating a plurality of modules 80 ) that is executed in coordination with the remote computing unit 30 .
  • the plurality of modules 80 may include a patient database 82 (stratified by the type of vision correction device 14 ), a user progress tracker module 84 configured to track progress of the user 12 and other users associated with the eye care provider 16 , a second message module 86 and a comparative tracking module 88 configured to provide trend and comparative analyses.
  • the remote computing unit 30 may be configured to provide two-way communication between the eye care provider 16 and the user 12 via the first message module 66 and the second message modules 86 .
  • the personalized assistance system 10 may include a broker module 90 for routing the respective messages from the user 12 to the eye care provider 16 , and vice-versa.
  • the broker module 90 may be configured in different ways. While FIG. 1 shows an example implementation of the personalized assistance system 10 , it is understood that other implementations may be carried out.
  • the first network 52 and second network 72 may be wireless or may include physical components and may be a short-range network or a long-range network.
  • the first network 52 and second network 72 may be implemented in the form of a local area network.
  • the local area network may include, but is not limited to, a Controller Area Network (CAN), a Controller Area Network with Flexible Data Rate (CAN-FD), Ethernet, blue tooth, WIFI and other forms of data connection.
  • the local area network may be a BluetoothTM connection, defined as being a short-range radio technology (or wireless technology) aimed at simplifying communications among Internet devices and between devices and the Internet.
  • BluetoothTM is an open wireless technology standard for transmitting fixed and mobile electronic device data over short distances and creates personal networks operating within the 2.4 GHz band.
  • the local area network may be a Wireless Local Area Network (LAN) which links multiple devices using a wireless distribution method, a Wireless Metropolitan Area Networks (MAN) which connects several wireless LANs or a Wireless Wide Area Network (WAN) which covers large areas such as neighboring towns and cities. Other types of connections may be employed.
  • LAN Wireless Local Area Network
  • MAN Wireless Metropolitan Area Networks
  • WAN Wireless Wide Area Network
  • the machine learning models 40 of FIG. 1 may include a neural network algorithm. While a neural network is illustrated herein, it is understood that the machine learning models 40 may be based on different kinds or types of algorithms, including but not limited to, a neural network, support vector regression, linear or logistic regression. k-means clustering, random forest and other types. As understood by those skilled in the art, neural networks are designed to recognize patterns from real-world data (e.g. images, sound, text, time series and others), translate or convert them into numerical form and embed in vectors or matrices. The neural network may employ deep learning maps to match an input vector x to an output vectory. Stated differently, each of the plurality of machine learning models 40 learns an activation function ⁇ such that ⁇ (x) maps to y.
  • activation function
  • the training process enables the neural network to correlate the appropriate activation function ⁇ (x) for transforming the input vector x to the output vector y.
  • ⁇ (x) for transforming the input vector x to the output vector y.
  • two parameters are learned: a bias and a slope.
  • the bias is the level of the output vectory when the input vector x is 0 and the slope is the rate of predicted increase or decrease in the output vectory for each unit increase in the input vector x.
  • the example network 200 is a feedforward artificial neural network having at least three layers of nodes N, including an input layer 202 , one or more hidden layers, such as first hidden layer 204 and second hidden layer 206 , and an output layer 208 .
  • Each of the layers is composed of nodes N configured to perform an affine transformation of a linear sum of inputs.
  • the nodes N are neurons characterized by a respective bias and respective weighted links.
  • the nodes N in the input layer 202 receive the input, normalize them and forward them to nodes N in the first hidden layer 204 .
  • Each node N in a subsequent layer computes a linear combination of the outputs of the previous layer.
  • the activation function ⁇ may be linear for the respective nodes N in the output layer 210 .
  • the activation function f may be a sigmoid for the first hidden layer 204 and the second hidden layer 206 .
  • a linear combination of sigmoids is used to approximate a continuous function characterizing the output vector y.
  • the example network 200 may generate multiple outputs, such as a first output factor 212 and a second output factor 214 , with the controller C being configured to use a weighted average of the multiple outputs to obtain a final output 210 .
  • the inputs to the input layer 202 are various factors (e.g. comfort level, visual acuity score) pertaining to a particular type of vision correction device 14
  • the first output factor 212 and the second output factor 214 may be an objective satisfaction score and a subjective satisfaction score, respectively, for that particular type of vision correction device 14 .
  • Other machine learning models available to those skilled in the art may be employed.
  • FIG. 2 a flow chart of method 100 executable by the controller C of FIG. 1 is shown.
  • Method 100 need not be applied in the specific order recited herein, with the start and end indicated respectively by “S” and “E” in FIG. 2 . It is understood that some blocks may be omitted.
  • the memory M can store controller-executable instruction sets, and the processor P can execute the controller-executable instruction sets stored in the memory M.
  • the controller C is configured to determine if the electronic diary module 62 has been triggered and the user 12 has answered any of the preselected questions. If so, the method 100 proceeds to block 115 , where the controller C is configured to record the respective answers entered by the user 12 . If not, the method 100 loops back to the start S.
  • the controller C is configured to determine if the query module 64 has been triggered and the user 12 has asked a question. If so, per block 125 , the controller C is configured to formulate a reply, based in part on the one or more machine learning models 40 and post the reply, via the query module 64 , for consumption by the user 12 . For example, the controller C may extract keywords from the question and enter them as inputs into the input layer 202 . The reply may be extracted based on the output layer 208 of the example network 200 . If not, the method 100 loops back to the start S.
  • the controller C is configured to determine if one or more enabling conditions has been met.
  • the enabling conditions may include an irritation factor and/or discomfort factor being above a predetermined threshold. If so, per block 135 , the controller C may execute one or more actions, which may include interfacing with the eye care provider 16 via a message sent to the second message module 86 .
  • the actions may include posting a reminder via the suggestion module 68 in the user device 50 , such as “please remember to clean contact lenses daily.”
  • the personalized assistance system 10 employs a multi-prong approach utilizing one or more machine learning models 40 .
  • the personalized assistance system 10 may be configured to recognize non-adherence to suggested guidelines, recognize when a follow up visit to the eye care provider 16 makes sense or suggest an alternative contact lens.
  • the personalized assistance system 10 offers an effective two-way communication between the user 12 and the eye care provider 16 .
  • the controller C of FIG. 1 includes a computer-readable medium (also referred to as a processor-readable medium), including a non-transitory (e.g., tangible) medium that participates in providing data (e.g., instructions) that may be read by a computer (e.g., by a processor of a computer).
  • a medium may take many forms, including, but not limited to, non-volatile media and volatile media.
  • Non-volatile media may include, for example, optical or magnetic disks and other persistent memory.
  • Volatile media may include, for example, dynamic random-access memory (DRAM), which may constitute a main memory.
  • DRAM dynamic random-access memory
  • Such instructions may be transmitted by one or more transmission media, including coaxial cables, copper wire and fiber optics, including the wires that comprise a system bus coupled to a processor of a computer.
  • Some forms of computer-readable media include, for example, a floppy disk, a flexible disk, hard disk, magnetic tape, other magnetic medium, a CD-ROM, DVD, other optical medium, punch cards, paper tape, other physical medium with patterns of holes, a RAM, a PROM, an EPROM, a FLASH-EEPROM, other memory chip or cartridge, or other medium from which a computer can read.
  • Look-up tables, databases, data repositories or other data stores described herein may include various kinds of mechanisms for storing, accessing, and retrieving various kinds of data, including a hierarchical database, a set of files in a file system, an application database in a proprietary format, a relational database management system (RDBMS), etc.
  • Each such data store may be included within a computing device employing a computer operating system such as one of those mentioned above and may be accessed via a network in one or more of a variety of manners.
  • a file system may be accessible from a computer operating system and may include files stored in various formats.
  • An RDBMS may employ the Structured Query Language (SQL) in addition to a language for creating, storing, editing, and executing stored procedures, such as the PL/SQL language mentioned above.
  • SQL Structured Query Language

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Epidemiology (AREA)
  • Primary Health Care (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • General Business, Economics & Management (AREA)
  • Data Mining & Analysis (AREA)
  • Veterinary Medicine (AREA)
  • Ophthalmology & Optometry (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • Surgery (AREA)
  • General Physics & Mathematics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Databases & Information Systems (AREA)
  • Artificial Intelligence (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Computing Systems (AREA)
  • Evolutionary Computation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Acoustics & Sound (AREA)
  • Otolaryngology (AREA)
  • Optics & Photonics (AREA)
  • Eye Examination Apparatus (AREA)
  • Selective Calling Equipment (AREA)
  • Medical Treatment And Welfare Office Work (AREA)

Abstract

A personalized assistance system for a user of a vision correction device includes a remote computing unit with a controller having a processor and tangible, non-transitory memory on which instructions are recorded. The controller is configured to selectively execute one or more machine learning models. A user device is operable by the user and includes an electronic diary module configured to prompt the user to answer one or more preselected questions at specific intervals. The electronic diary module is configured to store respective answers, entered by the user in response to the one or more preselected questions, as self-reported data. The controller is configured to obtain the self-reported data from the electronic diary module and generate an analysis of the self-reported data, via the one or more machine learning models. The controller is configured to assist the user based in part on the analysis.

Description

  • The disclosure relates generally to a personalized assistance system for a user of a vision correction device and method. Humans have five basic senses: sight, hearing, smell, taste and touch. Sight gives us the ability to visualize the world around us and connects us to our surroundings. According to some scientific reports, the brain devotes more space to processing and storing visual information than the other four senses combined, underscoring the importance of sight. Many people worldwide have various issues with quality of vision, for example, due to refractive errors. At least some of these issues may be addressed with vision correction devices, such as spectacles and contact lenses.
  • SUMMARY
  • Disclosed herein is a personalized assistance system for a user of a vision correction device and method. The personalized assistance system includes a remote computing unit with a controller having a processor and tangible, non-transitory memory on which instructions are recorded. The controller is configured to selectively execute one or more machine learning models. A user device is operable by the user and configured to communicate with the remote computing unit. The user device includes an electronic diary module configured to prompt the user to answer one or more preselected questions at specific intervals.
  • The electronic diary module is configured to store respective answers entered by the user in response to the one or more preselected questions as self-reported data. The one or more preselected questions may include an inquiry into a comfort level of the user, including at least one of a dryness factor and an irritation factor. The one or more preselected questions may include an inquiry into when the user last cleaned the vision correction device.
  • The controller is configured to obtain the self-reported data from the electronic diary module and generate an analysis of the self-reported data, via the one or more machine learning models. The controller is configured to assist the user based in part on the analysis. The vision correction device may include, but is not limited to, a contact lens. For example, the contact lens may be a multi-focal lens having a first zone for distance vision, a second zone for near vision and a third zone for intermediate vision.
  • The remote computing unit may include a first cloud unit and a central server, with the controller being embedded in at least one of the first cloud unit and the central server. The user device may include a query module configured to receive at least one question generated by the user. The controller may be configured to receive the question from the query module, formulate a reply, based in part on a first one of the one or more machine learning models, and post the reply, via the query module, for consumption by the user.
  • A provider device is configured to communicate with the remote computing unit, the provider device being operable by an eye care provider associated with the user. The user device and the provider device include respective message modules. The remote computing unit may be configured to provide two-way communication between the eye care provider and the user via the respective message modules.
  • The remote computing unit may include a first database storing respective information pertaining to the user, including a type of the vision correction device. The remote computing unit may include a second database storing group data pertaining to a group of users, the group data including respective self-reported data of the group of users. The user device includes a comparative tracking module configured to enable the user to compare the self-reported data with the group data. Assisting the user may include at least one of: providing coaching on taking care of the vision correction device and/or an eye of the user; suggesting a follow-up visit with an eye care provider; and suggesting an alternative vision correction product.
  • The above features and advantages and other features and advantages of the present disclosure are readily apparent from the following detailed description of the best modes for carrying out the disclosure when taken in connection with the accompanying drawings.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a schematic illustration of a personalized assistance system having a remote computing unit with a controller;
  • FIG. 2 is a schematic flowchart for a method executable by the controller of FIG. 1 ; and
  • FIG. 3 is a schematic example of a machine learning model executable by the controller of FIG. 1 .
  • DETAILED DESCRIPTION
  • Referring to the drawings, wherein like reference numbers refer to like components, FIG. 1 schematically illustrates a personalized assistance system 10 for assisting a user 12 of a vision correction device 14. The personalized assistance system 10 may include interfacing the user 12 with an eye care provider 16 associated with the user 12. The personalized assistance system 10 is configured to address problems leading to a user 12 no longer wearing the vision correction device 14, i.e. mitigate a drop-off in the use of the vision correction device 14 by the user 12. In one example, the vision correction device 14 is a contact lens having multiple zones with different respective optical powers, such as a first zone 22 for distance vision, a second zone 24 for near vision and an third zone 26 for intermediate vision. It is to be understood that the contact lens may take many different forms and include multiple and/or alternate components. Additionally, any type of vision correction device available to those skilled in the art may be employed.
  • The user 12 may employ the personalized assistance system 10 after being fitted with the vision correction device 14 by the eye care provider 16 to achieve a number of goals, including but not limited to: reporting outcomes over time so that progress can be tracked and monitored, asking questions and getting answers in real-time, and receiving personalized suggestions based on past reported outcomes and past queries. Additionally, the personalized assistance system 10 may be configured to respond to specific actions requested by the user 12. For example, a user 12 may request setting up of a reminder to remove their vision correction device 14. As described below, the personalized assistance system 10 leverages both self-reported data and comparative data for optimizing the experience of the user 12.
  • Referring to FIG. 1 , the personalized assistance system 10 includes a remote computing unit 30 having a controller C. The controller C has at least one processor P and at least one memory M (or non-transitory, tangible computer readable storage medium) on which are recorded instructions for executing a method 100. Method 100 is shown in and described below with reference to FIG. 2 .
  • Referring to FIG. 1 , the remote computing unit 30 may include one or cloud units, such as a first cloud unit 32, a second cloud unit 34 and a central server 36. The controller C may be embedded in at least one of the cloud units and the central server 36. The central server 36 may be a private or public source of information maintained by an organization, such as for example, a research institute, a company, a university and/or a hospital. The first cloud unit 32 and the second cloud unit 34 may include one or more servers hosted on the Internet to store, manage, and process data.
  • The controller C has access to and is specifically programmed to selectively execute one or more machine learning models 40, such as first machine learning model 42 and second machine learning model 44. The machine learning models 40 may be configured to find parameters, weights or a structure that minimizes a respective cost function. Each of the machine learning models 40 may be a respective regression model. In one example, the first machine learning model 42 and the second machine learning model 44 are respectively embedded in the first cloud unit 32 and the second cloud unit 34. The remote computing unit 30 may include a first database 46 for storing respective information pertaining to the user 12, including the type of the vision correction device. The remote computing unit 30 may include a second database 48 for storing group data pertaining to a group of users.
  • Referring to FIG. 1 , a user device 50 is operable by the user 12 and configured to communicate with, i.e. receive and transmit wireless communication, the remote computing unit 30, via a first network 52. The user device 50 may include a respective processor 54 and a respective memory 56. The user device 50 may run a first application 58, which may be a mobile application or “app.” The circuitry and components of a server, network and mobile application (“apps”) available to those skilled in the art may be employed.
  • The user device 50 may be a smartphone, laptop, tablet, desktop or other electronic device that the user 12 may operate, for example with a touch screen interface or I/O device such as a keyboard or mouse. The plurality of modules 60 may be executed in coordination with the remote computing unit 30. In one example, the plurality of modules 60 includes an electronic diary module 62, a query module 64, a first messaging module 66 and a suggestion module 68. The plurality of modules 60 may consume the output of a common or different machine learning models 40.
  • The electronic diary module 62 is configured to prompt the user 12 to answer one or more preselected questions at specific intervals, e.g. daily. The electronic diary module 62 is configured to store respective answers entered by the user 12 in response to the one or more preselected questions as self-reported data. The one or more preselected questions may include an inquiry into a comfort level of the user 12, including at least one of a dryness factor and an irritation factor. The one or more preselected questions may include an inquiry into when the user 12 last cleaned the vision correction device 14. The controller C may be configured to obtain the self-reported data from the electronic diary module 62 and generate an analysis of the self-reported data, via the one or more machine learning models 40. The user 12 may compare the self-reported data with group data (second database 48) generated by other users of the same type of vision correction device 14, via the electronic diary module 62.
  • The controller C may be configured to assist the user 12 based in part on the analysis. Assisting the user 12 based in part on the analysis may include at least one of the following: providing coaching on taking care of the vision correction device 14 (e.g. cleaning procedures) and/or an eye of the user 12; comparing comfort scores and other markers at a specific time period (e.g. one week after being fitted with the vision correction device 14) for the user 12 relative to a group of users of the same product; suggesting a follow-up visit with the eye care provider 16; and suggesting an alternative vision correction product.
  • Referring to FIG. 1 , the query module 64 in the user device 50 may be configured to receive at least one question entered by the user 12. The controller C may be configured to receive the at least one question from the query module 64 and formulate a reply, based in part on the one or more machine learning models 40. The reply may be posted, via the query module 64, for consumption by the user 12. The personalized assistance system 10 may be configured to be “adaptive” and may be updated periodically after the collection of additional data. In other words, the machine learning models 40 may be configured to be “adaptive machine learning” algorithms that are not static and that improve after additional user data is collected.
  • Referring to FIG. 1 , a provider device 70 is operable by the eye care provider 16 associated with the user. The provider device 70 is configured to communicate with the remote computing unit 30 via a second network 72. The provider device 70 includes a respective processor 74 and a respective memory 76. Similar to the user device 50, the provider device 70 may run a second application 78 (incorporating a plurality of modules 80) that is executed in coordination with the remote computing unit 30. The plurality of modules 80 may include a patient database 82 (stratified by the type of vision correction device 14), a user progress tracker module 84 configured to track progress of the user 12 and other users associated with the eye care provider 16, a second message module 86 and a comparative tracking module 88 configured to provide trend and comparative analyses.
  • The remote computing unit 30 may be configured to provide two-way communication between the eye care provider 16 and the user 12 via the first message module 66 and the second message modules 86. Referring to FIG. 1 , the personalized assistance system 10 may include a broker module 90 for routing the respective messages from the user 12 to the eye care provider 16, and vice-versa. The broker module 90 may be configured in different ways. While FIG. 1 shows an example implementation of the personalized assistance system 10, it is understood that other implementations may be carried out.
  • Referring to FIG. 1 , the first network 52 and second network 72 may be wireless or may include physical components and may be a short-range network or a long-range network. For example, the first network 52 and second network 72 may be implemented in the form of a local area network. The local area network may include, but is not limited to, a Controller Area Network (CAN), a Controller Area Network with Flexible Data Rate (CAN-FD), Ethernet, blue tooth, WIFI and other forms of data connection. The local area network may be a Bluetooth™ connection, defined as being a short-range radio technology (or wireless technology) aimed at simplifying communications among Internet devices and between devices and the Internet. Bluetooth™ is an open wireless technology standard for transmitting fixed and mobile electronic device data over short distances and creates personal networks operating within the 2.4 GHz band. The local area network may be a Wireless Local Area Network (LAN) which links multiple devices using a wireless distribution method, a Wireless Metropolitan Area Networks (MAN) which connects several wireless LANs or a Wireless Wide Area Network (WAN) which covers large areas such as neighboring towns and cities. Other types of connections may be employed.
  • The machine learning models 40 of FIG. 1 may include a neural network algorithm. While a neural network is illustrated herein, it is understood that the machine learning models 40 may be based on different kinds or types of algorithms, including but not limited to, a neural network, support vector regression, linear or logistic regression. k-means clustering, random forest and other types. As understood by those skilled in the art, neural networks are designed to recognize patterns from real-world data (e.g. images, sound, text, time series and others), translate or convert them into numerical form and embed in vectors or matrices. The neural network may employ deep learning maps to match an input vector x to an output vectory. Stated differently, each of the plurality of machine learning models 40 learns an activation function ƒ such that ƒ(x) maps to y. The training process enables the neural network to correlate the appropriate activation function ƒ(x) for transforming the input vector x to the output vector y. In the case of a simple linear regression model, two parameters are learned: a bias and a slope. The bias is the level of the output vectory when the input vector x is 0 and the slope is the rate of predicted increase or decrease in the output vectory for each unit increase in the input vector x. Once the plurality of machine learning models 40 is respectively trained, estimated values of the output vectory may be computed with given new values of the input vector x.
  • Referring to FIG. 3 , an example network 200 for the machine learning models 40 of FIG. 1 is shown. The example network 200 is a feedforward artificial neural network having at least three layers of nodes N, including an input layer 202, one or more hidden layers, such as first hidden layer 204 and second hidden layer 206, and an output layer 208. Each of the layers is composed of nodes N configured to perform an affine transformation of a linear sum of inputs. The nodes N are neurons characterized by a respective bias and respective weighted links. The nodes N in the input layer 202 receive the input, normalize them and forward them to nodes N in the first hidden layer 204. Each node N in a subsequent layer computes a linear combination of the outputs of the previous layer. A network with three layers would form an activation function ƒ(x)=ƒ(3)(ƒ(2)(ƒ(1)(x))). The activation function ƒ may be linear for the respective nodes N in the output layer 210. The activation function f may be a sigmoid for the first hidden layer 204 and the second hidden layer 206. A linear combination of sigmoids is used to approximate a continuous function characterizing the output vector y.
  • The example network 200 may generate multiple outputs, such as a first output factor 212 and a second output factor 214, with the controller C being configured to use a weighted average of the multiple outputs to obtain a final output 210. For example, if the inputs to the input layer 202 are various factors (e.g. comfort level, visual acuity score) pertaining to a particular type of vision correction device 14, the first output factor 212 and the second output factor 214 may be an objective satisfaction score and a subjective satisfaction score, respectively, for that particular type of vision correction device 14. Other machine learning models available to those skilled in the art may be employed.
  • Referring now to FIG. 2 , a flow chart of method 100 executable by the controller C of FIG. 1 is shown. Method 100 need not be applied in the specific order recited herein, with the start and end indicated respectively by “S” and “E” in FIG. 2 . It is understood that some blocks may be omitted. The memory M can store controller-executable instruction sets, and the processor P can execute the controller-executable instruction sets stored in the memory M.
  • Per block 110 of FIG. 2 , the controller C is configured to determine if the electronic diary module 62 has been triggered and the user 12 has answered any of the preselected questions. If so, the method 100 proceeds to block 115, where the controller C is configured to record the respective answers entered by the user 12. If not, the method 100 loops back to the start S.
  • Per block 120 of FIG. 2 , the controller C is configured to determine if the query module 64 has been triggered and the user 12 has asked a question. If so, per block 125, the controller C is configured to formulate a reply, based in part on the one or more machine learning models 40 and post the reply, via the query module 64, for consumption by the user 12. For example, the controller C may extract keywords from the question and enter them as inputs into the input layer 202. The reply may be extracted based on the output layer 208 of the example network 200. If not, the method 100 loops back to the start S.
  • Per block 130 of FIG. 2 , the controller C is configured to determine if one or more enabling conditions has been met. The enabling conditions may include an irritation factor and/or discomfort factor being above a predetermined threshold. If so, per block 135, the controller C may execute one or more actions, which may include interfacing with the eye care provider 16 via a message sent to the second message module 86. The actions may include posting a reminder via the suggestion module 68 in the user device 50, such as “please remember to clean contact lenses daily.”
  • In summary, the personalized assistance system 10 employs a multi-prong approach utilizing one or more machine learning models 40. The personalized assistance system 10 may be configured to recognize non-adherence to suggested guidelines, recognize when a follow up visit to the eye care provider 16 makes sense or suggest an alternative contact lens. The personalized assistance system 10 offers an effective two-way communication between the user 12 and the eye care provider 16.
  • The controller C of FIG. 1 includes a computer-readable medium (also referred to as a processor-readable medium), including a non-transitory (e.g., tangible) medium that participates in providing data (e.g., instructions) that may be read by a computer (e.g., by a processor of a computer). Such a medium may take many forms, including, but not limited to, non-volatile media and volatile media. Non-volatile media may include, for example, optical or magnetic disks and other persistent memory. Volatile media may include, for example, dynamic random-access memory (DRAM), which may constitute a main memory. Such instructions may be transmitted by one or more transmission media, including coaxial cables, copper wire and fiber optics, including the wires that comprise a system bus coupled to a processor of a computer. Some forms of computer-readable media include, for example, a floppy disk, a flexible disk, hard disk, magnetic tape, other magnetic medium, a CD-ROM, DVD, other optical medium, punch cards, paper tape, other physical medium with patterns of holes, a RAM, a PROM, an EPROM, a FLASH-EEPROM, other memory chip or cartridge, or other medium from which a computer can read.
  • Look-up tables, databases, data repositories or other data stores described herein may include various kinds of mechanisms for storing, accessing, and retrieving various kinds of data, including a hierarchical database, a set of files in a file system, an application database in a proprietary format, a relational database management system (RDBMS), etc. Each such data store may be included within a computing device employing a computer operating system such as one of those mentioned above and may be accessed via a network in one or more of a variety of manners. A file system may be accessible from a computer operating system and may include files stored in various formats. An RDBMS may employ the Structured Query Language (SQL) in addition to a language for creating, storing, editing, and executing stored procedures, such as the PL/SQL language mentioned above.
  • The detailed description and the drawings or FIGS. are supportive and descriptive of the disclosure, but the scope of the disclosure is defined solely by the claims. While some of the best modes and other embodiments for carrying out the claimed disclosure have been described in detail, various alternative designs and embodiments exist for practicing the disclosure defined in the appended claims. Furthermore, the embodiments shown in the drawings or the characteristics of various embodiments mentioned in the present description are not necessarily to be understood as embodiments independent of each other. Rather, it is possible that each of the characteristics described in one of the examples of an embodiment can be combined with one or a plurality of other desired characteristics from other embodiments, resulting in other embodiments not described in words or by reference to the drawings. Accordingly, such other embodiments fall within the framework of the scope of the appended claims.

Claims (20)

What is claimed is:
1. A personalized assistance system for a user of a vision correction device, the personalized assistance system comprising:
a remote computing unit including a controller having a processor and tangible, non-transitory memory on which instructions are recorded, the controller being configured to selectively execute one or more machine learning models;
a user device operable by the user and configured to communicate with the remote computing unit;
wherein the user device includes an electronic diary module configured to prompt the user to answer one or more preselected questions at specific intervals;
wherein the electronic diary module is configured to store respective answers, entered by the user in response to the one or more preselected questions, as self-reported data;
wherein the controller is configured to:
obtain the self-reported data from the electronic diary module;
generate an analysis of the self-reported data, via the one or more machine learning models; and
assist the user based in part on the analysis.
2. The personalized assistance system of claim 1, wherein:
the remote computing unit includes a first cloud unit and a central server, the controller being embedded in at least one of the first cloud unit and the central server.
3. The personalized assistance system of claim 1, wherein:
the user device includes a query module configured to receive at least one question generated by the user; and
the controller is configured to:
receive the at least one question from the query module;
formulate a reply, based in part on the one or more machine learning models; and
post the reply, via the query module, for consumption by the user.
4. The personalized assistance system of claim 1, further comprising:
a provider device configured to communicate with the remote computing unit, the provider device being operable by an eye care provider associated with the user;
wherein the user device and the provider device include respective message modules; and
wherein the remote computing unit is configured to provide two-way communication between the eye care provider and the user via the respective message modules.
5. The personalized assistance system of claim 1, wherein:
the remote computing unit includes a first database storing respective information pertaining to the user, including a type of the vision correction device;
the remote computing unit includes a second database storing group data pertaining to a group of users, the group data at least partially including respective self-reported data of the group of users; and
the user device includes a comparative tracking module configured to enable the user to compare the self-reported data with the group data.
6. The personalized assistance system of claim 1, wherein assisting the user based in part on the analysis includes at least one of:
providing coaching on taking care of at least one of the vision correction device and an eye of the user;
suggesting a follow-up visit with an eye care provider; and
suggesting an alternative vision correction product.
7. The personalized assistance system of claim 1, wherein:
the vision correction device is a contact lens.
8. The personalized assistance system of claim 7, wherein:
the contact lens is a multi-focal lens having a first zone for distance vision, a second zone for near vision and a third zone for intermediate vision.
9. The personalized assistance system of claim 8, wherein the one or more preselected questions include:
an inquiry into a comfort level of the user, including at least one of a dryness factor and an irritation factor.
10. The personalized assistance system of claim 8, wherein the one or more preselected questions include:
an inquiry into when the user last cleaned the vision correction device.
11. A personalized assistance system for selectively interfacing a user of a vision correction device with an eye care provider associated with the user, the personalized assistance system comprising:
a remote computing unit including a controller having a processor and tangible, non-transitory memory on which instructions are recorded, the controller being configured to selectively execute one or more machine learning models;
a user device operable by the user and configured to communicate with the remote computing unit;
a provider device configured to communicate with the remote computing unit, the provider device being operable by the eye care provider;
wherein the user device includes an electronic diary module configured to prompt the user to answer one or more preselected questions at specific intervals;
wherein the electronic diary module is configured to store respective answers, entered by the user in response to the one or more preselected questions, as self-reported data;
wherein the controller is configured to obtain the self-reported data from the electronic diary module and generate an analysis of the self-reported data, via the one or more machine learning models; and
wherein the user device and the provider device include respective message modules, the remote computing unit being configured to provide two-way communication between the eye care provider and the user based in part on the analysis, via the respective message modules.
12. The personalized assistance system of claim 11, wherein:
the remote computing unit includes a first cloud unit and a central server, the controller being embedded in at least one of the first cloud unit and the central server.
13. The personalized assistance system of claim 11, wherein:
the user device includes a query module configured to receive at least one question generated by the user; and
the controller is configured to:
receive the at least one question from the query module;
formulate a reply, based in part on the one or more machine learning models; and
post the reply, via the query module, for consumption by the user.
14. The personalized assistance system of claim 11, wherein:
the remote computing unit includes a first database storing respective information pertaining to the user, including a type of the vision correction device;
the remote computing unit includes a second database storing group data pertaining to a group of users, the group data at least partially including respective self-reported data of the group of users; and
the user device includes a comparative tracking module configured to enable the user to compare the self-reported data with the group data.
15. The personalized assistance system of claim 11, wherein assisting the user based in part on the analysis includes at least one of:
providing coaching on taking care of at least one of the vision correction device and an eye of the user;
suggesting a follow-up visit with an eye care provider; and
suggesting an alternative vision correction product.
16. The personalized assistance system of claim 11, wherein:
the vision correction device is a contact lens.
17. The personalized assistance system of claim 16, wherein:
the contact lens is a multi-focal lens having a first zone for distance vision, a second zone for near vision and a third zone for intermediate vision.
18. The personalized assistance system of claim 16, wherein the one or more preselected questions include:
an inquiry into a comfort level of the user, including at least one of a dryness factor and an irritation factor.
19. The personalized assistance system of claim 16, wherein the one or more preselected questions include:
an inquiry into when the user last cleaned the vision correction device.
20. A method of operating a personalized assistance system for a user of a vision correction device, the personalized assistance system having a remote computing unit with a controller having a processor and tangible, non-transitory memory, the method comprising:
configuring the controller to selectively execute one or more machine learning models;
configuring a user device to communicate with the remote computing unit, the user device being operable by the user;
configuring the user device with an electronic diary module configured to prompt the user to answer one or more preselected questions at specific intervals;
storing respective answers, entered by the user in response to the one or more preselected questions, as self-reported data, via the electronic diary module;
obtaining the self-reported data from the electronic diary module, via the controller, and generating an analysis of the self-reported data, via the one or more machine learning models; and
assist the user based in part on the analysis, via the controller.
US18/160,238 2019-12-19 2023-01-26 Personalized assistance system for user of vision correction device Pending US20230162828A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US18/160,238 US20230162828A1 (en) 2019-12-19 2023-01-26 Personalized assistance system for user of vision correction device

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US201962950432P 2019-12-19 2019-12-19
US17/127,735 US11587656B2 (en) 2019-12-19 2020-12-18 Personalized assistance system for user of vision correction device
US18/160,238 US20230162828A1 (en) 2019-12-19 2023-01-26 Personalized assistance system for user of vision correction device

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US17/127,735 Continuation US11587656B2 (en) 2019-12-19 2020-12-18 Personalized assistance system for user of vision correction device

Publications (1)

Publication Number Publication Date
US20230162828A1 true US20230162828A1 (en) 2023-05-25

Family

ID=74095932

Family Applications (2)

Application Number Title Priority Date Filing Date
US17/127,735 Active 2041-03-31 US11587656B2 (en) 2019-12-19 2020-12-18 Personalized assistance system for user of vision correction device
US18/160,238 Pending US20230162828A1 (en) 2019-12-19 2023-01-26 Personalized assistance system for user of vision correction device

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US17/127,735 Active 2041-03-31 US11587656B2 (en) 2019-12-19 2020-12-18 Personalized assistance system for user of vision correction device

Country Status (5)

Country Link
US (2) US11587656B2 (en)
EP (1) EP4078606A1 (en)
JP (1) JP2023506383A (en)
CN (1) CN114830247A (en)
WO (1) WO2021124288A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20230214881A1 (en) * 2021-12-31 2023-07-06 Synamedia Limited Methods, Devices, and Systems for Dynamic Targeted Content Processing

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2003087755A1 (en) * 2002-04-12 2003-10-23 Menicon Co., Ltd. Contact lens user support system and support method
US8628194B2 (en) * 2009-10-13 2014-01-14 Anton Sabeta Method and system for contact lens care and compliance
US20150100342A1 (en) 2013-10-08 2015-04-09 Mobile Doctor Investments, LLC Mobile app for contact lenses
US9706910B1 (en) * 2014-05-29 2017-07-18 Vivid Vision, Inc. Interactive system for vision assessment and correction
JP6548821B2 (en) * 2015-09-30 2019-07-24 株式会社ソニー・インタラクティブエンタテインメント How to optimize the placement of content on the screen of a head mounted display
US10827925B2 (en) * 2016-09-14 2020-11-10 DigitalOptometrics LLC Remote comprehensive eye examination system
SG10201703534XA (en) 2017-04-28 2018-11-29 D Newman Stephen Evaluation of Prescribed Optical Devices
US10799112B2 (en) * 2017-05-02 2020-10-13 Simple Contact, Inc. Techniques for providing computer assisted eye examinations
EP3643081B1 (en) * 2017-06-21 2023-11-01 Healthcare Technologies And Methods, LLC Adaptive and interactive education and communication system for people with hearing loss
WO2019176952A1 (en) 2018-03-13 2019-09-19 Menicon Co., Ltd. Determination System, Computing Device, Determination Method, and Program
JP2022537702A (en) 2019-06-27 2022-08-29 アルコン インコーポレイティド Systems and methods using machine learning to predict contact lens fit

Also Published As

Publication number Publication date
CN114830247A (en) 2022-07-29
US11587656B2 (en) 2023-02-21
EP4078606A1 (en) 2022-10-26
US20210193278A1 (en) 2021-06-24
JP2023506383A (en) 2023-02-16
WO2021124288A1 (en) 2021-06-24

Similar Documents

Publication Publication Date Title
US20240005205A1 (en) Iterative Attention-based Neural Network Training and Processing
JP6753707B2 (en) Artificial intelligence system that supports communication
US11544591B2 (en) Framework for a computing system that alters user behavior
Kim et al. Method of predicting human mobility patterns using deep learning
de Bruin et al. Prediction error minimization: Implications for embodied cognition and the extended mind hypothesis
Birek et al. A novel Big Data analytics and intelligent technique to predict driver's intent
US20230162828A1 (en) Personalized assistance system for user of vision correction device
US20210369106A1 (en) Selection of intraocular lens based on a predicted subjective outcome score
Anand et al. Chatbot enabled smart tourism service for indian cities: An ai approach
WO2018060957A1 (en) System and method for personalized migraine prediction powered by machine learning
Kersten A new mark of the cognitive? Predictive processing and extended cognition
JP7157239B2 (en) Method and system for defining an emotion recognition machine
WO2021176795A1 (en) Matching system, matching method, and matching program
Nagar et al. A review on machine learning applications in medical tourism
Wang Subjective employment obstacle of college students and its predictor model based on deep learning
Jabla et al. Automatic rule generation for decision-making in context-aware systems using machine learning
US11355239B1 (en) Cross care matrix based care giving intelligence
IKRAM et al. Malimg2022: Data Augmentation And Transfer Learning To Solve Imbalanced Training Data For Malware Classification
Elakkiya et al. Cognitive Analytics and Reinforcement Learning: Theories, Techniques and Applications
Jamieson Towards an Ethics of Hybrid Agency in Performance
Renuga Devi et al. Cognitive Analytics in Continual Learning: A New Frontier in Machine Learning Research
Larrivee et al. Realigning the Neural Paradigm for Death
Plummer Vitae, Vix Humane, The Resonance Of Machine Intelligence: Implications For Now And Into The Future For The World Of The Orthodox Human
Pham Advancing XAI in Dynamic Data-driven Environments with Swarm Intelligence
Jacques Active Inference: The Free Energy Principle in Mind, Brain, and Behaviour by Thomas Parr, Giovanni Pezzulo, and Karl J. Friston

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION