CN115223232A - Eye health comprehensive management system - Google Patents

Eye health comprehensive management system Download PDF

Info

Publication number
CN115223232A
CN115223232A CN202210365844.0A CN202210365844A CN115223232A CN 115223232 A CN115223232 A CN 115223232A CN 202210365844 A CN202210365844 A CN 202210365844A CN 115223232 A CN115223232 A CN 115223232A
Authority
CN
China
Prior art keywords
eye
user
eye health
health state
module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210365844.0A
Other languages
Chinese (zh)
Inventor
余克明
庄菁
陈熹
赵可明
何安琪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhongshan Ophthalmic Center
Original Assignee
Zhongshan Ophthalmic Center
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhongshan Ophthalmic Center filed Critical Zhongshan Ophthalmic Center
Priority to CN202210365844.0A priority Critical patent/CN115223232A/en
Publication of CN115223232A publication Critical patent/CN115223232A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/193Preprocessing; Feature extraction
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/18Arrangement of plural eye-testing or -examining apparatus
    • A61B3/185Arrangement of plural eye-testing or -examining apparatus characterised by modular construction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/197Matching; Classification
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Public Health (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Ophthalmology & Optometry (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Surgery (AREA)
  • Pathology (AREA)
  • Epidemiology (AREA)
  • Biophysics (AREA)
  • Primary Health Care (AREA)
  • Molecular Biology (AREA)
  • Databases & Information Systems (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Quality & Reliability (AREA)
  • Data Mining & Analysis (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

The application discloses an eye health comprehensive management system, which comprises an image acquisition module, a management module and a management module, wherein the image acquisition module is used for acquiring a face image of a user by using a camera; the eye parameter measuring module is used for measuring eye parameters of the user according to the collected human face image, wherein the eye parameters comprise eye distance, eyesight, blinking frequency and eye appearance; and the eye health state evaluation module is used for evaluating the eye health state of the user according to the eye parameters based on an artificial intelligence algorithm. According to the method and the device, the eye parameters of the user can be automatically acquired, and the problems of complexity in manual input and easiness in error are avoided; the eye health condition can be monitored comprehensively by measuring the eye distance, the eyesight, the blinking frequency and the eye appearance of the user; an integrated evaluation model is established through a manual algorithm, so that the eye health state of a user can be accurately evaluated; the eye health measurement device has the advantages of being simple to operate, high in measurement accuracy, comprehensive in measurement range and the like, and is beneficial to maintaining eye health of users.

Description

Eye health comprehensive management system
Technical Field
The application relates to the technical field of computer vision, in particular to an eye health comprehensive management system.
Background
Along with the popularization of electronic products, the prevalence rate of various eye diseases related to the electronic products is greatly increased, and the monitoring and comprehensive management of the eye health of users by taking the electronic products as carriers become a feasible direction. However, existing systems for eye health monitoring often have their own drawbacks: firstly, the measurement process requires the user to manually input body parameters, so that the measurement process is complicated, random errors are easily generated in a plurality of measurement steps, and the reliability of the measurement result is greatly reduced under the condition of overlapping a plurality of errors; second, the types of eye diseases that can be monitored are often single and lack a lateral assessment of eye health; thirdly, the user is easily disturbed when eye monitoring is performed, and further the user experience is poor. Therefore, it is desirable to provide an eye health integrated management system to solve the above problems.
Disclosure of Invention
The application aims to provide an eye health comprehensive management system to solve the problems that the existing eye health monitoring system is complex in measuring process, single in measuring type and low in measuring result accuracy.
To achieve the above object, the present application provides an eye health integrated management system, comprising:
the image acquisition module is used for acquiring a face image of a user by using the camera;
the eye parameter measuring module is used for measuring eye parameters of a user according to the collected human face image, wherein the eye parameters comprise eye screen distance, eyesight, blinking frequency and eye appearance;
and the eye health state evaluation module is used for evaluating the eye health state of the user according to the eye parameters based on an artificial intelligence algorithm.
Further, preferably, the eye health integrated management system further includes:
the user registration module is used for registering an account for the user according to the user information and storing the front face image uploaded by the user into a user file; the user information includes a name, a gender, and an age of the user.
Further, preferably, the eye health integrated management system further includes:
the identity recognition module is used for acquiring a face image of a user according to a first preset period and matching the face image with a user file;
when the matching is successful, starting each module to measure and monitor the eye using process of the user;
when the matching is unsuccessful, the user's eye procedure is not measured and supervised.
Further, preferably, the eye parameter measuring module includes:
the eye screen distance measuring unit is used for calculating the eye screen distance of the user according to a second preset period, wherein the calculation formula is as follows:
d t =d c *n c /n t
in the formula (d) t Is the eye-screen distance of the user, n t The pixel distance, n, of the angular separation of the two eyes of the user c The pixel distance of the angular distance between the inner canthus of both eyes, d, of the user's head-on photograph c Horizontal distances between two eyes and a camera when a user shoots a front face image;
the vision measuring unit is used for measuring the vision level of the user according to the standard logarithmic visual chart;
the blink frequency measuring unit is used for counting the blink frequency of the user in a preset time interval and judging whether the blink overfrequency occurs to the user or not according to the blink frequency;
the eye appearance detection unit is used for identifying the eye appearance of the user and judging whether the eye appearance is abnormal or not; the ocular appearance includes the relative positions of the eyelids and limbus, pupil, and conjunctival color, vascularity, and degree of hyperemia.
Further, preferably, the eye health status evaluation module includes:
the data acquisition unit is used for acquiring historical eye health state parameters of a user as sample data;
the data processing unit is used for carrying out data cleaning and normalization processing on the sample data and dividing the processed sample data into a training set and a test set;
the eye health state evaluation model comprises a plurality of eye health state evaluation submodels, and each eye health state evaluation submodel adopts different artificial intelligence algorithms;
the training unit is used for training the eye health state evaluation model by utilizing a training set until the testing precision of the trained eye health state evaluation model by utilizing a testing set meets a preset value, and generating a target eye health state evaluation model;
and the evaluation unit is used for acquiring the eye parameters of the tested object, inputting the eye parameters of the tested object into the target eye health state evaluation model and generating the eye health state evaluation result of the user.
Further, preferably, the artificial intelligence algorithm comprises:
random Forest, logistic Regression, KNN, SVC, precision Tree, XGboost, lightGBM and Gradient Boosting algorithms.
Further, preferably, the training unit is further configured to:
and determining the weight of the eye health state evaluation submodel according to the contribution degree of the eye health state evaluation submodel to the eye health state evaluation model in the training process.
Further, preferably, the eye health status evaluation module further includes:
and the hyper-parameter optimization unit is used for optimizing hyper-parameters of the target eye health state evaluation model according to the Hyperopt technology.
Further, preferably, the eye health integrated management system further includes:
and the eye reminding module is used for reminding the user when any one or more of the eye screen distance, the eye using time length, the blinking frequency and the eye appearance of the user are abnormal.
Further, preferably, the eye health integrated management system further includes:
and the honor rating module is used for rewarding the user when any one or more of the eye screen distance, the eye using time length and the blink frequency of the user meet the preset conditions.
Compared with the prior art, the beneficial effects of this application lie in:
the application discloses an eye health comprehensive management system, which comprises an image acquisition module, a management module and a management module, wherein the image acquisition module is used for acquiring a face image of a user by using a camera; the eye parameter measuring module is used for measuring eye parameters of a user according to the collected human face image, wherein the eye parameters comprise eye distance, eyesight, blinking frequency and eye appearance; and the eye health state evaluation module is used for evaluating the eye health state of the user according to the eye parameters based on an artificial intelligence algorithm.
The eye health comprehensive management system can automatically acquire the eye parameters of the user, and avoids the problems of complexity and easiness in error caused by manual input; the eye health condition can be comprehensively monitored by measuring the eye distance, the eyesight, the blinking frequency and the eye appearance of the user; an integrated evaluation model is established through a manual algorithm, so that the eye health state of a user can be accurately evaluated; the eye health measurement device has the advantages of being simple to operate, high in measurement accuracy, comprehensive in measurement range and the like, and is beneficial to maintaining eye health of users.
Drawings
In order to more clearly illustrate the technical solution of the present application, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings without creative efforts.
Fig. 1 is a schematic structural diagram of an eye health integrated management system according to an embodiment of the present application;
fig. 2 is a schematic structural diagram of an eye health integrated management system according to another embodiment of the present application;
FIG. 3 is a schematic diagram of a pose for a user to take a front photograph during filing according to an embodiment of the present application;
FIG. 4 is a schematic diagram of a sub-unit of the eye parameter measurement module of FIG. 1;
fig. 5 is a schematic structural diagram of an eye health integrated management system according to another embodiment of the present application;
fig. 6 is a schematic structural diagram of an eye health integrated management system according to still another embodiment of the present application;
FIG. 7 is a schematic diagram of a sub-unit of the eye health assessment module of FIG. 1 according to an embodiment;
FIG. 8 is a schematic diagram of an integrated eye health assessment model according to an embodiment;
fig. 9 is a schematic structural diagram of a sub-unit of the eye health status evaluation module in fig. 1 according to yet another embodiment.
Detailed Description
The technical solutions in the embodiments of the present application will be described clearly and completely with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only some embodiments of the present application, and not all embodiments. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments in the present application without making any creative effort belong to the protection scope of the present application.
It should be understood that the step numbers used herein are only for convenience of description and are not used as limitations on the order in which the steps are performed.
It is to be understood that the terminology used in the description of the present application is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in this specification and the appended claims, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
The terms "comprises" and "comprising" indicate the presence of the described features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
The term "and/or" refers to any and all possible combinations of one or more of the associated listed items and includes such combinations.
For convenience of explaining the scheme of the application, the mobile terminal is mainly used as a carrier, and the eye health comprehensive management system is realized on the mobile terminal in an APP mode. The mobile terminal comprises a Smart Phone (Smart Phone, such as an Android mobile Phone and an iOS mobile Phone), smart glasses, a Smart watch, a Smart bracelet, a tablet computer, a notebook computer, a personal digital assistant and other mobile internet devices capable of performing wireless communication.
Referring to fig. 1, an embodiment of the present application provides an eye health integrated management system. As shown in fig. 1, the eye health integrated management system comprises three modules, each of which is specifically as follows:
and the image acquisition module 01 is used for acquiring a face image of the user by using the camera.
Taking APP of the mobile terminal as an example, when the user holds the mobile terminal, such as a smart phone, the eye health comprehensive management APP may be started first. When the APP is in a detection state, a user picture is taken according to a preset time (for example, once in 5 minutes), a face image is recognized, and then the face image is transmitted to the eye parameter measurement module 02, and preferably, the image acquisition module 01 can acquire 105 feature points on the face image.
The eye parameter measuring module 02 is used for measuring the eye parameters of the user according to the collected human face image, wherein the eye parameters comprise the eye screen distance, the eyesight, the blinking frequency and the eye appearance.
After the feature points on the face image are obtained, the eye parameter measuring module 02 measures current eye parameters of the user according to the feature points, wherein the current eye parameters include eye distance, eyesight, blinking frequency, eye appearance and the like; preferably, the eye parameter measuring module 02 further obtains the eye duration data of the user, and sends the eye duration data and the eye parameter data to the eye health status evaluating module 03.
And the eye health state evaluation module 03 is configured to evaluate the eye health state of the user according to the eye parameters based on an artificial intelligence algorithm.
The module is used for evaluating the eye health state of a user according to the eye parameters, wherein a plurality of eye health state evaluation submodels are established according to different artificial intelligence algorithms, the eye health state evaluation submodels are trained through sample data, the trained submodels are integrated according to respective weights to obtain a target eye health state evaluation model, finally, the eye parameters sent by the eye parameter measuring module 02 are input into the target eye health state evaluation model, and corresponding evaluation results are output.
According to the embodiment of the application, the eye parameters of the user can be automatically acquired, and the problems of complexity and easiness in error caused by manual input are avoided; the eye health condition can be comprehensively monitored by measuring the eye distance, the eyesight, the blinking frequency and the eye appearance of the user; the integrated evaluation model is constructed through a manual algorithm, and the eye health state of the user can be accurately evaluated.
Referring to fig. 2, in an embodiment, the eye health integrated management system further includes a user registration module 04 and an identity module 05, and specifically, the functions of the modules are as follows:
the user registration module 04 is used for registering an account for the user according to the user information and storing the front face image uploaded by the user into a user file; the user information includes a name, a gender, and an age of the user.
Taking the eye health integrated management system as an example of an APP, first, before using the APP, a user needs to go through the user registration module 04 and build a profile, where the profile includes information such as the name, sex, age, and the like of the user. Most importantly, the user uploads a picture taken with the phone at 30cm from the face, with the phone vertical to the user's face, as shown in fig. 3. It should be noted that 30cm is usually used as a critical value of the eye-screen distance when the user uses the mobile device, and when the eye-screen distance is within this range, the eye-screen distance of the user is generally considered to be relatively good, and the eye health problem is not easily caused, and when the eye-screen distance exceeds this value, the eye use habit of the user is considered to be not good, that is, the eye health problem is caused by too close eye-screen distance.
The identity recognition module 05 is used for acquiring a face image of a user according to a first preset period and matching the face image with a user file;
when the matching is successful, starting each module to measure and monitor the eye using process of the user;
when the matching is unsuccessful, the user's eye procedure is not measured and supervised.
In this embodiment, the user is a child, and the eye health integrated management system is a mobile phone APP, for example, since in a household, the child often uses a mobile phone of a parent. Therefore, in order to accurately monitor the eye-screen distance and the eye-use duration of the child and avoid interference on normal use of the mobile phone by parents, a user of the mobile phone needs to be identified.
As a preferred embodiment, in the bright screen state of the mobile phone, the App may collect the face image of the user according to a first preset period, for example, setting to call the front lens of the mobile phone 1 time every 5 minutes, and take a picture. The method comprises the steps that a face in a photo is recognized through a face recognition Application Program Interface (API), the face in the photo is matched with a filing user (child) in identity after recognition, and when matching is successful, the user is the filing child, the APP is started to supervise the gesture and duration of the child using a mobile phone; if not, no supervision is needed, and the condition of wrong supervision is prevented.
Referring to fig. 4, in one embodiment, the eye parameter measuring module 02 needs to measure a plurality of eye parameters such as eye distance, eyesight, blink frequency and eye appearance, and therefore includes the following sub-units:
an eye-screen distance measuring unit 021, configured to calculate an eye-screen distance of the user according to a second preset period, where the calculation formula is:
d t =d c *n c /n t
in the formula (d) t Is the eye-screen distance of the user, n t Is the pixel distance, n, of the angular separation between the inner canthus of the eyes of the user c The pixel distance, d, of the angular distance between the inner canthus of the eyes of the user's head-on photograph c Horizontal distances between two eyes and a camera when a user shoots a front face image;
in this example, the eye-screen distance is measured every 5 minutes. Taking the example that the eye health comprehensive management system is a mobile phone end APP, the APP can call the front lens of the mobile phone every 5 minutes to take a user picture. Let d be the true eye-screen distance of the user in the picture t The pixel distance between the inner canthus of the eyes is n t The value can be obtained by measuring the pixel distance according to App; setting the pixel distance of the angular distance between the inner canthus of the eyes of the uploaded positive photograph during the filing of the user as n c (App measures pixel distance) because the front photograph is taken at 30cm during filing and the angular distance between the inner canthus of the eyes in the photograph is inversely proportional to the distance between the eye screens, d is calculated c The formula is substituted into the formula, and a certain formula for calculating the eye screen distance is as follows:
d t =30*n c /n t
a vision measuring unit 022 for measuring the vision level of the user according to a standard logarithmic eye chart.
With this function, parents can easily check the eyesight of children at home when monitoring the eye health of children. When the eye health integrated management system is the cell-phone end APP, this vision measuring unit 022 can be started, and the "E" letter alphabet optotype of the national standard logarithmic vision chart is utilized to carry out vision detection, for example: under the sufficient condition of indoor lighting, the head of a family can place the cell-phone in children visual axis the place ahead 1 meter, keeps the level with children's eye, appears the E letter from big to little in succession on the cell-phone screen, and E can appear in each orientation, and children need identify its open-ended orientation. The vision logMAR recording method is 0.0 (namely, the decimal recording method is 1.0, and the decimal recording method is 5.0), the corresponding visual target width is 0.29mm, the height is 1.45mm, and the visual target increment rate is 1.2589254 (the visual target is increased by 1.2589 times from small to large). 1 line of letters is displayed on the screen each time, when a child answers the opening direction of the letter, the result is recorded as 'right', when the child answers the wrong, the result is recorded as 'wrong', and after the 1 line of letters are answered, the next line of letters is automatically jumped to. The same line of visual target, children need to answer to it at least 3, think that children can discern this line of visual target, mark this line of eyesight. And when the children wrongly answer 3 or more visual marks in a certain line of visual marks, judging that the children cannot recognize the line of visual marks, marking the visual marks as the vision of the previous line, and finishing the examination.
And a blink frequency measurement unit 023 configured to count the number of blinks of the user in a preset time period, and determine whether the user has an excessive blink frequency according to the number of blinks.
The frequency of blinking is related to various eye diseases such as dry eye, asthenopia, and ametropia, and when blinking is too frequent, detailed ophthalmologic examination must be performed as soon as possible to clarify the cause of the disease. However, the phenomenon of blink overfrequency is not easy to be found and valued by parents. When the eye health comprehensive management system is a mobile phone APP, the blink frequency measurement unit 023 can be used for detecting the blink condition of the child. Preferably, the APP may be set to call the phone front lens every 10 minutes, take a picture of the phone user for 30 seconds, and count the number of blinks of the user during this period. If the blink frequency exceeds 9 times in 30 seconds (namely, the average blink frequency exceeds 3 times every 10 seconds), the user is judged to be the blink frequency.
An eye appearance detection unit 024 configured to identify an eye appearance of a user and determine whether the eye appearance is abnormal; the ocular appearance includes the relative positions of the eyelids and limbus, pupil, and conjunctival color, vascularity (ciliary hyperemia/conjunctival hyperemia/mixed hyperemia), degree of hyperemia.
It is also a more direct way to determine whether a user has an eye disease based on the eye appearance, and the change in the eye appearance may include: eyelid inversion and trichiasis, ptosis are manifested by abnormalities in eyelid position, conjunctival congestion, etc., and when the above-mentioned changes in eye appearance occur, a user needs to be alerted to the presence of abnormalities in eyelid position, function, and birth. In this embodiment, the eye appearance detection unit 024 can be used to activate a front camera of a mobile phone to perform relevant inspection on the eye appearance of a child. Specifically, the present embodiment can recognize the relative positions of the eyelids to the limbus and the pupil, as well as the conjunctival color, the blood vessel distribution (ciliary hyperemia/conjunctival hyperemia/mixed hyperemia), the degree of hyperemia, and the like, by the face recognition API. When the children open eyes naturally, a mobile phone is placed 300mm in front of the visual axis, a specific eye-injecting target is given, and if the upper eyelid covers the upper edge of the cornea and exceeds 2mm, the children are judged to have slight ptosis; when the upper eyelid partially or completely covers the pupil, the upper eyelid is judged to be severely ptosis; in addition, the degree of congestion was classified into 3 grades according to the color of conjunctiva and the distribution of blood vessels.
In one embodiment, in order to remind the user of timely understanding the eye health condition or timely visiting a hospital for eye diseases with respect to the data of the eye parameter measurement module 02, the eye health comprehensive management system further includes an eye use reminding module 06, as shown in fig. 5. Specifically, the eye prompting module 06 is configured to prompt the user when any one or more of the eye screen distance, the eye duration, the blinking frequency, and the eye appearance of the user is abnormal.
According to the core information (2019) of myopia prevention and control health education of children and teenagers issued by the national health committee, when a child uses a mobile phone, the eye screen distance is larger than 30cm, the use time of each time is smaller than 30 minutes, and the use time of the child in the whole day is not longer than 1 hour. Suppose the user is children, this healthy integrated management system of eye is cell-phone end APP, in order to monitor and remind children's eye screen distance problem, should remind module 06 with the eye and can set up following function: when the condition that the eye screen distance of the child is smaller than 30cm when the condition that the child uses the mobile phone is monitored, the screen pops up subtitles to remind the child to lengthen the eye screen distance.
Further, aiming at the problem of eye time, when the time of a single use of the mobile phone by a child exceeds 30 minutes, a screen pops up a subtitle to remind the child to have a rest for at least 10 minutes. If the child does not rest on the screen, the screen will be forcibly turned off after 5 minutes. When the mobile phone is used by the child for more than 1 hour all day, the screen pops up subtitles to remind the child that the mobile phone is not suitable to be used continuously on the day. If the child does not rest on the screen, the screen will be forcibly turned off after 5 minutes.
Further, if the child blinks more than 9 times in 30 seconds (i.e., more than 3 times per 10 seconds on average), it is determined that the user has an excessive blink frequency. At the moment, the eye reminding module 06 can remind children to pay attention to rest, relieve fatigue and be not suitable for using the mobile phone for a long time; if the children appear the performance of frequent blinking within 1 week for more than 3 days, the eye reminding module 06 reminds the users that the children should go to the hospital in time for detailed ophthalmic examination.
Further, when it is determined that the user is severely ptosis and the conjunctiva is in a hyperemic state, the eye reminding module 06 reminds the user of vigilance of related eyelid diseases such as eyelid inversion and trichiasis and the like, and the examination should be performed by a regular ophthalmological institution in time.
In one embodiment, it is assumed that the eye health integrated management system is a mobile phone APP, and the user is a child, so as to avoid the conflict psychology of the child on monitoring the APP, the eye health integrated management system further includes a honor rating module 07, as shown in fig. 6. The honor rating module 07 is configured to award a reward to the user when any one or more of the inter-eye distance, the eye-using duration, and the blink frequency of the user satisfy a preset condition.
In order to avoid the conflict psychology generated by the children for reminding the App, the children are encouraged to spontaneously carry out eye protection behaviors, the honor comparison system can be designed according to the psychological principle. For one or more of the conditions that the distance between the eye screens is greater than 30cm, the using time of the mobile phone is less than 1 hour all day, or the blinking frequency is less than 3 times every 10 seconds, a virtual medal is issued to the child every day and can be displayed on a visible part of an App home page, so that the child user feels honor. Through linkage with schools and data sharing, children who obtain the most virtual medals every week are raised in class by the executive. The children are encouraged to spontaneously develop good eye habits by utilizing the honor of the children.
Referring to fig. 7, in an embodiment, the eye health status evaluation module 03 includes:
the data acquisition unit 031 is configured to acquire the historical eye health status parameters of the user as sample data.
Data acquisition: taking APP as an example, rich and high-quality data is crucial to building a model with reliable performance. In this embodiment, random sampling is carried out to school-age children, its basic information is collected and collated to carry out the monitoring of certain duration through this APP to its habit of using the eye and action, obtain children's each item eye health status parameter. Then, the children are subjected to comprehensive ophthalmologic examination, and professional and experienced ophthalmologists evaluate the eye health state of the children to acquire eye-related disease information of the children.
Converting the basic information of the children, the eye health state parameters measured by the APP and the evaluation of the eye health state by an ophthalmologist into text information which can be read by a computer, extracting and storing the structured information of the text record, confirming that the data entry is correct and meets the requirements of an application scene through manual verification and secondary screening, and finally obtaining the data entry which can be used for model construction.
Relevant feature data and labels: the raw data used can be summarized to include two kinds of characteristic information, as well as the children's basic information and the eye health status parameters measured by APP. The basic information of the children can comprise information such as age, gender, height, weight, family economic conditions, whether related eye diseases exist between parents and the other parents, and the age of the parents who contact the electronic product at the earliest time, and the eye health state parameters measured by the APP can comprise information such as eye screen distance, duration of using a mobile phone, blink frequency and eye appearance. The label of the present embodiment is a practical evaluation of the eye health status by the ophthalmologist, and may include evaluation information of ametropia, dry eye, eyelid disease.
And the data processing unit 032 is configured to perform data cleaning and normalization processing on the sample data, and divide the processed sample data into a training set and a test set.
The present embodiment specifically includes the following parts:
data preprocessing: and removing partial missing and abnormal data. And if some children and parents thereof are unwilling to provide related basic information, do not use the APP cooperatively or delete the APP from the mobile phone, do not reach the set minimum APP use duration, and refuse to accept a professional ophthalmologist to evaluate the eye health state of the children, the data are regarded as missing data and removed, and the missing data are not included in a training data set of the model. And (5) interpolating the residual missing data by adopting a multiple interpolation method. And carrying out preliminary analysis on the basic information of the children, the eye health state parameters measured by the APP and the eye health state evaluation data of an ophthalmologist, making a frequency distribution histogram, and carrying out normality test on the variables by using a Shapiro-Wilk method. For variables conforming to normal distribution, the average (mu) and standard deviation (sigma) of the variables are calculated, and data in the range of mu +/-3 sigma are regarded as abnormal data; for variables that do not fit in a normal distribution, the first quartile (Q1), the median (Q2), the third quartile (Q3), and the quartile range (IQR) are calculated, and then data of < Q1-1.5 x IQR or > Q3+1.5 x IQR are regarded as abnormal data. And eliminating abnormal data to avoid the influence of bias on model construction.
Segmenting the data set: the data were randomly divided into training and test sets in a ratio of 7.
A model building unit 033 configured to build an eye health state assessment model using an artificial intelligence algorithm, where the eye health state assessment model includes a plurality of eye health state assessment submodels, and each eye health state assessment submodel employs a different artificial intelligence algorithm.
The embodiment is based on the fact that the classification model is used for fitting training data, optional models include but are not limited to Random Forest, logistic Regression, KNN, SVC, decision Tree, XGboost, lightGBM, gradient Boosting and the like, and the models can be used as independent classification models to directly output evaluation results of eye health states.
The training unit 034 is configured to train the eye health state assessment model by using a training set, and generate a target eye health state assessment model until the test accuracy of the trained eye health state assessment model by using a test set meets a preset value.
The training data and labels are recorded as a vector [ X, y]Wherein X = [ X ] 0 ,x 1 ,…,x n ]The eye health state parameters measured by the APP and the basic information of the children comprise age, sex, height, weight, family economic conditions, whether both parents have related eye diseases or not, and the earliest contact with electricityThe sub-products comprise characteristic information such as age, eye screen distance, time length of using the mobile phone, blinking frequency and eye appearance, and y is a label, namely the actual evaluation of eye health states by an ophthalmologist, including ametropia, dry eyes, eyelid diseases and other information. In the embodiment, a large amount of training set data is used for training the artificial intelligent model, so that the model outputs evaluation information of the eye health state according to the child basic information and the eye health state parameters measured by the APP, and the parameters of the artificial intelligent model are optimized according to the difference between the output evaluation information and the evaluation information of the actual ophthalmologist.
In one embodiment, the training unit 034 is further configured to:
and determining the weight of the eye health state evaluation submodel according to the contribution degree of the eye health state evaluation submodel to the eye health state evaluation model in the training process.
In this embodiment, a plurality of different artificial intelligence models, such as Random Forest, logistic Regression, KNN, SVC, decision Tree, XGBoost, lightGBM, gradient Boosting, are used as sub-models to construct an integrated model, as shown in fig. 8. Specifically, each sub-model can be trained according to the method to have a performance of more accurately evaluating the eye health state. Because each seed model has good and bad performance under different scenes and different data distributions, the embodiment integrates different sub models through a shallow neural network including a hidden layer to optimize the integrated performance. For this integrated model, the input variables are the various feature information X in the training data and the output results of the different submodels, which are the final eye health status assessment, and can be expressed as: y = F (X, y) 0 ,y 1 ,…,y n | θ); wherein F represents an integrated model, X is various feature information in the training data, and y 0 ,y 1 ,…,y n The evaluation result of the sub-model on the eye health state under the condition of inputting various characteristic information X, theta is a parameter variable of the model F, wherein the weight corresponding to each sub-model is included, and the object is required to be optimized in the training process. The method can fuse different sub-models to enable the models to learn the originalAnd the distribution of the data and the dynamic adjustment of the weight of the contribution of the sub-model to the final result under different distributions are carried out, so that the accuracy of the final eye health assessment result is improved.
The evaluation unit 035 is configured to obtain an eye parameter of the object to be measured, input the eye parameter of the object to be measured into the target eye health status evaluation model, and generate an eye health status evaluation result of the user.
In this embodiment, the test set data is substituted into the model, the model evaluation result is compared with the actual evaluation result of the ophthalmologist, and the capability of the model is evaluated by using a receiver operating characteristic curve (ROC curve) and an area under the curve (AUC). And loading the trained and optimized model into the APP to realize the eye health assessment function of the APP. When assessing the presence of non-health conditions such as ametropia, dry eye, eyelid inversion, trichiasis, ptosis, etc. in the child, a warning window may pop up to prompt the parent to take the child for professional examination at a regular ophthalmic medical facility for further definitive diagnosis and whether to perform intervention.
In a specific embodiment, the eye health status evaluation module further includes a hyper-parameter optimization unit 036, as shown in fig. 9, where the hyper-parameter optimization unit is configured to optimize hyper-parameters of the target eye health status evaluation model according to the Hyperopt technology.
The artificial intelligence model adopted in the embodiment is subjected to hyper-parameter optimization, that is, a suitable hyper-parameter value combination is searched, so that the maximum performance of the model can be realized in a reasonable time. Most artificial intelligence algorithms have default values of the hyper-parameters, but the default values do not always perform well in different types of machine learning projects, and therefore, the optimization step plays a crucial role in the prediction accuracy of the model.
Compared with the traditional grid search, the hyper pt technology is adopted for carrying out hyper-parameter optimization, and the hyper pt technology does not need to try all possible parameter combinations in a model to work, so that a great amount of time spent in executing the whole search is saved, and the calculation cost is reduced; compared with random search, the defect of missing important points (values) in the search space is avoided to the greatest extent. The Hyperopt carries out parameter adjustment by using a Bayesian optimization form, and the specific optimization process is described as follows:
a random search space is defined, i.e. different functions are chosen to specify the range of the input parameters. For example, the hyper-parameters in the Random Forest algorithm include the number of estimators (n _ estimators), the maximum depth (max _ depth), and the criterion (criterion). These parameters are adjustable, and can directly influence the quality of the training model. An objective function (i.e., a minimization function) is selected, and the value of the hyperparametric is randomly searched from the search space as input and returned as a loss value. The objective function may be any effective value return function, such as Accuracy (Accuracy), precision (Precision), recall (Recall), and F1_ score (integrated average [ harmonic mean ] of Precision and Recall) in the classification model, and converts the maximization problem into the minimization problem by a negative sign. In the optimization process, the model is trained using the selected hyper-parameter values and the target features are predicted, and then the prediction error is evaluated and returned. And in the searching process, the output of the function is pre-estimated, and then the searching space is adjusted based on a Bayesian optimization algorithm continuously according to the previous result. And carrying out multiple iterations on different algorithm sets and the hyper-parameters thereof, and finally minimizing the objective function.
In summary, the eye health comprehensive management system provided in the above embodiments of the present application may at least implement the following functions:
1) The eye screen distance measuring method is simpler and more accurate. The accuracy of the measuring result is improved while the user operation is simpler.
2) The eye health screening function is integrated. According to the technical scheme, the human face image information collected by the front lens of the mobile phone is fully utilized, when a child uses the mobile phone, eye screen distance and duration supervision can be performed, eye health screening such as visual inspection and blink frequency inspection can be performed, early discovery and early treatment of eye diseases of the child are realized, and mobile electronic equipment such as the mobile phone is not a killer of the vision of the child any more, and becomes a powerful tool for protecting the health of the eyes of the child.
3) The design accords with children psychology. The scheme tries to avoid the child conflict psychology possibly caused by the method of purely using pop-up reminding, forced black screen and the like, utilizes the psychology characteristics of the honor of the children, cultivates the children to voluntarily form good eye use habits, and improves the compliance of the children to the App.
4) The evaluation of the artificial intelligence model to the eye health state of the child is realized. According to the technical scheme, the eye health state parameters measured by the child basic information and the APP are collected to construct the artificial intelligence model, so that the eye health state of the child is evaluated, the eye health comprehensive intelligent evaluation scheme using the mobile phone as a carrier is formed, and the method has the advantages of being simple and convenient to operate, easy to popularize and capable of saving a large amount of manpower, material resources and financial resources.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical functional division, and there may be other divisions in actual practice, for example, multiple units or page components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed coupling or direct coupling or communication connection between each other may be through some interfaces, indirect coupling or communication connection between devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one position, or may be distributed on multiple network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional unit.
The integrated unit implemented in the form of a software functional unit may be stored in a computer readable storage medium. The software functional unit is stored in a storage medium and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device) or a processor (processor) to execute some steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
Finally, it should be noted that the above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions in the embodiments of the present application.

Claims (10)

1. An eye health integrated management system, comprising:
the image acquisition module is used for acquiring a face image of a user by using the camera;
the eye parameter measuring module is used for measuring eye parameters of a user according to the collected human face image, wherein the eye parameters comprise eye screen distance, eyesight, blinking frequency and eye appearance;
and the eye health state evaluation module is used for evaluating the eye health state of the user according to the eye parameters based on an artificial intelligence algorithm.
2. The eye health integrated management system of claim 1, further comprising:
the user registration module is used for registering an account for the user according to the user information and storing the front face image uploaded by the user into a user file; the user information includes a name, a gender, and an age of the user.
3. The eye health integrated management system according to claim 2, further comprising:
the identity recognition module is used for acquiring a face image of a user according to a first preset period and matching the face image with a user file;
when the matching is successful, starting each module to measure and monitor the eye using process of the user;
when the matching is unsuccessful, the user's eye procedure is not measured and supervised.
4. The eye health integrated management system of any one of claim 1, wherein the eye parameter measurement module comprises:
the eye screen distance measuring unit is used for calculating the eye screen distance of the user according to a second preset period, wherein the calculation formula is as follows:
d t =d c *n c /n t
in the formula (d) t Is the eye-screen distance of the user, n t Is the pixel distance, n, of the angular separation between the inner canthus of the eyes of the user c The pixel distance, d, of the angular distance between the inner canthus of the eyes of the user's head-on photograph c Horizontal distances between two eyes and a camera when a user shoots a front face image;
the vision measuring unit is used for measuring the vision level of the user according to the standard logarithmic visual chart;
the blink frequency measuring unit is used for counting the blink frequency of the user in a preset time interval and judging whether the blink overfrequency occurs to the user or not according to the blink frequency;
the eye appearance detection unit is used for identifying the eye appearance of the user and judging whether the eye appearance is abnormal or not; the ocular appearance includes the relative positions of the eyelids and limbus, pupil, and conjunctival color, vascularity, and degree of hyperemia.
5. The eye health integrated management system of claim 1, wherein the eye health status evaluation module comprises:
the data acquisition unit is used for acquiring historical eye health state parameters of the user as sample data;
the data processing unit is used for carrying out data cleaning and normalization processing on the sample data and dividing the processed sample data into a training set and a test set;
the eye health state evaluation model comprises a plurality of eye health state evaluation submodels, and each eye health state evaluation submodel adopts different artificial intelligence algorithms;
the training unit is used for training the eye health state evaluation model by utilizing a training set until the testing precision of the trained eye health state evaluation model by utilizing a testing set meets a preset value, and generating a target eye health state evaluation model;
and the evaluation unit is used for acquiring the eye parameters of the tested object, inputting the eye parameters of the tested object into the target eye health state evaluation model and generating the eye health state evaluation result of the user.
6. The eye health integrated management system of claim 5, wherein the artificial intelligence algorithm comprises:
random Forest, logistic Regression, KNN, SVC, decision Tree, XGboost, lightGBM, and Gradient Boosting algorithms.
7. The eye health integrated management system of claim 5, wherein the training unit is further configured to:
and determining the weight of the eye health state evaluation submodel according to the contribution degree of the eye health state evaluation submodel to the eye health state evaluation model in the training process.
8. The eye health integrated management system of claim 5, wherein the eye health status evaluation module further comprises:
and the hyper-parameter optimization unit is used for optimizing hyper-parameters of the target eye health state evaluation model according to the Hyperopt technology.
9. The eye health integrated management system of claim 1, further comprising:
and the eye reminding module is used for reminding the user when any one or more of the eye screen distance, the eye using time length, the blinking frequency and the eye appearance of the user is abnormal.
10. The eye health integrated management system of claim 1, further comprising:
and the honor evaluation module is used for rewarding the user when any one or more of the eye screen distance, the eye using time length and the blink frequency of the user meet the preset conditions.
CN202210365844.0A 2022-04-08 2022-04-08 Eye health comprehensive management system Pending CN115223232A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210365844.0A CN115223232A (en) 2022-04-08 2022-04-08 Eye health comprehensive management system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210365844.0A CN115223232A (en) 2022-04-08 2022-04-08 Eye health comprehensive management system

Publications (1)

Publication Number Publication Date
CN115223232A true CN115223232A (en) 2022-10-21

Family

ID=83606464

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210365844.0A Pending CN115223232A (en) 2022-04-08 2022-04-08 Eye health comprehensive management system

Country Status (1)

Country Link
CN (1) CN115223232A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117095821A (en) * 2023-10-20 2023-11-21 京东方艺云(杭州)科技有限公司 Myopia risk level prediction method and device, electronic equipment and medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117095821A (en) * 2023-10-20 2023-11-21 京东方艺云(杭州)科技有限公司 Myopia risk level prediction method and device, electronic equipment and medium
CN117095821B (en) * 2023-10-20 2024-02-20 京东方艺云(杭州)科技有限公司 Myopia risk level prediction method and device, electronic equipment and medium

Similar Documents

Publication Publication Date Title
CN110623629B (en) Visual attention detection method and system based on eyeball motion
CA3085436A1 (en) Digital visual acuity eye examination for remote physician assessment
CN110428908B (en) Eyelid motion function evaluation system based on artificial intelligence
CN112700858B (en) Early warning method and device for myopia of children and teenagers
KR102320580B1 (en) Myopia prediction method and system using deep learning
CN105224285A (en) Eyes open and-shut mode pick-up unit and method
CN113768461B (en) Fundus image analysis method, fundus image analysis system and electronic equipment
CN114648354A (en) Advertisement evaluation method and system based on eye movement tracking and emotional state
CN116028870B (en) Data detection method and device, electronic equipment and storage medium
CN113870239A (en) Vision detection method and device, electronic equipment and storage medium
CN112614583A (en) Depression grade testing system
CN111179258A (en) Artificial intelligence method and system for identifying retinal hemorrhage image
CN115223232A (en) Eye health comprehensive management system
Ogundokun et al. Diagnosis of long sightedness using neural network and decision tree algorithms
CN111738234B (en) Automatic co-situation ability identification method based on individual eye movement characteristics
CN117338234A (en) Diopter and vision joint detection method
CN111860437A (en) Method and device for judging fatigue degree based on facial expression
CN115661101A (en) Premature infant retinopathy detection system based on random sampling and deep learning
CN111513671A (en) Glasses comfort evaluation method based on eye image
CN114468977B (en) Ophthalmologic vision examination data collection and analysis method, system and computer storage medium
CN115526888A (en) Eye pattern data identification method and device, storage medium and electronic equipment
CN113273959B (en) Portable diabetic retinopathy diagnosis and treatment instrument
CN115240849A (en) Method and system for recording and reminding use characteristics and visual fatigue of electronic product
EP4258205A1 (en) Quality control method and quality control system for data annotation on fundus image
CN115100560A (en) Method, device and equipment for monitoring bad state of user and computer storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination