CN110251070B - Eye health condition monitoring method and system - Google Patents

Eye health condition monitoring method and system Download PDF

Info

Publication number
CN110251070B
CN110251070B CN201910511555.5A CN201910511555A CN110251070B CN 110251070 B CN110251070 B CN 110251070B CN 201910511555 A CN201910511555 A CN 201910511555A CN 110251070 B CN110251070 B CN 110251070B
Authority
CN
China
Prior art keywords
eye
scene
user
desktop
included angle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910511555.5A
Other languages
Chinese (zh)
Other versions
CN110251070A (en
Inventor
苏毅
李俊樱
潘纪泽
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Li Junying
Pan Jize
Su Yi
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN201910511555.5A priority Critical patent/CN110251070B/en
Publication of CN110251070A publication Critical patent/CN110251070A/en
Application granted granted Critical
Publication of CN110251070B publication Critical patent/CN110251070B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation
    • G06V40/171Local features and components; Facial parts ; Occluding parts, e.g. glasses; Geometrical relationships
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/18Status alarms
    • G08B21/24Reminder alarms, e.g. anti-loss alarms

Abstract

The invention discloses a method and a system for monitoring eye health condition, which can acquire required scene information and face information in real time from an acquired human head image and an acquired eye scene image, judge the eye scene of a user in advance according to a preset scene judgment rule, and judge the eye distance or backlight reading condition of the user according to different eye scenes and the eye judgment rule.

Description

Eye health condition monitoring method and system
Technical Field
The invention belongs to the field of eye health monitoring, and particularly relates to a method and a system for monitoring eye health conditions.
Background
In recent years, the myopia rate of students is higher and higher, which is greatly related to the ordinary learning habits, postures and eye distance of students. The eye health conditions such as the eye using posture of the student have great influence on the health of eyes, in order to improve the condition, the abnormal eye using condition of the student is timely found and timely correction is carried out on the eye using posture of the student at an early stage, a plurality of eye protection devices such as eye protection table lamps are arranged on the market at present, the requirements for detecting the eye using distance are met, and the following technologies are mostly applied:
1. the distance sensor is arranged in the eye protection device, when a user uses the eye protection device, the user can be fed back when the sensor detects that the expected distance of an object is less than the preset distance, but the distance between the object and the device is only measured by the method, so that the eye using distance of the user cannot be accurately monitored. For example, the user is reading a book, and the distance sensor detects that the book is too close to the desk lamp, thereby giving a false alarm.
2. Set up external gravity sensor in the device is inside, when using, need dress gravity sensor on one's body the user, can detect that the user leans forward or askew to sit, when gravity sensor detects the state unusual, feeds back for the user through vibrations or wireless technology, and this kind of technique can detect whether the user sits, if wear overhead can also detect whether askew head. It cannot detect eye distance, such as when the user is reading a book, and the proximity of the book to the eyes is not detectable. And the need to wear a gravity sensor and charge both can cause inconvenience to the user.
3. The camera is arranged inside the device, when a user uses the desk lamp, the camera can perform face recognition, find eyes of the user and then calculate the distance between the eyes and the desktop, and when the distance is less than a set value, the feedback is reminded. However, such technologies can only detect the distance from the eyes to a predetermined position at present, and cannot identify the desktop object or judge the behavior of the user, so that the reminding feedback mechanism is extremely rigid, the false alarm rate is high, and good eye monitoring and correcting effects cannot be achieved.
Therefore, in the prior art, when eye health condition monitoring is carried out, the problems of rigid monitoring mechanism, high false alarm rate and low intelligent degree generally exist.
Disclosure of Invention
In order to overcome the technical defects, the invention provides the eye health condition monitoring method and the eye health condition monitoring system, which break through the limitation of high false alarm rate of the previous eye monitoring by using a single sensor, greatly improve the accuracy of the eye monitoring and achieve good eye monitoring effect.
In order to solve the problems, the invention is realized according to the following technical scheme:
a method of eye health monitoring, comprising the steps of:
acquiring a human head image and an eye scene image;
extracting characteristic points of the eye scene image, and calculating to obtain scene information; the scene information comprises a desktop plane, an object outline and an included angle between the object and the desktop; the object is a paper reading material or an electronic reading device;
extracting characteristic points of the human head image, and calculating to obtain face information; the face information comprises face organ coordinates and face orientation angles;
judging the eye using scene of the user according to the scene information and the face information by using a preset scene judgment rule;
and calculating the eye using distance or backlight reading condition of the user according to the scene information, the face information and the eye using scene by using a preset eye using judgment rule, and judging the eye using health condition of the user.
Compared with the prior art, the method has the beneficial effects that:
the method can be used for monitoring the eye health condition of the user, acquiring required scene information and face information from the acquired human head image and eye scene image in real time, judging the eye scene of the user in advance according to a preset scene judgment rule, and judging the eye distance or backlight reading condition of the user according to different eye scenes and the eye judgment rule.
Further, the scene information is calculated by the following steps:
extracting desktop feature points and object feature points in the eye scene image;
fitting a desktop plane according to the desktop feature points;
identifying an object outline according to the object feature points;
and calculating the included angle between the object and the desktop according to the desktop plane and the object outline.
Through the steps, the desktop plane and the objects on the desk can be identified according to the characteristic points in the real-time image, the included angle between the objects and the desktop is calculated, and the eye health condition of the user can be accurately judged and monitored according to the information.
Further, the face information is obtained by calculation through the following steps:
acquiring characteristic points in the human head image;
determining the coordinates of each organ of the face according to the feature points;
and obtaining the orientation angle of the human face in the three-dimensional space according to the coordinates of each organ by using a fitting algorithm.
Through the steps, the coordinates of each organ in the human face can be obtained and the orientation angle information of the human face can be fitted according to each characteristic point of the real-time image, so that the eye health condition of the user can be accurately judged and monitored in the follow-up process according to the information.
Further, the scene determination rule is:
calculating the sight direction of the user according to the face orientation information and pupil coordinates in face organ coordinates;
judging whether the object outline is identified:
if the object outline is identified, judging whether the object outline is in the sight line direction, and if so, judging that the eye using scene of the user is a reading scene;
if the object outline is not identified, whether the desktop plane is in the sight line direction or not is judged, and if yes, the eye using scene of the user is judged to be a writing scene.
By the scene judgment rule, the real-time eye use scene of the user can be accurately judged, and different eye use posture information of the user can be conveniently monitored according to the eye use scene in the follow-up process.
Further, the eye use judgment rule is as follows:
when the eye scene is a writing scene:
calculating the writing distance from the eyes to the desktop according to the eye coordinates and the desktop plane; when the writing distance is smaller than a preset writing distance threshold value, judging that the eye use health condition of the user is abnormal;
and/or calculating the inclination angle of the head by using the coordinates of the nose tip and the eyes of the human face, and calculating the included angle between the inclination angle of the head and the Z axis of a preset camera coordinate system; when the included angle is smaller than a preset inclined included angle threshold value, judging that the eye use health condition of the user is abnormal;
when the eye scene is a reading scene:
calculating the reading distance from the eyes to the object according to the eye coordinates and the object outline, and judging the eye use health condition of the user as abnormal eye use when the reading distance is smaller than a preset reading distance threshold;
and/or calculating a backlight included angle according to the included angle between the object and the desktop and the included angle between the preset light and the desktop; the backlight included angle is an angle generated when the light ray is rotated to the object as an initial edge; and when the backlight included angle is smaller than a preset backlight included angle threshold value, judging that the eye use health condition of the user is abnormal for the backlight.
By the eye use judgment rule, the eye use distance or the backlight reading condition of the user is judged according to different eye use scenes, and the eye use health condition of the user can be comprehensively and timely monitored according to different eye use scenes of the user.
The invention also correspondingly discloses an eye health condition monitoring system, which comprises:
the image acquisition device is used for acquiring a human head image and an eye scene image;
the image processing device is used for extracting the characteristic points of the human head image and the eye scene image and calculating to obtain scene information and face information; the scene information comprises a desktop plane, an object outline and an included angle between the object and the desktop; the object is a paper reading material or an electronic reading device; the face information comprises face organ coordinates and face orientation angles;
the scene judgment device is used for judging the eye use scene of the user according to the scene information and the face information by using a preset scene judgment rule;
and the eye use judgment device is used for calculating the eye use distance or the backlight reading condition of the user according to the scene information, the face information and the eye use scene by using a preset eye use judgment rule, and judging the eye use health condition of the user.
Compared with the prior art, the system has the advantages that:
the system monitors the eye health condition of the user, can acquire required scene information and face information in real time from the acquired human head image and eye scene image, judges the eye scene of the user in advance according to a preset scene judgment rule, and judges the eye distance or backlight reading condition of the user according to different eye scenes and the eye judgment rule.
Further, the image processing apparatus calculates the scene information by:
extracting desktop feature points and object feature points in the eye scene image;
fitting a desktop plane according to the desktop feature points;
identifying an object outline according to the object feature points;
and calculating the included angle between the object and the desktop according to the desktop plane and the object outline.
Through the steps, the image processing device can identify the desktop plane and the objects on the desk according to the characteristic points in the real-time image, and calculate the included angle between the objects and the desktop, so that the eye health condition of the user can be accurately judged and monitored subsequently according to the information.
Further, the image processing apparatus calculates the face information by:
acquiring characteristic points in the human head image;
determining the coordinates of each organ of the face according to the feature points;
and obtaining the orientation angle of the human face in the three-dimensional space according to the coordinates of each organ by using a fitting algorithm.
Through the steps, the image processing device can acquire the coordinates of each organ in the human face and fit the orientation angle information of the human face according to each characteristic point of the real-time image, so that the eye health condition of the user can be judged and monitored more accurately according to the information.
Further, the scene judging means judges the eye scene by:
calculating the sight direction of the user according to the face orientation information and pupil coordinates in face organ coordinates;
judging whether the object outline is identified:
if the object outline is identified, judging whether the object outline is in the sight line direction, and if so, judging that the eye using scene of the user is a reading scene;
if the object outline is not identified, whether the desktop plane is in the sight line direction or not is judged, and if yes, the eye using scene of the user is judged to be a writing scene.
By the scene judging device, the real-time eye use scene of the user can be accurately judged, and different eye use posture information of the user can be conveniently monitored according to the eye use scene in the follow-up process.
Further, the eye use judgment means judges the eye use health condition by:
when the eye scene is a writing scene, calculating the writing distance from the eyes to the desktop according to the eye coordinates and the desktop plane; when the writing distance is smaller than a preset writing distance threshold value, judging that the eye use health condition of the user is abnormal;
when the eye scene is a reading scene:
calculating the reading distance from the eyes to the object according to the eye coordinates and the object outline, and judging the eye use health condition of the user as abnormal eye use when the reading distance is smaller than a preset reading distance threshold;
and/or calculating a backlight included angle according to the included angle between the object and the desktop and the included angle between the preset light and the desktop; the backlight included angle is an angle generated when the light ray is rotated to the object as an initial edge; and when the backlight included angle is smaller than a preset backlight included angle threshold value, judging that the eye use health condition of the user is abnormal for the backlight.
By the eye use judgment device, the eye use distance or the backlight reading condition of the user is judged according to different eye use scenes, and the eye use health condition of the user can be comprehensively and timely monitored according to different eye use scenes of the user.
Drawings
FIG. 1 is a schematic view of the steps of the eye health monitoring method according to embodiment 1 of the present invention;
FIG. 2 is a schematic view of an illumination scene of the light and the object according to embodiment 1 of the present invention;
FIG. 3 is a schematic view of another illumination scene of the light and the object according to embodiment 1 of the present invention;
FIG. 4 is a block diagram of an eye health monitoring system according to embodiment 1 of the present invention;
fig. 5 is a schematic structural diagram of the eye health monitoring desk lamp in embodiment 1 of the present invention.
Detailed Description
The preferred embodiments of the present invention will be described in conjunction with the accompanying drawings, and it will be understood that they are described herein for the purpose of illustration and explanation and not limitation.
Example 1
As shown in fig. 1, this embodiment 1 discloses an eye health monitoring method, which includes the following steps:
s1, acquiring a human head image and an eye scene image;
specifically, three-dimensional imaging can be performed through a plurality of cameras to form a three-dimensional image; specifically, the three-dimensional image includes an image of a human head and an image of a desktop object; specifically, the camera can be arranged in front of lighting equipment such as a desk lamp and the like to acquire real-time images of a user;
s2, extracting the feature points of the eye scene image, and calculating to obtain scene information; the scene information comprises a desktop plane, an object outline and an included angle between the object and the desktop; the object is a paper reading material or an electronic reading device; specifically, the object may be a book or a readable electronic display device such as a tablet computer or a mobile phone; specifically, step S2 includes:
s21, extracting desktop feature points and object feature points in the eye scene image; specifically, feature points of a desktop in the three-dimensional image and edge feature points of an object, such as a book or an electronic device, can be extracted through a feature point detection algorithm; specifically, extracting desktop feature points of a desktop in the image by using an angular point detection algorithm; specifically, a corner detection algorithm and an edge detection algorithm are combined to extract candidate characteristic points of an object in the image;
s22, fitting a desktop plane according to the desktop feature points; specifically, a plane fitting algorithm is used for fitting and eliminating noise of the desktop feature points to obtain a desktop plane equation;
s23, identifying the object outline according to the object characteristic points; specifically, an object recognition classifier, namely a small neural network classifier, is used for recognizing candidate feature points of the object and determining whether the object is a book or an electronic device;
and S24, calculating the included angle between the object and the desktop according to the desktop plane and the object outline.
Through the steps, the desktop plane and the objects on the desk can be identified according to the characteristic points in the real-time image, the included angle between the objects and the desktop is calculated, and the eye health condition of the user can be accurately judged and monitored according to the information.
S3, extracting characteristic points of the human head image, and calculating to obtain human face information; the face information comprises face organ coordinates and face orientation angles; specifically, step S3 includes:
s31, obtaining characteristic points in the human head image; specifically, a neural network algorithm is used for extracting a plurality of characteristic points of the head and the face of the human body in the three-dimensional image; specifically, LAB and SURF characteristics in the image are extracted, and then a plurality of edge characteristic points of the head and the face are located by utilizing an MLP (multi-layer perceptron) neural network;
s32, determining the coordinates of each organ of the human face according to the feature points; specifically, three-dimensional coordinates of feature points of organs including eyes, eyebrows, a nose, lips and the like are determined; specifically, if the human head image comprises an eye image, the coordinates of the eyes are directly positioned, if the eye image is shielded or cannot be directly shot due to the angle problem, the position coordinates of the eyes are calculated by combining a human standard face model through other effectively recognized organ characteristic points, and the positioning effect is achieved.
S33, obtaining the orientation angle of the face in the three-dimensional space according to the coordinates of each organ by using a fitting algorithm; specifically, a fitting algorithm is adopted to filter errors, and the main orientation of the face in the three-dimensional space is obtained.
Through the steps, the coordinates of each organ in the human face can be obtained and the orientation angle information of the human face can be fitted according to each characteristic point of the real-time image, so that the eye health condition of the user can be accurately judged and monitored in the follow-up process according to the information.
S4, judging the eye use scene of the user according to the scene information and the face information by using a preset scene judgment rule; specifically, the scene judgment rule is as follows:
calculating the sight direction of the user according to the face orientation information and the pupil coordinates in the face organ coordinates; specifically, coordinate correction is carried out on the face orientation angle by using pupil coordinates to obtain the sight line direction;
judging whether an object outline is identified:
if the object outline is identified, judging whether the object outline is in the sight line direction, and if so, judging that the eye using scene of the user is a reading scene;
if the object outline is not identified, judging whether the desktop plane is in the sight line direction, and if so, judging that the eye using scene of the user is a writing scene.
By the scene judgment rule, the real-time eye use scene of the user can be accurately judged, and different eye use posture information of the user can be conveniently monitored according to the eye use scene in the follow-up process.
S5, calculating the eye distance or the backlight reading condition of the user according to the scene information, the face information, and the eye scene using a preset eye judgment rule, and judging the eye health condition of the user, specifically, the eye judgment rule is:
when the eye scene is a writing scene:
calculating the writing distance from the eyes to the desktop according to the eye coordinates and the desktop plane; when the writing distance is smaller than a preset writing distance threshold value, judging that the eye use health condition of the user is abnormal;
and/or calculating the inclination angle of the head by using the coordinates of the nose tip and the eyes of the human face, and calculating the included angle between the inclination angle of the head and the Z axis of a preset camera coordinate system; when the included angle is smaller than a preset inclined included angle threshold value, namely the head of the user is inclined during reading, judging that the eye using health condition of the user is eye using abnormity;
when the eye scene is a reading scene:
calculating the reading distance from eyes to the object according to the eye coordinates and the object outline, and judging the eye use health condition of the user as eye use abnormity when the reading distance is smaller than a preset reading distance threshold value;
and/or calculating a backlight included angle according to the included angle between the object and the desktop and the included angle between the preset light and the desktop; the backlight included angle is an angle generated by rotating the light ray as an initial edge to an object; and when the backlight included angle is smaller than a preset backlight included angle threshold value, judging that the eye use health condition of the user is abnormal for the backlight.
Specifically, the information of the included angle between the preset light and the desktop can be determined according to the known light information in the environment, for example, when a camera arranged in the desk lamp device is used as the image acquisition device, the relative position between the camera and the light source can be obtained through calibration, that is, the light irradiation of the desk lamp device is known information, it can be easily known whether the light falls on the front or the back of the object through geometric calculation, if the light falls on the back, the included angle of the backlight is a negative number, and the light is determined as backlight, as shown in fig. 2; if the light falls on the front surface and the included angle A between the light and the object is less than 30 degrees, the light is also judged to be backlight, as shown in FIG. 3;
specifically, the information of the included angle between the preset light and the desktop can be determined by using a sensor for detection, and the irradiation angle of the actual light is determined by detecting the light through the sensor arranged at the fixed position; specifically, if the position of the light source is unknown or no light source is used, the light source is preset to be vertically downward-emitted astigmatic light, at the moment, the backlight included angle is directly calculated through the included angle between the object and the desktop, and the backlight is judged as the backlight if the included angle is larger than 45 degrees.
By the eye use judgment rule, the eye use distance or the backlight reading condition of the user is judged according to different eye use scenes, and the eye use health condition of the user can be comprehensively and timely monitored according to different eye use scenes of the user.
And S6, generating an alarm signal for triggering the alarm when the eye use condition of the user is judged to be abnormal. Specifically, the alarm may be a buzzer, a speaker, or other sound alarm device, or may be a physical alarm device, such as a sprayer, or an indicator LED lamp, and the alarm may be muted by lighting the indicator lamp.
The method for monitoring the eye health condition disclosed by the embodiment can acquire required scene information and face information in real time from the acquired human head image and eye scene image, judge the eye scene of the user in advance according to the preset scene judgment rule, and judge the eye distance or backlight reading condition of the user according to different eye scenes and the eye judgment rule.
Example 2
As shown in fig. 4, the present embodiment discloses an eye health monitoring system corresponding to the eye health monitoring method in embodiment 1, including:
the image acquisition device 1 is used for acquiring a human head image and an eye scene image;
the image processing device 2 is used for extracting the characteristic points of the human head image and the eye scene image and calculating to obtain scene information and face information; the scene information comprises a desktop plane, an object outline and an included angle between the object and the desktop; the object is a paper reading material or an electronic reading device; the face information comprises face organ coordinates and face orientation angles;
specifically, the image processing apparatus 2 is provided with an execution code for executing the following steps to calculate scene information:
extracting desktop feature points and object feature points in the eye scene image;
fitting a desktop plane according to the desktop feature points;
identifying the object outline according to the object characteristic points;
and calculating the included angle between the object and the desktop according to the desktop plane and the object outline.
Through the steps, the image processing device can identify the desktop plane and the objects on the desk according to the characteristic points in the real-time image, and calculate the included angle between the objects and the desktop, so that the eye health condition of the user can be accurately judged and monitored subsequently according to the information.
Specifically, the image processing apparatus 2 is provided with an execution code therein, which is used for executing the following steps to calculate the face information:
acquiring characteristic points in a human head image;
determining the coordinates of each organ of the face according to the feature points;
and obtaining the orientation angle of the human face in the three-dimensional space according to the coordinates of each organ by using a fitting algorithm.
The scene judgment device 3 is used for judging the eye use scene of the user according to the scene information and the face information by using a preset scene judgment rule;
specifically, the scene determination device 3 is provided with an execution code therein, and is configured to perform the following steps to determine the eye scene:
calculating the sight direction of the user according to the face orientation information and the pupil coordinates in the face organ coordinates;
judging whether an object outline is identified:
if the object outline is identified, judging whether the object outline is in the sight line direction, and if so, judging that the eye using scene of the user is a reading scene;
if the object outline is not identified, judging whether the desktop plane is in the sight line direction, and if so, judging that the eye using scene of the user is a writing scene.
And the eye use judgment device 4 is used for calculating the eye use distance or the backlight reading condition of the user according to the scene information, the face information and the eye use scene by using a preset eye use judgment rule, and judging the eye use health condition of the user.
Specifically, the eye health condition determining device 4 is provided with an executing code for executing the following steps:
when the eye scene is a writing scene:
calculating the writing distance from the eyes to the desktop according to the eye coordinates and the desktop plane; when the writing distance is smaller than a preset writing distance threshold value, judging that the eye use health condition of the user is abnormal;
and/or calculating the inclination angle of the head by using the coordinates of the nose tip and the eyes of the human face, and calculating the included angle between the inclination angle of the head and the Z axis of a preset camera coordinate system; when the included angle is smaller than a preset inclined included angle threshold value, judging the eye use health condition of the user as eye use abnormity;
when the eye scene is a reading scene:
calculating the reading distance from eyes to the object according to the eye coordinates and the object outline, and judging the eye use health condition of the user as eye use abnormity when the reading distance is smaller than a preset reading distance threshold value;
and/or calculating a backlight included angle according to the included angle between the object and the desktop and the included angle between the preset light and the desktop; the backlight included angle is an angle generated by rotating the light ray as an initial edge to an object; and when the backlight included angle is smaller than a preset backlight included angle threshold value, judging that the eye use health condition of the user is abnormal for the backlight.
Specifically, the eye use determination device 4 generates an alarm signal when determining that the eye use health condition of the user is an eye use abnormality.
Specifically, the eye health condition monitoring system disclosed in this embodiment further includes an alarm device, which automatically alarms when receiving the alarm signal. Specifically, the alarm device may be a buzzer, a speaker, or other sound alarm device, or may be a physical alarm device, such as a sprayer, or an indicator LED lamp, and the alarm device may be configured to perform a silent prompt by lighting the indicator lamp.
The eye health monitoring system disclosed in embodiment 2 corresponds to the eye health monitoring method disclosed in embodiment 1, and the specific technical details and technical effects thereof are also similar, and are not described herein again.
Example 3
As shown in fig. 5, this embodiment corresponds to the technical solutions disclosed in embodiments 1 and 2, and discloses an eye health condition monitoring desk lamp, which includes a desk lamp base 1, a desk lamp support 2, an adjustable supporting component 3 and an illuminating component 4; specifically, four cameras 5 are arranged on the support 2 in the vertical direction; the camera 5 is used for acquiring a real-time image of a user and performing three-dimensional imaging to form a three-dimensional image; specifically, the three-dimensional image includes an image of a human head and an image of a desktop object;
specifically, a processor is arranged in the desk lamp and connected with the camera, and an image processing device, a scene judgment device and an eye use judgment device as described in embodiment 2 are arranged in the processor in a code form, so that the method disclosed in embodiment 1 can be executed, the image acquired by the camera is processed in real time, and the eye use condition of the user is judged; specifically, the desk lamp is further provided with a loudspeaker, and when the eye use condition of the user is judged to be abnormal, the desk lamp can give out a sound to give an alarm, remind the user and correct the wrong eye use habit of the user in time.
The above embodiments are only preferred embodiments of the present invention, and the present invention is not limited thereto in any way, so that any modification, equivalent change and modification of the above embodiments according to the technical essence of the present invention are within the scope of the present invention.

Claims (6)

1. A method of monitoring eye health, comprising the steps of:
acquiring a human head image and an eye scene image;
extracting characteristic points of the eye scene image, and calculating to obtain scene information; the scene information comprises a desktop plane, an object outline and an included angle between the object and the desktop; the object is a paper reading material or an electronic reading device;
extracting characteristic points of the human head image, and calculating to obtain face information; the face information comprises face organ coordinates and face orientation angles;
judging the eye use scene of the user according to the scene information and the face information by using a preset scene judgment rule, wherein the scene judgment rule is as follows:
calculating the sight direction of the user according to the face orientation information and pupil coordinates in face organ coordinates;
judging whether the object outline is identified:
if the object outline is identified, judging whether the object outline is in the sight line direction, and if so, judging that the eye using scene of the user is a reading scene;
if the object outline is not identified, judging whether the desktop plane is in the sight line direction, if so, judging that the eye using scene of the user is a writing scene;
calculating the eye using distance or backlight reading condition of the user according to the scene information, the face information and the eye using scene by using a preset eye using judgment rule, and judging the eye using health condition of the user, wherein the eye using judgment rule is as follows:
when the eye scene is a writing scene:
calculating the writing distance from the eyes to the desktop according to the eye coordinates and the desktop plane; when the writing distance is smaller than a preset writing distance threshold value, judging that the eye use health condition of the user is abnormal;
and/or calculating the inclination angle of the head by using the coordinates of the nose tip and the eyes of the human face, and calculating the included angle between the inclination angle of the head and the Z axis of a preset camera coordinate system; when the included angle is smaller than a preset inclined included angle threshold value, judging that the eye use health condition of the user is abnormal;
when the eye scene is a reading scene:
calculating the reading distance from the eyes to the object according to the eye coordinates and the object outline, and judging the eye use health condition of the user as abnormal eye use when the reading distance is smaller than a preset reading distance threshold;
and/or calculating a backlight included angle according to the included angle between the object and the desktop and the included angle between the preset light and the desktop; the backlight included angle is an angle generated when the light ray is rotated to the object as an initial edge; and when the backlight included angle is smaller than a preset backlight included angle threshold value, judging that the eye use health condition of the user is abnormal for the backlight.
2. The eye health monitoring method of claim 1, wherein the context information is calculated by:
extracting desktop feature points and object feature points in the eye scene image;
fitting a desktop plane according to the desktop feature points;
identifying an object outline according to the object feature points;
and calculating the included angle between the object and the desktop according to the desktop plane and the object outline.
3. The eye health monitoring method of claim 1, wherein the face information is calculated by:
acquiring characteristic points in the human head image;
determining the coordinates of each organ of the face according to the feature points;
and obtaining the orientation angle of the human face in the three-dimensional space according to the coordinates of each organ by using a fitting algorithm.
4. An eye health monitoring system, comprising:
the image acquisition device is used for acquiring a human head image and an eye scene image;
the image processing device is used for extracting the characteristic points of the human head image and the eye scene image and calculating to obtain scene information and face information; the scene information comprises a desktop plane, an object outline and an included angle between the object and the desktop; the object is a paper reading material or an electronic reading device; the face information comprises face organ coordinates and face orientation angles;
the scene judgment device is used for judging the eye use scene of the user according to the scene information and the face information by using a preset scene judgment rule;
the scene judging device judges the eye scene through the following steps:
calculating the sight direction of the user according to the face orientation information and pupil coordinates in face organ coordinates;
judging whether the object outline is identified:
if the object outline is identified, judging whether the object outline is in the sight line direction, and if so, judging that the eye using scene of the user is a reading scene;
if the object outline is not identified, judging whether the desktop plane is in the sight line direction, if so, judging that the eye using scene of the user is a writing scene;
the eye use judgment device judges the eye use health condition by the following steps:
when the eye scene is a writing scene:
calculating the writing distance from the eyes to the desktop according to the eye coordinates and the desktop plane; when the writing distance is smaller than a preset writing distance threshold value, judging that the eye use health condition of the user is abnormal;
and/or calculating the inclination angle of the head by using the coordinates of the nose tip and the eyes of the human face, and calculating the included angle between the inclination angle of the head and the Z axis of a preset camera coordinate system; when the included angle is smaller than a preset inclined included angle threshold value, judging that the eye use health condition of the user is abnormal;
when the eye scene is a reading scene:
calculating the reading distance from the eyes to the object according to the eye coordinates and the object outline, and judging the eye use health condition of the user as abnormal eye use when the reading distance is smaller than a preset reading distance threshold;
and/or calculating a backlight included angle according to the included angle between the object and the desktop and the included angle between the preset light and the desktop; the backlight included angle is an angle generated when the light ray is rotated to the object as an initial edge; when the backlight included angle is smaller than a preset backlight included angle threshold value, judging that the eye use health condition of the user is abnormal for the backlight;
and the eye use judgment device is used for calculating the eye use distance or the backlight reading condition of the user according to the scene information, the face information and the eye use scene by using a preset eye use judgment rule, and judging the eye use health condition of the user.
5. The eye health monitoring system of claim 4, wherein the image processing device calculates the scene information by:
extracting desktop feature points and object feature points in the eye scene image;
fitting a desktop plane according to the desktop feature points;
identifying an object outline according to the object feature points;
and calculating the included angle between the object and the desktop according to the desktop plane and the object outline.
6. The eye health monitoring system of claim 4, wherein the image processing device calculates the face information by:
acquiring characteristic points in the human head image;
determining the coordinates of each organ of the face according to the feature points;
and obtaining the orientation angle of the human face in the three-dimensional space according to the coordinates of each organ by using a fitting algorithm.
CN201910511555.5A 2019-06-13 2019-06-13 Eye health condition monitoring method and system Active CN110251070B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910511555.5A CN110251070B (en) 2019-06-13 2019-06-13 Eye health condition monitoring method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910511555.5A CN110251070B (en) 2019-06-13 2019-06-13 Eye health condition monitoring method and system

Publications (2)

Publication Number Publication Date
CN110251070A CN110251070A (en) 2019-09-20
CN110251070B true CN110251070B (en) 2021-08-03

Family

ID=67918084

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910511555.5A Active CN110251070B (en) 2019-06-13 2019-06-13 Eye health condition monitoring method and system

Country Status (1)

Country Link
CN (1) CN110251070B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111539912B (en) * 2020-03-23 2021-09-28 中国科学院自动化研究所 Health index evaluation method and equipment based on face structure positioning and storage medium
CN111857329B (en) * 2020-05-26 2022-04-26 北京航空航天大学 Method, device and equipment for calculating fixation point
CN114138103A (en) * 2020-09-03 2022-03-04 司马苗 Intelligent eye protection equipment or software
CN116959214B (en) * 2023-07-18 2024-04-02 北京至真互联网技术有限公司 Method and system for reminding user of eye protection through intelligent glasses
CN117334023A (en) * 2023-12-01 2024-01-02 四川省医学科学院·四川省人民医院 Eye behavior monitoring method and system

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001283244A (en) * 2000-03-29 2001-10-12 Konami Co Ltd Three-dimensional image compositing device, its method, information storage medium, program distributing device and its method
JP2006081835A (en) * 2004-09-17 2006-03-30 Topcon Corp Ophthalmologic information processing system
CN1774207A (en) * 2003-04-11 2006-05-17 博士伦公司 System and method for acquiring data and aligning and tracking of an eye
CN105354825A (en) * 2015-09-30 2016-02-24 李乔亮 Intelligent device for automatically identifying position of reading material in read-write scene and application of intelligent device
CN108491792A (en) * 2018-03-21 2018-09-04 安徽大学 Office scene human-computer interaction Activity recognition method based on electro-ocular signal
CN108874122A (en) * 2018-04-28 2018-11-23 深圳市赛亿科技开发有限公司 Intelligent glasses and its control method, computer readable storage medium
CN109587875A (en) * 2018-11-16 2019-04-05 厦门盈趣科技股份有限公司 A kind of intelligent desk lamp and its adjusting method

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7963652B2 (en) * 2003-11-14 2011-06-21 Queen's University At Kingston Method and apparatus for calibration-free eye tracking
CN101285556A (en) * 2007-04-10 2008-10-15 上海外国语大学附属大境中学 Electronic intelligent table lamp for protecting eyes
KR20080099376A (en) * 2007-05-09 2008-11-13 오세영 Hand book supporter
CN101327075A (en) * 2007-06-21 2008-12-24 李庆伟 Posture-correcting eye-protecting instrument
CN100556337C (en) * 2008-07-18 2009-11-04 孙紫薇 All directional reading shelf
CN101642328A (en) * 2009-05-18 2010-02-10 张为敏 Environment-friendly and health-care adjustable desk and chair
CN102217843A (en) * 2010-04-15 2011-10-19 陈建辉 Office table
CN101833663B (en) * 2010-04-21 2012-10-10 北方工业大学 Binocular electronic reader
CN102679262A (en) * 2011-03-12 2012-09-19 沈卫军 Grille lampshade
US9971172B2 (en) * 2014-01-15 2018-05-15 Carl Zeiss Vision International Gmbh Method for determining the far visual point for a spectacle lens and system therefor
CN204695520U (en) * 2015-05-18 2015-10-07 深圳市奇胜隆实业有限公司 A kind of student desk lamp
CN207394480U (en) * 2017-05-27 2018-05-22 欧普照明股份有限公司 Intelligent lamp and the User Status system for prompting based on Intelligent lamp

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001283244A (en) * 2000-03-29 2001-10-12 Konami Co Ltd Three-dimensional image compositing device, its method, information storage medium, program distributing device and its method
CN1774207A (en) * 2003-04-11 2006-05-17 博士伦公司 System and method for acquiring data and aligning and tracking of an eye
JP2006081835A (en) * 2004-09-17 2006-03-30 Topcon Corp Ophthalmologic information processing system
CN105354825A (en) * 2015-09-30 2016-02-24 李乔亮 Intelligent device for automatically identifying position of reading material in read-write scene and application of intelligent device
CN108491792A (en) * 2018-03-21 2018-09-04 安徽大学 Office scene human-computer interaction Activity recognition method based on electro-ocular signal
CN108874122A (en) * 2018-04-28 2018-11-23 深圳市赛亿科技开发有限公司 Intelligent glasses and its control method, computer readable storage medium
CN109587875A (en) * 2018-11-16 2019-04-05 厦门盈趣科技股份有限公司 A kind of intelligent desk lamp and its adjusting method

Also Published As

Publication number Publication date
CN110251070A (en) 2019-09-20

Similar Documents

Publication Publication Date Title
CN110251070B (en) Eye health condition monitoring method and system
US10878237B2 (en) Systems and methods for performing eye gaze tracking
WO2017152649A1 (en) Method and system for automatically prompting distance from human eyes to screen
JP5109922B2 (en) Driver monitoring device and program for driver monitoring device
CN108665687B (en) Sitting posture monitoring method and device
JP2014515291A5 (en)
NL2029643B1 (en) Ai eye protection desk lamp with sitting posture correction reminder and implementation method thereof
TW201405434A (en) Pupil detection device
CN110148092B (en) Method for analyzing sitting posture and emotional state of teenager based on machine vision
US10402996B2 (en) Distance measuring device for human body features and method thereof
JP2015107155A (en) Measuring device and measurement method
KR20130043366A (en) Gaze tracking apparatus, display apparatus and method therof
CN111488775A (en) Device and method for judging degree of fixation
CN108537103B (en) Living body face detection method and device based on pupil axis measurement
JP2018045386A (en) Line-of-sight measurement device
Mollaret et al. Perceiving user's intention-for-interaction: A probabilistic multimodal data fusion scheme
CN113589296A (en) Human body sitting posture detection device and method
WO2023030109A1 (en) Head-mounted display device and method for adjusting display brightness thereof
Heo et al. Object recognition and selection method by gaze tracking and SURF algorithm
CN112149527A (en) Wearable device detection method and device, electronic device and storage medium
CN115316939A (en) Control method and system of vision screening instrument
JP2023549865A (en) Method and system for measuring binocular distance for children
CN104951081B (en) The method of automatic identification read-write posture and intelligent early-warning device thereof
CN111582003A (en) Sight tracking student classroom myopia prevention system
JP6546329B1 (en) Eye gaze vector detection device, visual function inspection device, visual function training device and method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB03 Change of inventor or designer information
CB03 Change of inventor or designer information

Inventor after: Su Yi

Inventor after: Li Junying

Inventor after: Pan Jize

Inventor before: Su Yi

Inventor before: Li Junying

Inventor before: Pan Jize

Inventor before: Wen Kangwei

TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20210519

Address after: 510000 room 507, Jinshan building, 246 Wushan Road, Tianhe District, Guangzhou City, Guangdong Province

Applicant after: Su Yi

Applicant after: Li Junying

Applicant after: Pan Jize

Address before: 510000 room 507, Jinshan building, 246 Wushan Road, Tianhe District, Guangzhou City, Guangdong Province

Applicant before: Su Yi

Applicant before: Li Junying

Applicant before: Wen Kangwei

Applicant before: Pan Jize

GR01 Patent grant
GR01 Patent grant