Disclosure of Invention
In order to overcome the technical defects, the invention provides the eye health condition monitoring method and the eye health condition monitoring system, which break through the limitation of high false alarm rate of the previous eye monitoring by using a single sensor, greatly improve the accuracy of the eye monitoring and achieve good eye monitoring effect.
In order to solve the problems, the invention is realized according to the following technical scheme:
a method of eye health monitoring, comprising the steps of:
acquiring a human head image and an eye scene image;
extracting characteristic points of the eye scene image, and calculating to obtain scene information; the scene information comprises a desktop plane, an object outline and an included angle between the object and the desktop; the object is a paper reading material or an electronic reading device;
extracting characteristic points of the human head image, and calculating to obtain face information; the face information comprises face organ coordinates and face orientation angles;
judging the eye using scene of the user according to the scene information and the face information by using a preset scene judgment rule;
and calculating the eye using distance or backlight reading condition of the user according to the scene information, the face information and the eye using scene by using a preset eye using judgment rule, and judging the eye using health condition of the user.
Compared with the prior art, the method has the beneficial effects that:
the method can be used for monitoring the eye health condition of the user, acquiring required scene information and face information from the acquired human head image and eye scene image in real time, judging the eye scene of the user in advance according to a preset scene judgment rule, and judging the eye distance or backlight reading condition of the user according to different eye scenes and the eye judgment rule.
Further, the scene information is calculated by the following steps:
extracting desktop feature points and object feature points in the eye scene image;
fitting a desktop plane according to the desktop feature points;
identifying an object outline according to the object feature points;
and calculating the included angle between the object and the desktop according to the desktop plane and the object outline.
Through the steps, the desktop plane and the objects on the desk can be identified according to the characteristic points in the real-time image, the included angle between the objects and the desktop is calculated, and the eye health condition of the user can be accurately judged and monitored according to the information.
Further, the face information is obtained by calculation through the following steps:
acquiring characteristic points in the human head image;
determining the coordinates of each organ of the face according to the feature points;
and obtaining the orientation angle of the human face in the three-dimensional space according to the coordinates of each organ by using a fitting algorithm.
Through the steps, the coordinates of each organ in the human face can be obtained and the orientation angle information of the human face can be fitted according to each characteristic point of the real-time image, so that the eye health condition of the user can be accurately judged and monitored in the follow-up process according to the information.
Further, the scene determination rule is:
calculating the sight direction of the user according to the face orientation information and pupil coordinates in face organ coordinates;
judging whether the object outline is identified:
if the object outline is identified, judging whether the object outline is in the sight line direction, and if so, judging that the eye using scene of the user is a reading scene;
if the object outline is not identified, whether the desktop plane is in the sight line direction or not is judged, and if yes, the eye using scene of the user is judged to be a writing scene.
By the scene judgment rule, the real-time eye use scene of the user can be accurately judged, and different eye use posture information of the user can be conveniently monitored according to the eye use scene in the follow-up process.
Further, the eye use judgment rule is as follows:
when the eye scene is a writing scene:
calculating the writing distance from the eyes to the desktop according to the eye coordinates and the desktop plane; when the writing distance is smaller than a preset writing distance threshold value, judging that the eye use health condition of the user is abnormal;
and/or calculating the inclination angle of the head by using the coordinates of the nose tip and the eyes of the human face, and calculating the included angle between the inclination angle of the head and the Z axis of a preset camera coordinate system; when the included angle is smaller than a preset inclined included angle threshold value, judging that the eye use health condition of the user is abnormal;
when the eye scene is a reading scene:
calculating the reading distance from the eyes to the object according to the eye coordinates and the object outline, and judging the eye use health condition of the user as abnormal eye use when the reading distance is smaller than a preset reading distance threshold;
and/or calculating a backlight included angle according to the included angle between the object and the desktop and the included angle between the preset light and the desktop; the backlight included angle is an angle generated when the light ray is rotated to the object as an initial edge; and when the backlight included angle is smaller than a preset backlight included angle threshold value, judging that the eye use health condition of the user is abnormal for the backlight.
By the eye use judgment rule, the eye use distance or the backlight reading condition of the user is judged according to different eye use scenes, and the eye use health condition of the user can be comprehensively and timely monitored according to different eye use scenes of the user.
The invention also correspondingly discloses an eye health condition monitoring system, which comprises:
the image acquisition device is used for acquiring a human head image and an eye scene image;
the image processing device is used for extracting the characteristic points of the human head image and the eye scene image and calculating to obtain scene information and face information; the scene information comprises a desktop plane, an object outline and an included angle between the object and the desktop; the object is a paper reading material or an electronic reading device; the face information comprises face organ coordinates and face orientation angles;
the scene judgment device is used for judging the eye use scene of the user according to the scene information and the face information by using a preset scene judgment rule;
and the eye use judgment device is used for calculating the eye use distance or the backlight reading condition of the user according to the scene information, the face information and the eye use scene by using a preset eye use judgment rule, and judging the eye use health condition of the user.
Compared with the prior art, the system has the advantages that:
the system monitors the eye health condition of the user, can acquire required scene information and face information in real time from the acquired human head image and eye scene image, judges the eye scene of the user in advance according to a preset scene judgment rule, and judges the eye distance or backlight reading condition of the user according to different eye scenes and the eye judgment rule.
Further, the image processing apparatus calculates the scene information by:
extracting desktop feature points and object feature points in the eye scene image;
fitting a desktop plane according to the desktop feature points;
identifying an object outline according to the object feature points;
and calculating the included angle between the object and the desktop according to the desktop plane and the object outline.
Through the steps, the image processing device can identify the desktop plane and the objects on the desk according to the characteristic points in the real-time image, and calculate the included angle between the objects and the desktop, so that the eye health condition of the user can be accurately judged and monitored subsequently according to the information.
Further, the image processing apparatus calculates the face information by:
acquiring characteristic points in the human head image;
determining the coordinates of each organ of the face according to the feature points;
and obtaining the orientation angle of the human face in the three-dimensional space according to the coordinates of each organ by using a fitting algorithm.
Through the steps, the image processing device can acquire the coordinates of each organ in the human face and fit the orientation angle information of the human face according to each characteristic point of the real-time image, so that the eye health condition of the user can be judged and monitored more accurately according to the information.
Further, the scene judging means judges the eye scene by:
calculating the sight direction of the user according to the face orientation information and pupil coordinates in face organ coordinates;
judging whether the object outline is identified:
if the object outline is identified, judging whether the object outline is in the sight line direction, and if so, judging that the eye using scene of the user is a reading scene;
if the object outline is not identified, whether the desktop plane is in the sight line direction or not is judged, and if yes, the eye using scene of the user is judged to be a writing scene.
By the scene judging device, the real-time eye use scene of the user can be accurately judged, and different eye use posture information of the user can be conveniently monitored according to the eye use scene in the follow-up process.
Further, the eye use judgment means judges the eye use health condition by:
when the eye scene is a writing scene, calculating the writing distance from the eyes to the desktop according to the eye coordinates and the desktop plane; when the writing distance is smaller than a preset writing distance threshold value, judging that the eye use health condition of the user is abnormal;
when the eye scene is a reading scene:
calculating the reading distance from the eyes to the object according to the eye coordinates and the object outline, and judging the eye use health condition of the user as abnormal eye use when the reading distance is smaller than a preset reading distance threshold;
and/or calculating a backlight included angle according to the included angle between the object and the desktop and the included angle between the preset light and the desktop; the backlight included angle is an angle generated when the light ray is rotated to the object as an initial edge; and when the backlight included angle is smaller than a preset backlight included angle threshold value, judging that the eye use health condition of the user is abnormal for the backlight.
By the eye use judgment device, the eye use distance or the backlight reading condition of the user is judged according to different eye use scenes, and the eye use health condition of the user can be comprehensively and timely monitored according to different eye use scenes of the user.
Detailed Description
The preferred embodiments of the present invention will be described in conjunction with the accompanying drawings, and it will be understood that they are described herein for the purpose of illustration and explanation and not limitation.
Example 1
As shown in fig. 1, this embodiment 1 discloses an eye health monitoring method, which includes the following steps:
s1, acquiring a human head image and an eye scene image;
specifically, three-dimensional imaging can be performed through a plurality of cameras to form a three-dimensional image; specifically, the three-dimensional image includes an image of a human head and an image of a desktop object; specifically, the camera can be arranged in front of lighting equipment such as a desk lamp and the like to acquire real-time images of a user;
s2, extracting the feature points of the eye scene image, and calculating to obtain scene information; the scene information comprises a desktop plane, an object outline and an included angle between the object and the desktop; the object is a paper reading material or an electronic reading device; specifically, the object may be a book or a readable electronic display device such as a tablet computer or a mobile phone; specifically, step S2 includes:
s21, extracting desktop feature points and object feature points in the eye scene image; specifically, feature points of a desktop in the three-dimensional image and edge feature points of an object, such as a book or an electronic device, can be extracted through a feature point detection algorithm; specifically, extracting desktop feature points of a desktop in the image by using an angular point detection algorithm; specifically, a corner detection algorithm and an edge detection algorithm are combined to extract candidate characteristic points of an object in the image;
s22, fitting a desktop plane according to the desktop feature points; specifically, a plane fitting algorithm is used for fitting and eliminating noise of the desktop feature points to obtain a desktop plane equation;
s23, identifying the object outline according to the object characteristic points; specifically, an object recognition classifier, namely a small neural network classifier, is used for recognizing candidate feature points of the object and determining whether the object is a book or an electronic device;
and S24, calculating the included angle between the object and the desktop according to the desktop plane and the object outline.
Through the steps, the desktop plane and the objects on the desk can be identified according to the characteristic points in the real-time image, the included angle between the objects and the desktop is calculated, and the eye health condition of the user can be accurately judged and monitored according to the information.
S3, extracting characteristic points of the human head image, and calculating to obtain human face information; the face information comprises face organ coordinates and face orientation angles; specifically, step S3 includes:
s31, obtaining characteristic points in the human head image; specifically, a neural network algorithm is used for extracting a plurality of characteristic points of the head and the face of the human body in the three-dimensional image; specifically, LAB and SURF characteristics in the image are extracted, and then a plurality of edge characteristic points of the head and the face are located by utilizing an MLP (multi-layer perceptron) neural network;
s32, determining the coordinates of each organ of the human face according to the feature points; specifically, three-dimensional coordinates of feature points of organs including eyes, eyebrows, a nose, lips and the like are determined; specifically, if the human head image comprises an eye image, the coordinates of the eyes are directly positioned, if the eye image is shielded or cannot be directly shot due to the angle problem, the position coordinates of the eyes are calculated by combining a human standard face model through other effectively recognized organ characteristic points, and the positioning effect is achieved.
S33, obtaining the orientation angle of the face in the three-dimensional space according to the coordinates of each organ by using a fitting algorithm; specifically, a fitting algorithm is adopted to filter errors, and the main orientation of the face in the three-dimensional space is obtained.
Through the steps, the coordinates of each organ in the human face can be obtained and the orientation angle information of the human face can be fitted according to each characteristic point of the real-time image, so that the eye health condition of the user can be accurately judged and monitored in the follow-up process according to the information.
S4, judging the eye use scene of the user according to the scene information and the face information by using a preset scene judgment rule; specifically, the scene judgment rule is as follows:
calculating the sight direction of the user according to the face orientation information and the pupil coordinates in the face organ coordinates; specifically, coordinate correction is carried out on the face orientation angle by using pupil coordinates to obtain the sight line direction;
judging whether an object outline is identified:
if the object outline is identified, judging whether the object outline is in the sight line direction, and if so, judging that the eye using scene of the user is a reading scene;
if the object outline is not identified, judging whether the desktop plane is in the sight line direction, and if so, judging that the eye using scene of the user is a writing scene.
By the scene judgment rule, the real-time eye use scene of the user can be accurately judged, and different eye use posture information of the user can be conveniently monitored according to the eye use scene in the follow-up process.
S5, calculating the eye distance or the backlight reading condition of the user according to the scene information, the face information, and the eye scene using a preset eye judgment rule, and judging the eye health condition of the user, specifically, the eye judgment rule is:
when the eye scene is a writing scene:
calculating the writing distance from the eyes to the desktop according to the eye coordinates and the desktop plane; when the writing distance is smaller than a preset writing distance threshold value, judging that the eye use health condition of the user is abnormal;
and/or calculating the inclination angle of the head by using the coordinates of the nose tip and the eyes of the human face, and calculating the included angle between the inclination angle of the head and the Z axis of a preset camera coordinate system; when the included angle is smaller than a preset inclined included angle threshold value, namely the head of the user is inclined during reading, judging that the eye using health condition of the user is eye using abnormity;
when the eye scene is a reading scene:
calculating the reading distance from eyes to the object according to the eye coordinates and the object outline, and judging the eye use health condition of the user as eye use abnormity when the reading distance is smaller than a preset reading distance threshold value;
and/or calculating a backlight included angle according to the included angle between the object and the desktop and the included angle between the preset light and the desktop; the backlight included angle is an angle generated by rotating the light ray as an initial edge to an object; and when the backlight included angle is smaller than a preset backlight included angle threshold value, judging that the eye use health condition of the user is abnormal for the backlight.
Specifically, the information of the included angle between the preset light and the desktop can be determined according to the known light information in the environment, for example, when a camera arranged in the desk lamp device is used as the image acquisition device, the relative position between the camera and the light source can be obtained through calibration, that is, the light irradiation of the desk lamp device is known information, it can be easily known whether the light falls on the front or the back of the object through geometric calculation, if the light falls on the back, the included angle of the backlight is a negative number, and the light is determined as backlight, as shown in fig. 2; if the light falls on the front surface and the included angle A between the light and the object is less than 30 degrees, the light is also judged to be backlight, as shown in FIG. 3;
specifically, the information of the included angle between the preset light and the desktop can be determined by using a sensor for detection, and the irradiation angle of the actual light is determined by detecting the light through the sensor arranged at the fixed position; specifically, if the position of the light source is unknown or no light source is used, the light source is preset to be vertically downward-emitted astigmatic light, at the moment, the backlight included angle is directly calculated through the included angle between the object and the desktop, and the backlight is judged as the backlight if the included angle is larger than 45 degrees.
By the eye use judgment rule, the eye use distance or the backlight reading condition of the user is judged according to different eye use scenes, and the eye use health condition of the user can be comprehensively and timely monitored according to different eye use scenes of the user.
And S6, generating an alarm signal for triggering the alarm when the eye use condition of the user is judged to be abnormal. Specifically, the alarm may be a buzzer, a speaker, or other sound alarm device, or may be a physical alarm device, such as a sprayer, or an indicator LED lamp, and the alarm may be muted by lighting the indicator lamp.
The method for monitoring the eye health condition disclosed by the embodiment can acquire required scene information and face information in real time from the acquired human head image and eye scene image, judge the eye scene of the user in advance according to the preset scene judgment rule, and judge the eye distance or backlight reading condition of the user according to different eye scenes and the eye judgment rule.
Example 2
As shown in fig. 4, the present embodiment discloses an eye health monitoring system corresponding to the eye health monitoring method in embodiment 1, including:
the image acquisition device 1 is used for acquiring a human head image and an eye scene image;
the image processing device 2 is used for extracting the characteristic points of the human head image and the eye scene image and calculating to obtain scene information and face information; the scene information comprises a desktop plane, an object outline and an included angle between the object and the desktop; the object is a paper reading material or an electronic reading device; the face information comprises face organ coordinates and face orientation angles;
specifically, the image processing apparatus 2 is provided with an execution code for executing the following steps to calculate scene information:
extracting desktop feature points and object feature points in the eye scene image;
fitting a desktop plane according to the desktop feature points;
identifying the object outline according to the object characteristic points;
and calculating the included angle between the object and the desktop according to the desktop plane and the object outline.
Through the steps, the image processing device can identify the desktop plane and the objects on the desk according to the characteristic points in the real-time image, and calculate the included angle between the objects and the desktop, so that the eye health condition of the user can be accurately judged and monitored subsequently according to the information.
Specifically, the image processing apparatus 2 is provided with an execution code therein, which is used for executing the following steps to calculate the face information:
acquiring characteristic points in a human head image;
determining the coordinates of each organ of the face according to the feature points;
and obtaining the orientation angle of the human face in the three-dimensional space according to the coordinates of each organ by using a fitting algorithm.
The scene judgment device 3 is used for judging the eye use scene of the user according to the scene information and the face information by using a preset scene judgment rule;
specifically, the scene determination device 3 is provided with an execution code therein, and is configured to perform the following steps to determine the eye scene:
calculating the sight direction of the user according to the face orientation information and the pupil coordinates in the face organ coordinates;
judging whether an object outline is identified:
if the object outline is identified, judging whether the object outline is in the sight line direction, and if so, judging that the eye using scene of the user is a reading scene;
if the object outline is not identified, judging whether the desktop plane is in the sight line direction, and if so, judging that the eye using scene of the user is a writing scene.
And the eye use judgment device 4 is used for calculating the eye use distance or the backlight reading condition of the user according to the scene information, the face information and the eye use scene by using a preset eye use judgment rule, and judging the eye use health condition of the user.
Specifically, the eye health condition determining device 4 is provided with an executing code for executing the following steps:
when the eye scene is a writing scene:
calculating the writing distance from the eyes to the desktop according to the eye coordinates and the desktop plane; when the writing distance is smaller than a preset writing distance threshold value, judging that the eye use health condition of the user is abnormal;
and/or calculating the inclination angle of the head by using the coordinates of the nose tip and the eyes of the human face, and calculating the included angle between the inclination angle of the head and the Z axis of a preset camera coordinate system; when the included angle is smaller than a preset inclined included angle threshold value, judging the eye use health condition of the user as eye use abnormity;
when the eye scene is a reading scene:
calculating the reading distance from eyes to the object according to the eye coordinates and the object outline, and judging the eye use health condition of the user as eye use abnormity when the reading distance is smaller than a preset reading distance threshold value;
and/or calculating a backlight included angle according to the included angle between the object and the desktop and the included angle between the preset light and the desktop; the backlight included angle is an angle generated by rotating the light ray as an initial edge to an object; and when the backlight included angle is smaller than a preset backlight included angle threshold value, judging that the eye use health condition of the user is abnormal for the backlight.
Specifically, the eye use determination device 4 generates an alarm signal when determining that the eye use health condition of the user is an eye use abnormality.
Specifically, the eye health condition monitoring system disclosed in this embodiment further includes an alarm device, which automatically alarms when receiving the alarm signal. Specifically, the alarm device may be a buzzer, a speaker, or other sound alarm device, or may be a physical alarm device, such as a sprayer, or an indicator LED lamp, and the alarm device may be configured to perform a silent prompt by lighting the indicator lamp.
The eye health monitoring system disclosed in embodiment 2 corresponds to the eye health monitoring method disclosed in embodiment 1, and the specific technical details and technical effects thereof are also similar, and are not described herein again.
Example 3
As shown in fig. 5, this embodiment corresponds to the technical solutions disclosed in embodiments 1 and 2, and discloses an eye health condition monitoring desk lamp, which includes a desk lamp base 1, a desk lamp support 2, an adjustable supporting component 3 and an illuminating component 4; specifically, four cameras 5 are arranged on the support 2 in the vertical direction; the camera 5 is used for acquiring a real-time image of a user and performing three-dimensional imaging to form a three-dimensional image; specifically, the three-dimensional image includes an image of a human head and an image of a desktop object;
specifically, a processor is arranged in the desk lamp and connected with the camera, and an image processing device, a scene judgment device and an eye use judgment device as described in embodiment 2 are arranged in the processor in a code form, so that the method disclosed in embodiment 1 can be executed, the image acquired by the camera is processed in real time, and the eye use condition of the user is judged; specifically, the desk lamp is further provided with a loudspeaker, and when the eye use condition of the user is judged to be abnormal, the desk lamp can give out a sound to give an alarm, remind the user and correct the wrong eye use habit of the user in time.
The above embodiments are only preferred embodiments of the present invention, and the present invention is not limited thereto in any way, so that any modification, equivalent change and modification of the above embodiments according to the technical essence of the present invention are within the scope of the present invention.