CN116228748B - Balance function analysis method and system based on eye movement tracking - Google Patents

Balance function analysis method and system based on eye movement tracking Download PDF

Info

Publication number
CN116228748B
CN116228748B CN202310483424.7A CN202310483424A CN116228748B CN 116228748 B CN116228748 B CN 116228748B CN 202310483424 A CN202310483424 A CN 202310483424A CN 116228748 B CN116228748 B CN 116228748B
Authority
CN
China
Prior art keywords
eye movement
visual angle
angle image
stc
time
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310483424.7A
Other languages
Chinese (zh)
Other versions
CN116228748A (en
Inventor
庄建华
李斐
罗旭
屈寅弘
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tianjin Zhiting Medical Technology Co ltd
Shanghai Changzheng Hospital
Original Assignee
Tianjin Zhiting Medical Technology Co ltd
Shanghai Changzheng Hospital
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tianjin Zhiting Medical Technology Co ltd, Shanghai Changzheng Hospital filed Critical Tianjin Zhiting Medical Technology Co ltd
Priority to CN202310483424.7A priority Critical patent/CN116228748B/en
Publication of CN116228748A publication Critical patent/CN116228748A/en
Application granted granted Critical
Publication of CN116228748B publication Critical patent/CN116228748B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/30ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30041Eye; Retina; Ophthalmic

Abstract

The invention claims a balance function analysis method and a system based on eye movement tracking, which detect and output eye movement data and visual angle image data in real time; judging whether the visual angle image responds to eye movement, if so, adopting continuous following operation by the visual angle image, and if not, carrying out balance risk early warning by the visual angle image; under the conditions that the visual angle image takes continuous following operation and the visual angle image does not take continuous following operation, respectively, further anticipating whether the eye movement takes real-time forward feedback or not; and determining the matching balance degree of the eye movement and the visual angle image according to the judging result of whether the visual angle image takes continuous following operation and the expected result of whether the eye movement takes real-time forward feedback. The method and the device can effectively collect, grasp and analyze eye movement characteristics of the user in real time, accurately track and analyze the eye movement characteristics of the user, finally evaluate and analyze the movement balance of the user efficiently and simply, and guide the user to take emergency measures.

Description

Balance function analysis method and system based on eye movement tracking
Technical Field
The application belongs to the field of biological signal detection, and particularly relates to a balance function analysis method and system based on eye movement tracking.
Background
The balance of the human body is a very precise science, in which complex operations of four organs are involved: vision, proprioception, vestibule of the inner ear, and cerebellum. Vision provides a notion of our spatial location, so when we close the eye, or vision is poor, the sense of balance is worsened. Proprioception is the nerve ending of the muscles of the limbs, which can feel the relative stimulus such as the position, the posture, the balance and the like of the user, and then reflects information to the muscle tissues to make the movement in a coordinated state. The inner ear is a system comprising the cochlea, the semicircular canal, and the vestibular system, wherein the cochlea is the auditory organ, and the semicircular canal and the vestibular system are the balance system of the human body.
When these balance organs are operated together, everything is normal, and once they are contradicted, discomfort may be caused. Dizziness is a motor or positional illusion, is the conflict between in-vivo pathological or physiological positional sense stimulus and brain advanced sensory center, and is the expression of dysfunction of the human body balance system, including the rotation sense of a patient or the rotation sense, the swinging sense, the floating sense, the lifting sense, the tilting sense and the like of surrounding scenes. Whereas those with heavy head and faint and sinking sensation and syncope do not fall into the dizziness category.
The current dizzy detection is generally carried out independently through the visual image or the eye movement condition of eyes, and the real situation is that the real-time dizzy condition of personnel and the follow-up dizzy prediction possibility can not be accurately identified by simply depending on the visual image or the eye movement condition, so that the detection distortion condition is very easy to occur.
Disclosure of Invention
In order to solve the problem of inaccurate detection of the balance of the current user, according to a first aspect of the present invention, the present invention claims a balance function analysis method based on eye tracking, which is characterized by comprising:
detecting and outputting eye movement data and visual angle image data in real time;
judging whether the visual angle image responds to eye movement, if the visual angle image responds to eye movement, the visual angle image adopts continuous following operation, and if the visual angle image does not respond to eye movement, the visual angle image carries out balance risk early warning;
under the conditions that the visual angle image takes continuous following operation and the visual angle image does not take continuous following operation, respectively, further anticipating whether the eye movement takes real-time forward feedback or not;
and determining the matching balance degree of the eye movement and the visual angle image according to the judging result of whether the visual angle image takes continuous following operation and the expected result of whether the eye movement takes real-time forward feedback.
Further, the view image taking a continuous following operation includes: viewing angle rotation following or viewing angle translation following;
the step of determining the matching balance degree of the eye movement and the visual angle image according to the judging result of whether the visual angle image takes continuous following operation and the expected result of whether the eye movement takes real-time forward feedback comprises the following steps:
under the condition that the visual angle image is continuously followed, determining an STC embroidery curve of which the balance of the eye movement and the visual angle image is not matched;
on the basis of the STC embroidery curve, the matching balance degree of eye movement and visual angle images is further determined;
wherein, the step of determining the STC embroidery curve of which the eye movement and the visual angle image balance are not matched comprises the following steps:
determining a real-time detection distance range of eye movement;
calculating the fastest time required by the safe following eye movement of the visual angle image according to the visual angle rotation following speed of the visual angle image, the visual angle translation following speed of the visual angle image and the eye movement distance;
calculating the time required by the visual angle image to follow the dizzy according to the initial speed of the visual angle image and the fastest time required by the visual angle image to safely follow the eye movement, and obtaining the shortest time required by the visual angle image to avoid the dizzy;
and calculating the time safety boundary of the eye movement and the time safety boundary of the visual angle image reaching the predicted dizzy point respectively according to the time of the eye movement reaching the predicted dizzy point and the shortest time required by the visual angle image to avoid dizzy, wherein the time safety boundary is an STC embroidery curve.
Further, the step of determining the matching balance of the eye movement and the view angle image according to whether the view angle image takes the expected result of continuous following operation and whether the eye movement takes real-time positive feedback comprises:
under the condition that the visual angle image does not adopt continuous following operation and the eye movement does not adopt real-time forward feedback, determining a first balance mismatch interval;
determining a second balance mismatch interval in the case that the view angle image does not take continuous follow-up operation while the eye movement takes instantaneous correction;
under the condition that the visual angle image adopts continuous following operation and the eye movement does not adopt real-time forward feedback, determining a third balance mismatch interval; and
in the case where the view angle image takes continuous follow-up operation while the eye movement takes instantaneous correction, a fourth balance mismatch interval is determined.
Further, after the step of detecting and outputting the eye movement data and the angle-of-view image data in real time, it further includes:
in the case of an eye movement transient correction and an eye movement transient steering, a fifth balance mismatch interval is determined.
Further, the STC embroidery curve, in which the eye movement and the viewing angle image balance are not matched, is calculated by using the following formula:
Figure SMS_1
Figure SMS_2
Figure SMS_3
Figure SMS_4
Figure SMS_5
Figure SMS_6
Figure SMS_7
Figure SMS_10
representing the furthest distance that eye movement detects in real time; / >
Figure SMS_15
Follow-up movement speed representing eye movement;/>
Figure SMS_18
Representing the time of eye movement reaching the furthest real-time detection position under the current follow-up movement speed; />
Figure SMS_9
Time representing predicted dizziness of eye movement arrival, +.>
Figure SMS_13
Is a variable; />
Figure SMS_17
Representing the following speed of the view angle rotation of the view angle image; />
Figure SMS_20
Representing the following speed of the visual angle image variable speed motion; />
Figure SMS_8
Representing the fastest time required for reaching the center of eye movement when the visual angle image safely follows the eye movement when the visual angle image rotates along with the visual angle; />
Figure SMS_12
Representing the fastest time length from the edge of eye movement required when the visual angle image safely follows the eye movement when the visual angle image changes speed and moves along with the eye movement; eyeMargin stands for eye movement distance; />
Figure SMS_16
Representing an initial velocity of the view image before it begins to follow; />
Figure SMS_19
Representing the shortest time required for the visual angle image to safely follow eye movement when the visual angle image rotates at the visual angle; />
Figure SMS_11
Representing the shortest time required for the visual angle image to safely follow eye movement when the visual angle image changes speed and moves; />
Figure SMS_14
STC embroidered curves representing eye movement and visual angle image balance mismatch.
Further, in the case where the view angle image does not take continuous follow-up operation and the eye movement does not take real-time positive feedback, the first matching balance is determined according to the following formula:
Figure SMS_21
The first matching balance degree is less than or equal to
Figure SMS_22
Figure SMS_23
Figure SMS_24
Figure SMS_25
Wherein, the liquid crystal display device comprises a liquid crystal display device,
Figure SMS_26
a follow-up movement speed representing eye movement; />
Figure SMS_27
Representing the time for eye movement to reach a predicted dizziness point; eyeSpeed represents the current motion speed of the visual angle image detected by eye movement in real time in the actual motion scene; />
Figure SMS_28
Represented by
Figure SMS_29
In the range, the eye moves in a first following motion range of the following motion direction; eyeMargin stands for eye movement distance;
Figure SMS_30
representing the need for the visual angle image to safely follow eye movement when the visual angle image is moving at the speed of eye speedA first fastest time period from the center of eye movement; />
Figure SMS_31
Representing the second fastest time period required for the view image to safely follow the eye movement from the edge of the eye movement when the view image is moving at the eye speed.
Further, in the case where the view angle image does not take the continuous following operation and the eye movement takes the instantaneous correction, the second matching balance degree is determined according to the following formula:
Figure SMS_32
the first matching balance degree is less than or equal to ∈>
Figure SMS_33
Figure SMS_34
Figure SMS_35
Figure SMS_36
Wherein, the liquid crystal display device comprises a liquid crystal display device,
Figure SMS_37
a follow-up movement speed representing eye movement; />
Figure SMS_38
Representing the time for eye movement to reach a predicted dizziness point; eyeSpeed represents the current motion speed of the visual angle image detected by eye movement in real time in the actual motion scene; pauseTime-max represents the eye movement maximum correction tracking movement dead time; / >
Figure SMS_39
Represented by->
Figure SMS_40
In the range, the eye moves in a second following motion range of the following motion direction; eyeMargin stands for eye movement distance; />
Figure SMS_41
Representing a third fastest time period from the center of eye movement required when the visual angle image safely follows the eye movement when the visual angle image moves at the eye speed; />
Figure SMS_42
Representing the fourth fastest time period required for the view image to safely follow the eye movement from the edge of the eye movement when the view image is moving at the eye speed.
According to a second aspect of the present invention, the present invention claims a balance function analysis system based on eye tracking, comprising:
the real-time detection module is used for detecting and outputting the eye movement data and the visual angle image data in real time;
the response detection module is used for judging whether the visual angle image responds to eye movement, if the visual angle image responds to eye movement, the visual angle image adopts continuous following operation, and if the visual angle image does not respond to eye movement, the visual angle image carries out balance risk early warning;
the feedback module is used for respectively further predicting whether the eye movement adopts real-time forward feedback or not under the conditions that the visual angle image adopts continuous following operation and the visual angle image does not adopt continuous following operation;
and the balance matching module is used for determining the matching balance degree of the eye movement and the visual angle image according to the judging result of whether the visual angle image is subjected to continuous following operation and the expected result of whether the eye movement is subjected to real-time forward feedback.
Further, the view image taking a continuous following operation includes: viewing angle rotation following or viewing angle translation following;
the step of determining the matching balance degree of the eye movement and the visual angle image according to the judging result of whether the visual angle image takes continuous following operation and the expected result of whether the eye movement takes real-time forward feedback comprises the following steps:
under the condition that the visual angle image is continuously followed, determining an STC embroidery curve of which the balance of the eye movement and the visual angle image is not matched;
on the basis of the STC embroidery curve, the matching balance degree of eye movement and visual angle images is further determined;
wherein, the step of determining the STC embroidery curve of which the eye movement and the visual angle image balance are not matched comprises the following steps:
determining a real-time detection distance range of eye movement;
calculating the fastest time required by the safe following eye movement of the visual angle image according to the visual angle rotation following speed of the visual angle image, the visual angle translation following speed of the visual angle image and the eye movement distance;
calculating the time required by the visual angle image to follow the dizzy according to the initial speed of the visual angle image and the fastest time required by the visual angle image to safely follow the eye movement, and obtaining the shortest time required by the visual angle image to avoid the dizzy;
calculating time safety boundaries of the eye movement and the visual angle image reaching the predicted dizzy point respectively according to the time of the eye movement reaching the predicted dizzy point and the shortest time required by the visual angle image to avoid dizzy, wherein the time safety boundaries are STC embroidery curves;
The step of determining the degree of matching balance of eye movement and view angle image based on whether the view angle image takes continuous follow-up operation and whether the eye movement takes an expected result of real-time positive feedback comprises:
under the condition that the visual angle image does not adopt continuous following operation and the eye movement does not adopt real-time forward feedback, determining a first balance mismatch interval;
determining a second balance mismatch interval in the case that the view angle image does not take continuous follow-up operation while the eye movement takes instantaneous correction;
under the condition that the visual angle image adopts continuous following operation and the eye movement does not adopt real-time forward feedback, determining a third balance mismatch interval; and
determining a fourth balance mismatch interval under the condition that the visual angle image adopts continuous following operation and the eye movement adopts instantaneous correction;
after the step of detecting and outputting the eye movement data and the angle-of-view image data in real time, further comprising:
in the case of an eye movement transient correction and an eye movement transient steering, a fifth balance mismatch interval is determined.
Further, the STC embroidery curve, in which the eye movement and the viewing angle image balance are not matched, is calculated by using the following formula:
Figure SMS_43
Figure SMS_44
Figure SMS_45
Figure SMS_46
Figure SMS_47
Figure SMS_48
Figure SMS_49
Figure SMS_52
representing the furthest distance that eye movement detects in real time; />
Figure SMS_55
A follow-up movement speed representing eye movement; / >
Figure SMS_59
Representing the time of eye movement reaching the furthest real-time detection position under the current follow-up movement speed; />
Figure SMS_51
Time representing predicted dizziness of eye movement arrival, +.>
Figure SMS_56
Is a variable; />
Figure SMS_61
Representing the following speed of the view angle rotation of the view angle image; />
Figure SMS_62
Representing the following speed of the visual angle image variable speed motion; />
Figure SMS_50
Representing the fastest time required for reaching the center of eye movement when the visual angle image safely follows the eye movement when the visual angle image rotates along with the visual angle; />
Figure SMS_54
Representing the fastest time length from the edge of eye movement required when the visual angle image safely follows the eye movement when the visual angle image changes speed and moves along with the eye movement; eyeMargin stands for eye movement distance; />
Figure SMS_58
Representing an initial velocity of the view image before it begins to follow; />
Figure SMS_60
Representing the shortest time required for the visual angle image to safely follow eye movement when the visual angle image rotates at the visual angle; />
Figure SMS_53
Representing the shortest time required for the visual angle image to safely follow eye movement when the visual angle image changes speed and moves; />
Figure SMS_57
STC embroidery curves representing mismatch of eye movement and visual angle image balance;
in the case that the view angle image does not take continuous following operation and the eye movement does not take real-time positive feedback, determining a first matching balance according to the following formula:
Figure SMS_63
the first matching balance degree is less than or equal to
Figure SMS_64
Figure SMS_65
Figure SMS_66
Figure SMS_67
Wherein, the liquid crystal display device comprises a liquid crystal display device,
Figure SMS_68
a follow-up movement speed representing eye movement; / >
Figure SMS_69
Representing the time for eye movement to reach a predicted dizziness point; eyeSpeed represents the current motion speed of the visual angle image detected by eye movement in real time in the actual motion scene; />
Figure SMS_70
Represented by
Figure SMS_71
In the range, the eye moves in a first following motion range of the following motion direction; eyeMargin stands for eye movement distance;
Figure SMS_72
representing a first fastest time period from the center of eye movement required for the visual angle image to safely follow the eye movement when the visual angle image moves at the eye speed; />
Figure SMS_73
Representing a second fastest time period required for the visual angle image to safely follow the eye movement when the visual angle image moves at the eye speed;
in the case that the view angle image does not take continuous follow-up operation and the eye movement takes instantaneous correction, determining a second matching balance according to the following formula:
Figure SMS_74
the first matching balance degree is less than or equal to ∈>
Figure SMS_75
Figure SMS_76
Figure SMS_77
Figure SMS_78
Wherein, the liquid crystal display device comprises a liquid crystal display device,
Figure SMS_79
a follow-up movement speed representing eye movement; />
Figure SMS_80
Representing the time for eye movement to reach a predicted dizziness point; eyeSpeed represents the current motion speed of the visual angle image detected by eye movement in real time in the actual motion scene; pauseTime-max represents the eye movement maximum correction tracking movement dead time; />
Figure SMS_81
Represented by->
Figure SMS_82
In the range, the eye moves in a second following motion range of the following motion direction; eyeMargin stands for eye movement distance; / >
Figure SMS_83
Representing the third best distance from the center of eye movement required for the visual angle image to safely follow the eye movement when the visual angle image moves at the eye speedA fast time period; />
Figure SMS_84
Representing the fourth fastest time period required for the view image to safely follow the eye movement from the edge of the eye movement when the view image is moving at the eye speed.
The invention claims a balance function analysis method and a system based on eye movement tracking, which detect and output eye movement data and visual angle image data in real time; judging whether the visual angle image responds to eye movement, if so, adopting continuous following operation by the visual angle image, and if not, carrying out balance risk early warning by the visual angle image; under the conditions that the visual angle image takes continuous following operation and the visual angle image does not take continuous following operation, respectively, further anticipating whether the eye movement takes real-time forward feedback or not; and determining the matching balance degree of the eye movement and the visual angle image according to the judging result of whether the visual angle image takes continuous following operation and the expected result of whether the eye movement takes real-time forward feedback. The method and the device can effectively collect, grasp and analyze eye movement characteristics of the user in real time, accurately track and analyze the eye movement characteristics of the user, finally evaluate and analyze the movement balance of the user efficiently and simply, and guide the user to take emergency measures.
Drawings
FIG. 1 is a workflow diagram of a balance function analysis method based on eye tracking according to the present invention;
FIG. 2 is a second workflow diagram of a balance function analysis method based on eye tracking according to the present invention;
FIG. 3 is a block diagram of a balance function analysis system based on eye tracking according to the present invention;
FIG. 4 is a workflow diagram of an evaluation method of eye movement and visual angle image balance mismatch in accordance with the present invention;
fig. 5 is a block diagram of a system for evaluating eye movement and view angle image balance mismatch according to the present invention.
Detailed Description
In order to facilitate an understanding of the present application, a more complete description of the present application will now be provided with reference to the relevant figures. Preferred embodiments of the present application are shown in the accompanying drawings. This application may, however, be embodied in many different forms and is not limited to the embodiments described herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete.
It will be understood that when an element is referred to as being "fixed to" another element, it can be directly on the other element or intervening elements may also be present. When an element is referred to as being "connected" to another element, it can be directly connected to the other element or intervening elements may also be present. The terms "vertical," "horizontal," "left," "right," and the like are used herein for illustrative purposes only and are not meant to be the only embodiment.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used herein in the description of the application is for the purpose of describing particular embodiments only and is not intended to be limiting of the application.
The application relates to the field of biological signal detection, and provides a method for determining matching balance degree in the process of identifying eye movement-visual angle image interaction based on continuous following capability of eye movement and visual angle images. Compared with the traditional dizzy estimation/evaluation method, the method can obtain more accurate matching balance degree of eye movement-visual angle image interaction, and provides basis for risk judgment and decision operation of automatic eye movement.
The eye movement and visual angle images are taken as nursing participants of the road aged, interaction and collision can occur between the eye movement and visual angle images in the use process of the road, the movement state of the eye movement and visual angle images is influenced by inertia in the movement process of the eye movement and visual angle images, and when dangerous environments appear, the movement state of the eye movement and visual angle images cannot be controlled instantaneously so as to avoid dizzy. Therefore, direct blurring of the eye movement and view images is unavoidable in a certain range of time and space of human body interaction, and a region where there is a relative position of the eye movement-view images where balance is not matched is defined as an eye movement-view image matching balance. Another concept, stun occurrence time (swoon time concur, STC for short), is also needed to understand the present application. STC is a parameter widely used in evaluation of eye movement balance mismatch, and generally refers to the time when eye movement distance dizziness occurs, and is obtained by the ratio of the relative distance between eye movement and a dangerous place to the current following movement speed.
The following capability of the visual angle image in the walking process of the dangerous crowd is considered. View image follow-up capability refers to the continuous follow-up capability that a view image takes when a hazard is found. According to the examination and test results of nursing accidents of the old, the following capability of the visual angle image when the visual angle image encounters danger can reduce the risk of dangerous accidents. The following is therefore defined in this application as the following capability of the view angle image, based on the results of experiments conducted by the inventors, by quantifying the speed of the view angle image during the following process.
Referring to fig. 1, fig. 1 is a schematic diagram of a balance function analysis method based on eye tracking.
The balance function analysis method based on eye movement tracking comprises the following steps:
s100, detecting and outputting eye movement data and visual angle image data in real time.
The eye movement data includes: eye movement position, eye movement speed, eye movement direction, eye movement maximum correction follow-up movement stagnation time period, and eye movement maximum non-blinking time period. The perspective image data includes: view image position, view image speed, view image direction, and view image line direction. Both the eye movement data and the perspective image data may be obtained by an object-borne perception system.
S200, judging whether the visual angle image responds to eye movement.
In this step, the activation of the view image following capability depends on the direction of the view image line, and if the direction of the view image line is the detection of the eye movement giving feedback information, it is confirmed that the view image following capability is activated. If the eye movement giving feedback information is not detected in the line-of-sight direction of the visual angle image, the following capability of the visual angle image is not activated, and the visual angle image continues to keep normal movement operation. If the perspective image is responsive to eye movement, the perspective image takes a continuous follow-up operation. And if the visual angle image does not respond to the eye movement, the visual angle image carries out balance risk early warning. In one embodiment, the continuous follow-up operation taken by the perspective image includes: viewing angle rotation following or viewing angle translation following. Of course, based on the core design thought of the application, more visual angle image following operations can be designed into the scheme so as to determine more accurate matching balance degree.
S300, further anticipating whether the eye movement takes real-time positive feedback in the case where the view image takes continuous following operation and the view image does not take continuous following operation, respectively.
In this step, the real-time positive feedback taken by eye movement includes, but is not limited to, normal follow-up motion, transient correction, and transient steering. Real-time positive feedback, such as eye movement, may also include deceleration corrections by other means on the road.
S400, determining the matching balance degree of the eye movement and the visual angle image according to the expected result of whether the visual angle image takes continuous following operation and whether the eye movement takes real-time forward feedback.
In this step, the degree of matching balance between the eye movement and the view angle image is different in the case of whether the view angle image adopts the continuous following operation and whether the eye movement adopts the real-time forward feedback. It will be appreciated that in the case where the view image employs continuous follow-up operation while the eye movement employs real-time positive feedback, the degree of matching balance of the eye movement and the view image will be smaller.
In this embodiment, an object-based real-time detection system is first used to detect eye movement data and view angle image data in real time. Further judging whether the visual angle image adopts continuous following operation or whether the eye movement adopts real-time positive feedback. The matching balance degree of the eye movement and the visual angle image is calculated respectively under different conditions. In this embodiment, the balance function analysis method based on eye movement tracking considers the continuous following capability of the visual angle image and the real-time positive feedback of eye movement, so that the identification of the mismatch of the balance between the eye movement and the visual angle image is more sufficient. In the application, the effective matching balance degree is determined under different conditions, so that the safety of the visual angle image and the comfort of the eye movement following motion in the interaction process of the eye movement and the visual angle image can be effectively improved. By judging whether or not the view angle image is responsive to eye movement, it is possible to classify and quantify effective following operations when the view angle image faces a dangerous environment. In addition, the embodiment identifies the matching balance degree in the eye movement-visual angle image interaction process based on the visual angle image continuous following operation, and has important significance for improving the safety of automatic traveling eye movement.
In one embodiment, the eye tracking based balance function analysis method, the visual angle image taking continuous following operation includes: viewing angle rotation following or viewing angle translation following.
Referring to fig. 2, the step of determining the matching balance degree between the eye movement and the view angle image according to the expected result of whether the view angle image takes the continuous following operation and whether the eye movement takes the real-time forward feedback includes:
s410, under the condition that the visual angle image continuously follows, calculating time safety boundaries when the eye movement and the visual angle image respectively reach the predicted dizzy point, wherein the time safety boundaries are STC embroidery curves.
In this embodiment, it is determined that the following capability of the eye movement is not considered, and under the condition that the visual angle image is continuously followed, an STC embroidery curve of the eye movement and the visual angle image dizzy is obtained. In the application, if necessary, the following capability of eye movement can be determined, and under the condition that the visual angle image continuously follows, an STC embroidery curve of eye movement and visual angle image dizzy can be obtained.
In one embodiment, the step of determining an STC embroidery curve for which eye movement does not match the balance of the perspective image comprises:
s411, determining the eye movement real-time detection distance range.
In the step, the eye movement real-time detection distance range can be obtained through real-time detection by an object-borne real-time detection system.
S412, calculating the fastest time required for safely following the eye movement of the visual angle image according to the visual angle rotation following speed of the visual angle image, the visual angle translation following speed of the visual angle image and the eye movement distance.
In this step, the calculation is performed in two cases. And calculating the fastest time length required by the visual angle image to safely follow the eye movement according to the visual angle rotation following speed of the visual angle image and the eye movement distance, wherein the fastest time length is the first distance. And calculating the fastest time length required by the visual angle image to safely follow the eye movement according to the visual angle translation following speed of the visual angle image and the eye movement distance, and taking the fastest time length as a second distance.
S413, calculating the safety boundary of the dizzy time of the visual angle image according to the initial speed of the visual angle image and the fastest time required for the visual angle image to safely follow eye movement.
According to the initial speed of the visual angle image, the first safety boundary and the second safety boundary of the dizzy time of the visual angle image are respectively calculated by combining the first distance and the second distance calculated in the step S412. The first safety boundary and the second safety boundary are safety boundaries of dizzy time.
S414, calculating an STC embroidery curve of which the eye movement and the visual angle image balance are not matched according to the eye movement dizzy time and the safety boundary of the visual angle image dizzy time.
In this step, an STC embroidery curve in which the eye movement and the viewing angle image balance are not matched is determined according to the first safety boundary, the second safety boundary and the eye movement viewing angle image dizzy time range.
In the present embodiment, a specific step of determining or calculating the STC security envelope range under the continuous following capability of the view angle image when the following capability of the eye movement is not considered is given. Of course, it is also possible to determine or calculate a specific step of the STC security envelope range under the viewing angle image continuous following capability while considering the following capability of eye movement.
In one embodiment, in the steps S411 to S414, the following formula may be used to calculate the STC embroidery curve that the eye movement does not match the balance of the visual angle image:
Figure SMS_85
Figure SMS_86
Figure SMS_87
Figure SMS_88
Figure SMS_89
Figure SMS_90
Figure SMS_91
Figure SMS_94
representing the furthest distance that eye movement detects in real time; />
Figure SMS_97
A follow-up movement speed representing eye movement; />
Figure SMS_100
Representing the time of eye movement reaching the furthest real-time detection position under the current follow-up movement speed; />
Figure SMS_93
Time representing predicted dizziness of eye movement arrival, +.>
Figure SMS_98
Is a variable; />
Figure SMS_102
Representing the following speed of the view angle rotation of the view angle image; />
Figure SMS_104
Representing the following speed of the visual angle image variable speed motion; />
Figure SMS_92
Representing the fastest time required for reaching the center of eye movement when the visual angle image safely follows the eye movement when the visual angle image rotates along with the visual angle; / >
Figure SMS_96
Representing the fastest time length from the edge of eye movement required when the visual angle image safely follows the eye movement when the visual angle image changes speed and moves along with the eye movement; eyeMargin stands for eye movement distance; />
Figure SMS_101
Representing an initial velocity of the view image before it begins to follow; />
Figure SMS_103
Representing the shortest time required for the visual angle image to safely follow eye movement when the visual angle image rotates at the visual angle; />
Figure SMS_95
Representing the shortest time required for the visual angle image to safely follow eye movement when the visual angle image changes speed and moves; />
Figure SMS_99
STC embroidered curves representing eye movement and visual angle image balance mismatch.
The ne appearing in the above formula is an abbreviation of near-end, which indicates the side of the eye movement near the view angle image, called the center. Fe appearing in the above formula is an abbreviation for far-end, which indicates the side of the eye movement away from the view angle image, called the edge.
In consideration of the view angle image and the eye movement in the dangerous state environment, whether the eye movement is triggered or not continuously following measures is used for calculating the relative safety distance between the eye movement and the view angle image in the actual aged care environment. Because of the dynamic interaction process between the eye movement and the visual angle image, the mismatch of the balance of the eye movement and the visual angle image depends on the data of the following movement speed, the following movement direction, the relative position and the like of the eye movement and the visual angle image at the moment. Therefore, this section is to briefly explain a method for generating an eye movement-viewing angle image two-dimensional matching balance, and takes a certain known human interaction environment as an example to calculate the eye movement-viewing angle image two-dimensional matching balance under the environment. The environment is expected to be: the following movement velocity v=60 km/h of eye movement, and the view angle image detected in real time by eye movement is in front right of eye movement.
In one embodiment, the step of determining the matching balance of the eye movement and the view angle image according to whether the view angle image takes the expected result of continuous following operation and whether the eye movement takes real-time positive feedback S400 includes:
first, a first balance mismatch interval is determined in the case where the view angle image does not take continuous follow-up operation while the eye movement does not take real-time positive feedback.
In one embodiment, where the view image does not take continuous follow-up action and the eye movement does not take real-time positive feedback, a first matching balance is determined according to the following formula:
Figure SMS_105
the first matching balance degree is less than or equal to
Figure SMS_106
Figure SMS_107
Figure SMS_108
Figure SMS_109
Wherein, the liquid crystal display device comprises a liquid crystal display device,
Figure SMS_110
a follow-up movement speed representing eye movement; />
Figure SMS_111
Representing the time for eye movement to reach a predicted dizziness point; eyeSpeed represents the current motion speed of the visual angle image detected by eye movement in real time in the actual motion scene; />
Figure SMS_112
Represented by
Figure SMS_113
In the range, the eye moves in a first following motion range of the following motion direction; eyeMargin stands for eye movement distance;
Figure SMS_114
representing a first fastest time period from the center of eye movement required for the visual angle image to safely follow the eye movement when the visual angle image moves at the eye speed; />
Figure SMS_115
Representing the second fastest time period required for the view image to safely follow the eye movement from the edge of the eye movement when the view image is moving at the eye speed.
Second, a second balance mismatch interval is determined in the case where the view image does not take continuous follow-up operation while the eye movement takes instantaneous correction.
In one embodiment, where the view image does not take continuous follow-up action and the eye movement takes transient correction, a second matching balance is determined according to the following formula:
Figure SMS_116
the first matching balance degree is less than or equal to ∈>
Figure SMS_117
Figure SMS_118
Figure SMS_119
Figure SMS_120
Wherein, the liquid crystal display device comprises a liquid crystal display device,
Figure SMS_121
a follow-up movement speed representing eye movement; />
Figure SMS_122
Representing the time for eye movement to reach a predicted dizziness point; eyeSpeed represents the current motion speed of the visual angle image detected by eye movement in real time in the actual motion scene; pauseTime-max represents the eye movement maximum correction tracking movement dead time; />
Figure SMS_123
Represented by->
Figure SMS_124
In the range, the eye moves in a second following motion range of the following motion direction; eyeMargin stands for eye movement distance; />
Figure SMS_125
Representing a third fastest time period from the center of eye movement required when the visual angle image safely follows the eye movement when the visual angle image moves at the eye speed; />
Figure SMS_126
Representing the fourth fastest time period required for the view image to safely follow the eye movement from the edge of the eye movement when the view image is moving at the eye speed.
Third, under the condition that the visual angle image adopts continuous following operation and the eye movement does not adopt real-time forward feedback, a third balance mismatch interval is determined.
Fourth, a fourth balance mismatch interval is determined in the case where the view image takes continuous follow-up operation while the eye movement takes instantaneous correction.
In one embodiment, after the step of detecting and outputting the eye movement data and the view angle image data in real time, further comprising: in the case of an eye movement transient correction and an eye movement transient steering, a fifth balance mismatch interval is determined.
In one embodiment, when considering the following capability of eye movement, the eye movement real-time forward feedback includes eye movement normal following movement and eye movement transient correction (eye movement transient correction includes stationary transient correction and transient travel). The corrected distance of the eye movement may be determined based on the eye movement following movement velocity and the eye movement maximum corrected following movement dead time. Or the minimum running radius of the eye movement can be determined according to the eye movement following movement speed and the eye movement maximum non-blinking time length.
Specifically, the eye movement correction follow-up motion stagnation period refers to the ability of the eye movement to rapidly decrease the follow-up motion speed until stopping the object in the follow-up motion.
The maximum non-blinking time length of eye movement, namely, in the process of detecting the eye movement of a pedestrian, the blinking action of the pedestrian is recorded, and the time length with the longest interval between two blinks is obtained as the maximum non-blinking time length of eye movement and is used for representing whether the user has distraction or eye vision distraction.
In this embodiment, the following capability of the eye movement is considered, and the real-time forward feedback of the eye movement includes normal following movement of the eye movement and transient correction of the eye movement. While the eye movement transient correction may include both a transient correction in a stationary condition and a transient correction while traveling. In the present embodiment, the following ability of the eye movement is taken into consideration, so that the determination of the matching balance degree of the eye movement and the angle-of-view image can be more accurate.
In this embodiment, in the process of calculating the fifth balance mismatch interval, the present application further clarifies the effectiveness of steering on risk reduction. Wherein, the eye movement usually adopts correction operation to dangerous environment, and the correction and the turn dizzy risk cannot be judged. In the present embodiment, the eye movement real-time forward feedback (instantaneous correction in a stationary condition and instantaneous correction in traveling) is taken into consideration, and the following ability of the eye movement is taken into consideration, so that the determination of the matching balance degree of the eye movement and the view angle image can be more accurate.
In the above description of the present application, the effective following operation when the view angle image faces a dangerous environment is classified and quantified. And, the present application comprehensively considers the following ability of the person-object to divide the risk area. The matching balance degree determined in the eye movement-view angle image interaction process based on the view angle image continuous following operation identification has important significance for improving the safety of automatic traveling eye movement.
In a specific embodiment, the application scenario of the present application is on eye movement with active real-time detection capability. The data that the eye movement can detect the eye movement of the user includes: the speed of eye movement, the correction of the dead time of follow-up movement, the time of no blinking, and the like. Meanwhile, the eye movement can identify the visual angle image in the real-time detection range and detect the position, speed, moving direction and sight direction of the visual angle image in real time. The data obtained by the eye movement is used as input, the following capability of the eye movement and the visual angle image in a dangerous state environment is used as a calculation parameter, and the existing balance mismatch in the process of interaction of the eye movement and the visual angle image can be calculated in real time in the process of eye movement following movement.
In one embodiment, the present application further provides an eye-tracking based balance function analysis system, referring to fig. 3, the eye-tracking based balance function analysis system includes: the device comprises a real-time detection module, a first analysis judging module, a second analysis judging module and an operation module.
And the real-time detection module is used for detecting and outputting the eye movement data and the visual angle image data in real time.
The first analysis and judgment module is used for judging whether the visual angle image responds to eye movement. If the visual angle image responds to the eye movement, the visual angle image adopts continuous following operation, and if the visual angle image does not respond to the eye movement, the visual angle image carries out balance risk early warning.
And the second analysis judging module is used for respectively further expecting whether the eye movement adopts real-time forward feedback or not under the condition that the visual angle image adopts continuous following operation and the visual angle image does not adopt continuous following operation.
And the operation module is used for determining the matching balance degree of the eye movement and the visual angle image according to whether the visual angle image takes continuous following operation or not and whether the eye movement takes an expected result of real-time forward feedback or not.
In this embodiment, the balance function analysis system based on eye movement tracking considers the continuous following capability of the visual angle image and the real-time positive feedback of eye movement, so that the recognition of the mismatch between the eye movement and the balance of the visual angle image is more sufficient. In the application, the effective matching balance degree is determined under different conditions, so that the safety of the visual angle image and the comfort of the eye movement following motion in the interaction process of the eye movement and the visual angle image can be effectively improved. By judging whether or not the view angle image is responsive to eye movement, it is possible to classify and quantify effective following operations when the view angle image faces a dangerous environment. In addition, the embodiment identifies the matching balance degree in the eye movement-visual angle image interaction process based on the visual angle image continuous following operation, and has important significance for improving the safety of automatic traveling eye movement.
Referring to fig. 4, the present application further provides an evaluation method for balancing mismatch between eye movement and visual angle images, including:
s10, detecting and outputting eye movement data and visual angle image data in real time.
The eye movement data includes: eye movement position, eye movement speed, eye movement direction, eye movement maximum correction tracking movement stagnation time period and eye movement maximum non-blinking time period; the perspective image data includes: view image position, view image speed, view image direction, and view image line direction.
S20, judging whether the visual angle image responds to eye movement, if the visual angle image responds to eye movement, the visual angle image adopts continuous following operation, and if the visual angle image does not respond to eye movement, the visual angle image carries out balance risk early warning.
S30, respectively determining the matching balance degree of the eye movement and the visual angle image under the condition that the visual angle image takes the continuous following operation and the visual angle image does not take the continuous following operation.
In one embodiment, the continuous follow-up operation taken by the perspective image includes: viewing angle rotation following or viewing angle translation following. Of course, based on the core design thought of the application, more visual angle image following operations can be designed into the scheme so as to determine more accurate matching balance degree.
S40, judging whether the visual angle image is in the range of the matching balance degree.
The judgment is carried out according to the position of the currently real-time detected visual angle image, the position of the eye movement and the determined matching balance degree range.
S50, evaluating that the balance of the eye movement and the visual angle image is not matched according to the judging result of whether the visual angle image is in the range of the matching balance degree.
If the view angle image is not within the matching balance range, the balance mismatch between the eye movement and the view angle image is lower. If the view angle image is within the range of the matching balance degree, the balance mismatch between the eye movement and the view angle image is higher. Specifically, the probability of a balance mismatch can be determined in combination with the five different degrees of matching balance obtained in the above embodiments. For example, the fifth equilibrium mismatch interval has the highest probability of equilibrium mismatch because no measures can be taken to avoid dizziness when the view angle image is in the fifth matching equilibrium.
In one embodiment, in the method for evaluating the mismatch of balance between eye movement and view angle image, the step of determining the matching balance between eye movement and view angle image in the case that the view angle image takes continuous following operation and the view angle image does not take continuous following operation, further includes:
Whether eye movement is expected to take real-time positive feedback including normal follow-up movement of the eye and transient correction of the eye movement. Wherein the eye movement transient correction further comprises: stationary transient correction and transient travel.
And determining the matching balance degree of the eye movement and the visual angle image by combining whether the visual angle image adopts continuous following operation and whether the eye movement adopts real-time positive feedback.
In this embodiment, the specific method for determining the matching balance degree may be determined by referring to the steps in the balance function analysis method based on eye tracking, which is not described herein.
According to the evaluation method for the unmatched eye movement and visual angle image balance, the continuous following capability of the visual angle image in a dangerous environment is clarified, the generated eye movement-visual angle image matching balance degree considers the coupling influence of factors such as the position, speed and continuous following capability of the visual angle image, and the unmatched eye movement-visual angle image balance is recognized more fully.
According to the evaluation method for the unmatched balance between the eye movement and the visual angle image, the correction and steering capability of the eye movement and the following capability of the visual angle image are comprehensively considered, and the generation method for the matched balance between the eye movement and the visual angle image under multiple environments based on the continuous following capability of the visual angle image is provided. The method has important significance for improving the recognition of the risk of the intelligent eye movement to the visual angle image, and can effectively improve the safety of the visual angle image and the comfort of the eye movement following movement in the eye movement-visual angle image interaction process.
Referring to fig. 5, the present application further provides an evaluation system 100 for eye movement and visual angle image balance mismatch, including: the system comprises a real-time detection module 10, an analysis judging module 20, an operation module 30 and an evaluation module 40.
The real-time detection module 10 is used for detecting the eye movement data and the visual angle image data in real time.
The analysis and judgment module 20 is connected with the real-time detection module 10. The analysis and judgment module 20 is used for judging whether the visual angle image responds to eye movement. If the visual angle image responds to the eye movement, the visual angle image adopts continuous following operation, and if the visual angle image does not respond to the eye movement, the visual angle image carries out balance risk early warning.
The operation module 30 is connected to the analysis and judgment module 20. The operation module 30 is configured to determine a matching balance degree between the eye movement and the view image in a case where the view image takes the continuous following operation and the view image does not take the continuous following operation, respectively.
The evaluation module 40 is connected to the arithmetic module 30. The evaluation module 40 is configured to evaluate whether the balance between the eye movement and the view image is not matched according to whether the view image is within the range of the matching balance.
In this embodiment, the above modules may be implemented by means of a computer program, and the specific hardware structure of the modules is not limited specifically, so that the above functions may be implemented. The evaluation system 100 for eye movement and visual angle image balance mismatch provided in this embodiment may perform all steps in the evaluation method for eye movement and visual angle image balance mismatch. The evaluation system 100 for unbalanced eye movement and visual angle image also comprehensively considers the correction and steering capability of eye movement and the following capability of visual angle image, and provides a method for generating balanced eye movement-visual angle image under multiple environments based on the continuous following capability of visual angle image. The method has important significance for improving the recognition of the risk of the intelligent eye movement to the visual angle image, and can effectively improve the safety of the visual angle image and the comfort of the eye movement following movement in the eye movement-visual angle image interaction process.
Those skilled in the art will appreciate that various modifications and improvements can be made to the disclosure. For example, the various devices or components described above may be implemented in hardware, or may be implemented in software, firmware, or a combination of some or all of the three.
A flowchart is used in this disclosure to describe the steps of a method according to an embodiment of the present disclosure. It should be understood that the steps that follow or before do not have to be performed in exact order. Rather, the various steps may be processed in reverse order or simultaneously. Also, other operations may be added to these processes.
Those of ordinary skill in the art will appreciate that all or a portion of the steps of the methods described above may be implemented by a computer program to instruct related hardware, and the program may be stored in a computer readable storage medium, such as a read only memory, a magnetic disk, or an optical disk. Alternatively, all or part of the steps of the above embodiments may be implemented using one or more integrated circuits. Accordingly, each module/unit in the above embodiment may be implemented in the form of hardware, or may be implemented in the form of a software functional module. The present disclosure is not limited to any specific form of combination of hardware and software.
Unless defined otherwise, all terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this disclosure pertains. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
The foregoing is illustrative of the present disclosure and is not to be construed as limiting thereof. Although a few exemplary embodiments of this disclosure have been described, those skilled in the art will readily appreciate that many modifications are possible in the exemplary embodiments without materially departing from the novel teachings and advantages of this disclosure. Accordingly, all such modifications are intended to be included within the scope of this disclosure as defined in the claims. It is to be understood that the foregoing is illustrative of the present disclosure and is not to be construed as limited to the specific embodiments disclosed, and that modifications to the disclosed embodiments, as well as other embodiments, are intended to be included within the scope of the appended claims. The disclosure is defined by the claims and their equivalents.
In the description of the present specification, reference to the terms "one embodiment," "some embodiments," "illustrative embodiments," "examples," "specific examples," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiments or examples. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
While embodiments of the present invention have been shown and described, it will be understood by those of ordinary skill in the art that: many changes, modifications, substitutions and variations may be made to the embodiments without departing from the spirit and principles of the invention, the scope of which is defined by the claims and their equivalents.

Claims (8)

1. A balance function analysis method based on eye movement tracking, comprising:
detecting and outputting eye movement data and visual angle image data in real time;
judging whether the visual angle image responds to eye movement, if the visual angle image responds to eye movement, the visual angle image adopts continuous following operation, and if the visual angle image does not respond to eye movement, the visual angle image carries out balance risk early warning;
under the conditions that the visual angle image takes continuous following operation and the visual angle image does not take continuous following operation, respectively, further anticipating whether the eye movement takes real-time forward feedback or not;
determining the matching balance degree of the eye movement and the visual angle image according to the judging result of whether the visual angle image is subjected to continuous following operation and the expected result of whether the eye movement is subjected to real-time forward feedback;
the view image taking a continuous following operation includes: viewing angle rotation following or viewing angle translation following;
The step of determining the matching balance degree of the eye movement and the visual angle image according to the judging result of whether the visual angle image takes continuous following operation and the expected result of whether the eye movement takes real-time forward feedback comprises the following steps:
under the condition that the visual angle image is continuously followed, determining an STC embroidery curve of which the balance of the eye movement and the visual angle image is not matched;
on the basis of the STC embroidery curve, the matching balance degree of eye movement and visual angle images is further determined;
wherein, the step of determining the STC embroidery curve of which the eye movement and the visual angle image balance are not matched comprises the following steps:
determining a real-time detection distance range of eye movement;
calculating the fastest time required by the safe following eye movement of the visual angle image according to the visual angle rotation following speed of the visual angle image, the visual angle translation following speed of the visual angle image and the eye movement distance;
calculating the time required by the visual angle image to follow the dizzy according to the initial speed of the visual angle image and the fastest time required by the visual angle image to safely follow the eye movement, and obtaining the shortest time required by the visual angle image to avoid the dizzy;
and calculating the time safety boundary of the eye movement and the time safety boundary of the visual angle image reaching the predicted dizzy point respectively according to the time of the eye movement reaching the predicted dizzy point and the shortest time required by the visual angle image to avoid dizzy, wherein the time safety boundary is an STC embroidery curve.
2. The eye-tracking based balance function analysis method according to claim 1, wherein the step of determining the matching balance of the eye movement and the view angle image according to whether the view angle image takes the expected result of the continuous following operation and whether the eye movement takes the real-time forward feedback comprises:
under the condition that the visual angle image does not adopt continuous following operation and the eye movement does not adopt real-time forward feedback, determining a first balance mismatch interval;
determining a second balance mismatch interval in the case that the view angle image does not take continuous follow-up operation while the eye movement takes instantaneous correction;
under the condition that the visual angle image adopts continuous following operation and the eye movement does not adopt real-time forward feedback, determining a third balance mismatch interval; and
in the case where the view angle image takes continuous follow-up operation while the eye movement takes instantaneous correction, a fourth balance mismatch interval is determined.
3. The eye-tracking based balance function analysis method according to claim 2, further comprising, after the step of detecting and outputting the eye movement data and the angle-of-view image data in real time:
in the case of an eye movement transient correction and an eye movement transient steering, a fifth balance mismatch interval is determined.
4. The eye-tracking based balance function analysis method of claim 3, wherein the STC embroidery curve for eye movement and view angle image balance mismatch is calculated using the following formula:
Figure FDA0004282967620000021
{0≤STC vd ≤STC vr };
Figure FDA0004282967620000022
Figure FDA0004282967620000023
STC pb-ne (STC vd )=D pb-ne (STC vd )/V pw
STC pf-fe (STC vd )=D pf-fe (STC vd )/V pw
[STC vd ,STC pf-fe ]≤STC noMatch-area ≤[STC vd ,STC pb-ne ];
D vr representing the furthest distance that eye movement detects in real time; v (V) v A follow-up movement speed representing eye movement; STC (STC) vr Representing the time of eye movement reaching the furthest real-time detection position under the current follow-up movement speed; STC (STC) vd Represents the time to predicted dizziness of eye movement arrival, STC vd Is a variable; v (V) pb Representing the following speed of the view angle rotation of the view angle image; v pf Representing the following speed of the visual angle image variable speed motion; d (D) pb-ne Representing at the viewing angle x The visual angle image safely follows the fastest time needed to reach the center of eye movement when the eye moves like the visual angle rotation follows; d (D) pf-fe Representing the fastest time length from the edge of eye movement required when the visual angle image safely follows the eye movement when the visual angle image changes speed and moves along with the eye movement; eyeMargin stands for eye movement distance; v (V) pw Representing an initial velocity of the view image before it begins to follow; STC (STC) pb-ne Representing the shortest time required for the visual angle image to safely follow eye movement when the visual angle image rotates at the visual angle; STC (STC) pf-fe Representing the shortest time required for the visual angle image to safely follow eye movement when the visual angle image changes speed and moves; STC (STC) noMatch-area STC embroidered curves representing eye movement and visual angle image balance mismatch.
5. The eye-tracking based balance function analysis method according to claim 4, wherein in the case where the view angle image does not take continuous following operation and the eye movement does not take real-time forward feedback, the first matching balance is determined according to the following formula: [ D v-1 ,D p-1-fe ]The first matching balance degree is less than or equal to [ D ] v-1 ,D p-1-ne ];
Figure FDA0004282967620000031
Figure FDA0004282967620000032
Figure FDA0004282967620000033
Wherein V is v A follow-up movement speed representing eye movement; STC (STC) vd Representing the time for eye movement to reach a predicted dizziness point; eyeSpeed represents the current motion speed of the visual angle image detected by eye movement in real time in the actual motion scene; d (D) v-1 Representative of STC vd In the range, the eye moves in a first following motion range of the following motion direction; eyeMargin stands for eye movement distance; d (D) p-1-ne Representing a first fastest time period from the center of eye movement required for the visual angle image to safely follow the eye movement when the visual angle image moves at the eye speed; d (D) p-1-fe Representing the second fastest time period required for the view image to safely follow the eye movement from the edge of the eye movement when the view image is moving at the eye speed.
6. The eye-tracking based balance function analysis method according to claim 5, wherein in the case where the view angle image does not take a continuous follow-up operation and the eye movement takes an instantaneous correction, the second matching balance is determined according to the following formula:
[D v-2 ,D p-2-fe ]The second matching balance degree is less than or equal to [ D ] v-2 ,D p-2-ne ];
Figure FDA0004282967620000034
Figure FDA0004282967620000035
Figure FDA0004282967620000041
Wherein V is v A follow-up movement speed representing eye movement; STC (STC) vd Representing the time for eye movement to reach a predicted dizziness point; eyeSpeed represents the current motion speed of the visual angle image detected by eye movement in real time in the actual motion scene; pauseTime-max represents the eye movement maximum correction tracking movement dead time; d (D) v-2 Representative of STC vd In the range, the eye moves in a second following motion range of the following motion direction; eyeMargin stands for eye movement distance; d (D) p-2-ne Representing a third fastest time period from the center of eye movement required when the visual angle image safely follows the eye movement when the visual angle image moves at the eye speed; d (D) p-2-fe Representing the fourth fastest time period required for the view image to safely follow the eye movement from the edge of the eye movement when the view image is moving at the eye speed.
7. An eye movement tracking based balance function analysis system, comprising:
the real-time detection module is used for detecting and outputting the eye movement data and the visual angle image data in real time;
the response detection module is used for judging whether the visual angle image responds to eye movement, if the visual angle image responds to eye movement, the visual angle image adopts continuous following operation, and if the visual angle image does not respond to eye movement, the visual angle image carries out balance risk early warning;
The feedback module is used for respectively further predicting whether the eye movement adopts real-time forward feedback or not under the conditions that the visual angle image adopts continuous following operation and the visual angle image does not adopt continuous following operation;
the balance matching module is used for determining the matching balance degree of the eye movement and the visual angle image according to the judging result of whether the visual angle image is subjected to continuous following operation and the expected result of whether the eye movement is subjected to real-time forward feedback;
the view image taking a continuous following operation includes: viewing angle rotation following or viewing angle translation following;
the step of determining the matching balance degree of the eye movement and the visual angle image according to the judging result of whether the visual angle image takes continuous following operation and the expected result of whether the eye movement takes real-time forward feedback comprises the following steps:
under the condition that the visual angle image is continuously followed, determining an STC embroidery curve of which the balance of the eye movement and the visual angle image is not matched;
on the basis of the STC embroidery curve, the matching balance degree of eye movement and visual angle images is further determined;
wherein, the step of determining the STC embroidery curve of which the eye movement and the visual angle image balance are not matched comprises the following steps:
determining a real-time detection distance range of eye movement;
calculating the fastest time required by the safe following eye movement of the visual angle image according to the visual angle rotation following speed of the visual angle image, the visual angle translation following speed of the visual angle image and the eye movement distance;
Calculating the time required by the visual angle image to follow the dizzy according to the initial speed of the visual angle image and the fastest time required by the visual angle image to safely follow the eye movement, and obtaining the shortest time required by the visual angle image to avoid the dizzy;
calculating time safety boundaries of the eye movement and the visual angle image reaching the predicted dizzy point respectively according to the time of the eye movement reaching the predicted dizzy point and the shortest time required by the visual angle image to avoid dizzy, wherein the time safety boundaries are STC embroidery curves;
the step of determining the degree of matching balance of eye movement and view angle image based on whether the view angle image takes continuous follow-up operation and whether the eye movement takes an expected result of real-time positive feedback comprises:
under the condition that the visual angle image does not adopt continuous following operation and the eye movement does not adopt real-time forward feedback, determining a first balance mismatch interval;
determining a second balance mismatch interval in the case that the view angle image does not take continuous follow-up operation while the eye movement takes instantaneous correction;
under the condition that the visual angle image adopts continuous following operation and the eye movement does not adopt real-time forward feedback, determining a third balance mismatch interval; and
determining a fourth balance mismatch interval under the condition that the visual angle image adopts continuous following operation and the eye movement adopts instantaneous correction;
After the step of detecting and outputting the eye movement data and the angle-of-view image data in real time, further comprising:
in the case of an eye movement transient correction and an eye movement transient steering, a fifth balance mismatch interval is determined.
8. The eye-tracking based balance function analysis system of claim 7, wherein the STC embroidery curve for eye movement and view image balance mismatch is calculated using the formula:
Figure FDA0004282967620000051
{0≤STC vd ≤STC vr };
Figure FDA0004282967620000052
Figure FDA0004282967620000053
STC pb-ne (STC vd )=D pb-ne (STC vd )/V pw
STC pf-fe (STC vd )=D pf-fe (STC vd )/V pw
[STC vd ,STC pf-fe ]≤STC noMatch-area ≤[STC vd ,STC pb-ne ];
D vr representing the furthest distance that eye movement detects in real time; v (V) v A follow-up movement speed representing eye movement; STC (STC) vr Representing the time of eye movement reaching the furthest real-time detection position under the current follow-up movement speed; STC (STC) vd Represents the time to predicted dizziness of eye movement arrival, STC vd Is a variable; v pb Representing the following speed of the view angle rotation of the view angle image; v pf Representing the following speed of the visual angle image variable speed motion; d (D) pb-ne Representing the fastest time required for reaching the center of eye movement when the visual angle image safely follows the eye movement when the visual angle image rotates along with the visual angle; d (D) pf-fe Representing the fastest time length from the edge of eye movement required when the visual angle image safely follows the eye movement when the visual angle image changes speed and moves along with the eye movement; eyeMargin stands for eye movement distance; v (V) pw Representing an initial velocity of the view image before it begins to follow; STC (STC) pb-ne Representing the shortest time required for the visual angle image to safely follow eye movement when the visual angle image rotates at the visual angle; STC (STC) pf-fe Representing the shortest time required for the visual angle image to safely follow eye movement when the visual angle image changes speed and moves; STC (STC) noMatch-area STC embroidery curves representing mismatch of eye movement and visual angle image balance;
in the case that the view angle image does not take continuous following operation and the eye movement does not take real-time positive feedback, determining a first matching balance according to the following formula: [ D v-1 ,D p-1-fe ]The first matching balance degree is less than or equal to [ D ] v-1 ,D p-1-ne ];
Figure FDA0004282967620000061
Figure FDA0004282967620000062
Figure FDA0004282967620000063
Wherein v is v A follow-up movement speed representing eye movement; STC (STC) vd Representing the time for eye movement to reach a predicted dizziness point; eyeSpeed represents the current motion speed of the visual angle image detected by eye movement in real time in the actual motion scene; d (D) v-1 Representative of STC vd In the range, the eye moves in a first following motion range of the following motion direction; eyeMargin stands for eye movement distance; d (D) p-1-ne Representing a first fastest time period from the center of eye movement required for the visual angle image to safely follow the eye movement when the visual angle image moves at the eye speed; d (D) p-1-fe Representing a second fastest time period required for the visual angle image to safely follow the eye movement when the visual angle image moves at the eye speed;
in the case that the view angle image does not take continuous follow-up operation and the eye movement takes instantaneous correction, determining a second matching balance according to the following formula:
[D v-2 ,D p-2-fe ]The second matching balance degree is less than or equal to [ D ] v-2 ,D p-2-ne ];
Figure FDA0004282967620000071
Figure FDA0004282967620000072
Figure FDA0004282967620000073
Wherein v is v A follow-up movement speed representing eye movement; STC (STC) vd Representing the time for eye movement to reach a predicted dizziness point; eyeSpeed represents the eye in an actual sports sceneThe current movement speed of the visual angle image detected in real time is moved; pauseTime-max represents the eye movement maximum correction tracking movement dead time; d (D) v-2 Representative of STC vd In the range, the eye moves in a second following motion range of the following motion direction; eyeMargin stands for eye movement distance; d (D) p-2-ne Representing a third fastest time period from the center of eye movement required when the visual angle image safely follows the eye movement when the visual angle image moves at the eye speed; d (D) p-2-fe Representing the fourth fastest time period required for the view image to safely follow the eye movement from the edge of the eye movement when the view image is moving at the eye speed.
CN202310483424.7A 2023-05-04 2023-05-04 Balance function analysis method and system based on eye movement tracking Active CN116228748B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310483424.7A CN116228748B (en) 2023-05-04 2023-05-04 Balance function analysis method and system based on eye movement tracking

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310483424.7A CN116228748B (en) 2023-05-04 2023-05-04 Balance function analysis method and system based on eye movement tracking

Publications (2)

Publication Number Publication Date
CN116228748A CN116228748A (en) 2023-06-06
CN116228748B true CN116228748B (en) 2023-07-14

Family

ID=86584625

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310483424.7A Active CN116228748B (en) 2023-05-04 2023-05-04 Balance function analysis method and system based on eye movement tracking

Country Status (1)

Country Link
CN (1) CN116228748B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015058246A (en) * 2013-09-20 2015-03-30 大日本印刷株式会社 Sight line analysis system
CN107656613A (en) * 2017-09-08 2018-02-02 国网山东省电力公司电力科学研究院 A kind of man-machine interactive system and its method of work based on the dynamic tracking of eye
CN111949131A (en) * 2020-08-17 2020-11-17 陈涛 Eye movement interaction method, system and equipment based on eye movement tracking technology

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101769177B1 (en) * 2012-10-09 2017-08-18 한국전자통신연구원 Apparatus and method for eye tracking
CA2957384C (en) * 2014-08-04 2023-06-20 New York University Methods and kits for diagnosing, assessing or quantitating drug use, drug abuse and narcosis, internuclear ophthalmoplegia, attention deficit hyperactivity disorder (adhd), chronic traumatic encephalopathy, schizophrenia spectrum disorders and alcohol consumption
CN112578903A (en) * 2019-09-30 2021-03-30 托比股份公司 Eye tracking method, eye tracker, and computer program
CN114615484B (en) * 2022-03-08 2022-11-01 常山县亿思达电子有限公司 Vision tracking and positioning system based on retina monitoring

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015058246A (en) * 2013-09-20 2015-03-30 大日本印刷株式会社 Sight line analysis system
CN107656613A (en) * 2017-09-08 2018-02-02 国网山东省电力公司电力科学研究院 A kind of man-machine interactive system and its method of work based on the dynamic tracking of eye
CN111949131A (en) * 2020-08-17 2020-11-17 陈涛 Eye movement interaction method, system and equipment based on eye movement tracking technology

Also Published As

Publication number Publication date
CN116228748A (en) 2023-06-06

Similar Documents

Publication Publication Date Title
JP5406328B2 (en) VEHICLE DISPLAY DEVICE, ITS CONTROL METHOD AND PROGRAM
JP6343808B2 (en) Visual field calculation device and visual field calculation method
JP6591085B2 (en) Motion sickness estimation device, motion sickness prevention device, and motion sickness estimation method
EP1721782B1 (en) Driving support equipment for vehicles
US10655978B2 (en) Controlling an autonomous vehicle based on passenger behavior
JP7342638B2 (en) Driver status detection device
KR101978548B1 (en) Server and method for diagnosing dizziness using eye movement measurement, and storage medium storin the same
US11603104B2 (en) Driver abnormality determination system, method and computer program
US20210316737A1 (en) Driver abnormality determination apparatus, method and computer program
CN116228748B (en) Balance function analysis method and system based on eye movement tracking
WO2019175922A1 (en) Driving assistance device, driving assistance method, and driving assistance program
JP7342636B2 (en) Vehicle control device and driver condition determination method
WO2019176492A1 (en) Calculation system, information processing device, driving assistance system, index calculation method, computer program, and storage medium
CN110881981A (en) Alzheimer's disease auxiliary detection system based on virtual reality technology
US20210316736A1 (en) Driver abnormality determination apparatus, method and computer program
KR102224209B1 (en) Apparatus and method for measuring angle of strabismus
JP5321951B2 (en) Driving burden judgment device
JP6689470B1 (en) Information processing apparatus, program, and information processing method
JP6582799B2 (en) Support apparatus and support method
KR101643744B1 (en) Device and method for detecting user fixation pupillary response
Ueno et al. A noncontact measurement of saccadic eye movement with two high-speed cameras
JP7298509B2 (en) state estimator
Chaudhary et al. Enhancing the precision of remote eye-tracking using iris velocity estimation
JP7415459B2 (en) Vehicle control device and driver condition determination method
JP7415460B2 (en) Vehicle control device and driver condition determination method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant