Disclosure of Invention
In order to solve the problem of inaccurate detection of the balance of the current user, according to a first aspect of the present invention, the present invention claims a balance function analysis method based on eye tracking, which is characterized by comprising:
detecting and outputting eye movement data and visual angle image data in real time;
judging whether the visual angle image responds to eye movement, if the visual angle image responds to eye movement, the visual angle image adopts continuous following operation, and if the visual angle image does not respond to eye movement, the visual angle image carries out balance risk early warning;
under the conditions that the visual angle image takes continuous following operation and the visual angle image does not take continuous following operation, respectively, further anticipating whether the eye movement takes real-time forward feedback or not;
and determining the matching balance degree of the eye movement and the visual angle image according to the judging result of whether the visual angle image takes continuous following operation and the expected result of whether the eye movement takes real-time forward feedback.
Further, the view image taking a continuous following operation includes: viewing angle rotation following or viewing angle translation following;
the step of determining the matching balance degree of the eye movement and the visual angle image according to the judging result of whether the visual angle image takes continuous following operation and the expected result of whether the eye movement takes real-time forward feedback comprises the following steps:
under the condition that the visual angle image is continuously followed, determining an STC embroidery curve of which the balance of the eye movement and the visual angle image is not matched;
on the basis of the STC embroidery curve, the matching balance degree of eye movement and visual angle images is further determined;
wherein, the step of determining the STC embroidery curve of which the eye movement and the visual angle image balance are not matched comprises the following steps:
determining a real-time detection distance range of eye movement;
calculating the fastest time required by the safe following eye movement of the visual angle image according to the visual angle rotation following speed of the visual angle image, the visual angle translation following speed of the visual angle image and the eye movement distance;
calculating the time required by the visual angle image to follow the dizzy according to the initial speed of the visual angle image and the fastest time required by the visual angle image to safely follow the eye movement, and obtaining the shortest time required by the visual angle image to avoid the dizzy;
and calculating the time safety boundary of the eye movement and the time safety boundary of the visual angle image reaching the predicted dizzy point respectively according to the time of the eye movement reaching the predicted dizzy point and the shortest time required by the visual angle image to avoid dizzy, wherein the time safety boundary is an STC embroidery curve.
Further, the step of determining the matching balance of the eye movement and the view angle image according to whether the view angle image takes the expected result of continuous following operation and whether the eye movement takes real-time positive feedback comprises:
under the condition that the visual angle image does not adopt continuous following operation and the eye movement does not adopt real-time forward feedback, determining a first balance mismatch interval;
determining a second balance mismatch interval in the case that the view angle image does not take continuous follow-up operation while the eye movement takes instantaneous correction;
under the condition that the visual angle image adopts continuous following operation and the eye movement does not adopt real-time forward feedback, determining a third balance mismatch interval; and
in the case where the view angle image takes continuous follow-up operation while the eye movement takes instantaneous correction, a fourth balance mismatch interval is determined.
Further, after the step of detecting and outputting the eye movement data and the angle-of-view image data in real time, it further includes:
in the case of an eye movement transient correction and an eye movement transient steering, a fifth balance mismatch interval is determined.
Further, the STC embroidery curve, in which the eye movement and the viewing angle image balance are not matched, is calculated by using the following formula:
representing the furthest distance that eye movement detects in real time; / >
Follow-up movement speed representing eye movement;/>
Representing the time of eye movement reaching the furthest real-time detection position under the current follow-up movement speed; />
Time representing predicted dizziness of eye movement arrival, +.>
Is a variable; />
Representing the following speed of the view angle rotation of the view angle image; />
Representing the following speed of the visual angle image variable speed motion; />
Representing the fastest time required for reaching the center of eye movement when the visual angle image safely follows the eye movement when the visual angle image rotates along with the visual angle; />
Representing the fastest time length from the edge of eye movement required when the visual angle image safely follows the eye movement when the visual angle image changes speed and moves along with the eye movement; eyeMargin stands for eye movement distance; />
Representing an initial velocity of the view image before it begins to follow; />
Representing the shortest time required for the visual angle image to safely follow eye movement when the visual angle image rotates at the visual angle; />
Representing the shortest time required for the visual angle image to safely follow eye movement when the visual angle image changes speed and moves; />
STC embroidered curves representing eye movement and visual angle image balance mismatch.
Further, in the case where the view angle image does not take continuous follow-up operation and the eye movement does not take real-time positive feedback, the first matching balance is determined according to the following formula:
The first matching balance degree is less than or equal to
;
Wherein, the liquid crystal display device comprises a liquid crystal display device,
a follow-up movement speed representing eye movement; />
Representing the time for eye movement to reach a predicted dizziness point; eyeSpeed represents the current motion speed of the visual angle image detected by eye movement in real time in the actual motion scene; />
Represented by
In the range, the eye moves in a first following motion range of the following motion direction; eyeMargin stands for eye movement distance;
representing the need for the visual angle image to safely follow eye movement when the visual angle image is moving at the speed of eye speedA first fastest time period from the center of eye movement; />
Representing the second fastest time period required for the view image to safely follow the eye movement from the edge of the eye movement when the view image is moving at the eye speed.
Further, in the case where the view angle image does not take the continuous following operation and the eye movement takes the instantaneous correction, the second matching balance degree is determined according to the following formula:
the first matching balance degree is less than or equal to ∈>
;
Wherein, the liquid crystal display device comprises a liquid crystal display device,
a follow-up movement speed representing eye movement; />
Representing the time for eye movement to reach a predicted dizziness point; eyeSpeed represents the current motion speed of the visual angle image detected by eye movement in real time in the actual motion scene; pauseTime-max represents the eye movement maximum correction tracking movement dead time; / >
Represented by->
In the range, the eye moves in a second following motion range of the following motion direction; eyeMargin stands for eye movement distance; />
Representing a third fastest time period from the center of eye movement required when the visual angle image safely follows the eye movement when the visual angle image moves at the eye speed; />
Representing the fourth fastest time period required for the view image to safely follow the eye movement from the edge of the eye movement when the view image is moving at the eye speed.
According to a second aspect of the present invention, the present invention claims a balance function analysis system based on eye tracking, comprising:
the real-time detection module is used for detecting and outputting the eye movement data and the visual angle image data in real time;
the response detection module is used for judging whether the visual angle image responds to eye movement, if the visual angle image responds to eye movement, the visual angle image adopts continuous following operation, and if the visual angle image does not respond to eye movement, the visual angle image carries out balance risk early warning;
the feedback module is used for respectively further predicting whether the eye movement adopts real-time forward feedback or not under the conditions that the visual angle image adopts continuous following operation and the visual angle image does not adopt continuous following operation;
and the balance matching module is used for determining the matching balance degree of the eye movement and the visual angle image according to the judging result of whether the visual angle image is subjected to continuous following operation and the expected result of whether the eye movement is subjected to real-time forward feedback.
Further, the view image taking a continuous following operation includes: viewing angle rotation following or viewing angle translation following;
the step of determining the matching balance degree of the eye movement and the visual angle image according to the judging result of whether the visual angle image takes continuous following operation and the expected result of whether the eye movement takes real-time forward feedback comprises the following steps:
under the condition that the visual angle image is continuously followed, determining an STC embroidery curve of which the balance of the eye movement and the visual angle image is not matched;
on the basis of the STC embroidery curve, the matching balance degree of eye movement and visual angle images is further determined;
wherein, the step of determining the STC embroidery curve of which the eye movement and the visual angle image balance are not matched comprises the following steps:
determining a real-time detection distance range of eye movement;
calculating the fastest time required by the safe following eye movement of the visual angle image according to the visual angle rotation following speed of the visual angle image, the visual angle translation following speed of the visual angle image and the eye movement distance;
calculating the time required by the visual angle image to follow the dizzy according to the initial speed of the visual angle image and the fastest time required by the visual angle image to safely follow the eye movement, and obtaining the shortest time required by the visual angle image to avoid the dizzy;
calculating time safety boundaries of the eye movement and the visual angle image reaching the predicted dizzy point respectively according to the time of the eye movement reaching the predicted dizzy point and the shortest time required by the visual angle image to avoid dizzy, wherein the time safety boundaries are STC embroidery curves;
The step of determining the degree of matching balance of eye movement and view angle image based on whether the view angle image takes continuous follow-up operation and whether the eye movement takes an expected result of real-time positive feedback comprises:
under the condition that the visual angle image does not adopt continuous following operation and the eye movement does not adopt real-time forward feedback, determining a first balance mismatch interval;
determining a second balance mismatch interval in the case that the view angle image does not take continuous follow-up operation while the eye movement takes instantaneous correction;
under the condition that the visual angle image adopts continuous following operation and the eye movement does not adopt real-time forward feedback, determining a third balance mismatch interval; and
determining a fourth balance mismatch interval under the condition that the visual angle image adopts continuous following operation and the eye movement adopts instantaneous correction;
after the step of detecting and outputting the eye movement data and the angle-of-view image data in real time, further comprising:
in the case of an eye movement transient correction and an eye movement transient steering, a fifth balance mismatch interval is determined.
Further, the STC embroidery curve, in which the eye movement and the viewing angle image balance are not matched, is calculated by using the following formula:
representing the furthest distance that eye movement detects in real time; />
A follow-up movement speed representing eye movement; / >
Representing the time of eye movement reaching the furthest real-time detection position under the current follow-up movement speed; />
Time representing predicted dizziness of eye movement arrival, +.>
Is a variable; />
Representing the following speed of the view angle rotation of the view angle image; />
Representing the following speed of the visual angle image variable speed motion; />
Representing the fastest time required for reaching the center of eye movement when the visual angle image safely follows the eye movement when the visual angle image rotates along with the visual angle; />
Representing the fastest time length from the edge of eye movement required when the visual angle image safely follows the eye movement when the visual angle image changes speed and moves along with the eye movement; eyeMargin stands for eye movement distance; />
Representing an initial velocity of the view image before it begins to follow; />
Representing the shortest time required for the visual angle image to safely follow eye movement when the visual angle image rotates at the visual angle; />
Representing the shortest time required for the visual angle image to safely follow eye movement when the visual angle image changes speed and moves; />
STC embroidery curves representing mismatch of eye movement and visual angle image balance;
in the case that the view angle image does not take continuous following operation and the eye movement does not take real-time positive feedback, determining a first matching balance according to the following formula:
the first matching balance degree is less than or equal to
;
Wherein, the liquid crystal display device comprises a liquid crystal display device,
a follow-up movement speed representing eye movement; / >
Representing the time for eye movement to reach a predicted dizziness point; eyeSpeed represents the current motion speed of the visual angle image detected by eye movement in real time in the actual motion scene; />
Represented by
In the range, the eye moves in a first following motion range of the following motion direction; eyeMargin stands for eye movement distance;
representing a first fastest time period from the center of eye movement required for the visual angle image to safely follow the eye movement when the visual angle image moves at the eye speed; />
Representing a second fastest time period required for the visual angle image to safely follow the eye movement when the visual angle image moves at the eye speed;
in the case that the view angle image does not take continuous follow-up operation and the eye movement takes instantaneous correction, determining a second matching balance according to the following formula:
the first matching balance degree is less than or equal to ∈>
;
Wherein, the liquid crystal display device comprises a liquid crystal display device,
a follow-up movement speed representing eye movement; />
Representing the time for eye movement to reach a predicted dizziness point; eyeSpeed represents the current motion speed of the visual angle image detected by eye movement in real time in the actual motion scene; pauseTime-max represents the eye movement maximum correction tracking movement dead time; />
Represented by->
In the range, the eye moves in a second following motion range of the following motion direction; eyeMargin stands for eye movement distance; / >
Representing the third best distance from the center of eye movement required for the visual angle image to safely follow the eye movement when the visual angle image moves at the eye speedA fast time period; />
Representing the fourth fastest time period required for the view image to safely follow the eye movement from the edge of the eye movement when the view image is moving at the eye speed.
The invention claims a balance function analysis method and a system based on eye movement tracking, which detect and output eye movement data and visual angle image data in real time; judging whether the visual angle image responds to eye movement, if so, adopting continuous following operation by the visual angle image, and if not, carrying out balance risk early warning by the visual angle image; under the conditions that the visual angle image takes continuous following operation and the visual angle image does not take continuous following operation, respectively, further anticipating whether the eye movement takes real-time forward feedback or not; and determining the matching balance degree of the eye movement and the visual angle image according to the judging result of whether the visual angle image takes continuous following operation and the expected result of whether the eye movement takes real-time forward feedback. The method and the device can effectively collect, grasp and analyze eye movement characteristics of the user in real time, accurately track and analyze the eye movement characteristics of the user, finally evaluate and analyze the movement balance of the user efficiently and simply, and guide the user to take emergency measures.
Detailed Description
In order to facilitate an understanding of the present application, a more complete description of the present application will now be provided with reference to the relevant figures. Preferred embodiments of the present application are shown in the accompanying drawings. This application may, however, be embodied in many different forms and is not limited to the embodiments described herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete.
It will be understood that when an element is referred to as being "fixed to" another element, it can be directly on the other element or intervening elements may also be present. When an element is referred to as being "connected" to another element, it can be directly connected to the other element or intervening elements may also be present. The terms "vertical," "horizontal," "left," "right," and the like are used herein for illustrative purposes only and are not meant to be the only embodiment.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used herein in the description of the application is for the purpose of describing particular embodiments only and is not intended to be limiting of the application.
The application relates to the field of biological signal detection, and provides a method for determining matching balance degree in the process of identifying eye movement-visual angle image interaction based on continuous following capability of eye movement and visual angle images. Compared with the traditional dizzy estimation/evaluation method, the method can obtain more accurate matching balance degree of eye movement-visual angle image interaction, and provides basis for risk judgment and decision operation of automatic eye movement.
The eye movement and visual angle images are taken as nursing participants of the road aged, interaction and collision can occur between the eye movement and visual angle images in the use process of the road, the movement state of the eye movement and visual angle images is influenced by inertia in the movement process of the eye movement and visual angle images, and when dangerous environments appear, the movement state of the eye movement and visual angle images cannot be controlled instantaneously so as to avoid dizzy. Therefore, direct blurring of the eye movement and view images is unavoidable in a certain range of time and space of human body interaction, and a region where there is a relative position of the eye movement-view images where balance is not matched is defined as an eye movement-view image matching balance. Another concept, stun occurrence time (swoon time concur, STC for short), is also needed to understand the present application. STC is a parameter widely used in evaluation of eye movement balance mismatch, and generally refers to the time when eye movement distance dizziness occurs, and is obtained by the ratio of the relative distance between eye movement and a dangerous place to the current following movement speed.
The following capability of the visual angle image in the walking process of the dangerous crowd is considered. View image follow-up capability refers to the continuous follow-up capability that a view image takes when a hazard is found. According to the examination and test results of nursing accidents of the old, the following capability of the visual angle image when the visual angle image encounters danger can reduce the risk of dangerous accidents. The following is therefore defined in this application as the following capability of the view angle image, based on the results of experiments conducted by the inventors, by quantifying the speed of the view angle image during the following process.
Referring to fig. 1, fig. 1 is a schematic diagram of a balance function analysis method based on eye tracking.
The balance function analysis method based on eye movement tracking comprises the following steps:
s100, detecting and outputting eye movement data and visual angle image data in real time.
The eye movement data includes: eye movement position, eye movement speed, eye movement direction, eye movement maximum correction follow-up movement stagnation time period, and eye movement maximum non-blinking time period. The perspective image data includes: view image position, view image speed, view image direction, and view image line direction. Both the eye movement data and the perspective image data may be obtained by an object-borne perception system.
S200, judging whether the visual angle image responds to eye movement.
In this step, the activation of the view image following capability depends on the direction of the view image line, and if the direction of the view image line is the detection of the eye movement giving feedback information, it is confirmed that the view image following capability is activated. If the eye movement giving feedback information is not detected in the line-of-sight direction of the visual angle image, the following capability of the visual angle image is not activated, and the visual angle image continues to keep normal movement operation. If the perspective image is responsive to eye movement, the perspective image takes a continuous follow-up operation. And if the visual angle image does not respond to the eye movement, the visual angle image carries out balance risk early warning. In one embodiment, the continuous follow-up operation taken by the perspective image includes: viewing angle rotation following or viewing angle translation following. Of course, based on the core design thought of the application, more visual angle image following operations can be designed into the scheme so as to determine more accurate matching balance degree.
S300, further anticipating whether the eye movement takes real-time positive feedback in the case where the view image takes continuous following operation and the view image does not take continuous following operation, respectively.
In this step, the real-time positive feedback taken by eye movement includes, but is not limited to, normal follow-up motion, transient correction, and transient steering. Real-time positive feedback, such as eye movement, may also include deceleration corrections by other means on the road.
S400, determining the matching balance degree of the eye movement and the visual angle image according to the expected result of whether the visual angle image takes continuous following operation and whether the eye movement takes real-time forward feedback.
In this step, the degree of matching balance between the eye movement and the view angle image is different in the case of whether the view angle image adopts the continuous following operation and whether the eye movement adopts the real-time forward feedback. It will be appreciated that in the case where the view image employs continuous follow-up operation while the eye movement employs real-time positive feedback, the degree of matching balance of the eye movement and the view image will be smaller.
In this embodiment, an object-based real-time detection system is first used to detect eye movement data and view angle image data in real time. Further judging whether the visual angle image adopts continuous following operation or whether the eye movement adopts real-time positive feedback. The matching balance degree of the eye movement and the visual angle image is calculated respectively under different conditions. In this embodiment, the balance function analysis method based on eye movement tracking considers the continuous following capability of the visual angle image and the real-time positive feedback of eye movement, so that the identification of the mismatch of the balance between the eye movement and the visual angle image is more sufficient. In the application, the effective matching balance degree is determined under different conditions, so that the safety of the visual angle image and the comfort of the eye movement following motion in the interaction process of the eye movement and the visual angle image can be effectively improved. By judging whether or not the view angle image is responsive to eye movement, it is possible to classify and quantify effective following operations when the view angle image faces a dangerous environment. In addition, the embodiment identifies the matching balance degree in the eye movement-visual angle image interaction process based on the visual angle image continuous following operation, and has important significance for improving the safety of automatic traveling eye movement.
In one embodiment, the eye tracking based balance function analysis method, the visual angle image taking continuous following operation includes: viewing angle rotation following or viewing angle translation following.
Referring to fig. 2, the step of determining the matching balance degree between the eye movement and the view angle image according to the expected result of whether the view angle image takes the continuous following operation and whether the eye movement takes the real-time forward feedback includes:
s410, under the condition that the visual angle image continuously follows, calculating time safety boundaries when the eye movement and the visual angle image respectively reach the predicted dizzy point, wherein the time safety boundaries are STC embroidery curves.
In this embodiment, it is determined that the following capability of the eye movement is not considered, and under the condition that the visual angle image is continuously followed, an STC embroidery curve of the eye movement and the visual angle image dizzy is obtained. In the application, if necessary, the following capability of eye movement can be determined, and under the condition that the visual angle image continuously follows, an STC embroidery curve of eye movement and visual angle image dizzy can be obtained.
In one embodiment, the step of determining an STC embroidery curve for which eye movement does not match the balance of the perspective image comprises:
s411, determining the eye movement real-time detection distance range.
In the step, the eye movement real-time detection distance range can be obtained through real-time detection by an object-borne real-time detection system.
S412, calculating the fastest time required for safely following the eye movement of the visual angle image according to the visual angle rotation following speed of the visual angle image, the visual angle translation following speed of the visual angle image and the eye movement distance.
In this step, the calculation is performed in two cases. And calculating the fastest time length required by the visual angle image to safely follow the eye movement according to the visual angle rotation following speed of the visual angle image and the eye movement distance, wherein the fastest time length is the first distance. And calculating the fastest time length required by the visual angle image to safely follow the eye movement according to the visual angle translation following speed of the visual angle image and the eye movement distance, and taking the fastest time length as a second distance.
S413, calculating the safety boundary of the dizzy time of the visual angle image according to the initial speed of the visual angle image and the fastest time required for the visual angle image to safely follow eye movement.
According to the initial speed of the visual angle image, the first safety boundary and the second safety boundary of the dizzy time of the visual angle image are respectively calculated by combining the first distance and the second distance calculated in the step S412. The first safety boundary and the second safety boundary are safety boundaries of dizzy time.
S414, calculating an STC embroidery curve of which the eye movement and the visual angle image balance are not matched according to the eye movement dizzy time and the safety boundary of the visual angle image dizzy time.
In this step, an STC embroidery curve in which the eye movement and the viewing angle image balance are not matched is determined according to the first safety boundary, the second safety boundary and the eye movement viewing angle image dizzy time range.
In the present embodiment, a specific step of determining or calculating the STC security envelope range under the continuous following capability of the view angle image when the following capability of the eye movement is not considered is given. Of course, it is also possible to determine or calculate a specific step of the STC security envelope range under the viewing angle image continuous following capability while considering the following capability of eye movement.
In one embodiment, in the steps S411 to S414, the following formula may be used to calculate the STC embroidery curve that the eye movement does not match the balance of the visual angle image:
representing the furthest distance that eye movement detects in real time; />
A follow-up movement speed representing eye movement; />
Representing the time of eye movement reaching the furthest real-time detection position under the current follow-up movement speed; />
Time representing predicted dizziness of eye movement arrival, +.>
Is a variable; />
Representing the following speed of the view angle rotation of the view angle image; />
Representing the following speed of the visual angle image variable speed motion; />
Representing the fastest time required for reaching the center of eye movement when the visual angle image safely follows the eye movement when the visual angle image rotates along with the visual angle; / >
Representing the fastest time length from the edge of eye movement required when the visual angle image safely follows the eye movement when the visual angle image changes speed and moves along with the eye movement; eyeMargin stands for eye movement distance; />
Representing an initial velocity of the view image before it begins to follow; />
Representing the shortest time required for the visual angle image to safely follow eye movement when the visual angle image rotates at the visual angle; />
Representing the shortest time required for the visual angle image to safely follow eye movement when the visual angle image changes speed and moves; />
STC embroidered curves representing eye movement and visual angle image balance mismatch.
The ne appearing in the above formula is an abbreviation of near-end, which indicates the side of the eye movement near the view angle image, called the center. Fe appearing in the above formula is an abbreviation for far-end, which indicates the side of the eye movement away from the view angle image, called the edge.
In consideration of the view angle image and the eye movement in the dangerous state environment, whether the eye movement is triggered or not continuously following measures is used for calculating the relative safety distance between the eye movement and the view angle image in the actual aged care environment. Because of the dynamic interaction process between the eye movement and the visual angle image, the mismatch of the balance of the eye movement and the visual angle image depends on the data of the following movement speed, the following movement direction, the relative position and the like of the eye movement and the visual angle image at the moment. Therefore, this section is to briefly explain a method for generating an eye movement-viewing angle image two-dimensional matching balance, and takes a certain known human interaction environment as an example to calculate the eye movement-viewing angle image two-dimensional matching balance under the environment. The environment is expected to be: the following movement velocity v=60 km/h of eye movement, and the view angle image detected in real time by eye movement is in front right of eye movement.
In one embodiment, the step of determining the matching balance of the eye movement and the view angle image according to whether the view angle image takes the expected result of continuous following operation and whether the eye movement takes real-time positive feedback S400 includes:
first, a first balance mismatch interval is determined in the case where the view angle image does not take continuous follow-up operation while the eye movement does not take real-time positive feedback.
In one embodiment, where the view image does not take continuous follow-up action and the eye movement does not take real-time positive feedback, a first matching balance is determined according to the following formula:
the first matching balance degree is less than or equal to
;
Wherein, the liquid crystal display device comprises a liquid crystal display device,
a follow-up movement speed representing eye movement; />
Representing the time for eye movement to reach a predicted dizziness point; eyeSpeed represents the current motion speed of the visual angle image detected by eye movement in real time in the actual motion scene; />
Represented by
In the range, the eye moves in a first following motion range of the following motion direction; eyeMargin stands for eye movement distance;
representing a first fastest time period from the center of eye movement required for the visual angle image to safely follow the eye movement when the visual angle image moves at the eye speed; />
Representing the second fastest time period required for the view image to safely follow the eye movement from the edge of the eye movement when the view image is moving at the eye speed.
Second, a second balance mismatch interval is determined in the case where the view image does not take continuous follow-up operation while the eye movement takes instantaneous correction.
In one embodiment, where the view image does not take continuous follow-up action and the eye movement takes transient correction, a second matching balance is determined according to the following formula:
the first matching balance degree is less than or equal to ∈>
;
Wherein, the liquid crystal display device comprises a liquid crystal display device,
a follow-up movement speed representing eye movement; />
Representing the time for eye movement to reach a predicted dizziness point; eyeSpeed represents the current motion speed of the visual angle image detected by eye movement in real time in the actual motion scene; pauseTime-max represents the eye movement maximum correction tracking movement dead time; />
Represented by->
In the range, the eye moves in a second following motion range of the following motion direction; eyeMargin stands for eye movement distance; />
Representing a third fastest time period from the center of eye movement required when the visual angle image safely follows the eye movement when the visual angle image moves at the eye speed; />
Representing the fourth fastest time period required for the view image to safely follow the eye movement from the edge of the eye movement when the view image is moving at the eye speed.
Third, under the condition that the visual angle image adopts continuous following operation and the eye movement does not adopt real-time forward feedback, a third balance mismatch interval is determined.
Fourth, a fourth balance mismatch interval is determined in the case where the view image takes continuous follow-up operation while the eye movement takes instantaneous correction.
In one embodiment, after the step of detecting and outputting the eye movement data and the view angle image data in real time, further comprising: in the case of an eye movement transient correction and an eye movement transient steering, a fifth balance mismatch interval is determined.
In one embodiment, when considering the following capability of eye movement, the eye movement real-time forward feedback includes eye movement normal following movement and eye movement transient correction (eye movement transient correction includes stationary transient correction and transient travel). The corrected distance of the eye movement may be determined based on the eye movement following movement velocity and the eye movement maximum corrected following movement dead time. Or the minimum running radius of the eye movement can be determined according to the eye movement following movement speed and the eye movement maximum non-blinking time length.
Specifically, the eye movement correction follow-up motion stagnation period refers to the ability of the eye movement to rapidly decrease the follow-up motion speed until stopping the object in the follow-up motion.
The maximum non-blinking time length of eye movement, namely, in the process of detecting the eye movement of a pedestrian, the blinking action of the pedestrian is recorded, and the time length with the longest interval between two blinks is obtained as the maximum non-blinking time length of eye movement and is used for representing whether the user has distraction or eye vision distraction.
In this embodiment, the following capability of the eye movement is considered, and the real-time forward feedback of the eye movement includes normal following movement of the eye movement and transient correction of the eye movement. While the eye movement transient correction may include both a transient correction in a stationary condition and a transient correction while traveling. In the present embodiment, the following ability of the eye movement is taken into consideration, so that the determination of the matching balance degree of the eye movement and the angle-of-view image can be more accurate.
In this embodiment, in the process of calculating the fifth balance mismatch interval, the present application further clarifies the effectiveness of steering on risk reduction. Wherein, the eye movement usually adopts correction operation to dangerous environment, and the correction and the turn dizzy risk cannot be judged. In the present embodiment, the eye movement real-time forward feedback (instantaneous correction in a stationary condition and instantaneous correction in traveling) is taken into consideration, and the following ability of the eye movement is taken into consideration, so that the determination of the matching balance degree of the eye movement and the view angle image can be more accurate.
In the above description of the present application, the effective following operation when the view angle image faces a dangerous environment is classified and quantified. And, the present application comprehensively considers the following ability of the person-object to divide the risk area. The matching balance degree determined in the eye movement-view angle image interaction process based on the view angle image continuous following operation identification has important significance for improving the safety of automatic traveling eye movement.
In a specific embodiment, the application scenario of the present application is on eye movement with active real-time detection capability. The data that the eye movement can detect the eye movement of the user includes: the speed of eye movement, the correction of the dead time of follow-up movement, the time of no blinking, and the like. Meanwhile, the eye movement can identify the visual angle image in the real-time detection range and detect the position, speed, moving direction and sight direction of the visual angle image in real time. The data obtained by the eye movement is used as input, the following capability of the eye movement and the visual angle image in a dangerous state environment is used as a calculation parameter, and the existing balance mismatch in the process of interaction of the eye movement and the visual angle image can be calculated in real time in the process of eye movement following movement.
In one embodiment, the present application further provides an eye-tracking based balance function analysis system, referring to fig. 3, the eye-tracking based balance function analysis system includes: the device comprises a real-time detection module, a first analysis judging module, a second analysis judging module and an operation module.
And the real-time detection module is used for detecting and outputting the eye movement data and the visual angle image data in real time.
The first analysis and judgment module is used for judging whether the visual angle image responds to eye movement. If the visual angle image responds to the eye movement, the visual angle image adopts continuous following operation, and if the visual angle image does not respond to the eye movement, the visual angle image carries out balance risk early warning.
And the second analysis judging module is used for respectively further expecting whether the eye movement adopts real-time forward feedback or not under the condition that the visual angle image adopts continuous following operation and the visual angle image does not adopt continuous following operation.
And the operation module is used for determining the matching balance degree of the eye movement and the visual angle image according to whether the visual angle image takes continuous following operation or not and whether the eye movement takes an expected result of real-time forward feedback or not.
In this embodiment, the balance function analysis system based on eye movement tracking considers the continuous following capability of the visual angle image and the real-time positive feedback of eye movement, so that the recognition of the mismatch between the eye movement and the balance of the visual angle image is more sufficient. In the application, the effective matching balance degree is determined under different conditions, so that the safety of the visual angle image and the comfort of the eye movement following motion in the interaction process of the eye movement and the visual angle image can be effectively improved. By judging whether or not the view angle image is responsive to eye movement, it is possible to classify and quantify effective following operations when the view angle image faces a dangerous environment. In addition, the embodiment identifies the matching balance degree in the eye movement-visual angle image interaction process based on the visual angle image continuous following operation, and has important significance for improving the safety of automatic traveling eye movement.
Referring to fig. 4, the present application further provides an evaluation method for balancing mismatch between eye movement and visual angle images, including:
s10, detecting and outputting eye movement data and visual angle image data in real time.
The eye movement data includes: eye movement position, eye movement speed, eye movement direction, eye movement maximum correction tracking movement stagnation time period and eye movement maximum non-blinking time period; the perspective image data includes: view image position, view image speed, view image direction, and view image line direction.
S20, judging whether the visual angle image responds to eye movement, if the visual angle image responds to eye movement, the visual angle image adopts continuous following operation, and if the visual angle image does not respond to eye movement, the visual angle image carries out balance risk early warning.
S30, respectively determining the matching balance degree of the eye movement and the visual angle image under the condition that the visual angle image takes the continuous following operation and the visual angle image does not take the continuous following operation.
In one embodiment, the continuous follow-up operation taken by the perspective image includes: viewing angle rotation following or viewing angle translation following. Of course, based on the core design thought of the application, more visual angle image following operations can be designed into the scheme so as to determine more accurate matching balance degree.
S40, judging whether the visual angle image is in the range of the matching balance degree.
The judgment is carried out according to the position of the currently real-time detected visual angle image, the position of the eye movement and the determined matching balance degree range.
S50, evaluating that the balance of the eye movement and the visual angle image is not matched according to the judging result of whether the visual angle image is in the range of the matching balance degree.
If the view angle image is not within the matching balance range, the balance mismatch between the eye movement and the view angle image is lower. If the view angle image is within the range of the matching balance degree, the balance mismatch between the eye movement and the view angle image is higher. Specifically, the probability of a balance mismatch can be determined in combination with the five different degrees of matching balance obtained in the above embodiments. For example, the fifth equilibrium mismatch interval has the highest probability of equilibrium mismatch because no measures can be taken to avoid dizziness when the view angle image is in the fifth matching equilibrium.
In one embodiment, in the method for evaluating the mismatch of balance between eye movement and view angle image, the step of determining the matching balance between eye movement and view angle image in the case that the view angle image takes continuous following operation and the view angle image does not take continuous following operation, further includes:
Whether eye movement is expected to take real-time positive feedback including normal follow-up movement of the eye and transient correction of the eye movement. Wherein the eye movement transient correction further comprises: stationary transient correction and transient travel.
And determining the matching balance degree of the eye movement and the visual angle image by combining whether the visual angle image adopts continuous following operation and whether the eye movement adopts real-time positive feedback.
In this embodiment, the specific method for determining the matching balance degree may be determined by referring to the steps in the balance function analysis method based on eye tracking, which is not described herein.
According to the evaluation method for the unmatched eye movement and visual angle image balance, the continuous following capability of the visual angle image in a dangerous environment is clarified, the generated eye movement-visual angle image matching balance degree considers the coupling influence of factors such as the position, speed and continuous following capability of the visual angle image, and the unmatched eye movement-visual angle image balance is recognized more fully.
According to the evaluation method for the unmatched balance between the eye movement and the visual angle image, the correction and steering capability of the eye movement and the following capability of the visual angle image are comprehensively considered, and the generation method for the matched balance between the eye movement and the visual angle image under multiple environments based on the continuous following capability of the visual angle image is provided. The method has important significance for improving the recognition of the risk of the intelligent eye movement to the visual angle image, and can effectively improve the safety of the visual angle image and the comfort of the eye movement following movement in the eye movement-visual angle image interaction process.
Referring to fig. 5, the present application further provides an evaluation system 100 for eye movement and visual angle image balance mismatch, including: the system comprises a real-time detection module 10, an analysis judging module 20, an operation module 30 and an evaluation module 40.
The real-time detection module 10 is used for detecting the eye movement data and the visual angle image data in real time.
The analysis and judgment module 20 is connected with the real-time detection module 10. The analysis and judgment module 20 is used for judging whether the visual angle image responds to eye movement. If the visual angle image responds to the eye movement, the visual angle image adopts continuous following operation, and if the visual angle image does not respond to the eye movement, the visual angle image carries out balance risk early warning.
The operation module 30 is connected to the analysis and judgment module 20. The operation module 30 is configured to determine a matching balance degree between the eye movement and the view image in a case where the view image takes the continuous following operation and the view image does not take the continuous following operation, respectively.
The evaluation module 40 is connected to the arithmetic module 30. The evaluation module 40 is configured to evaluate whether the balance between the eye movement and the view image is not matched according to whether the view image is within the range of the matching balance.
In this embodiment, the above modules may be implemented by means of a computer program, and the specific hardware structure of the modules is not limited specifically, so that the above functions may be implemented. The evaluation system 100 for eye movement and visual angle image balance mismatch provided in this embodiment may perform all steps in the evaluation method for eye movement and visual angle image balance mismatch. The evaluation system 100 for unbalanced eye movement and visual angle image also comprehensively considers the correction and steering capability of eye movement and the following capability of visual angle image, and provides a method for generating balanced eye movement-visual angle image under multiple environments based on the continuous following capability of visual angle image. The method has important significance for improving the recognition of the risk of the intelligent eye movement to the visual angle image, and can effectively improve the safety of the visual angle image and the comfort of the eye movement following movement in the eye movement-visual angle image interaction process.
Those skilled in the art will appreciate that various modifications and improvements can be made to the disclosure. For example, the various devices or components described above may be implemented in hardware, or may be implemented in software, firmware, or a combination of some or all of the three.
A flowchart is used in this disclosure to describe the steps of a method according to an embodiment of the present disclosure. It should be understood that the steps that follow or before do not have to be performed in exact order. Rather, the various steps may be processed in reverse order or simultaneously. Also, other operations may be added to these processes.
Those of ordinary skill in the art will appreciate that all or a portion of the steps of the methods described above may be implemented by a computer program to instruct related hardware, and the program may be stored in a computer readable storage medium, such as a read only memory, a magnetic disk, or an optical disk. Alternatively, all or part of the steps of the above embodiments may be implemented using one or more integrated circuits. Accordingly, each module/unit in the above embodiment may be implemented in the form of hardware, or may be implemented in the form of a software functional module. The present disclosure is not limited to any specific form of combination of hardware and software.
Unless defined otherwise, all terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this disclosure pertains. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
The foregoing is illustrative of the present disclosure and is not to be construed as limiting thereof. Although a few exemplary embodiments of this disclosure have been described, those skilled in the art will readily appreciate that many modifications are possible in the exemplary embodiments without materially departing from the novel teachings and advantages of this disclosure. Accordingly, all such modifications are intended to be included within the scope of this disclosure as defined in the claims. It is to be understood that the foregoing is illustrative of the present disclosure and is not to be construed as limited to the specific embodiments disclosed, and that modifications to the disclosed embodiments, as well as other embodiments, are intended to be included within the scope of the appended claims. The disclosure is defined by the claims and their equivalents.
In the description of the present specification, reference to the terms "one embodiment," "some embodiments," "illustrative embodiments," "examples," "specific examples," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiments or examples. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
While embodiments of the present invention have been shown and described, it will be understood by those of ordinary skill in the art that: many changes, modifications, substitutions and variations may be made to the embodiments without departing from the spirit and principles of the invention, the scope of which is defined by the claims and their equivalents.