WO2019206145A1 - 视觉疲劳识别方法、视觉疲劳识别装置、虚拟现实设备和存储介质 - Google Patents

视觉疲劳识别方法、视觉疲劳识别装置、虚拟现实设备和存储介质 Download PDF

Info

Publication number
WO2019206145A1
WO2019206145A1 PCT/CN2019/083902 CN2019083902W WO2019206145A1 WO 2019206145 A1 WO2019206145 A1 WO 2019206145A1 CN 2019083902 W CN2019083902 W CN 2019083902W WO 2019206145 A1 WO2019206145 A1 WO 2019206145A1
Authority
WO
WIPO (PCT)
Prior art keywords
average
fatigue
threshold
value
visual
Prior art date
Application number
PCT/CN2019/083902
Other languages
English (en)
French (fr)
Inventor
孙建康
张�浩
陈丽莉
楚明磊
薛鸿臻
闫桂新
李茜
Original Assignee
京东方科技集团股份有限公司
北京京东方光电科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 京东方科技集团股份有限公司, 北京京东方光电科技有限公司 filed Critical 京东方科技集团股份有限公司
Priority to US16/612,996 priority Critical patent/US11132544B2/en
Publication of WO2019206145A1 publication Critical patent/WO2019206145A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • G06T5/70
    • G06T5/90
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/30Noise filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/20Scenes; Scene-specific elements in augmented reality scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/193Preprocessing; Feature extraction
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/18Status alarms
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/18Status alarms
    • G08B21/24Reminder alarms, e.g. anti-loss alarms
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/011Emotion or mood input determined on the basis of sensed human body parameters such as pulse, heart rate or beat, temperature of skin, facial expressions, iris, voice pitch, brain activity patterns
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20076Probabilistic image processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30268Vehicle interior

Definitions

  • Embodiments of the present disclosure relate to a visual fatigue recognition method, a visual fatigue recognition device, a virtual reality device, and a storage medium.
  • Virtual Reality (VR) technology is a computer simulation technology that can create and experience virtual worlds. It combines computer technology and display technology to construct a virtual environment that allows users to immerse themselves in the virtual environment. Strong immersion.
  • Virtual reality VR devices are typically head-mounted, which can be used in areas such as video games and video interactions, but head-mounted virtual reality devices are typically used in a dim and relatively closed environment for wearing VR devices, usually The distance from the displayed display screen is relatively close, and when the image displayed on the display screen is observed by the VR device, there may be problems such as image distortion and excessive disparity of the two eyes, thus causing visual fatigue when the user uses the virtual reality device.
  • Some embodiments of the present disclosure provide a visual fatigue recognition method, including: acquiring an eye image of a user; acquiring a visual feature from the eye image, and calculating a visual fatigue value according to the visual feature; The fatigue level threshold is compared and the visual fatigue level is determined based on the comparison result; the visual fatigue level is used to generate a corresponding alert signal.
  • the method further includes pre-processing the ocular image prior to acquiring the visual feature from the ocular image.
  • the pre-processing includes: increasing brightness of the ocular image, increasing contrast of the ocular image, and/or performing denoising processing on the ocular image.
  • the visual features include: average saccade velocity, average squint angular velocity, closed eye averaging time, and/or blink average frequency.
  • the extracting the visual feature from the ocular image comprises: acquiring a pupil position, a pupil area, and/or a blinking number from the continuous image of each of the ocular images; and then, corresponding to the visual feature Correspondingly, calculating the average speed of the eyeball according to each of the pupil positions in the first preset time period; calculating the average angular velocity of the eyeball according to each of the pupil positions in the second preset time period; Calculating the average duration of the closed eyes for each of the pupil areas in the preset time period; and/or calculating the average frequency of the blinks according to the number of blinks in the fourth preset time period.
  • calculating the visual fatigue value according to the visual feature includes: comparing the saccade average speed to an saccade average speed level threshold to obtain a first visual fatigue value, corresponding to extracting the temporal feature And comparing the average angular velocity of the eyeball with the threshold value of the average angular velocity of the eyeball to obtain a second visual fatigue value, and comparing the average duration of the closed eye with the threshold of the average duration of the closed eye to obtain a third visual fatigue value, and / or comparing the blink average frequency with the blink average frequency level threshold to obtain a fourth visual fatigue value.
  • the saccade average speed level threshold includes an saccade average speed light fatigue threshold, an saccade average speed moderate fatigue threshold, and an saccade average speed severe fatigue threshold;
  • the saccade average angular velocity level threshold includes an eye Skating average angular velocity, mild fatigue threshold, saccade average angular velocity, moderate fatigue threshold, and saccade average angular velocity, severe fatigue threshold;
  • said closed-eye continuous average time-level threshold includes saccade average angular velocity, mild fatigue threshold, and average saccade average angular velocity Fatigue threshold and eye movement average angular velocity severe fatigue threshold;
  • the blink average frequency level threshold includes a blink average frequency mild fatigue threshold, a blink average frequency moderate fatigue threshold, and a blink average frequency severe fatigue threshold; Comparing the saccade average speed level threshold to obtain the first visual fatigue value includes: when the average saccade speed is less than the squint average speed and the light fatigue threshold, the first visual fatigue value is a first speed value When the average speed of the eyeball is greater than the average speed of the eyeball, the light fatigue When the threshold is less than
  • calculating the visual fatigue value based on the visual feature further comprises: determining the first visual fatigue value, the second visual fatigue value, the third visual fatigue value, and/or the fourth visual fatigue The value determines the visual fatigue value.
  • the fatigue level threshold includes a mild fatigue threshold, a moderate fatigue threshold, and a severe fatigue threshold, comparing the visual fatigue value to a fatigue level threshold and determining a visual fatigue level based on the comparison result, including: When the visual fatigue value is greater than or equal to the light fatigue threshold and less than the moderate fatigue threshold, determining that the visual fatigue level is a mild fatigue level; when the visual fatigue value is greater than or equal to the moderate fatigue threshold and When the threshold is less than the severe fatigue threshold, the visual fatigue level is determined to be a moderate fatigue level; and when the visual fatigue value is greater than or equal to the severe fatigue threshold, the visual fatigue level is determined to be a severe fatigue level.
  • the method also includes generating a corresponding alert signal based on the visual fatigue level.
  • the generating the corresponding reminding signal according to the visual fatigue level comprises: generating an image flicker signal of a corresponding color and/or a vibration signal of a corresponding frequency according to the visual fatigue level, so that an image of the corresponding color is displayed on the screen of the virtual reality device And the image flashes at a set frequency and/or causes the virtual reality device to vibrate at a preset frequency.
  • the method is for a virtual reality device.
  • Some embodiments of the present disclosure further provide a visual fatigue recognition apparatus, including: an eye image acquisition unit configured to acquire an eye image of a user; a visual fatigue value acquisition unit configured to acquire a visual feature from the eye image, and Calculating a visual fatigue value according to the visual feature; the visual fatigue level determining unit is configured to compare the visual fatigue value with a fatigue level threshold and determine a visual fatigue level according to the comparison result.
  • the apparatus further includes: a reminder signal generating unit configured to generate a corresponding alert signal according to the visual fatigue level.
  • the apparatus further includes an image pre-processing unit.
  • the image pre-processing unit includes a brightness enhancement unit, a contrast enhancement unit, and/or a filtering unit; the brightness enhancement unit is configured to increase brightness of the eye image; and the contrast enhancement unit is configured to increase the image of the eye Contrast; the filtering unit is configured to perform denoising processing on the eye image.
  • the visual features include: an average saccade velocity, an average squint angular velocity, a closed eye averaging time, and/or a blink average frequency
  • the visual fatigue value acquisition unit includes: a visual feature acquisition subunit And a method for obtaining a pupil position, a pupil area, and/or a blinking number from the continuous eye frames, and an eye hop average speed calculation subunit, configured to: according to each of the pupils in the first preset time period Position calculating the average saccade speed; the saccade average angular velocity calculation subunit, configured to calculate the average yaw rate of the saccade according to each of the pupil positions in the second preset time period; And calculating a closed eye average duration according to each of the pupil areas in the third preset time period; and/or a blink average frequency calculation subunit, configured to calculate the number of blinks according to the fourth preset time period The average frequency of blinking.
  • the visual fatigue value acquisition unit further includes: a first visual fatigue value determining subunit, a second visual fatigue value determining subunit, a third visual fatigue value determining subunit, and/or a fourth visual fatigue value determining a sub-unit, and a visual fatigue value determining sub-unit, wherein the first visual fatigue value determining sub-unit is configured to compare the saccade average speed with an saccade average speed level threshold to obtain a first visual fatigue value, the second a visual fatigue value determining subunit for comparing the squint average angular velocity with an squint average angular velocity grading threshold to obtain a second visual fatigue value, the third visual fatigue value determining subunit for averaging the closed eye The time is compared with a closed eye continuous average time level threshold to obtain a third visual fatigue value, and the fourth visual fatigue value determining subunit is configured to compare the blink average frequency with the blink average frequency level threshold to obtain a fourth visual fatigue value.
  • the visual fatigue value determining subunit is configured to use the first visual fatigue value,
  • the saccade average speed level threshold includes an saccade average speed light fatigue threshold, an saccade average speed moderate fatigue threshold, and an saccade average speed severe fatigue threshold;
  • the saccade average angular velocity level threshold includes an eye Skating average angular velocity, mild fatigue threshold, saccade average angular velocity, moderate fatigue threshold, and saccade average angular velocity, severe fatigue threshold;
  • said closed-eye continuous average time-level threshold includes saccade average angular velocity, mild fatigue threshold, and average saccade average angular velocity Fatigue threshold and eye movement average angular velocity severe fatigue threshold;
  • the blink average frequency level threshold includes a blink average frequency mild fatigue threshold, a blink average frequency moderate fatigue threshold, and a blink average frequency severe fatigue threshold;
  • the first visual fatigue value is determined The subunit is configured to make the first visual fatigue value a first speed value when the average eye speed is less than the eye movement average light fatigue threshold, when the eye movement average speed is greater than the eye movement Average speed mild fatigue threshold and less than the average eye movement threshold fatigue threshold
  • the first visual fatigue value is a second
  • the threshold value is less than the average eye rate average fatigue threshold, such that the second visual fatigue value is a second angular velocity value, when the average eye angle is When the degree is greater than the average eye rate average fatigue threshold and less than the eye movement average angular speed severe fatigue threshold, the second visual fatigue value is a third angular velocity value, and when the average angular velocity of the eyeball is greater than the average eye velocity When the angular velocity is a severe fatigue threshold, the second visual fatigue value is a fourth angular velocity value, wherein the patrol flat angular velocity mild fatigue threshold is less than the saccade average angular velocity moderate fatigue threshold, the squint average The angular velocity moderate fatigue threshold is less than the eye movement average angular velocity severe fatigue threshold; and the third visual fatigue value determining subunit is configured to when the closed eye continuous average time is less than the closed eye continuous average time mild fatigue threshold And causing the third visual fatigue value to be a first time value, when the closed eye continuous average time is greater than the closed eye continuous average time light fatigue threshold and less than
  • the fatigue level threshold includes a light fatigue threshold, a medium fatigue threshold, and a severe fatigue threshold
  • the visual fatigue level determining unit configured to: when the visual fatigue value is greater than or equal to the light fatigue threshold and When the moderate fatigue threshold is less than, the visual fatigue level is determined to be a mild fatigue level; and when the visual fatigue value is greater than or equal to the moderate fatigue threshold and less than the severe fatigue threshold, determining the visual fatigue The level is a moderate fatigue level; when the visual fatigue value is greater than or equal to the severe fatigue threshold, the visual fatigue level is determined to be a severe fatigue level.
  • Some embodiments of the present disclosure also provide a virtual reality device including the above-described visual fatigue recognition device.
  • Some embodiments of the present disclosure also provide another virtual reality device, including a processor and a machine readable storage medium.
  • the machine readable storage medium stores machine executable instructions executable by the processor, the machine executable instructions being implemented by the processor to implement the visual fatigue recognition method described above.
  • Some embodiments of the present disclosure also provide a storage medium that non-transitoryly stores computer readable instructions that can be executed when the non-transitory computer readable instructions are executed by a computer.
  • FIG. 1 is a workflow diagram of a visual fatigue recognition method provided in accordance with some embodiments of the present disclosure
  • FIG. 2 is a schematic illustration of a change in position of a pupil of a human eye provided in accordance with some embodiments of the present disclosure
  • FIG. 3 is a schematic diagram of a change in position of a pupil of a human eye according to further embodiments of the present disclosure
  • FIG. 4 is a schematic diagram of a change in face pupil area according to some embodiments of the present disclosure.
  • FIG. 5A is a block diagram of a visual fatigue recognition device provided in accordance with some embodiments of the present disclosure
  • FIG. 5B is a schematic block diagram of a time fatigue recognition device provided by other embodiments of the present disclosure
  • FIG. 6 is a schematic block diagram of a visual fatigue acquisition unit provided by some embodiments of the present disclosure.
  • FIG. 7 is a schematic block diagram of a visual fatigue recognition apparatus provided by some embodiments of the present disclosure.
  • FIG. 8A is a schematic block diagram of a virtual reality device according to some embodiments of the present disclosure
  • FIG. 8B is a schematic block diagram of a virtual reality device according to another embodiment of the present disclosure
  • FIG. 8C is a virtual reality provided by still another embodiment of the present disclosure. Schematic diagram of the device.
  • An embodiment of the present disclosure provides a visual fatigue recognition method.
  • the method includes: Step S10: acquiring an eye image of a user; Step S20, acquiring a visual feature from an eye image, and calculating visual fatigue according to the visual feature. a value; step S30, comparing the visual fatigue value with the fatigue level threshold and determining a visual fatigue level based on the comparison result; and step S40, using the visual fatigue level to generate a corresponding alert signal.
  • a virtual reality (VR) device may include a modeling component (eg, a 3D scanner), a three-dimensional visual display component (eg, a 3D display device, a projection device, etc.), a head mounted stereoscopic display (eg, binocular) from a hardware perspective. All-round display), sounding components (eg, three-dimensional sound devices), interactive devices (eg, including position trackers, data gloves, etc.), 3D input devices (eg, three-dimensional mice), motion capture devices, and other interactive devices.
  • a modeling component eg, a 3D scanner
  • a three-dimensional visual display component eg, a 3D display device, a projection device, etc.
  • a head mounted stereoscopic display eg, binocular from a hardware perspective. All-round display
  • sounding components eg, three-dimensional sound devices
  • interactive devices eg, including position trackers, data gloves, etc.
  • 3D input devices eg, three-dimensional mice
  • motion capture devices e
  • the VR device may further include an image capture device, for example, the image capture device includes an infrared light source and an infrared camera, and the infrared camera may be disposed under the lens in the three-dimensional visual reality component of the VR device, and the infrared light source may be opposite to the eye.
  • the part fills in the light, and then the infrared camera is used to acquire the eye image, which can help to obtain the details of the eye image, especially the details of the pupil part image.
  • the image capture device can also be set independently of the VR device.
  • the eye image may include both the left eye image and the right eye image, or only the left eye image or the right eye image. Since the states of the two eyes of the person are usually the same or similar, the image of one eye can also be judged by the image of one eye. Eye state.
  • a depth learning-based image recognition algorithm may be employed to extract visual features from the ocular image, such as saccade speed, saccade angular velocity, closed eye duration, and/or blink frequency, etc., depending on visual characteristics.
  • the visual fatigue value is calculated.
  • the visual fatigue value is used to indicate the degree of fatigue of the human eye. The greater the visual fatigue value, the higher the degree of fatigue of the human eye. For example, if the frequency of blinking is low and the duration of the closed eye is long, the visual fatigue is calculated. The value is large, and the user's eyes can be considered to be in a relatively fatigue state.
  • the visual fatigue value is compared with the fatigue level threshold, and the visual fatigue level is further determined according to the comparison result.
  • the visual fatigue level is classified according to the degree of fatigue of the human eye, and the visual fatigue level may include, for example, mild fatigue and moderateness. Fatigue and severe fatigue.
  • the fatigue level threshold includes a mild fatigue threshold, a moderate fatigue threshold, and a severe fatigue threshold.
  • the mild fatigue threshold, the moderate fatigue threshold, and the severe fatigue threshold distribution correspond to mild fatigue, moderate fatigue, and severe fatigue.
  • the visual fatigue level can be determined based on the comparison between the visual fatigue value and the fatigue level threshold, and the fatigue threshold can be a reference value for judging the degree of visual fatigue, for example, if the visual fatigue value is greater than the fatigue level threshold (such as light) Degree fatigue threshold), it can be determined that the user's human eye is in a mild fatigue state; if the visual fatigue value is greater than the fatigue level threshold (such as the moderate fatigue threshold), the user's human eye can be judged to be in a moderate fatigue state; The fatigue value is greater than the fatigue level threshold (such as the severe fatigue threshold) at the time of severe fatigue, and it can be determined that the user's human eye is in a severe fatigue state.
  • the fatigue level threshold such as light
  • the visual fatigue level is used to generate a corresponding reminder signal to remind the user of the current degree of visual fatigue, and the user performs appropriate rest according to the reminder signal, which helps relieve eye fatigue
  • the reminder signal can be
  • an alarm sound is emitted using a speaker in a VR device, or a reminder image is displayed in a display of a VR device, or a vibration of a VR device or the like is controlled.
  • the visual fatigue recognition method based on the VR device can calculate the visual fatigue value from the visual features extracted from the eye image, compare the visual fatigue value with the fatigue level threshold, and determine the visual fatigue level according to the comparison result.
  • the visual fatigue level is determined, and the visual fatigue level is used to generate a corresponding reminder signal, and the user's eye fatigue degree is reminded by the reminder information, so that the user can know the current degree of human eye fatigue, and the user can take appropriate rest to reduce the user's occurrence of myopia.
  • the risk of such problems helps to protect the user's eyesight.
  • the acquired original eye image may be pre-processed before the visual feature is acquired from the eye image, for example, the pre-processing may include one or more of the following processes: The brightness of the eye image is described, the contrast of the eye image is increased, and the eye image is denoised.
  • the brightness and contrast of the eye image may be improved by an image processing algorithm, such as a gradation transformation method or a histogram adjustment method, and the original eye image acquired usually has a certain noise, which is further improved.
  • the image quality of the eye can be further filtered by the filtering algorithm to remove the noise in the eye image, which helps to extract visual features from the high quality eye image.
  • the visual feature may include an eye saccade speed, an saccade yaw rate, a closed eye duration, and/or a blink frequency.
  • the extracting the visual feature from the ocular image described in the above step S20 may correspondingly include: step S201, from continuous Obtaining a pupil position, a pupil area, and/or a blinking number in each of the eye images of each frame; and then, corresponding to the visual feature, step S202, calculating an eyeball average according to each pupil position in the first preset time period a speed; a step S203, calculating an average angular velocity of the eyeball according to each pupil position in the second preset time period; and step S204, calculating an average duration of the closed eye according to each pupil area in the third preset time period; and/or step S205
  • the average blink frequency is calculated according to the number of blinks in the fourth preset time period.
  • the first preset time period, the second preset time period, the third preset time period, and the fourth preset time period may be the same time period, or may be different time periods, and the embodiment of the present disclosure is This is not a limitation.
  • the image capture device may acquire a plurality of frames of the eye image, and the image capture device acquires the image at a frame rate, for example, a frame rate of 240 frames. /s, that is, 240 frames of eye images per second can be acquired.
  • visual feature extraction may be performed for each frame image, or in order to reduce the calculation amount of visual feature extraction, visual feature extraction may be performed every several frames of images, which may be separately from successive frame images. Obtain the pupil position, pupil area, and so on.
  • the saccade velocity can be calculated from two pupil positions of two consecutive eye images.
  • the pupil position can be the pupil center position. Referring to FIG. 2, the dotted line indicates the level of the pupil center.
  • the pupil center obtained from the eye image at time t is located at point A, and the two-dimensional coordinates of the pupil center position at time t are (x A , y A );
  • the pupil center obtained by the eye image is located at point B, and the two-dimensional coordinates of the pupil center position at time t+1 are (x B , y B );
  • the frame rate of the image acquisition device is rate; then the eye movement speed at time t+1 v t+1 can be calculated by the following formula:
  • the saccade angular velocity can be calculated from three pupil positions of three consecutive three-eye images, for example, as shown in FIG. 3, the dotted line in the figure indicates the horizontal direction through which the pupil center passes, and the eye image from the time t-1 is assumed.
  • the obtained pupil center is located at point C, and the two-dimensional coordinates of the pupil center position at time t-1 are (x C , y C ); the pupil center obtained from the eye image at time t is located at point D, and the pupil center position at time t
  • the two-dimensional coordinates are (x D , y D ), and the movement angle of the center of the pupil at time t is ⁇ t ; the center of the pupil obtained from the eye image at time t+1 is located at point E, and the position of the center of the pupil at time t+1 is two.
  • the dimensional coordinates are (x E , y E ), and the motion angle of the pupil center at time t+1 is ⁇ t+1 ; then the angular velocity v t+1 at time t+1 can be calculated by the following formula: among them,
  • the closed eye duration can be calculated according to each pupil area of three consecutive frames of the eye image.
  • the pupil area obtained from the eye image at time t3 is 1/4 of the total pupil area, from time t2 to time t3.
  • the blink frequency f can be calculated based on the number of blinks n per unit time t (for example, 1 second):
  • the above method is to calculate the primary saccade velocity, the primary saccade angular velocity, the primary closed eye duration, and the blink frequency per unit time according to the continuous multi-frame eye image, and the above calculated values may have a large error, so
  • the average value of the foregoing values in the preset time period may be calculated, and the average value is used as a visual feature.
  • the preset may be calculated by using multiple eye hop speeds in the preset time period T calculated in the above manner.
  • Average speed of saccades over time Calculating the average angular velocity of the eyeball in the preset time period according to the plurality of eyeball angular velocities Calculating the average duration of closed eyes in the preset time period based on a plurality of closed eye durations Calculating the average blink frequency of the preset time period based on a plurality of blink frequencies
  • the preset time may be selected according to requirements, for example, the preset time period is 60s.
  • the calculating the visual fatigue value according to the visual feature includes: comparing the average saccade speed to an saccade average speed level threshold to obtain a first visual fatigue value, the eye hopping The average angular velocity is compared with an average angular velocity index of the eyeball to obtain a second visual fatigue value, and the average closed eye duration is compared with a closed eye continuous average time threshold to obtain a third visual fatigue value, and/or the average frequency of the blink A fourth visual fatigue value is obtained by comparison with a blink average frequency level threshold.
  • the average eye movement speed threshold includes an eye movement average speed light fatigue threshold, an eye movement average speed medium fatigue threshold, and an eye movement average speed severe fatigue threshold;
  • the eye movement average angular velocity level threshold includes an average eye movement average angular velocity Fatigue threshold, average average angular velocity of eyeball, moderate fatigue threshold of average angular velocity of eyeball, and average fatigue time threshold of eyelid continuation;
  • the threshold of average duration of closed eye includes mild fatigue threshold of average angular velocity, moderate average angular velocity of eyeball, and average saccade Angular velocity severe fatigue threshold;
  • the blink average frequency level threshold includes a blinking average frequency mild fatigue threshold, a blink average frequency moderate fatigue threshold, and a blink average frequency severe fatigue threshold.
  • calculating the visual fatigue value based on the visual features described in step S20 above may include the following steps S206, S207, S208, and/or S209.
  • Step S206 when the average speed of the eyeball Less than the eyelid leveling speed, mild fatigue threshold
  • the above average saccade average speed mild fatigue threshold Average saccade speed moderate fatigue threshold And eye movement average speed severe fatigue threshold It can be preset according to the experience value.
  • Step S207 when the average angular velocity of the eyeball Less than the average angular velocity of the eye movement, mild fatigue threshold
  • the above average nysolic average angular velocity mild fatigue threshold Average nystagmus of moderate angular velocity And eye movement average angular velocity severe fatigue threshold It can be preset according to the experience value.
  • Step S208 when the closed eye lasts for an average time Less than the closed eye duration average time mild fatigue threshold
  • closed eye continuous average time mild fatigue threshold Closed eye continuous average time moderate fatigue threshold And closed eye continuous average time severe fatigue threshold It can be preset according to the experience value.
  • Step S209 when the average frequency of blinking Less than the average frequency of blinking, mild fatigue threshold
  • Blinking average frequency mild fatigue threshold Less than the mean frequency of the blinking average fatigue threshold Blink average frequency moderate fatigue threshold Less than the average frequency of blinking, severe fatigue threshold Blink average frequency mild fatigue threshold Blink average frequency moderate fatigue threshold Blink average frequency severe fatigue threshold It can be preset according to the experience value.
  • first speed value, the second speed value, the third speed value, and the fourth speed value herein may be an artificially set first sequence having a certain regularity; the first angular velocity value, the second angular velocity value, The third triangular speed value and the fourth angular velocity value may be artificially set a second sequence having a certain regularity; the first time value, the second time value, the third time value, and the fourth time value may be artificially set The third sequence of certain regularity; the first frequency value, the second frequency value, the third frequency value, and the fourth frequency value may be artificially set fourth rows having a certain regularity.
  • the first series, the second number, the third number, and the fourth number may be the same.
  • calculating the visual fatigue value according to the visual feature further comprises: determining the visual fatigue value according to the first visual fatigue value, the second visual fatigue value, the third visual fatigue value, and/or the fourth visual fatigue value.
  • the visual fatigue value may be calculated by step S210: calculating the first visual fatigue value, the second visual fatigue value, and the third visual fatigue.
  • the sum of the value and/or the fourth visual fatigue value as the visual fatigue value that is, the visual fatigue value m is the first visual fatigue value m 1 , the second visual fatigue value m 2 , the third visual fatigue value m 3 and/or
  • the sum of the fourth visual fatigue values m 4 added, i.e., m m 1 + m 2 + m 3 + m 4 , may also be referred to as a visual fatigue integrated value.
  • the first visual fatigue value m 1 , the second visual fatigue value m 2 , and the third visual fatigue value may be first used.
  • the m 3 and/or the fourth visual fatigue value m 4 are normalized and then summed to obtain the visual fatigue value.
  • the visual fatigue value can be compared with the fatigue level value, and the visual fatigue level can be determined based on the comparison result.
  • visual fatigue levels can be classified into mild fatigue, moderate fatigue, and severe fatigue
  • fatigue level thresholds can also include mild fatigue thresholds, moderate fatigue thresholds, and severe fatigue thresholds.
  • the mild fatigue threshold, the moderate fatigue threshold, and the severe fatigue threshold may be determined based on the first, second, third, and fourth series.
  • comparing the visual fatigue value with a fatigue level threshold and determining a visual fatigue level according to the comparison result includes: determining that the visual fatigue value is greater than or equal to the light fatigue threshold and less than the moderate fatigue threshold The visual fatigue level is a mild fatigue level; when the visual fatigue value is greater than or equal to the moderate fatigue threshold and less than the severe fatigue threshold, determining that the visual fatigue level is a moderate fatigue level; When the visual fatigue value is greater than or equal to the severe fatigue threshold, the visual fatigue level is determined to be a severe fatigue level.
  • the light fatigue threshold is, for example, 4, the moderate fatigue threshold is 7, for example, and the severe fatigue threshold is 10, for example, if 4 ⁇ If m ⁇ 7, it can be determined that the user's eye fatigue degree is mild fatigue. If 7 ⁇ m ⁇ 10, it can be determined that the user's eye fatigue degree is moderate fatigue, and if 10 ⁇ m, the user can be determined. The degree of eye fatigue is severe fatigue.
  • the visual fatigue recognition method may further include generating a corresponding reminding signal according to the visual fatigue level, and generating a corresponding reminding signal according to the visual fatigue level, for example, including:
  • the alert signal may include a visual signal and/or a haptic signal
  • the visual signal may specifically be an image flicker signal
  • the virtual reality device may display an image of the corresponding color on the screen after receiving the image flicker signal
  • the stroking signal can be a vibration signal of a corresponding frequency
  • the VR device vibrates at a preset frequency after receiving the vibration signal, and the user can better remind the user through the image flicker signal and the vibration signal, and the user can know the current The degree of fatigue of the human eye.
  • a green flashing triangle image appears on the screen of the VR device, and the VR device is slightly vibrated (vibrating at a lower frequency) to perform a mild fatigue reminder.
  • an orange flashing triangle image appears on the screen of the VR device, and the VR device is moderately vibrated (vibrating at a higher frequency) for mild fatigue reminder;
  • a red flashing triangle image appears on the screen of the VR device, and the VR device is severely vibrated (vibrating at a higher frequency) to perform a severe fatigue reminder.
  • the embodiment of the present disclosure further provides a visual fatigue recognition device.
  • the visual fatigue recognition device 05 includes an eye image acquisition unit 501 for acquiring an eye image of the user, and a visual fatigue value acquisition unit 502. For acquiring a visual feature from the eye image, and calculating a visual fatigue value according to the visual feature; a visual fatigue level determining unit 503, configured to compare the visual fatigue value with a fatigue level threshold and determine a visual according to the comparison result Fatigue level.
  • the visual fatigue recognition device 05 may further include a reminder signal generating unit 504 for generating a corresponding reminder signal based on the visual fatigue level.
  • the visual fatigue recognition apparatus may further include an image pre-processing unit 505, which may include one or more of the following components: a brightness enhancement unit 551, a contrast enhancement unit 552, and filtering. Unit 553.
  • the brightness enhancement unit 551 is configured to increase the brightness of the eye image
  • the contrast enhancement unit 552 is configured to increase the contrast of the eye image
  • the filtering unit 553 is configured to perform a denoising process on the eye image.
  • the visual features include: average saccade velocity, average squint angular velocity, closed eye averaging time, and/or blink average frequency.
  • the visual fatigue value acquisition unit 502 includes: a feature acquisition sub-unit 521, configured to respectively obtain a pupil position, a pupil area, and/or a blink number from the consecutive image frames of the frames; the eye jump average speed calculation sub-unit 522 is configured to use the first preset time period according to the first preset time period Calculating the average saccade speed of each of the pupil positions; the saccade average angular velocity calculation sub-unit 523 is configured to calculate an average saccade angular velocity according to each of the pupil positions in the second preset time period; a subunit 524, configured to calculate a closed eye average duration according to each of the pupil areas in the third preset time period; and/or a blink average frequency calculation subunit 525, configured to blink according to the fourth preset time period The number of times calculates the average frequency of blinks.
  • the first preset time period, the second preset time period, the third preset time period, and the fourth preset time period may be the same time period, or may be different time periods, and the embodiment of the present disclosure is This is not a limitation.
  • the visual fatigue value acquisition unit 502 may further include a first visual fatigue value determination subunit 526, a second visual fatigue value determination subunit 527, a third visual fatigue value determination subunit 528, and/or Or the fourth visual fatigue value determining subunit 529.
  • the first visual fatigue value determining sub-unit 526 is configured to compare the saccade average speed with an saccade average speed level threshold to obtain a first visual fatigue value.
  • the second visual fatigue value determining sub-unit 527 is configured to compare the eye-jump average angular velocity with an eye-jump average angular velocity level threshold to obtain a second visual fatigue value.
  • the third visual fatigue value determining sub-unit 528 is configured to compare the closed-eye average duration with the closed-eye continuous average time level threshold to obtain a third visual fatigue value.
  • the fourth visual fatigue value determining sub-unit 529 is configured to compare the blink average frequency with the blink average frequency level threshold to obtain a fourth visual fatigue value.
  • the visual fatigue value acquisition unit 502 may further include a visual fatigue value determination sub-unit 530 for using the first visual fatigue value, the second visual fatigue value, and the third visual The fatigue value and/or the fourth visual fatigue value determine the visual fatigue value.
  • the eye movement average speed level threshold includes an eye movement average speed light fatigue threshold, an eye movement average speed medium fatigue threshold, and an eye movement average speed heavy fatigue threshold
  • the eye movement average angular speed level threshold includes an eye movement average angular speed Mild fatigue threshold, average average angular velocity of eyeball, moderate fatigue threshold of average angular velocity of eyeball, and average threshold of average eye velocity of closed eye include threshold of average angular velocity of eyeball, mild fatigue threshold of average angular velocity of eyeball, and moderate fatigue threshold of eyeball average angular velocity
  • the blink average frequency level threshold includes a blink average frequency mild fatigue threshold, a blink average frequency moderate fatigue threshold, and a blink average frequency severe fatigue threshold.
  • the first visual fatigue value determining sub-unit 526 is configured to: when the average eye movement speed is less than the eye movement average light fatigue threshold, the first visual fatigue value is a first speed value, when the eye movement average When the speed is greater than the light fatigue threshold of the average eye speed and less than the average fatigue threshold of the average eye speed, the first visual fatigue value is a second speed value, and when the average speed of the eyeball is greater than the average speed of the eyeball
  • the first visual fatigue value is a third speed value when the degree of fatigue threshold is less than the eye movement average heavy fatigue threshold, and the first visual fatigue is when the eye movement average speed is greater than the eye movement average speed severe fatigue threshold
  • the value is a fourth speed value, wherein the eye movement flatness light fatigue threshold is less than the eye movement average speed medium fatigue threshold, and the eye movement average speed medium fatigue threshold is less than the eye movement average speed severe fatigue threshold.
  • the second visual fatigue value determining subunit 527 is configured to: when the average angular velocity of the eyeball is less than the average fatigue angular velocity of the eyeball average angular velocity, the second visual fatigue value is a first angular velocity value, when the average eye velocity is When the angular velocity is greater than the average fatigue angular threshold of the eyeball and less than the moderate fatigue threshold of the average angular velocity of the eyeball, the second visual fatigue value is a second angular velocity value, and when the average angular velocity of the eyeball is greater than the average angular velocity of the eyeball.
  • the second visual fatigue value is a third angular velocity value when the degree of fatigue threshold is less than the eye movement average angular velocity severe fatigue threshold, and the second visual fatigue is when the eyeball average angular velocity is greater than the eyeball average angular velocity severe fatigue threshold
  • the value is a fourth angular velocity value, wherein the ocular angular angular velocity mild fatigue threshold is less than the squinting average
  • the third visual fatigue value determining sub-unit 528 is configured to: when the closed-eye continuous average time is less than the closed-eye continuous average time light fatigue threshold, the third visual fatigue value is a first time value, when the closed The third visual fatigue value is a second time value when the eye average duration is greater than the closed eye continuous average time mild fatigue threshold and less than the closed eye continuous average time moderate fatigue threshold, when the closed eye duration average time is greater than
  • the third visual fatigue value is a third time value
  • the closed eye continuous average time is greater than the closed eye continuous average time
  • the third visual fatigue value is a fourth time value, wherein the closed eye continuous average time mild fatigue threshold is less than the closed eye continuous average time moderate fatigue threshold, and the closed eye lasts the average time moderate fatigue The threshold is less than the closed eye continuous average time severe fatigue threshold.
  • a fourth visual fatigue value determining subunit 529 configured to: when the blinking average frequency is less than the blinking average frequency light fatigue threshold, the fourth visual fatigue value is a first frequency value, when the blinking average frequency is greater than the blink When the average frequency is slightly fatigue threshold and less than the average frequency of the blink average fatigue threshold, the fourth visual fatigue value is the second frequency value, and when the blink average frequency is greater than the average frequency of the blink average fatigue threshold and less than the blink average When the frequency is severely fatigued, the fourth visual fatigue value is a third frequency value, and when the blink average frequency is greater than the blink average frequency severe fatigue threshold, the fourth visual fatigue value is a fourth frequency value, wherein the blink is The average frequency mild fatigue threshold is less than the blink average frequency moderate fatigue threshold, and the blink average frequency moderate fatigue threshold is less than the blink average frequency severe fatigue threshold.
  • first speed value, the second speed value, the third speed value, and the fourth speed value herein may be an artificially set first sequence having a certain regularity; the first angular velocity value, the second angular velocity value, The third triangular speed value and the fourth angular velocity value may be artificially set a second sequence having a certain regularity; the first time value, the second time value, the third time value, and the fourth time value may be artificially set The third sequence of certain regularity; the first frequency value, the second frequency value, the third frequency value, and the fourth frequency value may be artificially set fourth rows having a certain regularity.
  • the first series, the second number, the third number, and the fourth number may be the same.
  • calculating the visual fatigue value according to the visual feature further comprises: determining the visual fatigue value according to the first visual fatigue value, the second visual fatigue value, the third visual fatigue value, and/or the fourth visual fatigue value.
  • the visual fatigue value determining subunit 530 may calculate the first visual fatigue value, the second visual fatigue value, the The sum of the third visual fatigue value and/or the fourth visual fatigue value is taken as the visual fatigue value.
  • the visual fatigue value determining subunit 530 may first compare the first visual fatigue value m 1 and the second visual fatigue value m. 2. The third visual fatigue value m 3 and/or the fourth visual fatigue value m 4 are normalized and then summed to obtain the visual fatigue value.
  • the fatigue level threshold includes a light fatigue threshold, a medium fatigue threshold, and a severe fatigue threshold
  • the visual fatigue level determining unit 503 is configured to: when the visual fatigue value is greater than or equal to the light fatigue threshold and less than the middle Determining the visual fatigue level as a mild fatigue level; and determining that the visual fatigue level is moderate when the visual fatigue value is greater than or equal to the moderate fatigue threshold and less than the severe fatigue threshold a fatigue level; when the visual fatigue value is greater than or equal to the severe fatigue threshold, determining that the visual fatigue level is a severe fatigue level.
  • the visual fatigue device provided by the present disclosure can reduce the risk of a user suffering from problems such as myopia and help protect the user's vision.
  • the device embodiments described above are merely illustrative, wherein the units described as separate components may or may not be physically separate, ie may be located in one place, or may be distributed over multiple network elements; The above units may be combined into one unit, or may be further split into a plurality of subunits.
  • the units in the apparatus of the embodiments of the present disclosure may be implemented by means of software, or by software and hardware, and may also be implemented by hardware. Based on such understanding, the technical solution of the present disclosure may be embodied in the form of a software product in essence or in the form of a software product, taking a software implementation as an example, by applying the The processor in which the device of the device is located reads the corresponding computer program instructions in the non-volatile memory into memory.
  • FIG. 7 is a schematic block diagram of another visual fatigue recognition device 06 provided by at least one embodiment of the present disclosure.
  • the visual fatigue recognition device 06 includes a processor 210, a machine readable storage medium 220, and one or more computer program modules 221.
  • processor 210 is coupled to machine readable storage medium 220 via bus system 230.
  • one or more computer program modules 221 are stored in machine readable storage medium 220.
  • one or more computer program modules 221 include instructions for performing the visual fatigue recognition method provided by any of the embodiments of the present disclosure.
  • instructions in one or more computer program modules 221 can be executed by processor 210.
  • the bus system 230 can be a conventional serial, parallel communication bus, etc., and embodiments of the present disclosure do not limit this.
  • the processor 210 can be a central processing unit (CPU), an image processor (GPU), or other form of processing unit having data processing capabilities and/or instruction execution capabilities, and can be a general purpose processor or a dedicated processor, and Other components in the device 06 can be visually fatigued to perform the desired function.
  • CPU central processing unit
  • GPU image processor
  • Other components in the device 06 can be visually fatigued to perform the desired function.
  • Machine-readable storage medium 220 can include one or more computer program products, which can include various forms of computer-readable storage media, such as volatile memory and/or nonvolatile memory.
  • the volatile memory may include, for example, random access memory (RAM) and/or cache or the like.
  • the nonvolatile memory may include, for example, a read only memory (ROM), a hard disk, a flash memory, or the like.
  • One or more computer program instructions can be stored on a computer readable storage medium, and the processor 210 can execute the program instructions to implement the functions (implemented by the processor 210) and/or other desired functions in the disclosed embodiments. For example, visual fatigue recognition methods and the like.
  • Various applications and various data such as a sequence of face images and various data used and/or generated by the application, etc., may also be stored in the computer readable storage medium.
  • an embodiment of the present disclosure further provides a virtual reality device 07, including the visual fatigue recognition device 05 or the visual fatigue recognition device 06 described above.
  • FIG. 8B is a schematic block diagram of the virtual reality device 08.
  • the virtual reality device 08 includes a machine readable storage medium 102 and a processor 101, and may further include a nonvolatile storage medium 103, a communication interface 104, and a bus 105; wherein the machine readable storage medium 102
  • the processor 101, the nonvolatile storage medium 103, and the communication interface 104 complete communication with each other via the bus 105.
  • the processor 101 can perform the visual fatigue recognition method described above by reading and executing machine executable instructions in the machine readable storage medium 102 corresponding to the control logic of the visual fatigue recognition method.
  • the communication interface 104 is coupled to a communication device (not shown).
  • the communication device can communicate with the network and other devices via wireless communication, such as the Internet, an intranet, and/or a wireless network such as a cellular telephone network, a wireless local area network (LAN), and/or a metropolitan area network (MAN) ).
  • Wireless communication can use any of a variety of communication standards, protocols, and technologies including, but not limited to, Global System for Mobile Communications (GSM), Enhanced Data GSM Environment (EDGE), Wideband Code Division Multiple Access (W-CDMA).
  • GSM Global System for Mobile Communications
  • EDGE Enhanced Data GSM Environment
  • W-CDMA Wideband Code Division Multiple Access
  • CDMA Code Division Multiple Access
  • TDMA Time Division Multiple Access
  • Wi-Fi eg based on IEEE 802.11a, IEEE 802.11b, IEEE 802.11g and/or IEEE 802.11n standards
  • VoIP Internet Protocol-based voice transmission
  • Wi-MAX protocols for email, instant messaging, and/or short message service (SMS), or any other suitable communication protocol.
  • the machine-readable storage medium referred to herein may be any electronic, magnetic, optical, or other physical storage device that can contain or store information such as executable instructions, data, and so forth.
  • the machine-readable storage medium may be: RAM (Radom Access Memory), volatile memory, non-volatile memory, flash memory, storage drive (such as a hard disk drive), any type of storage disk (such as a disk). , dvd, etc.), or a similar storage medium, or a combination thereof.
  • the non-volatile medium can be a non-volatile memory, a flash memory, a storage drive (such as a hard drive), any type of storage disk (such as a compact disc, dvd, etc.), or a similar non-volatile storage medium, or a combination thereof.
  • the above VR device may also include other existing components, and details are not described herein again.
  • the virtual reality device 07/08 can be worn on the eyes of a person, thereby implementing a visual fatigue recognition function for the user as needed.
  • Embodiments of the present disclosure also provide a storage medium.
  • the storage medium stores computer readable instructions non-transitoryly, and the non-transitory computer readable instructions, when executed by a computer (including a processor), can perform the visual fatigue recognition method provided by any of the embodiments of the present disclosure.
  • the storage medium may be any combination of one or more computer readable storage media, such as a computer readable storage medium containing computer readable program code for obtaining an image of a user's eye, and another computer readable storage medium containing The eye image acquires computer readable program code of the visual feature.
  • the computer can execute the program code stored in the computer storage medium to perform a visual fatigue recognition method such as provided by any of the embodiments of the present disclosure.
  • the storage medium may include a memory card of a smart phone, a storage unit of a tablet, a hard disk of a personal computer, a random access memory (RAM), a read only memory (ROM), an erasable programmable read only memory (EPROM), Portable compact disk read only memory (CD-ROM), flash memory, or any combination of the above storage media may be other suitable storage media.
  • RAM random access memory
  • ROM read only memory
  • EPROM erasable programmable read only memory
  • CD-ROM Portable compact disk read only memory
  • flash memory or any combination of the above storage media may be other suitable storage media.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Databases & Information Systems (AREA)
  • Computing Systems (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Ophthalmology & Optometry (AREA)
  • Business, Economics & Management (AREA)
  • Emergency Management (AREA)
  • Data Mining & Analysis (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Eye Examination Apparatus (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

一种视觉疲劳识别方法、视觉疲劳识别装置、虚拟现实设备和存储介质,该视觉疲劳识别方法包括:获取用户的眼部图像;从所述眼部图像获取视觉特征,并根据所述视觉特征计算视觉疲劳值;将所述视觉疲劳值与疲劳等级阈值进行比较并根据比较结果判定视觉疲劳等级;将所述视觉疲劳等级用于生成对应的提醒信号。该视觉疲劳识别方法有助于保护用户的视力。

Description

视觉疲劳识别方法、视觉疲劳识别装置、虚拟现实设备和存储介质
本申请要求于2018年4月26日递交的中国专利申请第201810384239.1号的优先权,在此全文引用上述中国专利申请公开的内容以作为本申请的一部分。
技术领域
本公开实施例涉及视觉疲劳识别方法、视觉疲劳识别装置、虚拟现实设备和存储介质。
背景技术
虚拟现实(Virtual Reality,简称VR)技术是一种可以创建和体验虚拟世界的计算机仿真技术,其结合了计算机技术和显示技术,构造出的虚拟环境,使用户沉浸到该虚拟环境中,具有很强的沉浸感。
虚拟现实VR设备通常为头戴式,其可应用于电子游戏和视频交互等领域,但头戴式虚拟现实设备的通常在一个昏暗和相对封闭的环境中使用,用于佩戴VR设备时,通常与观看的显示屏的距离较近,并且,通过VR设备观察显示屏显示的图像时,可能存在图像畸变和两眼视差过大等问题,因此,使得用户在使用虚拟现实设备时产生视觉疲劳。
发明内容
本公开一些实施例提供一种视觉疲劳识别方法,包括:获取用户的眼部图像;从所述眼部图像获取视觉特征,并根据所述视觉特征计算视觉疲劳值;将所述视觉疲劳值与疲劳等级阈值进行比较并根据比较结果判定视觉疲劳等级;将所述视觉疲劳等级用于生成对应的提醒信号。
在一些示例中,该方法还包括:在从所述眼部图像获取视觉特征之前对所述眼部图像进行预处理。所述预处理包括:提高所述眼部图像的亮度、提高所述眼部图像的对比度和/或对所述眼部图像进行去噪处理。
在一些示例中,所述视觉特征包括:眼跳平均速度、眼跳平均角速度、闭眼持续平均时间和/或眨眼平均频率。
在一些示例中,所述从所述眼部图像提取视觉特征包括:从连续的各帧所述眼部图像中分别获取瞳孔位置、瞳孔面积和/或眨眼次数;然后,与所述视觉特征相对应地,根据第一预设时间段内的各所述瞳孔位置计算所述眼跳平均速度;根据第二预设时间段内的各所述瞳孔位置计算所述眼跳平均角速度;根据第三预设时间段内的各所述瞳孔面积计算所述闭眼平均持续时间;和/或根据第四预设时间段内的眨眼次数计算所述眨眼平均频率。
在一些示例中,与提取所述时间特征相对应地,所述根据所述视觉特征计算视觉疲劳值包括:将所述眼跳平均速度与眼跳平均速度等级阈值进行比较获得第一视觉疲劳值,将所述眼跳平均角速度与眼跳平均角速度等级阈值进行比较获得第二视觉疲劳值,将所述闭眼平均持续时间与闭眼持续平均时间等级阈值进行比较获得第三视觉疲劳值,和/或将所述眨眼平均频率与眨眼平均频率等级阈值进行比较获得第四视觉疲劳值。
在一些示例中,所述眼跳平均速度等级阈值包括眼跳平均速度轻度疲劳阈值、眼跳平均速度中度疲劳阈值和眼跳平均速度重度疲劳阈值;所述眼跳平均角速度等级阈值包括眼跳平均角速度轻度疲劳阈值、眼跳平均角速度中度疲劳阈值和眼跳平均角速度重度疲劳阈值;所述闭眼持续平均时间等级阈值包括眼跳平均角速度轻度疲劳阈值、眼跳平均角速度中度疲劳阈值和眼跳平均角速度重度疲劳阈值;所述眨眼平均频率等级阈值包括眨眼平均频率轻度疲劳阈值、眨眼平均频率中度疲劳阈值和眨眼平均频率重度疲劳阈值;将所述眼跳平均速度与眼跳平均速度等级阈值进行比较获得第一视觉疲劳值包括:当所述眼跳平均速度小于所述眼跳平均速度轻度疲劳阈值时,则使得所述第一视觉疲劳值为第一速度数值,当所述眼跳平均速度大于所述眼跳平均速度轻度疲劳阈值且小于所述眼跳平均速度中度疲劳阈值时,则使得所述第一视觉疲劳值为第二速度数值,当所述眼跳平均速度大于所述眼跳平均速度中度疲劳阈值且小于所述眼跳平均速度重度疲劳阈值时,则使得所述第一视觉疲劳值为第三速度数值,当所述眼跳平均速度大于所述眼跳平均速度重度疲劳阈值时,则使得所述第一视觉疲劳值为第四速度数值,其中,所述眼跳平速度轻度疲劳阈值小于所述眼跳平均速度中度疲劳阈值,所述眼跳平均速度 中度疲劳阈值小于所述眼跳平均速度重度疲劳阈值;将所述眼跳平均角速度与眼跳平均角速度等级阈值进行比较获得第二视觉疲劳值包括:当所述眼跳平均角速度小于所述眼跳平均角速度轻度疲劳阈值时,则使得所述第二视觉疲劳值为第一角速度数值,当所述眼跳平均角速度大于所述眼跳平均角速度轻度疲劳阈值且小于所述眼跳平均角速度中度疲劳阈值时,则使得所述第二视觉疲劳值为第二角速度数值,当所述眼跳平均角速度大于所述眼跳平均角速度中度疲劳阈值且小于所述眼跳平均角速度重度疲劳阈值时,则使得所述第二视觉疲劳值为第三角速度数值,当所述眼跳平均角速度大于所述眼跳平均角速度重度疲劳阈值时,则使得所述第二视觉疲劳值为第四角速度数值,其中,所述眼跳平角速度轻度疲劳阈值小于所述眼跳平均角速度中度疲劳阈值,所述眼跳平均角速度中度疲劳阈值小于所述眼跳平均角速度重度疲劳阈值;将所述闭眼平均持续时间与闭眼持续平均时间等级阈值进行比较获得第三视觉疲劳值包括:当所述闭眼持续平均时间小于闭眼持续平均时间轻度疲劳阈值时,则使得所述第三视觉疲劳值为第一时间数值,当所述闭眼持续平均时间大于所述闭眼持续平均时间轻度疲劳阈值且小于所述闭眼持续平均时间中度疲劳阈值时,则使得所述第三视觉疲劳值为第二时间数值,当所述闭眼持续平均时间大于所述闭眼持续平均时间中度疲劳阈值且小于所述闭眼持续平均时间重度疲劳阈值时,则使得所述第三视觉疲劳值为第三时间数值,当所述闭眼持续平均时间大于所述闭眼持续平均时间重度疲劳阈值时,则使得所述第三视觉疲劳值为第四时间数值,其中,所述闭眼持续平均时间轻度疲劳阈值小于所述闭眼持续平均时间中度疲劳阈值,所述闭眼持续平均时间中度疲劳阈值小于所述闭眼持续平均时间重度疲劳阈值;将所述眨眼平均频率与眨眼平均频率等级阈值进行比较获得第四视觉疲劳值包括:当所述眨眼平均频率小于所述眨眼平均频率轻度疲劳阈值时,则使得所述第四视觉疲劳值为第一频率数值,当所述眨眼平均频率大于所述眨眼平均频率轻度疲劳阈值且小于所述眨眼平均频率中度疲劳阈值时,则使得所述第四视觉疲劳值为第二频率数值,当所述眨眼平均频率大于所述眨眼平均频率中度疲劳阈值且小于眨眼平均频率重度疲劳阈值时,则使得所述第四视觉疲劳值为第三频率数值,当所述眨眼平均频率大于 所述眨眼平均频率重度疲劳阈值时,则使得所述第四视觉疲劳值为第四频率数值,其中,所述眨眼平均频率轻度疲劳阈值小于所述眨眼平均频率中度疲劳阈值,所述眨眼平均频率中度疲劳阈值小于所述眨眼平均频率重度疲劳阈值。
在一些示例中,根据所述视觉特征计算视觉疲劳值还包括:根据所述第一视觉疲劳值、所述第二视觉疲劳值、所述第三视觉疲劳值和/或所述第四视觉疲劳值确定所述视觉疲劳值。
在一些示例中,所述疲劳等级阈值包括轻度疲劳阈值、中度疲劳阈值和重度疲劳阈值,将所述视觉疲劳值与疲劳等级阈值进行比较并根据比较结果判定视觉疲劳等级包括:当所述视觉疲劳值大于或等于所述轻度疲劳阈值并小于所述中度疲劳阈值时,判断所述视觉疲劳等级为轻度疲劳等级;当所述视觉疲劳值大于或等于所述中度疲劳阈值并小于所述重度疲劳阈值时,判断所述视觉疲劳等级为中度疲劳等级;当所述视觉疲劳值大于或等于所述重度疲劳阈值时,判断所述视觉疲劳等级为重度疲劳等级。
在一些示例中,该方法还包括根据所述视觉疲劳等级生成对应的提醒信号。所述根据所述视觉疲劳等级生成对应的提醒信号包括:根据所述视觉疲劳等级生成对应颜色的图像闪烁信号和/或对应频率的振动信号,以使虚拟现实设备的屏幕上显示对应颜色的图像且所述图像以设定频率闪烁和/或使虚拟现实设备以预设频率震动。
在一些示例中,该方法用于虚拟现实设备。
本公开一些实施例还提供一种视觉疲劳识别装置,包括:眼部图像获取单元,用于获取用户的眼部图像;视觉疲劳值获取单元,用于从所述眼部图像获取视觉特征,并根据所述视觉特征计算视觉疲劳值;视觉疲劳等级判定单元,用于将所述视觉疲劳值与疲劳等级阈值进行比较并根据比较结果判定视觉疲劳等级。
在一些示例中,该装置还包括:提醒信号生成单元,用于根据所述视觉疲劳等级生成对应的提醒信号。
在一些示例中,该装置还包括图像预处理单元。所述图像预处理单元包括亮度提高单元、对比度提高单元和/或滤波单元;所述亮度提高单元 用于提高所述眼部图像的亮度;所述对比度提高单元用于提高所述眼部图像的对比度;所述滤波单元用于对所述眼部图像进行去噪处理。
在一些示例中,所述视觉特征包括:眼跳平均速度、眼跳平均角速度、闭眼持续平均时间和/或眨眼平均频率,相应地,所述视觉疲劳值获取单元包括:视觉特征获取子单元,用于从连续的各帧所述眼部图像中分别获取瞳孔位置、瞳孔面积和/或眨眼次数;眼跳平均速度计算子单元,用于根据第一预设时间段内的各所述瞳孔位置计算所述眼跳平均速度;眼跳平均角速度计算子单元,用于根据第二预设时间段内的各所述瞳孔位置计算所述眼跳平均角速度;闭眼平均持续时间计算子单元,用于根据第三预设时间段内的各所述瞳孔面积计算所述闭眼平均持续时间;和/或眨眼平均频率计算子单元,用于根据第四预设时间段内的眨眼次数计算所述眨眼平均频率。
在一些示例中,所述视觉疲劳值获取单元还包括:第一视觉疲劳值确定子单元、第二视觉疲劳值确定子单元、第三视觉疲劳值确定子单元和/或第四视觉疲劳值确定子单元,以及视觉疲劳值确定子单元,所述第一视觉疲劳值确定子单元用于将所述眼跳平均速度与眼跳平均速度等级阈值进行比较获得第一视觉疲劳值,所述第二视觉疲劳值确定子单元用于将所述眼跳平均角速度与眼跳平均角速度等级阈值进行比较获得第二视觉疲劳值,所述第三视觉疲劳值确定子单元用于将所述闭眼平均持续时间与闭眼持续平均时间等级阈值进行比较获得第三视觉疲劳值,所述第四视觉疲劳值确定子单元用于将所述眨眼平均频率与眨眼平均频率等级阈值进行比较获得第四视觉疲劳值,所述视觉疲劳值确定子单元用于根据所述第一视觉疲劳值、所述第二视觉疲劳值、所述第三视觉疲劳值和/或所述第四视觉疲劳值确定所述视觉疲劳值。
在一些示例中,所述眼跳平均速度等级阈值包括眼跳平均速度轻度疲劳阈值、眼跳平均速度中度疲劳阈值和眼跳平均速度重度疲劳阈值;所述眼跳平均角速度等级阈值包括眼跳平均角速度轻度疲劳阈值、眼跳平均角速度中度疲劳阈值和眼跳平均角速度重度疲劳阈值;所述闭眼持续平均时间等级阈值包括眼跳平均角速度轻度疲劳阈值、眼跳平均角速度中度疲劳阈值和眼跳平均角速度重度疲劳阈值;所述眨眼平均频率等级 阈值包括眨眼平均频率轻度疲劳阈值、眨眼平均频率中度疲劳阈值和眨眼平均频率重度疲劳阈值;所述第一视觉疲劳值确定子单元用于当所述眼跳平均速度小于所述眼跳平均速度轻度疲劳阈值时,使得所述第一视觉疲劳值为第一速度数值,当所述眼跳平均速度大于所述眼跳平均速度轻度疲劳阈值且小于所述眼跳平均速度中度疲劳阈值时,使得所述第一视觉疲劳值为第二速度数值,当所述眼跳平均速度大于所述眼跳平均速度中度疲劳阈值且小于所述眼跳平均速度重度疲劳阈值时,使得所述第一视觉疲劳值为第三速度数值,当所述眼跳平均速度大于所述眼跳平均速度重度疲劳阈值时,使得所述第一视觉疲劳值为第四平均速度数值,其中,所述眼跳平速度轻度疲劳阈值小于所述眼跳平均速度中度疲劳阈值,所述眼跳平均速度中度疲劳阈值小于所述眼跳平均速度重度疲劳阈值;所述第二视觉疲劳值确定子单元用于当所述眼跳平均角速度小于眼跳平均角速度轻度疲劳阈值时,使得所述第二视觉疲劳值为第一角速度数值,当所述眼跳平均角速度大于所述眼跳平均角速度轻度疲劳阈值且小于所述眼跳平均角速度中度疲劳阈值时,使得所述第二视觉疲劳值为第二角速度数值,当所述眼跳平均角速度大于所述眼跳平均角速度中度疲劳阈值且小于所述眼跳平均角速度重度疲劳阈值时,使得所述第二视觉疲劳值为第三角速度数值,当所述眼跳平均角速度大于眼跳平均角速度重度疲劳阈值时,使得所述第二视觉疲劳值为第四角速度数值,其中,所述眼跳平角速度轻度疲劳阈值小于所述眼跳平均角速度中度疲劳阈值,所述眼跳平均角速度中度疲劳阈值小于所述眼跳平均角速度重度疲劳阈值;所述第三视觉疲劳值确定子单元用于当所述闭眼持续平均时间小于所述闭眼持续平均时间轻度疲劳阈值时,使得所述第三视觉疲劳值为第一时间数值,当所述闭眼持续平均时间大于所述闭眼持续平均时间轻度疲劳阈值且小于所述闭眼持续平均时间中度疲劳阈值时,使得所述第三视觉疲劳值为第二时间数值,当所述闭眼持续平均时间大于所述闭眼持续平均时间中度疲劳阈值且小于所述闭眼持续平均时间重度疲劳阈值时,使得所述第三视觉疲劳值为第三时间数值,当所述闭眼持续平均时间大于所述闭眼持续平均时间重度疲劳阈值时,使得所述第三视觉疲劳值为第四时间数值,其中,所述闭眼持续平均时间轻度疲劳阈值小于所述闭眼持 续平均时间中度疲劳阈值,所述闭眼持续平均时间中度疲劳阈值小于所述闭眼持续平均时间重度疲劳阈值;所述第四视觉疲劳值确定子单元用于当所述眨眼平均频率小于所述眨眼平均频率轻度疲劳阈值时,使得所述第四视觉疲劳值为第一频率数值,当所述眨眼平均频率大于所述眨眼平均频率轻度疲劳阈值且小于所述眨眼平均频率中度疲劳阈值时,使得所述第四视觉疲劳值为第二频率数值,当所述眨眼平均频率大于所述眨眼平均频率中度疲劳阈值且小于所述眨眼平均频率重度疲劳阈值时,使得所述第四视觉疲劳值为第三频率数值,当所述眨眼平均频率大于所述眨眼平均频率重度疲劳阈值时,使得所述第四视觉疲劳值为第四频率数值,其中,所述眨眼平均频率轻度疲劳阈值小于所述眨眼平均频率中度疲劳阈值,所述眨眼平均频率中度疲劳阈值小于所述眨眼平均频率重度疲劳阈值。
在一些示例中,所述疲劳等级阈值包括轻度疲劳阈值、中度疲劳阈值和重度疲劳阈值,所述视觉疲劳等级判定单元用于当所述视觉疲劳值大于或等于所述轻度疲劳阈值并小于所述中度疲劳阈值时,判断所述视觉疲劳等级为轻度疲劳等级;当所述视觉疲劳值大于或等于所述中度疲劳阈值并小于所述重度疲劳阈值时,判断所述视觉疲劳等级为中度疲劳等级;当所述视觉疲劳值大于或等于所述重度疲劳阈值时,判断所述视觉疲劳等级为重度疲劳等级。
本公开一些实施例还提供一种虚拟现实设备,包括上述视觉疲劳识别装置。
本公开一些实施例还提供另一种虚拟现实设备,包括处理器和机器可读存储介质。所述机器可读存储介质存储有可适于被所述处理器执行的机器可执行指令,所述机器可执行指令被所述处理器执行时实施上述视觉疲劳识别方法。
本公开一些实施例还提供一种存储介质,非暂时性地存储计算机可读指令,当所述非暂时性计算机可读指令由计算机执行时可以执行上述视觉疲劳识别方法。
附图说明
图1是根据本公开一些实施例提供的视觉疲劳识别方法的工作流程图;
图2是根据本公开一些实施例提供的人眼瞳孔位置变化的示意图;
图3是根据本公开另一些实施例提供的人眼瞳孔位置变化的示意图;
图4是根据本公开一些实施例提供的人脸瞳孔面积变化的示意图;
图5A是根据本公开一些实施例提供的视觉疲劳识别装置的方框图;图5B是本公开另一些实施例提供的时间疲劳识别装置的示意性框图;
图6是本公开一些实施例提供的视觉疲劳获取单元的示意性框图;
图7是本公开一些实施例提供的视觉疲劳识别装置的示意性框图;
图8A是本公开一些实施例提供的虚拟现实设备的示意性框图;图8B是本公开另一些实施例提供的虚拟现实设备的示意性框图;图8C是本公开又一些实施例提供的虚拟现实设备的示意图。
具体实施方式
为了更清楚地说明本发明实施例的技术方案,下面将对实施例的附图作简单地介绍,显而易见地,下面描述中的附图仅仅涉及本发明的一些实施例,而非对本发明的限制。
本公开实施例提供一种视觉疲劳识别方法,如图1所示,该方法包括:步骤S10、获取用户的眼部图像;步骤S20、从眼部图像获取视觉特征,并根据视觉特征计算视觉疲劳值;步骤S30、将视觉疲劳值与疲劳等级阈值进行比较并根据比较结果判定视觉疲劳等级;步骤S40、将视觉疲劳等级用于生成对应的提醒信号。
以下以该方法应用于虚拟现实VR设备为例进行具体说明,然而本公开实施例对该方法的应用场景不作限制。
虚拟现实(VR)设备从硬件角度看,可以包括建模部件(例如,3D扫描仪),三维视觉显示部件(例如,3D展示装置、投影装置等),头戴式立体显示器(例如,双目全方位显示器),发声部件(例如,三维的声音装置),交互设备(例如,包括位置追踪仪、数据手套等),3D输入设备(例如,三维鼠标),动作捕捉设备以及其他交互设备等。
在一些示例中,该VR设备中可以进一步包括图像采集装置,例如该 图像采集装置包括红外光源和红外相机,红外相机可设置在VR设备的三维视觉现实部件中的透镜下方,红外光源可对眼部进行补光,然后采用红外相机获取眼部图像,可有助于获取眼部图像的细节信息,特别是瞳孔部位图像的细节信息。该图像采集装置也可以独立于该VR设备设置。
眼部图像可同时包括左眼图像和右眼图像,或者只采集左眼图像或者右眼图像,由于人的两只眼睛的状态通常是相同或者类似的,通过一只眼睛的图像也可以判断人眼状态。
在一些示例中,可采用基于深度学习的图像识别算法从眼部图像中提取视觉特征,视觉特征例如包括眼跳速度、眼跳角速度、闭眼持续时间和/或眨眼频率等,根据视觉特征可计算视觉疲劳值,视觉疲劳值用于表示人眼疲劳程度,视觉疲劳值越大代表人眼疲劳程度越高,例如,如果在一定时间内眨眼频率低和闭眼持续时间长,计算出视觉疲劳值较大,可认为用户人眼处于较为疲劳状态,
例如,将视觉疲劳值与疲劳等级阈值进行比较,根据比较结果进一步的判断视觉疲劳等级,视觉疲劳等级是根据人眼疲劳程度对其进行的分级,视觉疲劳等级例如可包括轻度疲劳、中度疲劳和重度疲劳等。
例如,疲劳等级阈值包括轻度疲劳阈值、中度疲劳阈值和重度疲劳阈值。例如,该轻度疲劳阈值、中度疲劳阈值和重度疲劳阈值分布对应于轻度疲劳程度、中度疲劳程度和重度疲劳程度。
例如,根据视觉疲劳值与疲劳等级阈值进行比较的结果可以判断视觉疲劳等级,疲劳阈值可以为判断视觉疲劳程度的参考值,例如,如果视觉疲劳值大于轻度疲劳时的疲劳等级阈值(如轻度疲劳阈值),可判定用户人眼处于轻度疲劳状态;如果视觉疲劳值大于中度疲劳时的疲劳等级阈值(如中度疲劳阈值),可判定用户人眼处于中度疲劳状态;如果视觉疲劳值大于重度疲劳时的疲劳等级阈值(如重度疲劳阈值),可判定用户人眼处于重度疲劳状态。例如,将该视觉疲劳等级用于生成对应的提醒信号以提醒信号用于提醒用户当前处于何种视觉疲劳程度,用户根据该提醒信号进行适当的休息,有助于缓解人眼疲劳,提醒信号可以为多种形式,例如,利用VR设备中的扬声器发出报警声音,或者在VR设备的显示器中显示提醒图像,或者控制VR设备振动等。
由上述描述可知,该基于VR设备的视觉疲劳识别方法,可以从眼部图像中提取的视觉特征计算视觉疲劳值,将该视觉疲劳值与疲劳等级阈值进行比较并根据比较结果判定视觉疲劳等级进而判定视觉疲劳等级,并将视觉疲劳等级用于生成对应的提醒信号,通过提醒信息对用户的人眼疲劳程度进行提醒,使用户获知当前人眼疲劳程度,用户可进行适当休息,降低用户发生近视等问题的风险,有助于保护用户视力。
为了提高获取眼部图像的图像质量,可在从所述眼部图像获取视觉特征之前对获取的原始眼部图像进行预处理,例如,该预处理可以包括以下一种或多种处理:提高所述眼部图像的亮度、提高所述眼部图像的对比度、对该眼部图像进行去噪处理。
在一些示例中,可通过图像处理算法,例如,灰度变换法或者直方图调整法等提高眼部图像的亮度和对比度,并且,由于获取的原始眼部图像通常具有一定的噪声,为进一步改善眼部图像质量,进一步可采用滤波算法对眼部图像进行滤波处理后可去除眼部图像中的噪声,有助于从高质量的眼部图像中提取视觉特征。
在一些示例中,视觉特征可以包括眼跳速度、眼跳角速度、闭眼持续时间和/或眨眼频率,上述步骤S20所述的从眼部图像提取视觉特征相应地可以包括:步骤S201、从连续的各帧所述眼部图像中分别获取瞳孔位置、瞳孔面积和/或眨眼次数;然后,与该视觉特征对应地,步骤S202、根据第一预设时间段内的各瞳孔位置计算眼跳平均速度;步骤S203、根据第二预设时间段内的各瞳孔位置计算眼跳平均角速度;步骤S204、根据第三预设时间段内的各瞳孔面积计算闭眼平均持续时间;和/或步骤S205、根据第四预设时间段内的眨眼次数计算眨眼平均频率。
例如,上述第一预设时间段、第二预设时间段、第三预设时间段和第四预设时间段可以为相同的时间段,也可以为不同的时间段,本公开实施例对此不作限制。
例如,在通过图像采集装置(例如红外相机)采集用户的眼部图像时,图像采集装置可采集到多帧眼部图像,图像采集装置以一定的帧速采集图像,例如,帧速为240帧/s,即每秒可采集到240帧眼部图像。
对于连续采集的各帧眼部图像,可对每帧图像进行视觉特征提取,或 者为了降低视觉特征提取的计算量,可每隔若干帧图像进行视觉特征提取,可从连续的各帧图像中分别获取瞳孔位置和瞳孔面积等。
例如,可根据连续两帧眼部图像的两个瞳孔位置计算眼跳速度,例如,为了计算方便,瞳孔位置可以为瞳孔中心位置,参照图2所示,图中虚线表示瞳孔中心所经过的水平方向,对于连续的两帧图像而言,假设从t时刻的眼部图像获取的瞳孔中心位于A点,t时刻瞳孔中心位置的二维坐标为(x A,y A);从t+1时刻的眼部图像获取的瞳孔中心位于B点,t+1时刻瞳孔中心位置的二维坐标为(x B,y B);图像采集装置的帧速为rate;则t+1时刻的眼跳速度v t+1可以通过下述公式计算得出:
Figure PCTCN2019083902-appb-000001
例如,可根据连续三帧眼部图像的三个瞳孔位置计算眼跳角速度,例如,参照图3所示,图中虚线表示瞳孔中心所经过的水平方向,假设从t-1时刻的眼部图像获取的瞳孔中心位于C点,t-1时刻的瞳孔中心位置的二维坐标为(x C,y C);从t时刻的眼部图像获取的瞳孔中心位于D点,t时刻瞳孔中心位置的二维坐标为(x D,y D),且t时刻瞳孔中心的运动角为θ t;从t+1时刻的眼部图像获取的瞳孔中心位于E点,t+1时刻瞳孔中心位置的二维坐标为(x E,y E),且t+1时刻瞳孔中心的运动角为θ t+1;则t+1时刻的眼跳角速度v t+1可以通过下述公式计算得出:
Figure PCTCN2019083902-appb-000002
其中,
Figure PCTCN2019083902-appb-000003
Figure PCTCN2019083902-appb-000004
例如,可根据连续三帧眼部图像的各瞳孔面积计算闭眼持续时间,例如,参照图4所示,根据t1时刻的眼部图像获取瞳孔面积并非完整的瞳孔面积,例如为完整瞳孔面积的1/4,可以认为此时用户正在闭眼,将t1时刻作为闭眼初始时间,到t2时刻,根据t2时刻的眼部图像检测不到瞳孔,认为此时瞳孔面积为零,从t1时刻到t2时刻为闭眼持续时间的半个周期,时长为t2-t1=T1,到t3时刻,根据t3时刻的眼部图像获取的瞳孔面积为完整瞳孔面积的1/4,从t2时刻到t3时刻为闭眼持续时间的另外半个周期, 时长为t3-t2=T2,则闭眼持续时间t close等于t close=T 1+T 2
可根据单位时间t(例如1秒钟)内的眨眼次数n计算眨眼频率f为:
Figure PCTCN2019083902-appb-000005
上述方式为根据连续的多帧眼部图像计算一次眼跳速度、一次眼跳角速度、一次闭眼持续时间和单位时间内的眨眼频率,一次计算出的上述各值可能存在较大的误差,因此,可计算预设时间段内的上述各值的平均值,将平均值作为视觉特征,具体而言,可采用上述方式计算出的预设时间段T内的多个眼跳速度计算该预设时间段内的眼跳平均速度
Figure PCTCN2019083902-appb-000006
根据多个眼跳角速度计算该预设时间段内的眼跳平均角速度
Figure PCTCN2019083902-appb-000007
根据多个闭眼持续时间计算该预设时间段内的闭眼持续平均时间
Figure PCTCN2019083902-appb-000008
根据多个眨眼频率计算该预设时间段内的眨眼平均频率
Figure PCTCN2019083902-appb-000009
上述预设时间可根据需要选择,例如,该预设时间段为60s。
在一些示例中,与提取时间特征相对应地,所述根据视觉特征计算视觉疲劳值包括:将该眼跳平均速度与眼跳平均速度等级阈值进行比较获得第一视觉疲劳值,将该眼跳平均角速度与眼跳平均角速度等级阈值进行比较获得第二视觉疲劳值,将该闭眼平均持续时间与闭眼持续平均时间等级阈值进行比较获得第三视觉疲劳值,和/或将该眨眼平均频率与眨眼平均频率等级阈值进行比较获得第四视觉疲劳值。例如,该眼跳平均速度等级阈值包括眼跳平均速度轻度疲劳阈值、眼跳平均速度中度疲劳阈值和眼跳平均速度重度疲劳阈值;该眼跳平均角速度等级阈值包括眼跳平均角速度轻度疲劳阈值、眼跳平均角速度中度疲劳阈值和眼跳平均角速度重度疲劳阈值;该闭眼持续平均时间等级阈值包括眼跳平均角速度轻度疲劳阈值、眼跳平均角速度中度疲劳阈值和眼跳平均角速度重度疲劳阈值;该眨眼平均频率等级阈值包括眨眼平均频率轻度疲劳阈值、眨眼平均频率中度疲劳阈值和眨眼平均频率重度疲劳阈值。
在一些例子中,上述步骤S20中所述的根据视觉特征计算视觉疲劳值可以包括如下步骤S206、S207、S208和/或S209。
步骤S206、当眼跳平均速度
Figure PCTCN2019083902-appb-000010
小于该眼跳平速度轻度疲劳阈值
Figure PCTCN2019083902-appb-000011
时,则第一视觉疲劳值m 1为第一速度数值,例如m 1=0;当眼跳平均速度
Figure PCTCN2019083902-appb-000012
大于该眼跳平均速度轻度疲劳阈值
Figure PCTCN2019083902-appb-000013
且小于该眼跳平均速度中度疲劳阈值
Figure PCTCN2019083902-appb-000014
时,则第一视觉疲劳值m 1为第二速度数值,例如m 1=1;当眼跳平均速度
Figure PCTCN2019083902-appb-000015
大于该眼跳平均速度中度疲劳阈值
Figure PCTCN2019083902-appb-000016
且小于该眼跳平均速度重度疲劳阈值
Figure PCTCN2019083902-appb-000017
时,则第一视觉疲劳值m 1为第三速度数值,例如m 1=2;当眼跳平均速度
Figure PCTCN2019083902-appb-000018
大于该眼跳平均速度重度疲劳阈值
Figure PCTCN2019083902-appb-000019
时,则第一视觉疲劳值m 1为第四速度数值,例如m 1=3。
上述的眼跳平均速度轻度疲劳阈值
Figure PCTCN2019083902-appb-000020
小于眼跳平均速度中度疲劳阈值
Figure PCTCN2019083902-appb-000021
眼跳平均速度中度疲劳阈值
Figure PCTCN2019083902-appb-000022
小于眼跳平均速度重度疲劳阈值
Figure PCTCN2019083902-appb-000023
例如,上述的眼跳平均速度轻度疲劳阈值
Figure PCTCN2019083902-appb-000024
眼跳平均速度中度疲劳阈值
Figure PCTCN2019083902-appb-000025
和眼跳平均速度重度疲劳阈值
Figure PCTCN2019083902-appb-000026
可根据经验值预先设定。
步骤S207、当眼跳平均角速度
Figure PCTCN2019083902-appb-000027
小于眼跳平均角速度轻度疲劳阈值
Figure PCTCN2019083902-appb-000028
时,则第二视觉疲劳值m 2为第一角速度数值,例如m 2=0;当眼跳平均角速度
Figure PCTCN2019083902-appb-000029
大于眼跳平均角速度轻度疲劳阈值
Figure PCTCN2019083902-appb-000030
且小于眼跳平均角速度中度疲劳阈值
Figure PCTCN2019083902-appb-000031
时,则第二视觉疲劳值m 2为第二角速度数值,例如m 2=1;当眼跳平均角速度
Figure PCTCN2019083902-appb-000032
大于眼跳平均角速度中度疲劳阈值
Figure PCTCN2019083902-appb-000033
且小于眼跳平均角速度重度疲劳阈值
Figure PCTCN2019083902-appb-000034
时,则第二视觉疲 劳值m 2为第三角速度数值,例如m 2=2;当眼跳平均角速度
Figure PCTCN2019083902-appb-000035
大于眼跳平均角速度重度疲劳阈值
Figure PCTCN2019083902-appb-000036
时,则第二视觉疲劳值m 2为第四角速度数值,例如m 2=3。
上述的眼跳平均角速度轻度疲劳阈值
Figure PCTCN2019083902-appb-000037
小于眼跳平均角速度中度疲劳阈值
Figure PCTCN2019083902-appb-000038
眼跳平均角速度中度疲劳阈值
Figure PCTCN2019083902-appb-000039
小于眼跳平均速度重度疲劳阈值
Figure PCTCN2019083902-appb-000040
上述的眼跳平均角速度轻度疲劳阈值
Figure PCTCN2019083902-appb-000041
眼跳平均角速度中度疲劳阈值
Figure PCTCN2019083902-appb-000042
和眼跳平均角速度重度疲劳阈值
Figure PCTCN2019083902-appb-000043
可根据经验值预先设定。
步骤S208、当闭眼持续平均时间
Figure PCTCN2019083902-appb-000044
小于闭眼持续平均时间轻度疲劳阈值
Figure PCTCN2019083902-appb-000045
时,则第三视觉疲劳值m 3为第一时间数值,例如m 3=0;当闭眼持续平均时间
Figure PCTCN2019083902-appb-000046
大于闭眼持续平均时间轻度疲劳阈值
Figure PCTCN2019083902-appb-000047
且小于闭眼持续平均时间中度疲劳阈值
Figure PCTCN2019083902-appb-000048
时,则第三视觉疲劳值m 3为第二时间数值,例如m 3=1;当闭眼持续平均时间
Figure PCTCN2019083902-appb-000049
大于闭眼持续平均时间中度疲劳阈值
Figure PCTCN2019083902-appb-000050
且小于闭眼持续平均时间重度疲劳阈值
Figure PCTCN2019083902-appb-000051
时,则第三视觉疲劳值m 3为第三时间数值,例如m 3=2;当闭眼持续平均时间
Figure PCTCN2019083902-appb-000052
大于闭眼持续平均时间重度疲劳阈值
Figure PCTCN2019083902-appb-000053
时,则第三视觉疲劳值m 3为第四时间数值,例如m 3=3;上述的闭眼持续平均时间轻度疲劳阈值
Figure PCTCN2019083902-appb-000054
小于闭眼持续平均时间中度疲劳阈值
Figure PCTCN2019083902-appb-000055
闭眼持续平均时间中度疲劳阈值
Figure PCTCN2019083902-appb-000056
小于闭眼持续平 均时间重度疲劳阈值
Figure PCTCN2019083902-appb-000057
上述的闭眼持续平均时间轻度疲劳阈值
Figure PCTCN2019083902-appb-000058
闭眼持续平均时间中度疲劳阈值
Figure PCTCN2019083902-appb-000059
和闭眼持续平均时间重度疲劳阈值
Figure PCTCN2019083902-appb-000060
可根据经验值预先设定。
步骤S209、当眨眼平均频率
Figure PCTCN2019083902-appb-000061
小于眨眼平均频率轻度疲劳阈值
Figure PCTCN2019083902-appb-000062
时,则第四视觉疲劳值为m 4为第一频率数值,例如m 4=0;当眨眼平均频率
Figure PCTCN2019083902-appb-000063
大于眨眼平均频率轻度疲劳阈值
Figure PCTCN2019083902-appb-000064
且小于眨眼平均频率中度疲劳阈值
Figure PCTCN2019083902-appb-000065
时,则第四视觉疲劳值m 4为第二频率数值,例如m 4=1;当所述眨眼平均频率
Figure PCTCN2019083902-appb-000066
大于眨眼平均频率中度疲劳阈值
Figure PCTCN2019083902-appb-000067
且小于眨眼平均频率重度疲劳阈值
Figure PCTCN2019083902-appb-000068
时,则第四视觉疲劳值m 4为第三频率数值,例如m 4=2;当眨眼平均频率
Figure PCTCN2019083902-appb-000069
大于眨眼平均频率重度疲劳阈值
Figure PCTCN2019083902-appb-000070
时,则第四视觉疲劳值m 4为第四频率数值,例如m 4=3。
眨眼平均频率轻度疲劳阈值
Figure PCTCN2019083902-appb-000071
小于眨眼平均频率中度疲劳阈值
Figure PCTCN2019083902-appb-000072
眨眼平均频率中度疲劳阈值
Figure PCTCN2019083902-appb-000073
小于眨眼平均频率重度疲劳阈值
Figure PCTCN2019083902-appb-000074
上述的眨眼平均频率轻度疲劳阈值
Figure PCTCN2019083902-appb-000075
眨眼平均频率中度疲劳阈值
Figure PCTCN2019083902-appb-000076
眨眼平均频率重度疲劳阈值
Figure PCTCN2019083902-appb-000077
可根据经验值预先设定。
需要说明的是,这里的第一速度数值、第二速度数值、第三速度数值和第四速度数值可以为人为设定的具有一定规律的第一数列;第一角速度数值、第二角速度数值、第三角速度数值和第四角速度数值可以为人为设 定的具有一定规律的第二数列;第一时间数值、第二时间数值、第三时间数值和第四时间数值可以为人为设定的具有一定规律的第三数列;第一频率数值、第二频率数值、第三频率数值和第四频率数值可以为人为设定的具有一定规律的第四数列。
例如,该第一数列、第二数列、第三数列和第四数列可以相同。
例如,根据所述视觉特征计算视觉疲劳值还包括:根据上述第一视觉疲劳值、第二视觉疲劳值、第三视觉疲劳值和/或第四视觉疲劳值确定该视觉疲劳值。
例如,当该第一数列、第二数列、第三数列和第四数列相同时,可以通过步骤S210计算该视觉疲劳值:计算上述第一视觉疲劳值、第二视觉疲劳值、第三视觉疲劳值和/或第四视觉疲劳值的总和作为视觉疲劳值,即视觉疲劳值m为将上述第一视觉疲劳值m 1、第二视觉疲劳值m 2、第三视觉疲劳值m 3和/或第四视觉疲劳值m 4进行相加得到的总和,即m=m 1+m 2+m 3+m 4,也可称为视觉疲劳综合值。
在另一示例中,该第一数列、第二数列、第三数列和第四数列不同时,可以先对上述第一视觉疲劳值m 1、第二视觉疲劳值m 2、第三视觉疲劳值m 3和/或第四视觉疲劳值m 4进行归一化处理然后再求和得到该视觉疲劳值。得到上述视觉疲劳值后,可将该视觉疲劳值与疲劳等级值进行比较,根据比较结果可判断出视觉疲劳等级。例如,视觉疲劳等级可分为轻度疲劳、中度疲劳和重度疲劳,疲劳等级阈值也可以包括轻度疲劳阈值、中度疲劳阈值和重度疲劳阈值。例如,该轻度疲劳阈值、中度疲劳阈值和重度疲劳阈值可以根据上述第一数列、第二数列、第三数列和第四数列来确定。
例如,将所述视觉疲劳值与疲劳等级阈值进行比较并根据比较结果判定视觉疲劳等级包括:当所述视觉疲劳值大于或等于所述轻度疲劳阈值并小于所述中度疲劳阈值时,判断所述视觉疲劳等级为轻度疲劳等级;当所述视觉疲劳值大于或等于所述中度疲劳阈值并小于所述重度疲劳阈值时,判断所述视觉疲劳等级为中度疲劳等级;当所述视觉疲劳值大于或等于所述重度疲劳阈值时,判断所述视觉疲劳等级为重度疲劳等级。
例如,对应于上述第一数列、第二数列、第三数列和第四数列,轻度 疲劳阈值例如为4,中度疲劳阈值例如为7,重度疲劳阈值例如为10,据此,若4≤m<7,则可以判定用户的人眼疲劳程度为轻度疲劳,若7≤m<10,则可以判定用户的人眼疲劳程度为中度疲劳,若10≤m,则可以判定用户的人眼疲劳程度为重度疲劳。
在另一示例中,上述视觉疲劳识别方法还可以包括根据上述视觉疲劳等级生成对应的提醒信号,根据视觉疲劳等级生成对应的提醒信号例如包括:
根据视觉疲劳等级生成对应颜色的图像闪烁信号和/或对应频率的振动信号,以使虚拟现实设备的屏幕上显示对应颜色的图像且以设定频率闪烁和/或虚拟现实设备以预设频率振动。
本实施例中,提醒信号可以包括视觉信号和/或触觉信号,视觉信号具体可以为图像闪烁信号,虚拟现实设备接收到该图像闪烁信号后可在其屏幕上显示对应颜色的图像,且该图像以设定频率闪烁;触觉信号可以为对应频率的振动信号,VR设备接收到该振动信号后以预设频率振动,通过图像闪烁信号和振动信号可更好地提醒用户,用户可据此获知当前人眼疲劳程度。
例如,若判定出视觉疲劳等级为轻度疲劳,可在VR设备的屏幕上方出现绿色闪烁三角图像,并且使VR设备轻度振动(以某一较低的频率振动),以进行轻度疲劳提醒;若判定出视觉疲劳等级为中度疲劳,可在VR设备的屏幕上方出现橙色闪烁三角图像,并且使VR设备中度振动(以某一较高的频率振动),以进行轻度疲劳提醒;若判定出视觉疲劳等级为重度疲劳,可在VR设备的屏幕上方出现红色闪烁三角图像,并且使VR设备重度振动(以某一更高的频率振动),以进行重度疲劳提醒。
本公开实施例还提供一种视觉疲劳识别装置,如图5A所示,该视觉疲劳识别装置05包括:眼部图像获取单元501,用于获取用户的眼部图像;视觉疲劳值获取单元502,用于从所述眼部图像获取视觉特征,并根据所述视觉特征计算视觉疲劳值;视觉疲劳等级判定单元503,用于将所述视觉疲劳值与疲劳等级阈值进行比较并根据比较结果判定视觉疲劳等级。
在一些示例中,例如,如图5A所示,该视觉疲劳识别装置05还可以包括提醒信号生成单元504,用于根据所述视觉疲劳等级生成对应的提醒信 号。
在一些示例,如图5B所示,该视觉疲劳识别装置还可以包括图像预处理单元505,该图像处理单元505可以包括如下一种或多种部件:亮度提高单元551、对比度提高单元552、滤波单元553。该亮度提高单元551用于提高所述眼部图像的亮度,该对比度提高单元552用于提高该眼部图像的对比度,该滤波单元553用于对该眼部图像进行去噪处理。
例如,所述视觉特征包括:眼跳平均速度、眼跳平均角速度、闭眼持续平均时间和/或眨眼平均频率,相应地,如图6所示,所述视觉疲劳值获取单元502包括:视觉特征获取子单元521,用于从连续的各帧所述眼部图像中分别获取瞳孔位置、瞳孔面积和/或眨眼次数;眼跳平均速度计算子单元522,用于根据第一预设时间段内的各所述瞳孔位置计算眼跳平均速度;眼跳平均角速度计算子单元523,用于根据第二预设时间段内的各所述瞳孔位置计算眼跳平均角速度;闭眼平均持续时间计算子单元524,用于根据第三预设时间段内的各所述瞳孔面积计算闭眼平均持续时间;和/或眨眼平均频率计算子单元525,用于根据第四预设时间段内的眨眼次数计算眨眼平均频率。
例如,上述第一预设时间段、第二预设时间段、第三预设时间段和第四预设时间段可以为相同的时间段,也可以为不同的时间段,本公开实施例对此不作限制。
例如,如图6所示,所述视觉疲劳值获取单元502还可以包括第一视觉疲劳值确定子单元526、第二视觉疲劳值确定子单元527、第三视觉疲劳值确定子单元528和/或第四视觉疲劳值确定子单元529。第一视觉疲劳值确定子单元526用于将所述眼跳平均速度与眼跳平均速度等级阈值进行比较获得第一视觉疲劳值。第二视觉疲劳值确定子单元527用于将所述眼跳平均角速度与眼跳平均角速度等级阈值进行比较获得第二视觉疲劳值。第三视觉疲劳值确定子单元528用于将所述闭眼平均持续时间与闭眼持续平均时间等级阈值进行比较获得第三视觉疲劳值。第四视觉疲劳值确定子单元529用于将所述眨眼平均频率与眨眼平均频率等级阈值进行比较获得第四视觉疲劳值。
例如,视觉疲劳值获取单元502还可以包括视觉疲劳值确定子单元 530,视觉疲劳值确定子单元530用于根据所述第一视觉疲劳值、所述第二视觉疲劳值、所述第三视觉疲劳值和/或所述第四视觉疲劳值确定所述视觉疲劳值。
例如,所述眼跳平均速度等级阈值包括眼跳平均速度轻度疲劳阈值、眼跳平均速度中度疲劳阈值和眼跳平均速度重度疲劳阈值;所述眼跳平均角速度等级阈值包括眼跳平均角速度轻度疲劳阈值、眼跳平均角速度中度疲劳阈值和眼跳平均角速度重度疲劳阈值;所述闭眼持续平均时间等级阈值包括眼跳平均角速度轻度疲劳阈值、眼跳平均角速度中度疲劳阈值和眼跳平均角速度重度疲劳阈值;所述眨眼平均频率等级阈值包括眨眼平均频率轻度疲劳阈值、眨眼平均频率中度疲劳阈值和眨眼平均频率重度疲劳阈值。
例如,第一视觉疲劳值确定子单元526用于当所述眼跳平均速度小于该眼跳平速度轻度疲劳阈值时,则第一视觉疲劳值为第一速度数值,当所述眼跳平均速度大于该眼跳平均速度轻度疲劳阈值且小于该眼跳平均速度中度疲劳阈值时,则第一视觉疲劳值为第二速度数值,当所述眼跳平均速度大于该眼跳平均速度中度疲劳阈值且小于该眼跳平均速度重度疲劳阈值时,则第一视觉疲劳值为第三速度数值,当所述眼跳平均速度大于该眼跳平均速度重度疲劳阈值时,则第一视觉疲劳值为第四速度数值,其中,眼跳平速度轻度疲劳阈值小于眼跳平均速度中度疲劳阈值,该眼跳平均速度中度疲劳阈值小于该眼跳平均速度重度疲劳阈值。
例如,第二视觉疲劳值确定子单元527,用于当所述眼跳平均角速度小于眼跳平均角速度轻度疲劳阈值时,则第二视觉疲劳值为第一角速度数值,当所述眼跳平均角速度大于该眼跳平均角速度轻度疲劳阈值且小于该眼跳平均角速度中度疲劳阈值时,则第二视觉疲劳值为第二角速度数值,当所述眼跳平均角速度大于该眼跳平均角速度中度疲劳阈值且小于该眼跳平均角速度重度疲劳阈值时,则第二视觉疲劳值为第三角速度数值,当所述眼跳平均角速度大于该眼跳平均角速度重度疲劳阈值时,则第二视觉疲劳值为第四角速度数值,其中,该眼跳平角速度轻度疲劳阈值小于该眼跳平均角速度中度疲劳阈值,该眼跳平均角速度中度疲劳阈值小于该眼跳平均角速度重度疲劳阈值。
例如,第三视觉疲劳值确定子单元528用于当所述闭眼持续平均时间小于该闭眼持续平均时间轻度疲劳阈值时,则第三视觉疲劳值为第一时间数值,当所述闭眼持续平均时间大于该闭眼持续平均时间轻度疲劳阈值且小于该闭眼持续平均时间中度疲劳阈值时,则第三视觉疲劳值为第二时间数值,当所述闭眼持续平均时间大于该闭眼持续平均时间中度疲劳阈值且小于该闭眼持续平均时间重度疲劳阈值时,则第三视觉疲劳值为第三时间数值,当所述闭眼持续平均时间大于该闭眼持续平均时间重度疲劳阈值时,则第三视觉疲劳值为第四时间数值,其中,该闭眼持续平均时间轻度疲劳阈值小于该闭眼持续平均时间中度疲劳阈值,该闭眼持续平均时间中度疲劳阈值小于该闭眼持续平均时间重度疲劳阈值。
第四视觉疲劳值确定子单元529,用于当所述眨眼平均频率小于该眨眼平均频率轻度疲劳阈值时,则第四视觉疲劳值为第一频率数值,当所述眨眼平均频率大于该眨眼平均频率轻度疲劳阈值且小于该眨眼平均频率中度疲劳阈值时,则第四视觉疲劳值为第二频率数值,当所述眨眼平均频率大于该眨眼平均频率中度疲劳阈值且小于该眨眼平均频率重度疲劳阈值时,则第四视觉疲劳值为第三频率数值,当所述眨眼平均频率大于该眨眼平均频率重度疲劳阈值时,则第四视觉疲劳值为第四频率数值,其中,该眨眼平均频率轻度疲劳阈值小于该眨眼平均频率中度疲劳阈值,该眨眼平均频率中度疲劳阈值小于该眨眼平均频率重度疲劳阈值。
需要说明的是,这里的第一速度数值、第二速度数值、第三速度数值和第四速度数值可以为人为设定的具有一定规律的第一数列;第一角速度数值、第二角速度数值、第三角速度数值和第四角速度数值可以为人为设定的具有一定规律的第二数列;第一时间数值、第二时间数值、第三时间数值和第四时间数值可以为人为设定的具有一定规律的第三数列;第一频率数值、第二频率数值、第三频率数值和第四频率数值可以为人为设定的具有一定规律的第四数列。
例如,该第一数列、第二数列、第三数列和第四数列可以相同。
例如,根据所述视觉特征计算视觉疲劳值还包括:根据上述第一视觉疲劳值、第二视觉疲劳值、第三视觉疲劳值和/或第四视觉疲劳值确定该视觉疲劳值。
例如,当该第一数列、第二数列、第三数列和第四数列相同时,视觉疲劳值确定子单元530可以通过计算所述第一视觉疲劳值、所述第二视觉疲劳值、所述第三视觉疲劳值和/或所述第四视觉疲劳值的总和作为视觉疲劳值。
在另一示例中,该第一数列、第二数列、第三数列和第四数列不同时,视觉疲劳值确定子单元530可以先对上述第一视觉疲劳值m 1、第二视觉疲劳值m 2、第三视觉疲劳值m 3和/或第四视觉疲劳值m 4进行归一化处理然后再求和得到该视觉疲劳值。
例如,所述疲劳等级阈值包括轻度疲劳阈值、中度疲劳阈值和重度疲劳阈值,视觉疲劳等级判定单元503用于当所述视觉疲劳值大于或等于所述轻度疲劳阈值并小于所述中度疲劳阈值时,判断所述视觉疲劳等级为轻度疲劳等级;当所述视觉疲劳值大于或等于所述中度疲劳阈值并小于所述重度疲劳阈值时,判断所述视觉疲劳等级为中度疲劳等级;当所述视觉疲劳值大于或等于所述重度疲劳阈值时,判断所述视觉疲劳等级为重度疲劳等级。
与前述识别疲劳识别方法的实施例相对应,本公开提供的视觉疲劳装置可降低用户发生近视等问题的风险,有助于保护用户的视力。
对于装置实施例而言,其中各个单元的功能和作用的实现过程具体详见上述方法中对应步骤的实现过程,在此不再赘述。
以上所描述的装置实施例仅仅是示意性的,其中所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,即可以位于一个地方,或者也可以分布到多个网络单元上;上述各单元可以合并为一个单元,也可以进一步拆分成多个子单元。
本公开实施例的装置中的各单元可借助软件的方式实现,或者通过软件和硬件的方式来实现,当然也可以通过硬件实现。基于这样的理解,本公开的技术方案本质上或者说对现有技术做出贡献的部分可以以软件产品的形式体现出来,以软件实现为例,作为一个逻辑意义上的装置,是通过应用该装置的设备所在的处理器将非易失性存储器中对应的计算机程序指令读取到内存中运行形成的。
图7为本公开至少一实施例提供的另一种视觉疲劳识别装置06的示意 框图。如图7所示,该视觉疲劳识别装置06包括处理器210、机器可读存储介质220以及一个或多个计算机程序模块221。
例如,处理器210与机器可读存储介质220通过总线系统230连接。例如,一个或多个计算机程序模块221被存储在机器可读存储介质220中。例如,一个或多个计算机程序模块221包括用于执行本公开任一实施例提供的视觉疲劳识别方法的指令。例如,一个或多个计算机程序模块221中的指令可以由处理器210执行。例如,总线系统230可以是常用的串行、并行通信总线等,本公开的实施例对此不作限制。
例如,该处理器210可以是中央处理单元(CPU)、图像处理器(GPU)或者具有数据处理能力和/或指令执行能力的其它形式的处理单元,可以为通用处理器或专用处理器,并且可以视觉疲劳识别装置06中的其它组件以执行期望的功能。
机器可读存储介质220可以包括一个或多个计算机程序产品,该计算机程序产品可以包括各种形式的计算机可读存储介质,例如易失性存储器和/或非易失性存储器。该易失性存储器例如可以包括随机存取存储器(RAM)和/或高速缓冲存储器(cache)等。该非易失性存储器例如可以包括只读存储器(ROM)、硬盘、闪存等。在计算机可读存储介质上可以存储一个或多个计算机程序指令,处理器210可以运行该程序指令,以实现本公开实施例中(由处理器210实现)的功能以及/或者其它期望的功能,例如视觉疲劳识别方法等。在该计算机可读存储介质中还可以存储各种应用程序和各种数据,例如人脸图像序列以及应用程序使用和/或产生的各种数据等。
需要说明的是,为表示清楚、简洁,本公开实施例并没有给出该视觉疲劳识别装置的全部组成单元。为实现视觉疲劳识别装置的必要功能,本领域技术人员可以根据具体需要提供、设置其他未示出的组成单元,本公开的实施例对此不作限制。
如图8A所示,本公开实施例还提供一种虚拟现实设备07,包括上述视觉疲劳识别装置05或视觉疲劳识别装置06。
本公开实施例还提供另一种虚拟现实设备08,图8B为该虚拟现实设备08的示意性框图。如图8B所示,该虚拟现实设备08包括:机器可读存 储介质102和处理器101,还可以包括非易失性存储介质103、通信接口104和总线105;其中,机器可读存储介质102、处理器101、非易失性存储介质103和通信接口104通过总线105完成相互间的通信。处理器101通过读取并执行机器可读存储介质102中与视觉疲劳识别方法的控制逻辑对应的机器可执行指令,可执行上文描述的视觉疲劳识别方法。
例如,该通信接口104与通信装置(图中未示出)连接。该通信装置可以通过无线通信来与网络和其他设备进行通信,该网络例如为因特网、内部网和/或诸如蜂窝电话网络之类的无线网络、无线局域网(LAN)和/或城域网(MAN)。无线通信可以使用多种通信标准、协议和技术中的任何一种,包括但不局限于全球移动通信系统(GSM)、增强型数据GSM环境(EDGE)、宽带码分多址(W-CDMA)、码分多址(CDMA)、时分多址(TDMA)、蓝牙、Wi-Fi(例如基于IEEE 802.11a、IEEE 802.11b、IEEE 802.11g和/或IEEE 802.11n标准)、基于因特网协议的语音传输(VoIP)、Wi-MAX,用于电子邮件、即时消息传递和/或短消息服务(SMS)的协议,或任何其他合适的通信协议。
本文中提到的机器可读存储介质可以是任何电子、磁性、光学或其它物理存储装置,可以包含或存储信息,如可执行指令、数据,等等。例如,机器可读存储介质可以是:RAM(Radom Access Memory,随机存取存储器)、易失存储器、非易失性存储器、闪存、存储驱动器(如硬盘驱动器)、任何类型的存储盘(如光盘、dvd等),或者类似的存储介质,或者它们的组合。
非易失性介质可以是非易失性存储器、闪存、存储驱动器(如硬盘驱动器)、任何类型的存储盘(如光盘、dvd等),或者类似的非易失性存储介质,或者它们的组合。
当然,上述的VR设备还可以包括其他已有部件,此处不再赘述。
如图8C所示,该虚拟现实设备07/08可以佩戴在人的眼部,从而根据需要实现对用户进行视觉疲劳识别功能。
本公开实施例还提供一种存储介质。例如,该存储介质非暂时性地存储计算机可读指令,当非暂时性计算机可读指令由计算机(包括处理器)执行时可以执行本公开任一实施例提供的视觉疲劳识别方法。
例如,该存储介质可以是一个或多个计算机可读存储介质的任意组合,例如一个计算机可读存储介质包含获取用户眼部图像的计算机可读的程序代码,另一个计算机可读存储介质包含从该眼部图像获取视觉特征的计算机可读的程序代码。例如,当该程序代码由计算机读取时,计算机可以执行该计算机存储介质中存储的程序代码,执行例如本公开任一实施例提供的视觉疲劳识别方法。
例如,存储介质可以包括智能电话的存储卡、平板电脑的存储部件、个人计算机的硬盘、随机存取存储器(RAM)、只读存储器(ROM)、可擦除可编程只读存储器(EPROM)、便携式紧致盘只读存储器(CD-ROM)、闪存、或者上述存储介质的任意组合,也可以为其他适用的存储介质。
以上所述仅是本公开的示范性实施方式,而非用于限制本公开的保护范围,本公开的保护范围由所附的权利要求确定。

Claims (20)

  1. 一种视觉疲劳识别方法,包括:
    获取用户的眼部图像;
    从所述眼部图像获取视觉特征,并根据所述视觉特征计算视觉疲劳值;
    将所述视觉疲劳值与疲劳等级阈值进行比较并根据比较结果判定视觉疲劳等级;
    将所述视觉疲劳等级用于生成对应的提醒信号。
  2. 根据权利要求1所述的方法,还包括:在从所述眼部图像获取视觉特征之前对所述眼部图像进行预处理,其中,
    所述预处理包括:提高所述眼部图像的亮度、提高所述眼部图像的对比度和/或对所述眼部图像进行去噪处理。
  3. 根据权利要求1或2所述的方法,其中,所述视觉特征包括:眼跳平均速度、眼跳平均角速度、闭眼持续平均时间和/或眨眼平均频率。
  4. 根据权利要求3所述的方法,其中,所述从所述眼部图像提取视觉特征包括:
    从连续的各帧所述眼部图像中分别获取瞳孔位置、瞳孔面积和/或眨眼次数;然后,与所述视觉特征相对应地,
    根据第一预设时间段内的各所述瞳孔位置计算所述眼跳平均速度;
    根据第二预设时间段内的各所述瞳孔位置计算所述眼跳平均角速度;
    根据第三预设时间段内的各所述瞳孔面积计算所述闭眼平均持续时间;
    和/或
    根据第四预设时间段内的眨眼次数计算所述眨眼平均频率。
  5. 根据权利要求4所述的方法,其中,与提取所述时间特征相对应地,所述根据所述视觉特征计算视觉疲劳值包括:
    将所述眼跳平均速度与眼跳平均速度等级阈值进行比较获得第一视觉疲劳值,
    将所述眼跳平均角速度与眼跳平均角速度等级阈值进行比较获得第 二视觉疲劳值,
    将所述闭眼平均持续时间与闭眼持续平均时间等级阈值进行比较获得第三视觉疲劳值,
    和/或
    将所述眨眼平均频率与眨眼平均频率等级阈值进行比较获得第四视觉疲劳值。
  6. 根据权利要求5所述的方法,其中,所述眼跳平均速度等级阈值包括眼跳平均速度轻度疲劳阈值、眼跳平均速度中度疲劳阈值和眼跳平均速度重度疲劳阈值;所述眼跳平均角速度等级阈值包括眼跳平均角速度轻度疲劳阈值、眼跳平均角速度中度疲劳阈值和眼跳平均角速度重度疲劳阈值;所述闭眼持续平均时间等级阈值包括眼跳平均角速度轻度疲劳阈值、眼跳平均角速度中度疲劳阈值和眼跳平均角速度重度疲劳阈值;所述眨眼平均频率等级阈值包括眨眼平均频率轻度疲劳阈值、眨眼平均频率中度疲劳阈值和眨眼平均频率重度疲劳阈值;
    将所述眼跳平均速度与眼跳平均速度等级阈值进行比较获得第一视觉疲劳值包括:当所述眼跳平均速度小于所述眼跳平均速度轻度疲劳阈值时,则使得所述第一视觉疲劳值为第一速度数值,当所述眼跳平均速度大于所述眼跳平均速度轻度疲劳阈值且小于所述眼跳平均速度中度疲劳阈值时,则使得所述第一视觉疲劳值为第二速度数值,当所述眼跳平均速度大于所述眼跳平均速度中度疲劳阈值且小于所述眼跳平均速度重度疲劳阈值时,则使得所述第一视觉疲劳值为第三速度数值,当所述眼跳平均速度大于所述眼跳平均速度重度疲劳阈值时,则使得所述第一视觉疲劳值为第四速度数值,其中,所述眼跳平速度轻度疲劳阈值小于所述眼跳平均速度中度疲劳阈值,所述眼跳平均速度中度疲劳阈值小于所述眼跳平均速度重度疲劳阈值;
    将所述眼跳平均角速度与眼跳平均角速度等级阈值进行比较获得第二视觉疲劳值包括:当所述眼跳平均角速度小于所述眼跳平均角速度轻度疲劳阈值时,则使得所述第二视觉疲劳值为第一角速度数值,当所述眼跳平均角速度大于所述眼跳平均角速度轻度疲劳阈值且小于所述眼跳平均角速度中度疲劳阈值时,则使得所述第二视觉疲劳值为第二角速度数 值,当所述眼跳平均角速度大于所述眼跳平均角速度中度疲劳阈值且小于所述眼跳平均角速度重度疲劳阈值时,则使得所述第二视觉疲劳值为第三角速度数值,当所述眼跳平均角速度大于所述眼跳平均角速度重度疲劳阈值时,则使得所述第二视觉疲劳值为第四角速度数值,其中,所述眼跳平角速度轻度疲劳阈值小于所述眼跳平均角速度中度疲劳阈值,所述眼跳平均角速度中度疲劳阈值小于所述眼跳平均角速度重度疲劳阈值;
    将所述闭眼平均持续时间与闭眼持续平均时间等级阈值进行比较获得第三视觉疲劳值包括:当所述闭眼持续平均时间小于闭眼持续平均时间轻度疲劳阈值时,则使得所述第三视觉疲劳值为第一时间数值,当所述闭眼持续平均时间大于所述闭眼持续平均时间轻度疲劳阈值且小于所述闭眼持续平均时间中度疲劳阈值时,则使得所述第三视觉疲劳值为第二时间数值,当所述闭眼持续平均时间大于所述闭眼持续平均时间中度疲劳阈值且小于所述闭眼持续平均时间重度疲劳阈值时,则使得所述第三视觉疲劳值为第三时间数值,当所述闭眼持续平均时间大于所述闭眼持续平均时间重度疲劳阈值时,则使得所述第三视觉疲劳值为第四时间数值,其中,所述闭眼持续平均时间轻度疲劳阈值小于所述闭眼持续平均时间中度疲劳阈值,所述闭眼持续平均时间中度疲劳阈值小于所述闭眼持续平均时间重度疲劳阈值;
    将所述眨眼平均频率与眨眼平均频率等级阈值进行比较获得第四视觉疲劳值包括:当所述眨眼平均频率小于所述眨眼平均频率轻度疲劳阈值时,则使得所述第四视觉疲劳值为第一频率数值,当所述眨眼平均频率大于所述眨眼平均频率轻度疲劳阈值且小于所述眨眼平均频率中度疲劳阈值时,则使得所述第四视觉疲劳值为第二频率数值,当所述眨眼平均频率大于所述眨眼平均频率中度疲劳阈值且小于眨眼平均频率重度疲劳阈值时,则使得所述第四视觉疲劳值为第三频率数值,当所述眨眼平均频率大于所述眨眼平均频率重度疲劳阈值时,则使得所述第四视觉疲劳值为第四频率数值,其中,所述眨眼平均频率轻度疲劳阈值小于所述眨眼平均频率中度疲劳阈值,所述眨眼平均频率中度疲劳阈值小于所述眨眼平均频率重度疲劳阈值。
  7. 根据权利要求5或6所述的方法,其中,根据所述视觉特征计算 视觉疲劳值还包括:
    根据所述第一视觉疲劳值、所述第二视觉疲劳值、所述第三视觉疲劳值和/或所述第四视觉疲劳值确定所述视觉疲劳值。
  8. 根据权利要求7所述的方法,其中,所述疲劳等级阈值包括轻度疲劳阈值、中度疲劳阈值和重度疲劳阈值,将所述视觉疲劳值与疲劳等级阈值进行比较并根据比较结果判定视觉疲劳等级包括:
    当所述视觉疲劳值大于或等于所述轻度疲劳阈值并小于所述中度疲劳阈值时,判断所述视觉疲劳等级为轻度疲劳等级;
    当所述视觉疲劳值大于或等于所述中度疲劳阈值并小于所述重度疲劳阈值时,判断所述视觉疲劳等级为中度疲劳等级;
    当所述视觉疲劳值大于或等于所述重度疲劳阈值时,判断所述视觉疲劳等级为重度疲劳等级。
  9. 根据权利要求1-8任一项所述的方法,还包括根据所述视觉疲劳等级生成所述对应的提醒信号,其中,所述根据所述视觉疲劳等级生成所述对应的提醒信号包括:
    根据所述视觉疲劳等级生成对应颜色的图像闪烁信号和/或对应频率的振动信号,以使虚拟现实设备的屏幕上显示对应颜色的图像且所述图像以设定频率闪烁和/或使虚拟现实设备以预设频率震动。
  10. 根据权利要求1-9任一项所述的方法,用于虚拟现实设备。
  11. 一种视觉疲劳识别装置,包括:
    眼部图像获取单元,用于获取用户的眼部图像;
    视觉疲劳值获取单元,用于从所述眼部图像获取视觉特征,并根据所述视觉特征计算视觉疲劳值;
    视觉疲劳等级判定单元,用于将所述视觉疲劳值与疲劳等级阈值进行比较并根据比较结果判定视觉疲劳等级。
  12. 根据权利要求11所述的装置,还包括:提醒信号生成单元,用于根据所述视觉疲劳等级生成对应的提醒信号。
  13. 根据权利要求11或12所述的装置,还包括图像预处理单元,其中,
    所述图像预处理单元包括亮度提高单元、对比度提高单元和/或滤波 单元;
    所述亮度提高单元用于提高所述眼部图像的亮度;
    所述对比度提高单元用于提高所述眼部图像的对比度;
    所述滤波单元用于对所述眼部图像进行去噪处理。
  14. 根据权利要求11-13任一所述的装置,其中,所述视觉特征包括:眼跳平均速度、眼跳平均角速度、闭眼持续平均时间和/或眨眼平均频率,
    相应地,所述视觉疲劳值获取单元包括:
    视觉特征获取子单元,用于从连续的各帧所述眼部图像中分别获取瞳孔位置、瞳孔面积和/或眨眼次数;
    眼跳平均速度计算子单元,用于根据第一预设时间段内的各所述瞳孔位置计算所述眼跳平均速度;
    眼跳平均角速度计算子单元,用于根据第二预设时间段内的各所述瞳孔位置计算所述眼跳平均角速度;
    闭眼平均持续时间计算子单元,用于根据第三预设时间段内的各所述瞳孔面积计算所述闭眼平均持续时间;
    和/或
    眨眼平均频率计算子单元,用于根据第四预设时间段内的眨眼次数计算所述眨眼平均频率。
  15. 根据权利要求14所述的装置,其中,所述视觉疲劳值获取单元还包括:第一视觉疲劳值确定子单元、第二视觉疲劳值确定子单元、第三视觉疲劳值确定子单元和/或第四视觉疲劳值确定子单元,以及视觉疲劳值确定子单元,
    所述第一视觉疲劳值确定子单元用于将所述眼跳平均速度与眼跳平均速度等级阈值进行比较获得第一视觉疲劳值,
    所述第二视觉疲劳值确定子单元用于将所述眼跳平均角速度与眼跳平均角速度等级阈值进行比较获得第二视觉疲劳值,
    所述第三视觉疲劳值确定子单元用于将所述闭眼平均持续时间与闭眼持续平均时间等级阈值进行比较获得第三视觉疲劳值,
    所述第四视觉疲劳值确定子单元用于将所述眨眼平均频率与眨眼平均频率等级阈值进行比较获得第四视觉疲劳值,
    所述视觉疲劳值确定子单元用于根据所述第一视觉疲劳值、所述第二视觉疲劳值、所述第三视觉疲劳值和/或所述第四视觉疲劳值确定所述视觉疲劳值。
  16. 根据权利要求15所述的装置,其中,所述眼跳平均速度等级阈值包括眼跳平均速度轻度疲劳阈值、眼跳平均速度中度疲劳阈值和眼跳平均速度重度疲劳阈值;所述眼跳平均角速度等级阈值包括眼跳平均角速度轻度疲劳阈值、眼跳平均角速度中度疲劳阈值和眼跳平均角速度重度疲劳阈值;所述闭眼持续平均时间等级阈值包括眼跳平均角速度轻度疲劳阈值、眼跳平均角速度中度疲劳阈值和眼跳平均角速度重度疲劳阈值;所述眨眼平均频率等级阈值包括眨眼平均频率轻度疲劳阈值、眨眼平均频率中度疲劳阈值和眨眼平均频率重度疲劳阈值;
    所述第一视觉疲劳值确定子单元用于当所述眼跳平均速度小于所述眼跳平均速度轻度疲劳阈值时,使得所述第一视觉疲劳值为第一速度数值,当所述眼跳平均速度大于所述眼跳平均速度轻度疲劳阈值且小于所述眼跳平均速度中度疲劳阈值时,使得所述第一视觉疲劳值为第二速度数值,当所述眼跳平均速度大于所述眼跳平均速度中度疲劳阈值且小于所述眼跳平均速度重度疲劳阈值时,使得所述第一视觉疲劳值为第三速度数值,当所述眼跳平均速度大于所述眼跳平均速度重度疲劳阈值时,使得所述第一视觉疲劳值为第四平均速度数值,其中,所述眼跳平速度轻度疲劳阈值小于所述眼跳平均速度中度疲劳阈值,所述眼跳平均速度中度疲劳阈值小于所述眼跳平均速度重度疲劳阈值;
    所述第二视觉疲劳值确定子单元,用于当所述眼跳平均角速度小于眼跳平均角速度轻度疲劳阈值时,使得所述第二视觉疲劳值为第一角速度数值,当所述眼跳平均角速度大于所述眼跳平均角速度轻度疲劳阈值且小于所述眼跳平均角速度中度疲劳阈值时,使得所述第二视觉疲劳值为第二角速度数值,当所述眼跳平均角速度大于所述眼跳平均角速度中度疲劳阈值且小于所述眼跳平均角速度重度疲劳阈值时,使得所述第二视觉疲劳值为第三角速度数值,当所述眼跳平均角速度大于眼跳平均角速度重度疲劳阈值时,使得所述第二视觉疲劳值为第四角速度数值,其中,所述眼跳平角速度轻度疲劳阈值小于所述眼跳平均角速度中度疲劳阈值, 所述眼跳平均角速度中度疲劳阈值小于所述眼跳平均角速度重度疲劳阈值;
    所述第三视觉疲劳值确定子单元,用于当所述闭眼持续平均时间小于所述闭眼持续平均时间轻度疲劳阈值时,使得所述第三视觉疲劳值为第一时间数值,当所述闭眼持续平均时间大于所述闭眼持续平均时间轻度疲劳阈值且小于所述闭眼持续平均时间中度疲劳阈值时,使得所述第三视觉疲劳值为第二时间数值,当所述闭眼持续平均时间大于所述闭眼持续平均时间中度疲劳阈值且小于所述闭眼持续平均时间重度疲劳阈值时,使得所述第三视觉疲劳值为第三时间数值,当所述闭眼持续平均时间大于所述闭眼持续平均时间重度疲劳阈值时,使得所述第三视觉疲劳值为第四时间数值,其中,所述闭眼持续平均时间轻度疲劳阈值小于所述闭眼持续平均时间中度疲劳阈值,所述闭眼持续平均时间中度疲劳阈值小于所述闭眼持续平均时间重度疲劳阈值;
    所述第四视觉疲劳值确定子单元,用于当所述眨眼平均频率小于所述眨眼平均频率轻度疲劳阈值时,使得所述第四视觉疲劳值为第一频率数值,当所述眨眼平均频率大于所述眨眼平均频率轻度疲劳阈值且小于所述眨眼平均频率中度疲劳阈值时,使得所述第四视觉疲劳值为第二频率数值,当所述眨眼平均频率大于所述眨眼平均频率中度疲劳阈值且小于所述眨眼平均频率重度疲劳阈值时,使得所述第四视觉疲劳值为第三频率数值,当所述眨眼平均频率大于所述眨眼平均频率重度疲劳阈值时,使得所述第四视觉疲劳值为第四频率数值,其中,所述眨眼平均频率轻度疲劳阈值小于所述眨眼平均频率中度疲劳阈值,所述眨眼平均频率中度疲劳阈值小于所述眨眼平均频率重度疲劳阈值。
  17. 根据权利要求16所述的装置,其中,所述疲劳等级阈值包括轻度疲劳阈值、中度疲劳阈值和重度疲劳阈值,
    所述视觉疲劳等级判定单元用于当所述视觉疲劳值大于或等于所述轻度疲劳阈值并小于所述中度疲劳阈值时,判断所述视觉疲劳等级为轻度疲劳等级;
    当所述视觉疲劳值大于或等于所述中度疲劳阈值并小于所述重度疲劳阈值时,判断所述视觉疲劳等级为中度疲劳等级;
    当所述视觉疲劳值大于或等于所述重度疲劳阈值时,判断所述视觉疲劳等级为重度疲劳等级。
  18. 一种虚拟现实设备,包括如权利要求12-17任一所述的视觉疲劳识别装置。
  19. 一种虚拟现实设备,包括处理器和机器可读存储介质,其中,
    所述机器可读存储介质存储有可适于被所述处理器执行的机器可执行指令,所述机器可执行指令被所述处理器执行时实施如权利要求1-8任一项所述的视觉疲劳识别方法。
  20. 一种存储介质,非暂时性地存储计算机可读指令,当所述非暂时性计算机可读指令由计算机执行时可以执行如权利要求1-8任一所述的视觉疲劳识别方法。
PCT/CN2019/083902 2018-04-26 2019-04-23 视觉疲劳识别方法、视觉疲劳识别装置、虚拟现实设备和存储介质 WO2019206145A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/612,996 US11132544B2 (en) 2018-04-26 2019-04-23 Visual fatigue recognition method, visual fatigue recognition device, virtual reality apparatus and storage medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201810384239.1A CN108596106B (zh) 2018-04-26 2018-04-26 基于vr设备的视觉疲劳识别方法及其装置、vr设备
CN201810384239.1 2018-04-26

Publications (1)

Publication Number Publication Date
WO2019206145A1 true WO2019206145A1 (zh) 2019-10-31

Family

ID=63609587

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/083902 WO2019206145A1 (zh) 2018-04-26 2019-04-23 视觉疲劳识别方法、视觉疲劳识别装置、虚拟现实设备和存储介质

Country Status (3)

Country Link
US (1) US11132544B2 (zh)
CN (1) CN108596106B (zh)
WO (1) WO2019206145A1 (zh)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108596106B (zh) 2018-04-26 2023-12-05 京东方科技集团股份有限公司 基于vr设备的视觉疲劳识别方法及其装置、vr设备
CN110413124A (zh) * 2019-08-01 2019-11-05 贵州电网有限责任公司 一种基于vr视频的人机交互系统及其使用方法
CN112183443A (zh) * 2020-10-14 2021-01-05 歌尔科技有限公司 保护视力的方法、装置及智能眼镜
CN113239841B (zh) * 2021-05-24 2023-03-24 桂林理工大学博文管理学院 基于人脸识别的课堂专注状态检测方法及相关仪器
CN113448439A (zh) * 2021-06-30 2021-09-28 广东小天才科技有限公司 光线发射方法、装置、终端设备及存储介质
CN115993239A (zh) * 2023-03-24 2023-04-21 京东方艺云(苏州)科技有限公司 一种评估方法、装置、电子设备及存储介质
CN116974370B (zh) * 2023-07-18 2024-04-16 深圳市本顿科技有限公司 一种防沉迷儿童学习平板电脑控制方法及系统
CN117653350A (zh) * 2024-02-01 2024-03-08 科弛医疗科技(北京)有限公司 手术机器人和手术机器人防疲劳方法

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103186609A (zh) * 2011-12-30 2013-07-03 深圳富泰宏精密工业有限公司 使用电子装置时缓解视觉疲劳的系统及方法
CN103903583A (zh) * 2014-03-18 2014-07-02 友达光电股份有限公司 用于视觉疲劳判别的液晶显示装置及其屏幕闪烁方法
CN107562213A (zh) * 2017-10-27 2018-01-09 网易(杭州)网络有限公司 视疲劳状态的检测方法、装置以及头戴式可视设备
CN108596106A (zh) * 2018-04-26 2018-09-28 京东方科技集团股份有限公司 基于vr设备的视觉疲劳识别方法及其装置、vr设备

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
BRPI0712837B8 (pt) * 2006-06-11 2021-06-22 Volvo Tech Corporation método para determinação e análise de uma localização de interesse visual
CN102846324B (zh) * 2012-09-29 2015-01-21 中国人民解放军第四军医大学 具有监测人体疲劳能力的眼镜
CN103677270B (zh) * 2013-12-13 2016-08-17 电子科技大学 一种基于眼动跟踪的人机交互方法
US10564714B2 (en) * 2014-05-09 2020-02-18 Google Llc Systems and methods for biomechanically-based eye signals for interacting with real and virtual objects
CN204065594U (zh) * 2014-07-08 2014-12-31 镇江万新光学眼镜有限公司 一种可调控眼部湿润的眼镜
CN104240446A (zh) * 2014-09-26 2014-12-24 长春工业大学 基于人脸识别的疲劳驾驶预警系统
CN104778396B (zh) * 2015-04-29 2019-01-29 惠州Tcl移动通信有限公司 一种基于环境筛选帧的眼纹识别解锁方法及系统
CN106897725A (zh) * 2015-12-18 2017-06-27 西安中兴新软件有限责任公司 一种判断用户视力疲劳的方法及装置
CN105513280A (zh) * 2016-01-15 2016-04-20 苏州大学 疲劳驾驶检测方法
CN106073805B (zh) * 2016-05-30 2018-10-19 南京大学 一种基于眼动数据的疲劳检测方法和装置
CN106168854A (zh) * 2016-06-30 2016-11-30 北京小米移动软件有限公司 视力保护的提醒方法及装置
CN107341468B (zh) * 2017-06-30 2021-05-04 北京七鑫易维信息技术有限公司 驾驶员状态识别方法、装置、存储介质及处理器
CN107463254A (zh) * 2017-07-24 2017-12-12 中科创达软件科技(深圳)有限公司 一种视力保护方法、装置及终端
CN207207753U (zh) * 2017-08-07 2018-04-10 江苏速度智能科技有限公司 一种自带疲劳驾驶预警功能的低速电动车仪表
CN107645590A (zh) * 2017-08-16 2018-01-30 广东小天才科技有限公司 一种用眼疲劳提醒方法及用户终端

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103186609A (zh) * 2011-12-30 2013-07-03 深圳富泰宏精密工业有限公司 使用电子装置时缓解视觉疲劳的系统及方法
CN103903583A (zh) * 2014-03-18 2014-07-02 友达光电股份有限公司 用于视觉疲劳判别的液晶显示装置及其屏幕闪烁方法
CN107562213A (zh) * 2017-10-27 2018-01-09 网易(杭州)网络有限公司 视疲劳状态的检测方法、装置以及头戴式可视设备
CN108596106A (zh) * 2018-04-26 2018-09-28 京东方科技集团股份有限公司 基于vr设备的视觉疲劳识别方法及其装置、vr设备

Also Published As

Publication number Publication date
US20210042497A1 (en) 2021-02-11
CN108596106B (zh) 2023-12-05
US11132544B2 (en) 2021-09-28
CN108596106A (zh) 2018-09-28

Similar Documents

Publication Publication Date Title
WO2019206145A1 (zh) 视觉疲劳识别方法、视觉疲劳识别装置、虚拟现实设备和存储介质
JP7095030B2 (ja) 仮想現実コンテンツのビデオレンダリング速度の調整および立体画像の処理
KR102000888B1 (ko) 이미지들의 스트림을 디스플레이하기 위한 시스템 및 방법
JP7178403B2 (ja) ロバストなバイオメトリックアプリケーションのための詳細な眼形状モデル
JP2022051873A (ja) 眼瞼形状推定
CN106484116B (zh) 媒体文件的处理方法和装置
US9875547B2 (en) Method and apparatus for adjusting stereoscopic image parallax
WO2016180224A1 (zh) 一种人物图像处理方法及装置
RU2672502C1 (ru) Устройство и способ для формирования изображения роговицы
CN108280418A (zh) 脸部图像的欺骗识别方法及装置
US11385710B2 (en) Geometric parameter measurement method and device thereof, augmented reality device, and storage medium
CN104036169B (zh) 生物认证方法及生物认证装置
CN109993115A (zh) 图像处理方法、装置及可穿戴设备
CN114391117A (zh) 眼睛跟踪延迟增强
CN111985268A (zh) 一种人脸驱动动画的方法和装置
CN106774929B (zh) 一种虚拟现实终端的显示处理方法和虚拟现实终端
AU2018327983A1 (en) Techniques for providing virtual light adjustments to image data
EP4285203A1 (en) Context-aware extended reality systems
JP6221292B2 (ja) 集中度判定プログラム、集中度判定装置、および集中度判定方法
WO2020051781A1 (en) Systems and methods for drowsiness detection
CN111723636B (zh) 利用视动反应的欺骗检测
CN113552947A (zh) 虚拟场景的显示方法、装置和计算机可读存储介质
CN109542230B (zh) 图像处理方法、装置、电子设备及存储介质
CN107092853A (zh) 拍照方法及装置
CN108537552B (zh) 基于透镜的支付方法、装置及系统

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19793237

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19793237

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 19793237

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 19793237

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 14/05/2021)