WO2021037788A1 - Device for detecting drug influence based on visual data - Google Patents

Device for detecting drug influence based on visual data Download PDF

Info

Publication number
WO2021037788A1
WO2021037788A1 PCT/EP2020/073610 EP2020073610W WO2021037788A1 WO 2021037788 A1 WO2021037788 A1 WO 2021037788A1 EP 2020073610 W EP2020073610 W EP 2020073610W WO 2021037788 A1 WO2021037788 A1 WO 2021037788A1
Authority
WO
WIPO (PCT)
Prior art keywords
eye
metric
scanner device
person
vision
Prior art date
Application number
PCT/EP2020/073610
Other languages
French (fr)
Inventor
Jenny Johansson
Stefanie NAJAFI
Original Assignee
Eyescanner Technology Sweden Ab
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from SE2030188A external-priority patent/SE2030188A1/en
Application filed by Eyescanner Technology Sweden Ab filed Critical Eyescanner Technology Sweden Ab
Publication of WO2021037788A1 publication Critical patent/WO2021037788A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6887Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient mounted on external non-worn devices, e.g. non-medical devices
    • A61B5/6898Portable consumer electronic devices, e.g. music players, telephones, tablet computers
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/163Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state by tracking eye movement, gaze, or pupil change
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4845Toxicology, e.g. by detection of alcohol, drug or toxic products
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4863Measuring or inducing nystagmus
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/02Subjective types, i.e. testing apparatus requiring the active assistance of the patient
    • A61B3/08Subjective types, i.e. testing apparatus requiring the active assistance of the patient for testing binocular or stereoscopic vision, e.g. strabismus
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/11Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for measuring interpupillary distance or diameter of pupils
    • A61B3/112Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for measuring interpupillary distance or diameter of pupils for measuring diameter of pupils
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/113Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement

Definitions

  • the present disclosure relates to methods, uses, and portable devices for detecting drug influence in a person based on a visual representation of an eye of the person captured by one or more vision-based sensors.
  • BACKGROUND It is sometimes desired to determine whether a person is under the influence of one or more drugs, e.g., in a traffic police control situation.
  • a common way of determining if a person is under the influence of drugs is to manually observe the eyes of the person in order to detect signs of drug influence.
  • Manual testing is, however, time consuming, and may not be entirely reliable since the test result is at least partly influenced by the technique and judgement ability of the person or persons carrying out the manual test.
  • Different types of drugs also give different symptoms, which means that some experience is required by the person carrying out the test in order to reliably detect drug influence.
  • US 2018/0333092 A1 discloses an ocular response testing device for testing eyes of a patient to screen for driving under the influence of drugs or alcohol.
  • US 9,357,918 B1 discloses a system and method for drug screening and monitoring pupil reactivity and voluntary and involuntary eye muscle function. Automating the test procedure may provide more reliable test results obtained in a more efficient manner. There is, however, a continuing need for more robust ways to automatically determine is a person is under the influence of drugs. Also, the necessary equipment for carrying out the automated tests should be kept at a minimum.
  • This object is at least in part obtained by an eye-scanner device for detecting drug influence in a person based on a visual representation of an eye of the person.
  • the device comprises a vision-based sensor, a light source, and a control unit.
  • the vision-based sensor is arranged to capture a visual representation of the eye.
  • the control unit is arranged to determine a pupil size metric, a pupil’s reaction to light metric, a horizontal and/or vertical nystagmus metric, and an eye crossing ability metric.
  • the control unit is arranged to detect drug influence in the person based on the pupil size metric, the pupil’s reaction to light metric, the horizontal and/or vertical nystagmus metric, and the eye crossing ability metric.
  • the disclosed device measures four key metrics on which the drug detection test is based, which provides for a reliable detection.
  • the proposed device does not require contact with the persons face, which is an advantage.
  • the test procedure provides reliable test results obtained in an efficient manner and in a robust way, while keeping the necessary equipment for carrying out the automated tests at a minimum.
  • the device is portable and provides for a convenient drug testing device which can be incorporated in existing smartphones, which is an advantage.
  • the vision-based sensor comprises a camera sensor arranged on a smartphone, tablet, or a mobile computing device. This way, bulky vision-based sensor equipment may be omitted, which provides more convenient system.
  • the light source is comprised in a smartphone, tablet, or a mobile computing device. This way, additional light source equipment may be omitted, which provides more convenient system.
  • the control unit is arranged in physical connection to the vision-based sensor. This way, the detection of drug influence in the person can be done directly in the eye-scanner device.
  • control unit is arranged distanced from the vision- based sensor in an external processing resource. This way, the vision-based sensor can upload captured image data of the eye to the external processing resource, preferably via wireless link. This allows for more processing power to be utilized.
  • drug influence in the person is detected if one or more of the metrics fail to meet a respective metric pass criterion. This provides a quick and efficient way of detection of drug influence in the person.
  • drug influence in the person is detected based on a weighted combination of the metrics. This provides a more advanced way of detection of drug influence in the person.
  • the vision-based sensor is arranged to capture a time sequence of images of the eye and/or a video sequence of the eye.
  • the images and/or video sequence is then subject to image processing in order to determine the four different measurement metrics discussed above.
  • control unit is arranged to gather one or more statistics associated with the metrics, to anonymize the gathered statistics, and to store the anonymized statistics. This enables analysis of drug influence.
  • the data points may also be associated with respective time stamps and/or geographical positions.
  • the geographical positions may be quantized on, e.g., city or area level in order to not impact anonymization.
  • the eye-scanner device comprises a display with a visible marker.
  • the display can then be used to provide instructions to the test subject for, e.g., moving the gaze or directing the gaze at a given location on the screen.
  • the eye-scanner device comprises a distance sensor arranged to determine the distance between the vision-based sensor and the eye of the person. This can facilitate in obtaining the size measurements and movement measurements, discussed above, of the four tests in terms of absolute values.
  • the distance sensor comprises a camera sensor arranged on a smartphone, tablet, or a mobile computing device. This provides a convenient system.
  • the distance sensor is arranged to be calibrated using a known reference in proximity to eye. This can improve accuracy of the determined distance.
  • the distance sensor comprises a plurality of camera sensors.
  • the distance can thus be determined from techniques involving, i.a., distances between camera lenses (i.e. sensors), field of view angels, and geometric derivations.
  • the distance sensor comprises a radar, a light detection and ranging (LIDAR) device, and/or any other light-based ranging device, such as depth sensing using infrared projection. This can improve accuracy of the determined distance.
  • LIDAR light detection and ranging
  • the display comprises one or more test indicators arranged to indicate that a capture of a visual representation of the eye is ongoing. This can guide the person under test to keep looking at the device until the capture is complete.
  • the display comprises one or more distance indicators arranged to indicate that the distance between the vision-based sensor and the eye of the person is within a preferred range. This provides clear feedback to the person under test which enables the preferred distance to be maintained.
  • the display comprises one or more capture indicators arranged to indicate that that the vision-based sensor is capable of capturing the visual representation of the eye.
  • the capture indicators can thereby guide the person under test to maintain capturing capabilities.
  • the eye-scanner comprises a speaker device arranged to communicate instructions to the person. This is way, the person can be guided while not directly looking at the display, which is an advantage.
  • the pupil size metric, the pupil’s reaction to light metric, the horizontal and/or vertical nystagmus metric, and the eye crossing ability metric are, at least partly, determined based on relative measurements.
  • the relative measurements may eliminate the need of determining the absolute distance between the vision-based sensor and the eye of the person, or at least reduce the required accuracy of the distance, depending on the desired accuracy of the final measurement metrics of the four tests.
  • control units computer programs, computer readable media, computer program products, and uses associated with the above discussed advantages.
  • Figures 1-2 schematically illustrate example eye-scanner systems
  • Figures 3A, 3B schematically illustrate eye measurement metrics
  • Figure 4 is a flow chart illustrating details of example methods
  • Figure 5 schematically illustrates a processing device
  • Figure 6 schematically illustrates a computer program product.
  • an Eye-scanner which via the camera of a smartphone or other vision-based sensor device, automatically detects drug-affected persons.
  • the disclosed methods and devices allow for a more efficient and reliable detection of drug influence in a person.
  • the proposed device does not require contact with the persons face, such as the ski goggle-like frame described in US 9,357,918 B1.
  • the Eye-scanner is arranged to, within a few minutes, measure four different measurement values or metrics by eye tracking and pupil observation: pupil size, pupil’s reaction to light, control of horizontal and/or vertical nystagmus, and the ability of the person to cross eyes.
  • the disclosed device measures four key metrics on which the drug detection test is based, which provides for a more reliable detection result compared to the prior art based on measurement of less than four metrics.
  • Figure 1 shows a system 100 for detecting drug influence in a person based which is based on an eye-scanner device 120.
  • the system comprises a vision- based sensor 121 for capturing visual representations such as an image, a time sequence of images, and/or a video sequence of an eye 110 of the person subject to drug influence testing.
  • the system 100 also comprises a light source 122 and a control unit 125 and/or external processing resources 130 for detecting drug influence in a person by image processing of the captured visual representation.
  • the external processing resources may comprise data storage for storing anonymized test results and statistics.
  • Figure 2 shows an embodiment 200 of the proposed system where the vision- based sensor 121 , light source 122, and control unit 125 is comprised in a smartphone 210, tablet, or the like.
  • This system is portable and provides for a convenient drug testing device which can be incorporated in existing smartphones, which is an advantage.
  • the light source 122 may be a photography light source normally found on smart phones, which can be used to stimulate the eye by turning the light source on and off. Such light sources may also be arranged with a controllable light intensity.
  • the vision-based sensor 121 is a stereo vision sensor comprising more than one vision sensing element, e.g., two or more camera lenses.
  • the vision-based sensor can then be arranged to determine a distance from the sensor to the pupil or pupils, and thereby estimate pupil size.
  • the pupil can be measured in terms of size 310, and its horizontal 330 and vertical 340 motion can be tracked over a time sequence of images or through a video sequence.
  • the eye-scanner 120 and/or the external processing resource 130 comprise control units 125, 135 configured to perform image processing, and in particular to locate the pupil in a captured image or sequence of images. This hardware will be discussed in more detail below in connection to Figure 5.
  • Figure 3A shows an eye 300 where a pupil size measure 310 is indicated.
  • the pupil size test checks if the pupil size is within the range of 2.0-8.0 mm, and preferably in the range 3.0-6.5 mm. According to some aspects, if the person under test falls outside the normal range, a positive result is generated, otherwise a negative result is generated.
  • the pupil size is used to derive a first weighting value which is based on how far outside the range the pupils of the person under test is.
  • a relative low value is output, which value then increases as the pupil size comes closer to the boundary of the normal range.
  • the value then continues to increase as the pupil size goes beyond the normal range.
  • the ‘normality’ of a pupil size can be quantified more finely, e.g., on a scale from 1 to 100, compared to a binary normal/abnormal quantification.
  • Measurement metric 2 measurement of pupil's reaction to light
  • the test measures whether the pupil responds to light.
  • Light of a given intensity is first directed in the eye.
  • the pupil is then monitored for response to the light.
  • Slow or no reaction to light generates a positive result.
  • What constitutes a “slow” reaction can be pre-configured.
  • This metric may also be represented as a second weighting value, e.g., on a scale from 0 to 100.
  • a very fast reaction to light then gives a low value, e.g., below 10, while a slower reaction results in a higher value. Non-existent reactions would give a result on the order of 100.
  • the reaction to light can then be quantified more finely compared to the binary slow/fast.
  • the ambient light is taken into consideration when determining the value of the response metric.
  • a less pronounced reaction can be expected to light of a given intensity if the eye has previously been subjected to strong daylight for an extended period of time, compared to an eye which has not been subject to strong ambient light, e.g., at nighttime.
  • the light source used for testing i.e., the light of the given intensity, can be modulated in terms of this intensity in dependence of an intensity of the ambient light.
  • Nystagmus is a condition of involuntary eye movement. Due to the involuntary movement of the eye, it has also sometimes been called “dancing eyes”.
  • Figure 3A illustrates vertical 340 and horizontal eye movement 330.
  • the test checks whether involuntary twitching occurs when the person is looking from side to side (horizontal) or up and then down (vertical). Florizontal and/or vertical jerking of the pupil, i.e., if the pupil does not move linearly from end point to end point, generates a positive result. Again, the twitching can be quantized on a finer scale from 0 to 100.
  • the eye-scanner 120 may comprise a display 220 configured to show a marker which the test subject can be instructed to follow with the eye 110.
  • the marker can then move from side to side or up and down, while the pupil motion in monitored for any non-smooth motion indicating a level of nystagmus.
  • Measurement value 4 checking the ability to cross the eyes.
  • Crossing one’s eyes means to direct left and right eye gaze towards a point located close to the eyes and centered between the eyes, as illustrated in Figure 3B. Eye crossing occurs, e.g., when trying to follow an object with both eyes as the object approaches the nose of the person.
  • the test checks the person's ability to cross his or her eyes when following an object with the eyes. Inability to cross eyes generates a positive outcome.
  • the display 220 can be used to assist in measuring this metric. Similar to the nystagmus metric, a marker can be shown on the display at a location where eye crossing is expected in a person not influenced by drugs. The movement of the pupils can then be monitored in order to discern if the test subject is capable of crossing eyes or not. Again, the relative location of the pupils can be detected based on image processing of the captured visual representation.
  • a preliminary analysis is carried out after each of the four tests to determine if the test was successful, i.e. to investigate if enough data has been collected for the analysis discussed above. If, e.g., the view of the vision-based sensor is obstructed during a test, the test is deemed unsuccessful and a re-testing is prompted.
  • a detection of drug influence and potential recommendation for further testing is triggered in case one or more of the four tests are failed, i.e., if one or more measured metric do not meet respective pre-determined pass criteria.
  • a combination of the above weighting values can be used as base for detecting drug influence. For instance, four weighting values can be added together to represent drug influence likelihood on a scale from 0 to 400.
  • the results are anonymized and stored on a remote server such as the external processing resource 130. This is in order to later be able to use the statistics the results generate.
  • Figures 1 and 2 schematically illustrate example eye-scanner devices 120 for detecting drug influence in a person based on a visual representation of an eye 110 of the person.
  • the device comprises a vision- based sensor 121 , a light source 122 and a control unit 125, 135.
  • the vision- based sensor 121 is arranged to capture a visual representation of the eye 110.
  • the control unit 125, 135 is arranged to process the visual representation in order to determine the four metrics discussed above; a pupil size metric 310, a pupil’s reaction to light metric 320, a horizontal and/or vertical nystagmus metric 330, 340, and an eye crossing ability metric 350.
  • the control unit 125, 135 is arranged to detect drug influence in the person based on the pupil size metric 310, the pupil’s reaction to light metric 320, the horizontal and/or vertical nystagmus metric 330, 340, and the eye crossing ability metric 350.
  • the vision-based sensor 121 comprises a camera sensor arranged on a smartphone 210, tablet, or a mobile computing device.
  • a camera sensor arranged on a smartphone 210, tablet, or a mobile computing device.
  • existing smart devices such as mobile smartphones and the like can be used to implement the herein proposed techniques.
  • the vision-based sensor 121 may be arranged to capture a time sequence of images of the eye 110, and/or a video sequence of the eye 110. The images and/or video sequence is then subject to image processing in order to determine the four different measurement metrics discussed above.
  • the light source 320 may also be comprised in the smartphone 210, tablet, or mobile computing device.
  • the eye-scanner devices 120 may comprise a distance sensor 126 arranged to determine the distance between the vision-based sensor 121 and the eye 110 of the person. By using the determined distance, the control unit can carry out the analysis on measured values captured in a range of different distances, e.g. between 0 and 2 meters. Preferably, however, the distance is in a preferred range of 10 to 50 centimeters.
  • the distance sensor 126 comprises a camera sensor arranged on a smartphone 210, tablet, or a mobile computing device.
  • existing smart devices such as mobile smartphones and the like can be used to implement the herein proposed techniques.
  • the determined distance may be determined, e.g. using reference sizes or estimated reference sizes.
  • the distance sensor 126 is arranged to be calibrated using a known reference in proximity to eye 110 of the person under test before or after the tests.
  • the known reference may, e.g., be a paper with a checkerboard with known dimensions, or something similar.
  • the distance sensor 126 may comprise a plurality of camera sensors, e.g. as in a stereoscopic camera.
  • existing smart devices such as mobile smartphones and the like can be used to implement the herein proposed techniques.
  • the distance can thus be determined from techniques involving, i.a., distances between camera lenses (i.e. sensors), field of view angels, and geometric derivations.
  • the distance sensor 126 comprises a radar, a light detection and ranging (LIDAR) device, and/or any other light-based ranging device, such as depth sensing using infrared projection.
  • LIDAR light detection and ranging
  • existing smart devices such as mobile smartphones and the like can be used to implement the herein proposed techniques.
  • the distance sensor 126 may comprise an ultra-sonic ranging device.
  • Three- dimensional ultrasound technology for human-machine interaction is an emerging technology which may be an integral part of future smartphones and the like.
  • the size measurements and movement measurements, discussed above, of the four tests may be measured in terms of absolute values. However, alternatively, or in combination of, these measurements may be measured in relative terms.
  • the pupil size metric 310, the pupil’s reaction to light metric 320, the horizontal and/or vertical nystagmus metric 330, 340, and the eye crossing ability metric 350 are, at least partly, determined based on relative measurements.
  • the size of the pupil e.g. its diameter
  • the size of the iris e.g. its diameter
  • Other normalizations are also possible.
  • the relative measurements may eliminate the need of determining the absolute distance between the vision-based sensor 121 and the eye 110 of the person, or at least reduce the required accuracy of the distance, depending on the desired accuracy of the final measurement metrics of the four tests.
  • the eye-scanner device 120 optionally also comprises a display 220 with a visible marker or the like.
  • the display can then be used to provide instructions to the test subject for, e.g., moving the gaze or directing the gaze at a given location on the screen. For instance, when determining the eye-crossing ability metric, the test person may be directed to look at a marker on the screen, which marker is positioned so as to induce eye crossing.
  • the display may also be configured to show markers moving from right to left and/or up and down in order to stimulate horizontal or vertical pupil movement during a nystagmus test.
  • the display 220 may also display the sequence of images and/or video sequence of the eye 110 in a live feed.
  • the display may comprise one or more test indicators arranged to indicate that a capture of a visual representation of the eye 110 (i.e. a test) is ongoing.
  • the indicators can, for example, be circles displayed on top of the pupils in the live feed. These circles can track the pupils on the display continuously during the capture. The circles may only be present during one of the tests. Alternatively, the circles are present at all times during the capture and change shape or color during one of the tests. This can guide the person under test to keep looking at the device until the capture is complete.
  • the display may comprise one or more distance indicators arranged to indicate that the distance between the vision-based sensor 121 and the eye 110 of the person is within a preferred range. If the distance is larger than preferred, the indicators may change shape or color. If the distance is smaller value than preferred, the indicators may change in a different way.
  • the distance indicators may also be circles displayed on top of the pupils. As an example, the circles are green when the distance is in the preferred range, yellow when the distance is too short and red when the distance is too long. The distance indicators can thereby guide the person under test to maintain the distance within the preferred range.
  • the display may comprise one or more capture indicators arranged to indicate that that the vision-based sensor 121 is capable of capturing the visual representation of the eye 110, which means, i.a., that eye is in of the field of view of the sensor and that the eye is in focus.
  • the capture indicators can thereby guide the person under test to maintain capturing capabilities.
  • the capture indicators may also be circles displayed on top of the pupils, which appears on the display when vision-based sensor 121 is capable of capturing the visual representation of the eye 110.
  • the eye-scanner devices 120 comprises a speaker device 127 arranged to communicate instructions to the person.
  • the speaker device 127 may be arranged on a smartphone 210, tablet, or a mobile computing device.
  • the instruction could be, e.g., “look into the camera”, “the light source 320 is about to be turned on, keep looking straight ahead”, and “without moving your head, start looking to the left, continue moving your eyes until commanded to stop”. This is way, the person can be guided while not directly looking at the display, which is an advantage.
  • the control unit 125 may be arranged in physical connection to the vision- based sensor 121 , e.g., as part of the processing circuitry of a smart phone 210. However, the control unit 135 may also be arranged distanced from the vision-based sensor in an external processing resource 130, such as a remote server or cloud-based processing resource. The vision-based sensor then uploads captured image data of the eye 110 to the external processing resource, preferably via wireless link 140.
  • each metric can be associated with a pass criterion.
  • the pupil’s size metric can be associated with a range of pupil sizes. If the pupil size 310 of the person under test is not within the acceptable range, then the test is failed, otherwise it is passed.
  • the pupil’s reaction to light metric 320, the horizontal and/or vertical nystagmus metric 330, 340, and the eye crossing ability metric 350 may be associated with respective pass criteria. The person is detected to be under likely influence of drugs if one or more of the metrics fail to meet the respective metric pass criterion.
  • the drug detection test can also be more advanced, e.g., comprising a weighted combination of the four metrics. In this case, if a person only barely passes the four different tests, a likely drug influence may still be detected based on a combination of the four test-metrics.
  • control unit 125, 135 may optionally be arranged to gather one or more statistics or data points associated with the metrics, to anonymize the gathered statistics, and to store the anonymized statistics. This enables analysis of drug influence.
  • the data points may also be associated with respective time stamps and/or geographical positions. The geographical positions may be quantized on, e.g., city or area level in order to not impact anonymization.
  • the control unit 125, 135 may optionally be arranged to gather one or more data points associated with the metrics, to associate the gathered statistics with the person under test, and to store the associated statistics. This way, it is possible to gather base values of the metrics of a person under test when he is not under the influence of a drug.
  • the control unit may be arranged to compare the new test data with the stored base values. Deviations between the new data and the base values can be an indication that the person is under the influence of a drug.
  • the drug detection test may comprise a combination of comparing the test metrics to nominal values (i.e. predetermined threshold values) and comparing to the persons base values.
  • the eye-scanner device 120 discussed above is arranged to execute a method comprising:
  • This representation may comprise one or more images, or a video sequence, or a combination of images and video sequence.
  • the pupil size may be estimated from the one or more images, e.g., as a diameter of the pupil.
  • the location and boundary of the pupil may be detected by means of image processing and/or eye tracking techniques.
  • Nystagmus may be detected by comparing a sequence of captured images or a video sequence of the eye 110 in order to detect, e.g., involuntary twitching.
  • S4 Determine an ability to cross eyes metric 350 based on the visual representation, i.e., an ability to move the gaze direction laterally in a controlled manner.
  • the person may be instructed to attempt to cross eyes or be instructed to follow a marker on a display 220 or be instructed to follow an external object such as a pen which is moved in a way to cause eye crossing.
  • Ability to cross eyes may be determined based on image processing techniques.
  • S5 Determine a measurement of pupil's reaction to light metric, where the light has a given intensity.
  • a light source 320 arranged on or in connection to the vision-based sensor 121 may be used to stimulate a reaction, which reaction can then be monitored based on the representation.
  • the disclosed method then comprises determining S6 a level of drug influence associated with the person based on the determined metrics, i.e., based on the abilities of the eye or pair of eyes.
  • the determining of drug influence may be a yes/no result, or a level on a scale.
  • the determining may also be based on a sequence of tests, where drug influence is determined based on an ability of the person to pass the sequence of tests, e.g., associated with S2-S5 above.
  • the detection is based on four measurements S2, S3, S4, S5, giving a robust detection method based on the four metrics able to detect drug influence in a person.
  • FIG. 5 schematically illustrates, in terms of a number of functional units, the components of a control unit 125, 135 according to an embodiment of the discussions herein.
  • the control unit may be comprised in a mobile device such as a smart phone 210 or in an external processing resource 130.
  • Processing circuitry 510 is provided using any combination of one or more of a suitable central processing unit CPU, multiprocessor, microcontroller, digital signal processor DSP, etc., capable of executing software instructions stored in a computer program product, e.g. in the form of a storage medium 530.
  • the processing circuitry 510 may further be provided as at least one application specific integrated circuit ASIC, or field programmable gate array FPGA.
  • the processing circuitry 510 is configured to cause the control unit 125, 135 to perform a set of operations, or steps, such as the methods discussed in connection to Figure 4.
  • the storage medium 530 may store the set of operations
  • the processing circuitry 510 may be configured to retrieve the set of operations from the storage medium 530 to cause the control unit 125, 135 to perform the set of operations.
  • the set of operations may be provided as a set of executable instructions.
  • the processing circuitry 510 is thereby arranged to execute methods as herein disclosed.
  • the storage medium 530 may also comprise persistent storage, which, for example, can be any single one or combination of magnetic memory, optical memory, solid state memory, memory card or even remotely mounted memory.
  • the control unit 125, 135 may further comprise an interface 520 for communications with at least one external device 540.
  • the interface 520 may comprise one or more transmitters and receivers, comprising analogue and digital components and a suitable number of ports for wireline or wireless communication.
  • the processing circuitry 510 controls the general operation of the control unit 125, 135 e.g. by sending data and control signals to the interface 520 and the storage medium 530, by receiving data and reports from the interface 520, and by retrieving data and instructions from the storage medium 530.
  • Other components, as well as the related functionality, of the control node are omitted in order not to obscure the concepts presented herein.
  • Figure 6 schematically illustrates a computer program product 600, comprising a set of operations 610 executable by the control unit 125, 135.
  • the set of operations 610 may be loaded into the storage medium 530.
  • the set of operations may correspond to the methods discussed above in connection to Figure 4 or to the different operations by the discussed above.
  • the computer program product 600 is illustrated as an optical disc, such as a CD (compact disc) or a DVD (digital versatile disc) or a Blu-Ray disc.
  • the computer program product could also be embodied as a memory, such as a random-access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM), or an electrically erasable programmable read-only memory (EEPROM) and more particularly as a non-volatile storage medium of a device in an external memory such as a USB (Universal Serial Bus) memory or a Flash memory, such as a compact Flash memory.
  • RAM random-access memory
  • ROM read-only memory
  • EPROM erasable programmable read-only memory
  • EEPROM electrically erasable programmable read-only memory
  • EEPROM electrically erasable programmable read-only memory
  • the computer program is here schematically shown as a track on the depicted optical disk, the computer program can be stored in
  • an eye-scanner application is started, which comprises five subsequent stages: START, which is the first interface that introduces the person under test to the application wherein a test can be initiated; PREAMBLE, which is an interface presenting the stages and instructions, and gives the possibility to terminate; TEST DATA, which is an interface wherein information about the person under test can be inputted; MEASUREMENTS, which starts the measurements and displays a measurement interface; and RESULTS, which is an interface displaying results. Details about the five stages follow below.
  • the start interface is the first interface when the application is launched. It displays that the eye-scanner application has been launched and enables initialization of testing. After the subsequent stages have been cycled through, the application is returned to the start interface.
  • the preamble interface is launched after the initialization of testing.
  • This interface instructs and prepares the person under test for testing, i.e. what will happen (the four tests) and what is required of the person under test.
  • This interface thereby allows the person under test to confirm that a test should be started, which is useful if the initialization of testing is initiated by accident on the start interface.
  • the preamble interface also comprises means for termination, which returns the application to the start interface, and means for confirmation, which is a continuation to the next stage.
  • the test data interface takes information about the person under test as input, such as age and sex.
  • the information can be useful for setting threshold values for the four test-metrics indicating influence of drugs.
  • the information can also relate to drug intake, such as type of drug, quantity, and time of use, which can be useful when determining threshold values of the test metrics in a controlled experiment.
  • the test data interface also comprises means for termination, which returns the application to the start interface, and means for confirmation, which is a continuation to the next stage.
  • the confirmation can only be initiated when a minimum amount of information has been inputted, such as age and sex.
  • the four tests may be carried out in a single sequence, or they may be initiated individually.
  • the camera is turned on and a live feed of the camera is displayed.
  • capture indicators in the form of circles displayed on top of the pupils appears on the live feed, and remain there as long as the camera sensor 121 is capable of capturing the visual representation of the eye 110. This can increase the probability of a successful capture during one of the four tests.
  • the capture indicators can thereby guide the person under test to maintain capturing capabilities.
  • the measurements interface also comprise a distance indicator that indicate that the distance between the camera sensor 121 and the eye 110 of the person under test is within a preferred range. This provides clear feedback to the person under test which enables the preferred distance to be maintained.
  • the four tests may be carried out automatically or be initiated by the person under test.
  • An information view successively shows information about the ongoing test and, together with markers, guides the person under test what to do.
  • the measurements interface comprise three parts: an information view, which shows information about the current test (out of four) and provides feedback to the person under test; a film view, which shows a live feed of the camera; and means for termination.
  • the information view provides information about the ongoing test. After one of the four tests has successfully been completed, the information view provides new information about a new ongoing test.
  • the information view comprises a headline stating the ongoing test and an instruction to the person under test, e.g. “look into the camera”, “the light source 320 is about to be turned on, keep looking straight ahead”, and “without moving your head, start looking to the left, continue moving your eyes until commanded to stop”.
  • the information view also comprises a capture indicator and a distance indicator.
  • the information view further comprises a counter that counts down the time to the start of a test. The person under test thereby has time to move his eyes into position before the test starts.
  • the information view also comprises information about what is currently being processed, e.g. “looking for the pupil”. Some or all of the information on the information view may also be communicated to the person through voice messages (which may be pre-recorded).
  • the film view is a live feed of the camera. This provides feedback to the person under test such that he can hold the phone properly.
  • the capture indicators in the form of circles displayed on top of the pupils reinforce this feedback.
  • the means for termination allows the person under test to abort the tests and return to the start interface.
  • the results interface is initiated after the measurement stage is successfully completed.
  • the analysis of the measurement metrics can take place on the processing circuitry of the smart phone or on an external processing resource 130, such as a remote server or cloud-based processing resource. It is also possible that some of the analysis is carried out on the processing circuitry of the smart phone and the rest on the external processing resource.
  • the results interface shows that all tests have been completed and a summary of the results, which comprises age, sex, date, and a test code.
  • a recommendation for further management may be provided on the results interface.
  • recommendation for additional drug tests such as blood or urine tests.
  • the test code is a unique code for a completed cycle of the stages. It can be used for connecting the four tests to the results of an additional drug tests such as blood or urine tests.
  • the results interface also comprises means for termination, which returns the application to the start interface.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Animal Behavior & Ethology (AREA)
  • Public Health (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Veterinary Medicine (AREA)
  • General Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Child & Adolescent Psychology (AREA)
  • Developmental Disabilities (AREA)
  • Educational Technology (AREA)
  • Hospice & Palliative Care (AREA)
  • Psychiatry (AREA)
  • Psychology (AREA)
  • Social Psychology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Pharmacology & Pharmacy (AREA)
  • Toxicology (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

An eye-scanner device (120) for detecting drug influence in a person based on a visual representation of an eye (110) of the person, the device comprising a vision-based sensor (121), a light source (122) and a control unit (125, 135), wherein the vision-based sensor (121) is arranged to capture a visual representation of the eye (110), the control unit (125, 135) is arranged to determine a pupil size metric, a pupil's reaction to light metric, a horizontal and/or vertical nystagmus metric, and an eye crossing ability metric, wherein the control unit (125, 135) is arranged to detect drug influence in the person based on the pupil size metric, the pupil's reaction to light metric, the horizontal and/or vertical nystagmus metric, and the eye crossing ability metric.

Description

TITLE
DEVICE FOR DETECTING DRUG INFLUENCE BASED ON VISUAL DATA
TECHNICAL FIELD The present disclosure relates to methods, uses, and portable devices for detecting drug influence in a person based on a visual representation of an eye of the person captured by one or more vision-based sensors.
BACKGROUND It is sometimes desired to determine whether a person is under the influence of one or more drugs, e.g., in a traffic police control situation. A common way of determining if a person is under the influence of drugs is to manually observe the eyes of the person in order to detect signs of drug influence. Manual testing is, however, time consuming, and may not be entirely reliable since the test result is at least partly influenced by the technique and judgement ability of the person or persons carrying out the manual test. Different types of drugs also give different symptoms, which means that some experience is required by the person carrying out the test in order to reliably detect drug influence.
Some work has been made towards automating the procedure in order to increase the efficiency and reliability of drug influence detection;
US 2018/0333092 A1 discloses an ocular response testing device for testing eyes of a patient to screen for driving under the influence of drugs or alcohol.
US 9,357,918 B1 discloses a system and method for drug screening and monitoring pupil reactivity and voluntary and involuntary eye muscle function. Automating the test procedure may provide more reliable test results obtained in a more efficient manner. There is, however, a continuing need for more robust ways to automatically determine is a person is under the influence of drugs. Also, the necessary equipment for carrying out the automated tests should be kept at a minimum. SUMMARY
It is an object of the present disclosure to provide an improved device for detecting drug influence in a person. This object is at least in part obtained by an eye-scanner device for detecting drug influence in a person based on a visual representation of an eye of the person. The device comprises a vision-based sensor, a light source, and a control unit. The vision-based sensor is arranged to capture a visual representation of the eye. The control unit is arranged to determine a pupil size metric, a pupil’s reaction to light metric, a horizontal and/or vertical nystagmus metric, and an eye crossing ability metric. The control unit is arranged to detect drug influence in the person based on the pupil size metric, the pupil’s reaction to light metric, the horizontal and/or vertical nystagmus metric, and the eye crossing ability metric. The disclosed device measures four key metrics on which the drug detection test is based, which provides for a reliable detection. The proposed device does not require contact with the persons face, which is an advantage. The test procedure provides reliable test results obtained in an efficient manner and in a robust way, while keeping the necessary equipment for carrying out the automated tests at a minimum. The device is portable and provides for a convenient drug testing device which can be incorporated in existing smartphones, which is an advantage.
According to aspects, the vision-based sensor comprises a camera sensor arranged on a smartphone, tablet, or a mobile computing device. This way, bulky vision-based sensor equipment may be omitted, which provides more convenient system.
According to aspects, the light source is comprised in a smartphone, tablet, or a mobile computing device. This way, additional light source equipment may be omitted, which provides more convenient system. According to aspects, the control unit is arranged in physical connection to the vision-based sensor. This way, the detection of drug influence in the person can be done directly in the eye-scanner device.
According to aspects, the control unit is arranged distanced from the vision- based sensor in an external processing resource. This way, the vision-based sensor can upload captured image data of the eye to the external processing resource, preferably via wireless link. This allows for more processing power to be utilized.
According to aspects, drug influence in the person is detected if one or more of the metrics fail to meet a respective metric pass criterion. This provides a quick and efficient way of detection of drug influence in the person.
According to aspects, drug influence in the person is detected based on a weighted combination of the metrics. This provides a more advanced way of detection of drug influence in the person.
According to aspects, the vision-based sensor is arranged to capture a time sequence of images of the eye and/or a video sequence of the eye. The images and/or video sequence is then subject to image processing in order to determine the four different measurement metrics discussed above.
According to aspects, the control unit is arranged to gather one or more statistics associated with the metrics, to anonymize the gathered statistics, and to store the anonymized statistics. This enables analysis of drug influence. The data points may also be associated with respective time stamps and/or geographical positions. The geographical positions may be quantized on, e.g., city or area level in order to not impact anonymization.
According to aspects, the eye-scanner device comprises a display with a visible marker. The display can then be used to provide instructions to the test subject for, e.g., moving the gaze or directing the gaze at a given location on the screen.
According to aspects, the eye-scanner device comprises a distance sensor arranged to determine the distance between the vision-based sensor and the eye of the person. This can facilitate in obtaining the size measurements and movement measurements, discussed above, of the four tests in terms of absolute values.
According to aspects, the distance sensor comprises a camera sensor arranged on a smartphone, tablet, or a mobile computing device. This provides a convenient system.
According to aspects, the distance sensor is arranged to be calibrated using a known reference in proximity to eye. This can improve accuracy of the determined distance.
According to aspects, the distance sensor comprises a plurality of camera sensors. The distance can thus be determined from techniques involving, i.a., distances between camera lenses (i.e. sensors), field of view angels, and geometric derivations.
According to aspects, the distance sensor comprises a radar, a light detection and ranging (LIDAR) device, and/or any other light-based ranging device, such as depth sensing using infrared projection. This can improve accuracy of the determined distance.
According to aspects, the display comprises one or more test indicators arranged to indicate that a capture of a visual representation of the eye is ongoing. This can guide the person under test to keep looking at the device until the capture is complete.
According to aspects, the display comprises one or more distance indicators arranged to indicate that the distance between the vision-based sensor and the eye of the person is within a preferred range. This provides clear feedback to the person under test which enables the preferred distance to be maintained.
According to aspects, the display comprises one or more capture indicators arranged to indicate that that the vision-based sensor is capable of capturing the visual representation of the eye. The capture indicators can thereby guide the person under test to maintain capturing capabilities. According to aspects, the eye-scanner comprises a speaker device arranged to communicate instructions to the person. This is way, the person can be guided while not directly looking at the display, which is an advantage.
According to aspects, the pupil size metric, the pupil’s reaction to light metric, the horizontal and/or vertical nystagmus metric, and the eye crossing ability metric are, at least partly, determined based on relative measurements. The relative measurements may eliminate the need of determining the absolute distance between the vision-based sensor and the eye of the person, or at least reduce the required accuracy of the distance, depending on the desired accuracy of the final measurement metrics of the four tests.
There is furthermore disclosed herein control units, computer programs, computer readable media, computer program products, and uses associated with the above discussed advantages.
The methods disclosed herein are associated with the same advantages as discussed above in connection to the eye-scanner device.
Generally, all terms used in the claims are to be interpreted according to their ordinary meaning in the technical field, unless explicitly defined otherwise herein. All references to "a/an/the element, apparatus, component, means, step, etc." are to be interpreted openly as referring to at least one instance of the element, apparatus, component, means, step, etc., unless explicitly stated otherwise. The steps of any method disclosed herein do not have to be performed in the exact order disclosed, unless explicitly stated. Further features of, and advantages with, the present invention will become apparent when studying the appended claims and the following description. The skilled person realizes that different features of the present invention may be combined to create embodiments other than those described in the following, without departing from the scope of the present invention.
BRIEF DESCRIPTION OF THE DRAWINGS Aspects of the present disclosure will now be described more fully with reference to the accompanying drawings, where
Figures 1-2 schematically illustrate example eye-scanner systems;
Figures 3A, 3B schematically illustrate eye measurement metrics; Figure 4 is a flow chart illustrating details of example methods;
Figure 5 schematically illustrates a processing device; and
Figure 6 schematically illustrates a computer program product.
DETAILED DESCRIPTION The invention will now be described more fully hereinafter with reference to the accompanying drawings, in which certain aspects of the invention are shown. This invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments and aspects set forth herein; rather, these embodiments are provided by way of example so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art. Like numbers refer to like elements throughout the description.
It is to be understood that the present invention is not limited to the embodiments described herein and illustrated in the drawings; rather, the skilled person will recognize that many changes and modifications may be made within the scope of the appended claims.
There is disclosed herein a digital tool referred to as an Eye-scanner, which via the camera of a smartphone or other vision-based sensor device, automatically detects drug-affected persons. The disclosed methods and devices allow for a more efficient and reliable detection of drug influence in a person. Compared to known devices for detecting drug influence, the proposed device does not require contact with the persons face, such as the ski goggle-like frame described in US 9,357,918 B1. The Eye-scanner is arranged to, within a few minutes, measure four different measurement values or metrics by eye tracking and pupil observation: pupil size, pupil’s reaction to light, control of horizontal and/or vertical nystagmus, and the ability of the person to cross eyes. Thus, the disclosed device measures four key metrics on which the drug detection test is based, which provides for a more reliable detection result compared to the prior art based on measurement of less than four metrics.
Figure 1 shows a system 100 for detecting drug influence in a person based which is based on an eye-scanner device 120. The system comprises a vision- based sensor 121 for capturing visual representations such as an image, a time sequence of images, and/or a video sequence of an eye 110 of the person subject to drug influence testing. The system 100 also comprises a light source 122 and a control unit 125 and/or external processing resources 130 for detecting drug influence in a person by image processing of the captured visual representation. The external processing resources may comprise data storage for storing anonymized test results and statistics.
Figure 2 shows an embodiment 200 of the proposed system where the vision- based sensor 121 , light source 122, and control unit 125 is comprised in a smartphone 210, tablet, or the like. This system is portable and provides for a convenient drug testing device which can be incorporated in existing smartphones, which is an advantage.
The light source 122 may be a photography light source normally found on smart phones, which can be used to stimulate the eye by turning the light source on and off. Such light sources may also be arranged with a controllable light intensity.
According to some aspects, the vision-based sensor 121 is a stereo vision sensor comprising more than one vision sensing element, e.g., two or more camera lenses. The vision-based sensor can then be arranged to determine a distance from the sensor to the pupil or pupils, and thereby estimate pupil size.
In general, detecting the location and orientation of a pupil in an eye is known and will therefore not be discussed in more detail herein. With reference to Figures 3A and 3B, the pupil can be measured in terms of size 310, and its horizontal 330 and vertical 340 motion can be tracked over a time sequence of images or through a video sequence.
The eye-scanner 120 and/or the external processing resource 130 comprise control units 125, 135 configured to perform image processing, and in particular to locate the pupil in a captured image or sequence of images. This hardware will be discussed in more detail below in connection to Figure 5.
The four different measurement metrics on which the drug detection test is based will now be described in detail;
Measurement metric 1 - pupil size
Figure 3A shows an eye 300 where a pupil size measure 310 is indicated. The pupil size test checks if the pupil size is within the range of 2.0-8.0 mm, and preferably in the range 3.0-6.5 mm. According to some aspects, if the person under test falls outside the normal range, a positive result is generated, otherwise a negative result is generated.
According to some aspects the pupil size is used to derive a first weighting value which is based on how far outside the range the pupils of the person under test is. Thus, if the pupil size is well within normal range then a relative low value is output, which value then increases as the pupil size comes closer to the boundary of the normal range. The value then continues to increase as the pupil size goes beyond the normal range. This way the ‘normality’ of a pupil size can be quantified more finely, e.g., on a scale from 1 to 100, compared to a binary normal/abnormal quantification.
Measurement metric 2 - measurement of pupil's reaction to light
The test measures whether the pupil responds to light. Light of a given intensity is first directed in the eye. The pupil is then monitored for response to the light. Slow or no reaction to light generates a positive result. What constitutes a “slow” reaction can be pre-configured. This metric may also be represented as a second weighting value, e.g., on a scale from 0 to 100. A very fast reaction to light then gives a low value, e.g., below 10, while a slower reaction results in a higher value. Non-existent reactions would give a result on the order of 100. Again, the reaction to light can then be quantified more finely compared to the binary slow/fast. According to aspects, the ambient light is taken into consideration when determining the value of the response metric. For instance, a less pronounced reaction can be expected to light of a given intensity if the eye has previously been subjected to strong daylight for an extended period of time, compared to an eye which has not been subject to strong ambient light, e.g., at nighttime. The light source used for testing, i.e., the light of the given intensity, can be modulated in terms of this intensity in dependence of an intensity of the ambient light.
Measurement metric 3 - control of horizontal and/or vertical nystagmus
Nystagmus is a condition of involuntary eye movement. Due to the involuntary movement of the eye, it has also sometimes been called "dancing eyes".
Figure 3A illustrates vertical 340 and horizontal eye movement 330. The test checks whether involuntary twitching occurs when the person is looking from side to side (horizontal) or up and then down (vertical). Florizontal and/or vertical jerking of the pupil, i.e., if the pupil does not move linearly from end point to end point, generates a positive result. Again, the twitching can be quantized on a finer scale from 0 to 100.
The eye-scanner 120 may comprise a display 220 configured to show a marker which the test subject can be instructed to follow with the eye 110. The marker can then move from side to side or up and down, while the pupil motion in monitored for any non-smooth motion indicating a level of nystagmus.
Measurement value 4 - checking the ability to cross the eyes.
Crossing one’s eyes means to direct left and right eye gaze towards a point located close to the eyes and centered between the eyes, as illustrated in Figure 3B. Eye crossing occurs, e.g., when trying to follow an object with both eyes as the object approaches the nose of the person.
The test checks the person's ability to cross his or her eyes when following an object with the eyes. Inability to cross eyes generates a positive outcome. The display 220 can be used to assist in measuring this metric. Similar to the nystagmus metric, a marker can be shown on the display at a location where eye crossing is expected in a person not influenced by drugs. The movement of the pupils can then be monitored in order to discern if the test subject is capable of crossing eyes or not. Again, the relative location of the pupils can be detected based on image processing of the captured visual representation.
When the four tests have been performed, a merging and analysis of the measured values takes place. These provide a recommendation for further management. E.g., additional drug tests such as blood or urine tests.
According to aspects, a preliminary analysis is carried out after each of the four tests to determine if the test was successful, i.e. to investigate if enough data has been collected for the analysis discussed above. If, e.g., the view of the vision-based sensor is obstructed during a test, the test is deemed unsuccessful and a re-testing is prompted.
According to one example, a detection of drug influence and potential recommendation for further testing is triggered in case one or more of the four tests are failed, i.e., if one or more measured metric do not meet respective pre-determined pass criteria. Alternatively, a combination of the above weighting values can be used as base for detecting drug influence. For instance, four weighting values can be added together to represent drug influence likelihood on a scale from 0 to 400.
According to some aspects, the results are anonymized and stored on a remote server such as the external processing resource 130. This is in order to later be able to use the statistics the results generate.
To summarize, Figures 1 and 2 schematically illustrate example eye-scanner devices 120 for detecting drug influence in a person based on a visual representation of an eye 110 of the person. The device comprises a vision- based sensor 121 , a light source 122 and a control unit 125, 135. The vision- based sensor 121 is arranged to capture a visual representation of the eye 110. The control unit 125, 135 is arranged to process the visual representation in order to determine the four metrics discussed above; a pupil size metric 310, a pupil’s reaction to light metric 320, a horizontal and/or vertical nystagmus metric 330, 340, and an eye crossing ability metric 350. The control unit 125, 135 is arranged to detect drug influence in the person based on the pupil size metric 310, the pupil’s reaction to light metric 320, the horizontal and/or vertical nystagmus metric 330, 340, and the eye crossing ability metric 350.
According to some aspects, the vision-based sensor 121 comprises a camera sensor arranged on a smartphone 210, tablet, or a mobile computing device. Thus, existing smart devices such as mobile smartphones and the like can be used to implement the herein proposed techniques. The vision-based sensor 121 may be arranged to capture a time sequence of images of the eye 110, and/or a video sequence of the eye 110. The images and/or video sequence is then subject to image processing in order to determine the four different measurement metrics discussed above.
The light source 320 may also be comprised in the smartphone 210, tablet, or mobile computing device.
As mentioned, the device 120 does not require contact with the persons face during the measurements. Therefore, the eye-scanner devices 120 may comprise a distance sensor 126 arranged to determine the distance between the vision-based sensor 121 and the eye 110 of the person. By using the determined distance, the control unit can carry out the analysis on measured values captured in a range of different distances, e.g. between 0 and 2 meters. Preferably, however, the distance is in a preferred range of 10 to 50 centimeters.
According to some aspects, the distance sensor 126 comprises a camera sensor arranged on a smartphone 210, tablet, or a mobile computing device. Thus, existing smart devices such as mobile smartphones and the like can be used to implement the herein proposed techniques. The determined distance may be determined, e.g. using reference sizes or estimated reference sizes. In an example embodiment, the distance sensor 126 is arranged to be calibrated using a known reference in proximity to eye 110 of the person under test before or after the tests. The known reference may, e.g., be a paper with a checkerboard with known dimensions, or something similar.
The distance sensor 126 may comprise a plurality of camera sensors, e.g. as in a stereoscopic camera. Thus, existing smart devices such as mobile smartphones and the like can be used to implement the herein proposed techniques. The distance can thus be determined from techniques involving, i.a., distances between camera lenses (i.e. sensors), field of view angels, and geometric derivations.
According to some aspects, the distance sensor 126 comprises a radar, a light detection and ranging (LIDAR) device, and/or any other light-based ranging device, such as depth sensing using infrared projection. Thus, existing smart devices such as mobile smartphones and the like can be used to implement the herein proposed techniques.
The distance sensor 126 may comprise an ultra-sonic ranging device. Three- dimensional ultrasound technology for human-machine interaction is an emerging technology which may be an integral part of future smartphones and the like.
The size measurements and movement measurements, discussed above, of the four tests may be measured in terms of absolute values. However, alternatively, or in combination of, these measurements may be measured in relative terms. In other words, the pupil size metric 310, the pupil’s reaction to light metric 320, the horizontal and/or vertical nystagmus metric 330, 340, and the eye crossing ability metric 350 are, at least partly, determined based on relative measurements. For example, the size of the pupil (e.g. its diameter) may be measured relative to the height of the whole eye or the size of the iris (e.g. its diameter). Other normalizations are also possible. The relative measurements may eliminate the need of determining the absolute distance between the vision-based sensor 121 and the eye 110 of the person, or at least reduce the required accuracy of the distance, depending on the desired accuracy of the final measurement metrics of the four tests. The eye-scanner device 120 optionally also comprises a display 220 with a visible marker or the like. The display can then be used to provide instructions to the test subject for, e.g., moving the gaze or directing the gaze at a given location on the screen. For instance, when determining the eye-crossing ability metric, the test person may be directed to look at a marker on the screen, which marker is positioned so as to induce eye crossing. The display may also be configured to show markers moving from right to left and/or up and down in order to stimulate horizontal or vertical pupil movement during a nystagmus test. The display 220 may also display the sequence of images and/or video sequence of the eye 110 in a live feed.
The display may comprise one or more test indicators arranged to indicate that a capture of a visual representation of the eye 110 (i.e. a test) is ongoing. The indicators can, for example, be circles displayed on top of the pupils in the live feed. These circles can track the pupils on the display continuously during the capture. The circles may only be present during one of the tests. Alternatively, the circles are present at all times during the capture and change shape or color during one of the tests. This can guide the person under test to keep looking at the device until the capture is complete.
The display may comprise one or more distance indicators arranged to indicate that the distance between the vision-based sensor 121 and the eye 110 of the person is within a preferred range. If the distance is larger than preferred, the indicators may change shape or color. If the distance is smaller value than preferred, the indicators may change in a different way. The distance indicators may also be circles displayed on top of the pupils. As an example, the circles are green when the distance is in the preferred range, yellow when the distance is too short and red when the distance is too long. The distance indicators can thereby guide the person under test to maintain the distance within the preferred range.
The display may comprise one or more capture indicators arranged to indicate that that the vision-based sensor 121 is capable of capturing the visual representation of the eye 110, which means, i.a., that eye is in of the field of view of the sensor and that the eye is in focus. The capture indicators can thereby guide the person under test to maintain capturing capabilities. The capture indicators may also be circles displayed on top of the pupils, which appears on the display when vision-based sensor 121 is capable of capturing the visual representation of the eye 110.
According to aspects, the eye-scanner devices 120 comprises a speaker device 127 arranged to communicate instructions to the person. The speaker device 127 may be arranged on a smartphone 210, tablet, or a mobile computing device. The instruction could be, e.g., “look into the camera”, “the light source 320 is about to be turned on, keep looking straight ahead”, and “without moving your head, start looking to the left, continue moving your eyes until commanded to stop”. This is way, the person can be guided while not directly looking at the display, which is an advantage.
The control unit 125 may be arranged in physical connection to the vision- based sensor 121 , e.g., as part of the processing circuitry of a smart phone 210. However, the control unit 135 may also be arranged distanced from the vision-based sensor in an external processing resource 130, such as a remote server or cloud-based processing resource. The vision-based sensor then uploads captured image data of the eye 110 to the external processing resource, preferably via wireless link 140.
The drug detection test is based on the four metrics discussed above. However, the test procedure can be configured in a number of different ways. For instance, each metric can be associated with a pass criterion. For instance, the pupil’s size metric can be associated with a range of pupil sizes. If the pupil size 310 of the person under test is not within the acceptable range, then the test is failed, otherwise it is passed. Similarly, the pupil’s reaction to light metric 320, the horizontal and/or vertical nystagmus metric 330, 340, and the eye crossing ability metric 350 may be associated with respective pass criteria. The person is detected to be under likely influence of drugs if one or more of the metrics fail to meet the respective metric pass criterion. The drug detection test can also be more advanced, e.g., comprising a weighted combination of the four metrics. In this case, if a person only barely passes the four different tests, a likely drug influence may still be detected based on a combination of the four test-metrics.
The systems and devices disclosed herein can also be used for gathering statistics on, e.g., drug influence. For instance, the control unit 125, 135 may optionally be arranged to gather one or more statistics or data points associated with the metrics, to anonymize the gathered statistics, and to store the anonymized statistics. This enables analysis of drug influence. The data points may also be associated with respective time stamps and/or geographical positions. The geographical positions may be quantized on, e.g., city or area level in order to not impact anonymization.
The control unit 125, 135 may optionally be arranged to gather one or more data points associated with the metrics, to associate the gathered statistics with the person under test, and to store the associated statistics. This way, it is possible to gather base values of the metrics of a person under test when he is not under the influence of a drug. During a subsequent measurement, the control unit may be arranged to compare the new test data with the stored base values. Deviations between the new data and the base values can be an indication that the person is under the influence of a drug. The drug detection test may comprise a combination of comparing the test metrics to nominal values (i.e. predetermined threshold values) and comparing to the persons base values.
With reference to Figure 4, the eye-scanner device 120 discussed above is arranged to execute a method comprising:
S1 : Capturing a visual representation of the eye 110. This representation may comprise one or more images, or a video sequence, or a combination of images and video sequence.
S2: Determining a pupil size metric 310 based on the captured visual representation. The pupil size may be estimated from the one or more images, e.g., as a diameter of the pupil. The location and boundary of the pupil may be detected by means of image processing and/or eye tracking techniques.
S3: Determining a horizontal and/or vertical nystagmus metric 330, 340. Nystagmus may be detected by comparing a sequence of captured images or a video sequence of the eye 110 in order to detect, e.g., involuntary twitching.
S4: Determine an ability to cross eyes metric 350 based on the visual representation, i.e., an ability to move the gaze direction laterally in a controlled manner. The person may be instructed to attempt to cross eyes or be instructed to follow a marker on a display 220 or be instructed to follow an external object such as a pen which is moved in a way to cause eye crossing. Ability to cross eyes may be determined based on image processing techniques.
S5: Determine a measurement of pupil's reaction to light metric, where the light has a given intensity. A light source 320 arranged on or in connection to the vision-based sensor 121 may be used to stimulate a reaction, which reaction can then be monitored based on the representation.
The disclosed method then comprises determining S6 a level of drug influence associated with the person based on the determined metrics, i.e., based on the abilities of the eye or pair of eyes. The determining of drug influence may be a yes/no result, or a level on a scale. The determining may also be based on a sequence of tests, where drug influence is determined based on an ability of the person to pass the sequence of tests, e.g., associated with S2-S5 above.
The detection is based on four measurements S2, S3, S4, S5, giving a robust detection method based on the four metrics able to detect drug influence in a person.
Figure 5 schematically illustrates, in terms of a number of functional units, the components of a control unit 125, 135 according to an embodiment of the discussions herein. The control unit may be comprised in a mobile device such as a smart phone 210 or in an external processing resource 130. Processing circuitry 510 is provided using any combination of one or more of a suitable central processing unit CPU, multiprocessor, microcontroller, digital signal processor DSP, etc., capable of executing software instructions stored in a computer program product, e.g. in the form of a storage medium 530. The processing circuitry 510 may further be provided as at least one application specific integrated circuit ASIC, or field programmable gate array FPGA.
Particularly, the processing circuitry 510 is configured to cause the control unit 125, 135 to perform a set of operations, or steps, such as the methods discussed in connection to Figure 4. For example, the storage medium 530 may store the set of operations, and the processing circuitry 510 may be configured to retrieve the set of operations from the storage medium 530 to cause the control unit 125, 135 to perform the set of operations. The set of operations may be provided as a set of executable instructions. Thus, the processing circuitry 510 is thereby arranged to execute methods as herein disclosed.
The storage medium 530 may also comprise persistent storage, which, for example, can be any single one or combination of magnetic memory, optical memory, solid state memory, memory card or even remotely mounted memory.
The control unit 125, 135 may further comprise an interface 520 for communications with at least one external device 540. As such the interface 520 may comprise one or more transmitters and receivers, comprising analogue and digital components and a suitable number of ports for wireline or wireless communication.
The processing circuitry 510 controls the general operation of the control unit 125, 135 e.g. by sending data and control signals to the interface 520 and the storage medium 530, by receiving data and reports from the interface 520, and by retrieving data and instructions from the storage medium 530. Other components, as well as the related functionality, of the control node are omitted in order not to obscure the concepts presented herein.
Figure 6 schematically illustrates a computer program product 600, comprising a set of operations 610 executable by the control unit 125, 135. The set of operations 610 may be loaded into the storage medium 530. The set of operations may correspond to the methods discussed above in connection to Figure 4 or to the different operations by the discussed above.
In the example of Figure 6, the computer program product 600 is illustrated as an optical disc, such as a CD (compact disc) or a DVD (digital versatile disc) or a Blu-Ray disc. The computer program product could also be embodied as a memory, such as a random-access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM), or an electrically erasable programmable read-only memory (EEPROM) and more particularly as a non-volatile storage medium of a device in an external memory such as a USB (Universal Serial Bus) memory or a Flash memory, such as a compact Flash memory. Thus, while the computer program is here schematically shown as a track on the depicted optical disk, the computer program can be stored in any way which is suitable for the computer program product.
Below follows an example scenario demonstrating the eye-scanner device and the disclosed method. The person under test has access to the eye-scanner device, which in this case comprises a smart phone. First, an eye-scanner application is started, which comprises five subsequent stages: START, which is the first interface that introduces the person under test to the application wherein a test can be initiated; PREAMBLE, which is an interface presenting the stages and instructions, and gives the possibility to terminate; TEST DATA, which is an interface wherein information about the person under test can be inputted; MEASUREMENTS, which starts the measurements and displays a measurement interface; and RESULTS, which is an interface displaying results. Details about the five stages follow below.
START
The start interface is the first interface when the application is launched. It displays that the eye-scanner application has been launched and enables initialization of testing. After the subsequent stages have been cycled through, the application is returned to the start interface.
PREAMBLE The preamble interface is launched after the initialization of testing. This interface instructs and prepares the person under test for testing, i.e. what will happen (the four tests) and what is required of the person under test. This interface thereby allows the person under test to confirm that a test should be started, which is useful if the initialization of testing is initiated by accident on the start interface. The preamble interface also comprises means for termination, which returns the application to the start interface, and means for confirmation, which is a continuation to the next stage.
TEST DATA
The test data interface takes information about the person under test as input, such as age and sex. The information can be useful for setting threshold values for the four test-metrics indicating influence of drugs. The information can also relate to drug intake, such as type of drug, quantity, and time of use, which can be useful when determining threshold values of the test metrics in a controlled experiment.
The test data interface also comprises means for termination, which returns the application to the start interface, and means for confirmation, which is a continuation to the next stage. Optionally, the confirmation can only be initiated when a minimum amount of information has been inputted, such as age and sex.
MEASUREMENTS
The four tests may be carried out in a single sequence, or they may be initiated individually. When the measurements interface is initiated, the camera is turned on and a live feed of the camera is displayed. When the camera sensor 121 is capable of capturing the visual representation of the eye 110, capture indicators in the form of circles displayed on top of the pupils appears on the live feed, and remain there as long as the camera sensor 121 is capable of capturing the visual representation of the eye 110. This can increase the probability of a successful capture during one of the four tests. The capture indicators can thereby guide the person under test to maintain capturing capabilities. The measurements interface also comprise a distance indicator that indicate that the distance between the camera sensor 121 and the eye 110 of the person under test is within a preferred range. This provides clear feedback to the person under test which enables the preferred distance to be maintained.
When the camera sensor can capture, and when the distance is in the preferred range, the four tests may be carried out automatically or be initiated by the person under test. An information view successively shows information about the ongoing test and, together with markers, guides the person under test what to do. When the four tests have been successfully completed, the measurement are saved, and the next stage is initiated automatically.
The measurements interface comprise three parts: an information view, which shows information about the current test (out of four) and provides feedback to the person under test; a film view, which shows a live feed of the camera; and means for termination.
The information view provides information about the ongoing test. After one of the four tests has successfully been completed, the information view provides new information about a new ongoing test. The information view comprises a headline stating the ongoing test and an instruction to the person under test, e.g. “look into the camera”, “the light source 320 is about to be turned on, keep looking straight ahead”, and “without moving your head, start looking to the left, continue moving your eyes until commanded to stop”. The information view also comprises a capture indicator and a distance indicator. The information view further comprises a counter that counts down the time to the start of a test. The person under test thereby has time to move his eyes into position before the test starts. The information view also comprises information about what is currently being processed, e.g. “looking for the pupil”. Some or all of the information on the information view may also be communicated to the person through voice messages (which may be pre-recorded).
The film view is a live feed of the camera. This provides feedback to the person under test such that he can hold the phone properly. The capture indicators in the form of circles displayed on top of the pupils reinforce this feedback. The means for termination allows the person under test to abort the tests and return to the start interface. Optionally, it is also possible to pause the tests if they are done in a sequence. It may also be possible to restart the tests, i.e. to return to the beginning of the measurement stage. RESULTS
The results interface is initiated after the measurement stage is successfully completed. The analysis of the measurement metrics can take place on the processing circuitry of the smart phone or on an external processing resource 130, such as a remote server or cloud-based processing resource. It is also possible that some of the analysis is carried out on the processing circuitry of the smart phone and the rest on the external processing resource. In any case, the results interface shows that all tests have been completed and a summary of the results, which comprises age, sex, date, and a test code.
After analysis of the measurement metrics takes place, a recommendation for further management may be provided on the results interface. E.g., recommendation for additional drug tests such as blood or urine tests.
The test code is a unique code for a completed cycle of the stages. It can be used for connecting the four tests to the results of an additional drug tests such as blood or urine tests. The results interface also comprises means for termination, which returns the application to the start interface.

Claims

1. An eye-scanner device (120) for detecting drug influence in a person based on a visual representation of an eye (110) of the person, the device comprising a vision-based sensor (121 ), a light source (122) and a control unit (125, 135), wherein the vision-based sensor (121 ) is arranged to capture a visual representation of the eye (110), the control unit (125, 135) is arranged to determine a pupil size metric (310), a pupil’s reaction to light metric (320), a horizontal and/or vertical nystagmus metric (330, 340), and an eye crossing ability metric (350), wherein the control unit (125, 135) is arranged to detect drug influence in the person based on the pupil size metric (310), the pupil’s reaction to light metric (320), the horizontal and/or vertical nystagmus metric (330, 340), and the eye crossing ability metric (350).
2. The eye-scanner device (120) according to claim 1 , wherein the vision- based sensor (121 ) comprises a camera sensor arranged on a smartphone (210), tablet, or a mobile computing device.
3. The eye-scanner device (120) according to any previous claim, wherein the light source (320) is comprised in a smartphone (210), tablet, or a mobile computing device.
4. The eye-scanner device (120) according to any previous claim, wherein the control unit (125) is arranged in physical connection to the vision-based sensor (121 ).
5. The eye-scanner device (120) according to any previous claim, wherein the control unit (125) is arranged distanced from the vision-based sensor in an external processing resource (130).
6. The eye-scanner device (120) according to any previous claim, wherein drug influence in the person is detected if one or more of the metrics fail to meet a respective metric pass criterion.
7. The eye-scanner device (120) according to any previous claim, wherein drug influence in the person is detected based on a weighted combination of the metrics.
8. The eye-scanner device (120) according to any previous claim, wherein the vision-based sensor (121 ) is arranged to capture a time sequence of images of the eye (110).
9. The eye-scanner device (120) according to any previous claim, wherein the vision-based sensor (121 ) is arranged to capture a video sequence of the eye (110).
10. The eye-scanner device (120) according to any previous claim, wherein the control unit (125, 135) is arranged to gather one or more statistics associated with the metrics, to anonymize the gathered statistics, and to store the anonymized statistics.
11. The eye-scanner device (120) according to any previous claim, comprising a display (220) with a visible marker.
12. The eye-scanner device (120) according to any previous claim, comprising a distance sensor (126) arranged to determine the distance between the vision-based sensor (121) and the eye (110) of the person.
13. The eye-scanner device (120) according to claim 12, wherein the distance sensor (126) comprises a camera sensor arranged on a smartphone (210), tablet, or a mobile computing device.
14. The eye-scanner device (120) according to any of claims 12-13, wherein the distance sensor (126) is arranged to be calibrated using a known reference in proximity to eye (110).
15. The eye-scanner device (120) according to any of claims 12-14, wherein the distance sensor (126) comprises a plurality of camera sensors.
16. The eye-scanner device (120) according to any of claims 12-15, wherein the distance sensor (126) comprises a radar, a light detection and ranging (LIDAR) device, and/or any other light-based ranging device, such as depth sensing using infrared projection.
17. The eye-scanner device (120) according to any previous claim, wherein the display comprises one or more test indicators arranged to indicate that a capture of a visual representation of the eye (110) is ongoing.
18. The eye-scanner device (120) according to any previous claim, wherein the display comprises one or more distance indicators arranged to indicate that the distance between the vision-based sensor (121) and the eye (110) of the person is within a preferred range.
19. The eye-scanner device (120) according to any previous claim, wherein the display comprises one or more capture indicators arranged to indicate that that the vision-based sensor (121) is capable of capturing the visual representation of the eye (110).
20. The eye-scanner device (120) according to any previous claim, comprising a speaker device (127) arranged to communicate instructions to the person.
21. The eye-scanner device (120) according to any previous claim, wherein the pupil size metric (310), the pupil’s reaction to light metric (320), the horizontal and/or vertical nystagmus metric (330, 340), and the eye crossing ability metric (350) are, at least partly, determined based on relative measurements.
PCT/EP2020/073610 2019-08-27 2020-08-24 Device for detecting drug influence based on visual data WO2021037788A1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
SE1930273 2019-08-27
SE1930273-6 2019-08-27
SE2030188A SE2030188A1 (en) 2019-08-27 2020-06-08 Device for detecting drug influence based on visual data
SE2030188-3 2020-06-08

Publications (1)

Publication Number Publication Date
WO2021037788A1 true WO2021037788A1 (en) 2021-03-04

Family

ID=72240428

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2020/073610 WO2021037788A1 (en) 2019-08-27 2020-08-24 Device for detecting drug influence based on visual data

Country Status (1)

Country Link
WO (1) WO2021037788A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024091172A1 (en) * 2022-10-28 2024-05-02 Kontigo Care Ab Method and system for self-administrated surveillance of use of addictive stimulus

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9357918B1 (en) 2014-12-18 2016-06-07 Karen Elise Cohen System and method for drug screening and monitoring pupil reactivity and voluntary and involuntary eye muscle function
US20180333092A1 (en) 2015-12-03 2018-11-22 Ophthalight Digital Solutions Inc. Portable ocular response testing device and methods of use
US20190200862A1 (en) * 2013-01-25 2019-07-04 Wesley W.O. Krueger Ocular-performance-based head impact measurement using a faceguard
WO2020081799A1 (en) * 2018-10-17 2020-04-23 Battelle Memorial Institute Roadside impairment sensor
WO2020081804A1 (en) * 2018-10-17 2020-04-23 Battelle Memorial Institute Medical condition sensor

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190200862A1 (en) * 2013-01-25 2019-07-04 Wesley W.O. Krueger Ocular-performance-based head impact measurement using a faceguard
US9357918B1 (en) 2014-12-18 2016-06-07 Karen Elise Cohen System and method for drug screening and monitoring pupil reactivity and voluntary and involuntary eye muscle function
US20180333092A1 (en) 2015-12-03 2018-11-22 Ophthalight Digital Solutions Inc. Portable ocular response testing device and methods of use
WO2020081799A1 (en) * 2018-10-17 2020-04-23 Battelle Memorial Institute Roadside impairment sensor
WO2020081804A1 (en) * 2018-10-17 2020-04-23 Battelle Memorial Institute Medical condition sensor

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024091172A1 (en) * 2022-10-28 2024-05-02 Kontigo Care Ab Method and system for self-administrated surveillance of use of addictive stimulus

Similar Documents

Publication Publication Date Title
CN111060101B (en) Vision-assisted distance SLAM method and device and robot
JP5099182B2 (en) Confirmation work support system, server device, head mounted display device, wearable terminal, confirmation work support method and program
US9489574B2 (en) Apparatus and method for enhancing user recognition
EP2546782B1 (en) Liveness detection
EP3666177B1 (en) Electronic device for determining degree of conjunctival hyperemia
US20130301007A1 (en) Apparatus to measure accommodation of the eye
CN102149325A (en) Line-of-sight direction determination device and line-of-sight direction determination method
AU2015258993A1 (en) Asset estimate generation system
CN111783640A (en) Detection method, device, equipment and storage medium
JP2019159518A (en) Visual state detection apparatus, visual state detection method, and visual state detection program
CN201969115U (en) Clinic infusion alarm system
WO2021037788A1 (en) Device for detecting drug influence based on visual data
CN115409774A (en) Eye detection method based on deep learning and strabismus screening system
US8503737B2 (en) Visual line estimating apparatus
US11416004B2 (en) System and method for validating readings of orientation sensor mounted on autonomous ground vehicle
CN110291771B (en) Depth information acquisition method of target object and movable platform
KR20230137221A (en) Method for verification of image, diagnostic system performing the same and computer-readable recording medium on which the method of performing the same
SE2030188A1 (en) Device for detecting drug influence based on visual data
KR102166359B1 (en) Smart glass and method for tracking visual cognition selectively
CN115497636A (en) Method and system for realizing information inspection based on vehicle external environment monitoring
CN111772573B (en) Binocular vergence sensitivity detection device and data processing method thereof
CN113143194A (en) Eyesight test method based on mobile terminal, mobile terminal and system
KR20230017454A (en) Method, Device and Computer Program For Preventing Cheating In Non-face-to-face Evaluation
US10609311B2 (en) Method and device for increasing resolution of an image sensor
CN108076655B (en) Focus detection method, apparatus, storage medium, and device for substance detection

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20761556

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20761556

Country of ref document: EP

Kind code of ref document: A1