EP2828794A1 - Method and apparatus for evaluating results of gaze detection - Google Patents
Method and apparatus for evaluating results of gaze detectionInfo
- Publication number
- EP2828794A1 EP2828794A1 EP13710873.4A EP13710873A EP2828794A1 EP 2828794 A1 EP2828794 A1 EP 2828794A1 EP 13710873 A EP13710873 A EP 13710873A EP 2828794 A1 EP2828794 A1 EP 2828794A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- events
- time
- scene
- information
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 28
- 238000001514 detection method Methods 0.000 title claims abstract description 23
- 230000004424 eye movement Effects 0.000 claims abstract description 17
- 230000004434 saccadic eye movement Effects 0.000 claims abstract description 13
- 230000008859 change Effects 0.000 claims abstract description 7
- 230000002123 temporal effect Effects 0.000 claims abstract description 7
- 230000000007 visual effect Effects 0.000 claims description 55
- 238000013507 mapping Methods 0.000 claims description 14
- 230000009466 transformation Effects 0.000 claims description 13
- 238000003384 imaging method Methods 0.000 claims description 8
- 230000036962 time dependent Effects 0.000 claims description 7
- 239000000872 buffer Substances 0.000 claims 2
- 238000005452 bending Methods 0.000 claims 1
- 230000009017 pursuit movement Effects 0.000 abstract 2
- 238000011156 evaluation Methods 0.000 description 25
- 210000003128 head Anatomy 0.000 description 7
- 238000012360 testing method Methods 0.000 description 5
- 238000010191 image analysis Methods 0.000 description 3
- 230000001133 acceleration Effects 0.000 description 2
- 210000004087 cornea Anatomy 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 230000004886 head movement Effects 0.000 description 2
- 230000008447 perception Effects 0.000 description 2
- 210000001747 pupil Anatomy 0.000 description 2
- 238000009877 rendering Methods 0.000 description 2
- 238000011160 research Methods 0.000 description 2
- 230000009897 systematic effect Effects 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000003542 behavioural effect Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 230000037406 food intake Effects 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 230000004459 microsaccades Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000004461 rapid eye movement Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 230000011514 reflex Effects 0.000 description 1
- 230000000284 resting effect Effects 0.000 description 1
- 238000012800 visualization Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/19—Sensors therefor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/193—Preprocessing; Feature extraction
Definitions
- the present invention relates to a method and a device for evaluating the results of a gaze detection, that is to say the results of an eye tracking.
- Eye tracking is known per se and is used, for example, in behavioral research or market research, when it has to be examined, in soft points of view, when or for how long and / or in which sequence or how the look or a line of sight moves between different points of view. This can, for example, give information on where and how a provider should place the goods offered at the lowest price in order to be sure of the attention of the traffic.
- scene images are recorded which reproduce a scene that is visible to a subject - the word may also refer in the following to both a person and an animal - and for which, for example, a can be used on a head of the subject attached and so moving with the head camera.
- an eye position and thus a viewing direction of the subject is detected, which can be done for example with the help of targeted to the eyes of the subject cameras using appropriate image analysis.
- the term point of view - this also applies to the following description of the invention and the embodiments - a point in the scene on which the subject's gaze rests at a time when the scene biid is assigned.
- results of a gaze detection are typically present as information that defines for each of a multiplicity of successive points in time a viewing direction detected at that time and a viewpoint identified thereby in a scene image assigned to this point in time.
- results of a gaze detection are typically present as information that defines for each of a multiplicity of successive points in time a viewing direction detected at that time and a viewpoint identified thereby in a scene image assigned to this point in time.
- a very large amount of information has to be processed because typically more than twenty scene images and pixels are captured per second. Regardless of whether the evaluation is done manually, automatically or interactively, it is therefore associated with a very large time or computational effort.
- the present invention has for its object to propose measures with which the effort for the evaluation of results of a gaze detection of the described type can be reduced.
- This method thus serves to evaluate the results of a gaze detection, wherein these results are present or obtained, for example in the manner described above by eye-tracking, as information which for each of a multiplicity of consecutive times respectively detects a viewing direction detected at that time and a thereby defining the viewpoint identified in a scene image assigned to this point in time, whereby the scene brothers are time-dependent, so that different scene images are assigned to the different points in time in the scene.
- This procedure comprises the following steps:
- mapping the view point identified in the respective scene point in time that is to say the point of view defined by the results of the gaze detection for that point in time, to a position corresponding to this viewpoint in a reference picture.
- the dividing of the period into the tiling intervals can take place, for example, by assigning to each of the times or scene images an attribute which indicates to which of the intervals or viewing events this time or the time at which this scene image is associated belongs. Typically, the intervals will be disjoint.
- the reference picture may be identical for the different scene pictures. But it is also possible that the reference image is one of a sequence of reference images, which may be defined by a reference video, for example. Similar to the scene images in this case, the reference image is time-dependent.
- viewing events in which information is recorded and viewing events in which no information is recorded are recognized as different types of visual events - namely as visual events with information recording and as visual events without information recording.
- the above-mentioned gaze events of the at least one selected type of gaze event will then be the gaze events with information capture.
- the selection of exactly one of the times or of a subset of the zones is then generally effected, in particular, for each of the inverses assigned to a gaze event in which information is acquired.
- the proposal to drastically reduce the effort for evaluation in this way is based on the following three insights.
- at least one item of information relevant to the purposes of the proposed procedure does not take place only in certain types of gaze events, in particular fixations or follow-up movements in which the gaze rests for a long time more or less on an object resting or moving relative to the subject
- fixations or follow-up movements in which the gaze rests for a long time more or less on an object resting or moving relative to the subject
- saccades which represent a different kind of visual events and in which a too rapid eye movement makes a relevant in this sense information recording virtually impossible.
- each of the intervals corresponding to a visual event of selected type - e.g. a viewing event with information recording or e.g. a fixation or a follow-up movement are assigned, in each case exactly one of the times or a real subset of the times are selected from this interval and be sure that an evaluation of those times assigned
- the number of times selected and thus the mappings to be made of viewpoints in the reference image is significantly smaller than the number all times for which information about scene images and viewpoints is available.
- the mapping of the viewpoints to corresponding positions in the reference image finally makes the systematic evaluation possible only because the positions determined here contain an objective statement as to what the subject's gaze has lain when and for how long, if the subject is to be allowed to do so Therefore, the scene images assigned to the different times are different from each other.
- a particular advantage arises from the fact that the number of images to be carried out by the proposed approach can be kept within very narrow limits.
- the method may also provide for analyzing the gaze behavior of several subjects in succession or simultaneously as described.
- the mapping of the viewpoints into the reference image which is preferably chosen to be the same for all probands, then allows a very objective evaluation of the test results with very little effort.
- Characteristics in which the different types of visual events For example, they may be a speed of eye movement or a magnitude of an angle swept over a certain period of time.
- mentioning the visual events with information recording and the visual events without taking up information should not imply that precisely these and only these two types of visual events are distinguished. On the contrary, it is possible to identify a larger number of different types of visual events, which may also be assigned to a larger number of different classes, but which may each be interpreted in the sense mentioned as one of possibly different conceivable visual events. without information.
- the different viewing events can differ from one another due to different speeds and / or amplitudes of an eye movement. Accordingly, the identification of the various viewing events can take place, for example, by detecting and evaluating a speed and / or amplitudes of an eye movement. In addition, further information can be evaluated for this purpose, eg via an eye acceleration or an image content in an environment of the viewpoints in the scene images, which can be subjected to a Btld evaluation, or via a head movement detectable with a head tracker. In particular, in expedient embodiments of the method, saccades will be recognized as gaze events in which no information is recorded, while fixations and / or subsequent movements can be recognized as gaze events in which information is acquired.
- the method can be configured such that for each of the intervals that are associated with a visual event of selected type-for example, a gaze event with information recording, for example, a fixation or a follow-up movement-a position in the reference image together with a reference image or "mapping end point" Information about the duration of each view event is stored. If exactly one of the times from this interval is selected, this position can be selected as the position tion that maps the viewpoint defined for the time selected from this interval. If more than one of the times are selected from this interval, that position may be selected as one of the positions or a-not necessarily arithmetic-mean or median of the positions to which the viewpoints defined for the times selected from that interval are mapped.
- the imaging target may be selected, for example, as the position to which the viewpoint is mapped, defined for the last time selected from this interval, or, in particular in the case of automatic mapping, in dependence on a measure of the quality of the respective image .
- further attributes with information about the gaze event can also be stored together with the position, eg a start and an end, a consecutive number of the gaze event or a count for multiple occurrence of the same position. All of this information then allows a largely automatable evaluation of the results of the eye tracking.
- An embodiment in which the method serves not only the evaluation of the gaze detection, but also the gaze detection itself, provides that the scene images are taken at the stated times with a camera, wherein the viewing directions are detected by means of an eye tracker, the Viewpoints in the scene biiders depending on the thus obtained information on the viewing directions determined.
- the camera can be worn by a head of the subject, the viewing directions are detected by the eye tracker.
- the eye tracker may e.g. Eye cameras for observation each have one of two eyes of the subject and provide an image analysis. The latter can e.g. determine a pupil position relative to a reflex on a cornea of the respective eye.
- the two eye cameras may be held together with the former camera for taking the scene images from a spectacle-type carrier.
- the reference image is an image of a scene visible in the scene images, which corresponds to a fixed perspective and is photographically recorded, for example or sketching character, or an abstract rendering or coding of characteristic features recognizable in the scene creators.
- Such abstract rendering or coding may be given, for example, for each characteristic feature to be considered by a designation or location definition of that feature.
- the scene image associated with this point in time is output together with the viewpoint identified therein, e.g. on a screen. This can be done manually, e.g. with a mouse click, or to visually verify the visualization of the viewpoint in the reference image.
- the reference image is preferably also output for this purpose and may be e.g. next to or above or below the current scene image on the screen.
- mapping of the pixels from the individually selected scene images into the reference image is performed manually. However, it is at least possible that the mapping of the viewpoints on the positions and thus the determination of said positions is carried out automatically by
- the transformation can be determined, for example, under the constraint that it is a homography with the desired imaging properties. It can also be determined as a spatial image. In this case, at least some of the features may be located on the same scene object. The last three of the four steps mentioned here are carried out in each case for each of the selected times or each of the scenes assigned to these times.
- the characteristic features that are identified by the algorithm in the context of an image analysis may be, for example, corners or lines. Corresponding algorithms for feature detection are known per se, for example from the document US
- the selection of the times can be done both manually and automatically. So z, B. in each case a time is selected, which is defined by the fact that it is centered in the respective interval.
- the proposed device is suitable for advantageously low-cost evaluation of results of a gaze detection, these results being present or readable or producible as information that for each of a plurality of successive times each detected at that time sight and define a view point identified thereby in a scene image assigned to this point in time, the scene bids being time-dependent, so that different scene images are generally assigned to the different points in time.
- this device is program-technically set up to carry out the following steps:
- a position in a reference image as an image of the viewpoint defined for that point in time, and storing information associated with that position as an attribute over the duration of the view event or interval to which that point belongs; possibly together with further attributes with information about this gaze event, eg, about the beginning and end and / or a sequential number of the gaze event.
- the device can be set up to divide the time period into the intervals by assigning to each time point in each case an attribute which indicates to which of the intervals or viewing events this time belongs.
- the reference picture may be identical or time-dependent again for the different scene images.
- the different types mentioned differ from one another in such a way that, in particular, viewing events with and sight events without information recording-as explained above-are distinguished from one another.
- the device is therefore set up in appropriate embodiments, viewing events in which an information recording takes place, and viewing events in which no information is recorded, to recognize different types of visual events. Accordingly, the mentioned visual events of a selected kind are expediently the visual events with information recording.
- the device can be set up to view the different visual events as a function of different speeds and / or To identify amplitudes of an eye movement.
- the device is in particular configured to recognize saccades as gaze events without information recording and / or fixations and / or subsequent movements as gaze events with information recording.
- Embodiments of the device are possible in which this is set up by programming, the positions defined for the selected times automatically map corresponding positions in the reference image on these points of view and to store the positions thus determined as the images of the viewpoints. For this purpose, it can be set up to automatically perform the following steps in order to map the viewpoints to the corresponding positions in the reference image:
- the transformation can be a homography. So that the viewpoints are imaged in the desired manner in the reference image, the device can thus be set up to determine the transformation in each case under the constraint that it is a homography with the required imaging properties with respect to the locations of said plurality of characteristic features. It is also conceivable that the device requires a manual input to define the mapping of the viewpoints from the selected scene images or allows correction as an automatically proposed image. In this case, the device can be set up for each of the selected times the corresponding position in the reference image together with the attribute assigned to these positions, depending in each case on an input defining the position and linking it to the time or the scene image assigned to this time to save. This input can be made, for example, for each of the selected times by a mouse click in the reference image shown to it.
- the device can be set up for an input that defines the selected times as such - eg. For example, by a mouse click on a timeline on which the intervals are shown visibly displayed - or for an automatic selection of these times by applying these times as a function of the intervals defining rule. This requirement may e.g. define that the respective points in the middle of the intervals are selected.
- the apparatus may also be arranged, if a plurality of times from the same interval associated with a view event with information recording is selected, to serve as an imaging destination associated with that interval one of the positions or an average of the positions to which the points of view belong Scene images are assigned, which are assigned to the selected from this interval times.
- the term mean here should not only be able to designate an arithmetic mean but also, for example, a median. It may be provided that the device further comprises an eye tracker for
- Detecting the viewing directions and having a camera for taking the scene images is a device for viewing and evaluating the results of this view capture.
- the eye-tracker is then preferably provided with a holder for attachment to or on a head of the subject in order to allow a free movement of the subject in order to obtain as realistic information as possible about the subject's vision. habits can be obtained.
- the camera for capturing the scene images is typically part of the eye tracker and can be worn by the same holder. This facilitates calibration of the eye tracker.
- the device can also have a camera for recording the reference image.
- the method described above can be carried out in particular with a device of the type described here.
- the device described can additionally be set up to carry out any desired embodiment or configuration of the method described at the outset.
- FIG. 1 is a schematic representation of a device for capturing and evaluating the results of this gaze detection
- FIG. 2 shows a scene image taken with this device and, to the right of it, a reference image likewise recorded with this device, wherein an arrow illustrates how a viewpoint identified in the scene image is imaged onto a corresponding position in the reference image, FIG.
- FIG. 3 is a diagrammatic representation of a time profile of a speed of an eye movement detected with the device over an observation period
- Fig. 4 is a timeline on which this observation period in several
- the device shown in FIG. 1 is an arrangement which comprises an eye tracker 1 for a gaze detection and is additionally set up, with this eye tracker 1 obtained results of Bficker- Constitution in a particularly appropriate manner with little effort to evaluate.
- the eye tracker 1 has two eye cameras 2, which are held by a spectacle-like frame 3 and are directed to two eyes of a subject, not shown here, who carries the frame 3.
- the eye tracker 1 has a camera 4 held by the same frame 3 for taking scene images which is arranged to move with a head of the subject and that a field of view of this camera 4 corresponds to approximately one visual field of the subject .
- the eye tracker comprises a computing unit 5, which analyzes an output of the eye cameras 2 by means of image evaluation methods and thereby determines a current viewing direction of the test person for a plurality of closely spaced successive times.
- the arithmetic unit 5 may be e.g.
- the camera 4 records a scene image assigned to this time at each of the named times. These scene images thus form successive frames of a film recorded with the camera 4.
- the eye tracker 1 now identifies with the arithmetic unit 5, for each of the times, a viewpoint in the scene image assigned to this point in time, the viewpoint-also referred to as "gaze point” or “focal point” -in each case the point in the scene design, on which the subject looks at the given time.
- the device also has another camera 6, with which a reference image is taken.
- the reference image is a photographic image, taken from a defined perspective, of a scene which corresponds to the scene visible to the subject and which is also reproduced by the scene images.
- These Scene contains in the present case, a shelf 7, which may be, for example, a goods shelf in a supermarket and in which several objects 8, for example, different goods are placed.
- the camera 6 is given by a video camera and the reference image is time-dependent.
- the device has an evaluation unit 9 for evaluating the results obtained with the eye tracker 1.
- These results are information obtained in the manner described, which for each of the stated times defines the viewing direction detected at that time and the viewpoint which was identified in the scene image associated with this time.
- one of the scene images 10 recorded with the camera 4 is shown on the left as an example, while the reference image 11 taken with the camera 6 is shown on the right next to it.
- a sketch-like representation of the scene could also serve as a reference image or an abstract reproduction, in which recognizable characteristic features in the scene images - e.g. the items placed on the shelf 7 8 - are simply coded by their name or place.
- the viewpoint 12 identified in this scene image 10 is also represented in the form of an X.
- 11 positions are now to be determined in the reference image, which the test person has looked at, and these positions should be stored together with information on when and for how long the test person looked there.
- an arrow illustrates how the viewpoint 12 is imaged onto a position 13 corresponding to this viewpoint 12. So that for each of the scene images such an image does not have to be performed in order to obtain the desired information, the procedure is now as described below.
- the evaluation unit 9 which is set up in accordance with the program, detects a temporal change in the viewing direction and / or the viewpoint by comparing the tnforma that is present for the successive times. tions about the direction of vision and the point of view.
- FIG. 3 shows a time curve of a speed v of an eye movement of the test person within an observation period, with this observation period extending from to to ti.
- saccades 14 are recognized as a type of gaze event in which no information is captured, while sequential movements 15 and fixations 16 are recognized as two types of gaze events in which information is acquired. For the method described, however, it is not absolutely necessary to distinguish between subsequent movements 15 and fixations 16. These two types of gaze events can also be treated equally in the classification and can be easily classified as events in the gaze event with information gathering category.
- the preceding paragraph describes only a relatively simple method for identifying the various viewing events.
- additional measures may be provided, eg an evaluation of a head movement detected simultaneously with a head tracker or an IMU and / or an analysis of Image contents at the viewpoints 12 through an evaluation of scene camera data.
- an amplitude of eye movements can also be examined by comparing the viewing directions and / or viewpoints 12 at the different successive times and the respective visual event can be identified and classified depending on the detected amplitude ,
- the amplitude is a maximum viewing angle difference observed during an eye movement within a certain period of time.
- microsaccades in which despite large Speeds with an information recording is expected to be distinguished from actual saccades, which can be treated as visual events without information recording. It is also conceivable to detect accelerations of the eyes additionally or instead for identifying the visual events.
- the said observation period is divided into disjoint intervals by the evaluation unit 9, so that an interval corresponding to a duration of the respective gaze event is assigned to each of the identified gaze events.
- Fig. 4 shows a timeline with the intervals.
- the intervals which are assigned to a fixation 16 or a following movement 15 are designated there as intervals Ii, l 2 ,,, and h. At least these intervals Ii to le each contain a plurality of said times.
- the division of the observation period into the intervals is done by assigning to each of the times an attribute indicating which of the intervals or gaze events this time belongs to.
- a time is selected for each of the intervals Ii to which are associated with a fixation 16 or a follow-up movement 15.
- the selected times are illustrated in Fig. 4 each by an arrow.
- these times can either be selected automatically by applying a prescription which defines the selected time points in each case as times centrally located in the respective interval or in some way optimally for the algorithm used, or manually by clicking on the desired times on a this purpose on a screen 17 of the evaluation unit 9 reproduced timeline.
- the time beam can be reproduced in the manner shown in FIG. 4 in a manner which makes the relevant intervals I * to 16 visible.
- the evaluation unit 9 is set up such that for each of the selected points in time, the scene image 10 assigned to this time is output together with the viewpoint 12 identified therein on the screen 17, the reference image 11 each being shown in the manner shown in FIG displayed next to or above or below the scene image 10 on the screen 17.
- the viewpoint 12 which is identified in the scene image 10 assigned to this point in time, is now mapped to the position 13 in the reference image 11 corresponding to this pixel 12.
- the position 13 in the reference image 11 which corresponds to the viewpoint 12 assigned to this point in time is represented as an image of the viewpoint 12 together with an attribute assigned to this position 13
- Information about the visual event or about the interval to which this time belongs is stored.
- the duration of the respective viewing event or interval can be taken from this information.
- other attributes with information about this gaze event e.g. Information about the beginning and end of the visual event, a continuous
- the evaluation unit 9 is then set up, for each of the selected times, the corresponding position 13 in the reference image 12 together with the attribute assigned to these positions 13 in each case depending on a position 13 defining the position 13 and associating it with the time or the scene image 10 assigned to that time Save input.
- the linkage may be e.g. This is done by the input defining the mapping always immediately following the input selecting the time.
- the evaluation unit 9 can also be set up to automatically map the viewpoints 12 defined for the selected times to the positions 13 corresponding to these viewpoints 12 and to store the positions thus automatically determined as images of the viewpoints 12.
- the representation of the scene images 10 and the reference image 11 can then serve to visually check the automatic imaging of the viewpoint 12 into the reference image 11, even if the automatically determined position 13 is displayed there visibly.
- the evaluation unit 9 can be set up, a plurality of characteristic features, such as e.g. To identify corners or edges, and locations of these features in the reference image 11 by means of a feature detection algorithm and in addition to carry out the following steps for each of the selected times:
- time here always refers only to times from the finite number of times for which information about points of view is available. If a plurality of points in time are selected from an interval for each of which a picture of a described type and thus a respective position 13 in the reference picture 11 is defined, one of these positions 13 or an average or median of these positions 13 can be stored together with the information associated with this position 13 as an attribute over the duration of the interval. The information thus determined and stored with comparatively little effort and therefore relatively quickly contains all the information needed for a further evaluation of the results of the gaze recognition and can be statistically evaluated in a variety of ways both mechanically and manually.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Ophthalmology & Optometry (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Psychiatry (AREA)
- Social Psychology (AREA)
- Eye Examination Apparatus (AREA)
- Image Analysis (AREA)
- User Interface Of Digital Computer (AREA)
- Position Input By Displaying (AREA)
Abstract
Description
Claims
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP16199536.0A EP3168783B1 (en) | 2012-03-22 | 2013-03-21 | Method and device for evaluating the results of eye tracking |
EP13710873.4A EP2828794B1 (en) | 2012-03-22 | 2013-03-21 | Method and device for evaluating the results of eye tracking |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP12160878.0A EP2642425A1 (en) | 2012-03-22 | 2012-03-22 | Method and device for evaluating the results of eye tracking |
PCT/EP2013/055953 WO2013139919A1 (en) | 2012-03-22 | 2013-03-21 | Method and apparatus for evaluating results of gaze detection |
EP13710873.4A EP2828794B1 (en) | 2012-03-22 | 2013-03-21 | Method and device for evaluating the results of eye tracking |
Related Child Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP16199536.0A Division EP3168783B1 (en) | 2012-03-22 | 2013-03-21 | Method and device for evaluating the results of eye tracking |
EP16199536.0A Division-Into EP3168783B1 (en) | 2012-03-22 | 2013-03-21 | Method and device for evaluating the results of eye tracking |
Publications (2)
Publication Number | Publication Date |
---|---|
EP2828794A1 true EP2828794A1 (en) | 2015-01-28 |
EP2828794B1 EP2828794B1 (en) | 2017-01-04 |
Family
ID=47902015
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP12160878.0A Withdrawn EP2642425A1 (en) | 2012-03-22 | 2012-03-22 | Method and device for evaluating the results of eye tracking |
EP16199536.0A Active EP3168783B1 (en) | 2012-03-22 | 2013-03-21 | Method and device for evaluating the results of eye tracking |
EP13710873.4A Active EP2828794B1 (en) | 2012-03-22 | 2013-03-21 | Method and device for evaluating the results of eye tracking |
Family Applications Before (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP12160878.0A Withdrawn EP2642425A1 (en) | 2012-03-22 | 2012-03-22 | Method and device for evaluating the results of eye tracking |
EP16199536.0A Active EP3168783B1 (en) | 2012-03-22 | 2013-03-21 | Method and device for evaluating the results of eye tracking |
Country Status (6)
Country | Link |
---|---|
US (1) | US9639745B2 (en) |
EP (3) | EP2642425A1 (en) |
JP (1) | JP6067093B2 (en) |
CN (1) | CN104321785B (en) |
IN (1) | IN2014DN08740A (en) |
WO (1) | WO2013139919A1 (en) |
Families Citing this family (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2010083853A1 (en) * | 2009-01-26 | 2010-07-29 | Tobii Technology Ab | Detection of gaze point assisted by optical reference signals |
EP2642425A1 (en) | 2012-03-22 | 2013-09-25 | SensoMotoric Instruments GmbH | Method and device for evaluating the results of eye tracking |
US9256784B1 (en) | 2013-03-11 | 2016-02-09 | Amazon Technologies, Inc. | Eye event detection |
US9466130B2 (en) * | 2014-05-06 | 2016-10-11 | Goodrich Corporation | Systems and methods for enhancing displayed images |
US11586295B2 (en) * | 2015-01-12 | 2023-02-21 | Maximilian Ralph Peter von und zu Liechtenstein | Wink gesture control system |
WO2016146488A1 (en) * | 2015-03-13 | 2016-09-22 | SensoMotoric Instruments Gesellschaft für innovative Sensorik mbH | Method for automatically identifying at least one user of an eye tracking device and eye tracking device |
EP3308188A4 (en) * | 2015-06-09 | 2019-01-23 | Nokia Technologies Oy | Causing performance of an active scan |
EP3332284A1 (en) * | 2015-08-07 | 2018-06-13 | SensoMotoric Instruments Gesellschaft für innovative Sensorik mbH | Method and apparatus for data capture and evaluation of ambient data |
KR101848453B1 (en) * | 2016-08-19 | 2018-04-13 | 서울대학교병원 | Apparatus for taking a picture of a certain portion of eyeball using headmount display |
EP3316075B1 (en) * | 2016-10-26 | 2021-04-07 | Harman Becker Automotive Systems GmbH | Combined eye and gesture tracking |
KR102495234B1 (en) | 2017-09-06 | 2023-03-07 | 삼성전자주식회사 | Electronic apparatus, method for controlling thereof and the computer readable recording medium |
US10656706B2 (en) * | 2017-12-04 | 2020-05-19 | International Business Machines Corporation | Modifying a computer-based interaction based on eye gaze |
JP2019200481A (en) * | 2018-05-14 | 2019-11-21 | 株式会社デンソーテン | Terminal device and collection method |
KR101987229B1 (en) * | 2018-12-10 | 2019-06-10 | 세종대학교산학협력단 | Method and apparatus for analyzing saliency-based visual stimulus and gaze data |
JP6963157B2 (en) * | 2019-02-09 | 2021-11-05 | 株式会社ガゾウ | Visual inspection training equipment and programs |
US11112865B1 (en) * | 2019-02-13 | 2021-09-07 | Facebook Technologies, Llc | Systems and methods for using a display as an illumination source for eye tracking |
CN113762907B (en) * | 2020-10-13 | 2024-09-24 | 北京沃东天骏信息技术有限公司 | Method and device for auditing objects |
KR102354822B1 (en) * | 2021-03-25 | 2022-01-24 | 박준 | Eyeball Movement Test System Using AI Big Data |
CN117137426B (en) * | 2023-10-26 | 2024-02-13 | 中国科学院自动化研究所 | Visual field damage evaluation training method and system based on micro-glance feature monitoring |
Family Cites Families (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6381339B1 (en) * | 1997-01-15 | 2002-04-30 | Winton Emery Brown | Image system evaluation method and apparatus using eye motion tracking |
US6106119A (en) * | 1998-10-16 | 2000-08-22 | The Board Of Trustees Of The Leland Stanford Junior University | Method for presenting high level interpretations of eye tracking data correlated to saved display images |
US6711293B1 (en) | 1999-03-08 | 2004-03-23 | The University Of British Columbia | Method and apparatus for identifying scale invariant features in an image and use of same for locating an object in an image |
GB0229625D0 (en) * | 2002-12-19 | 2003-01-22 | British Telecomm | Searching images |
JP4120008B2 (en) * | 2003-02-25 | 2008-07-16 | 独立行政法人科学技術振興機構 | Motor function assist device |
JP3751608B2 (en) * | 2003-06-18 | 2006-03-01 | 株式会社東芝 | Information processing device |
JP4742695B2 (en) * | 2005-06-27 | 2011-08-10 | トヨタ自動車株式会社 | Gaze recognition apparatus and gaze recognition method |
US9606621B2 (en) * | 2006-07-28 | 2017-03-28 | Philips Lighting Holding B.V. | Gaze interaction for information display of gazed items |
US7682025B2 (en) * | 2007-02-04 | 2010-03-23 | Miralex Systems Incorporated | Gaze tracking using multiple images |
US7556377B2 (en) * | 2007-09-28 | 2009-07-07 | International Business Machines Corporation | System and method of detecting eye fixations using adaptive thresholds |
CN101382940B (en) * | 2008-10-23 | 2012-01-04 | 浙江大学 | Web page image individuation search method based on eyeball tracking |
WO2010083853A1 (en) * | 2009-01-26 | 2010-07-29 | Tobii Technology Ab | Detection of gaze point assisted by optical reference signals |
WO2010118292A1 (en) * | 2009-04-09 | 2010-10-14 | Dynavox Systems, Llc | Calibration free, motion tolerant eye-gaze direction detector with contextually aware computer interaction and communication methods |
CN101943982B (en) * | 2009-07-10 | 2012-12-12 | 北京大学 | Method for manipulating image based on tracked eye movements |
GB0915136D0 (en) * | 2009-08-28 | 2009-10-07 | Cambridge Entpr Ltd | Visual perimeter measurement system and method |
AU2011253982B9 (en) * | 2011-12-12 | 2015-07-16 | Canon Kabushiki Kaisha | Method, system and apparatus for determining a subject and a distractor in an image |
EP2642425A1 (en) | 2012-03-22 | 2013-09-25 | SensoMotoric Instruments GmbH | Method and device for evaluating the results of eye tracking |
-
2012
- 2012-03-22 EP EP12160878.0A patent/EP2642425A1/en not_active Withdrawn
-
2013
- 2013-03-21 US US14/387,026 patent/US9639745B2/en active Active
- 2013-03-21 CN CN201380026908.3A patent/CN104321785B/en active Active
- 2013-03-21 EP EP16199536.0A patent/EP3168783B1/en active Active
- 2013-03-21 WO PCT/EP2013/055953 patent/WO2013139919A1/en active Application Filing
- 2013-03-21 EP EP13710873.4A patent/EP2828794B1/en active Active
- 2013-03-21 JP JP2015500928A patent/JP6067093B2/en not_active Expired - Fee Related
- 2013-03-21 IN IN8740DEN2014 patent/IN2014DN08740A/en unknown
Non-Patent Citations (1)
Title |
---|
See references of WO2013139919A1 * |
Also Published As
Publication number | Publication date |
---|---|
CN104321785A (en) | 2015-01-28 |
JP6067093B2 (en) | 2017-01-25 |
US9639745B2 (en) | 2017-05-02 |
US20150063635A1 (en) | 2015-03-05 |
EP3168783B1 (en) | 2020-08-19 |
JP2015514251A (en) | 2015-05-18 |
EP2828794B1 (en) | 2017-01-04 |
CN104321785B (en) | 2018-01-09 |
IN2014DN08740A (en) | 2015-05-22 |
EP3168783A1 (en) | 2017-05-17 |
WO2013139919A1 (en) | 2013-09-26 |
EP2642425A1 (en) | 2013-09-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP2828794B1 (en) | Method and device for evaluating the results of eye tracking | |
EP2157903B1 (en) | Method for perception measurement | |
EP3332284A1 (en) | Method and apparatus for data capture and evaluation of ambient data | |
EP2782493B1 (en) | Method and device improving visual performance | |
WO2017153355A1 (en) | Method and device for carrying out eye gaze mapping | |
DE102016201531A1 (en) | Method and device for detecting fatigue of a driver | |
DE102008040803A1 (en) | Method for the quantitative representation of the blood flow | |
DE102014216511A1 (en) | Create chapter structures for video data with images from a surgical microscope object area | |
DE102011011931A1 (en) | Method for evaluating a plurality of time-shifted images, device for evaluating images, monitoring system | |
EP3873322B1 (en) | Location-based quantification of impairment due to halo and scattered light | |
WO2017220667A1 (en) | Method and device for modifying the affective visual information in the field of vision of an user | |
DE102017217872A1 (en) | Improved view of a motor vehicle environment | |
EP1300108B1 (en) | Method for obtaining, evaluating and analyzing sequences of vision | |
DE102014009699A1 (en) | Method for operating a display device and system with a display device | |
DE102007001738B4 (en) | Method and computer program product for eye tracking | |
EP3399427B1 (en) | Method and system for the quantitative measurement of mental stress of an individual user | |
AT506572B9 (en) | METHOD FOR MEASURING VISUAL ATTENTION IN THE VIEW OF STATIC AND DYNAMIC VISUAL SCENES | |
DE102023203957B3 (en) | Methods for stereo image processing and display and overall system | |
DE102013105638B4 (en) | Increase of perceptual image resolution | |
DE102009000376A1 (en) | Method for quantitative representation of blood flow in tissue- or vein region of patients, involves generating difference between successive frames, where generated difference is represented | |
DE102008040802A1 (en) | Method for quantitative representation of blood flow in tissue or vein region of patient, involves representing parameter characteristic of flow and another parameter characteristic of vein position in superimposed manner for image regions | |
AT412443B (en) | Test person look sequence analyzing method, involves recording eye and view video, determining pupil coordinate for each frame of eye video, and measuring centroid corresponding to pupil center of dark area | |
DE102023104370A1 (en) | Method for detecting and evaluating pupil movements of a living being, mobile device, computer program product and computer-readable medium | |
DE102016006768A1 (en) | A method of operating a display system and display system | |
DE102015011926A1 (en) | Method for operating a camera system in a motor vehicle and motor vehicle |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 20140922 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
RIN1 | Information on inventor provided before grant (corrected) |
Inventor name: HOFFMANN, JAN Inventor name: WILLIAMS, DENIS |
|
DAX | Request for extension of the european patent (deleted) | ||
17Q | First examination report despatched |
Effective date: 20150710 |
|
GRAP | Despatch of communication of intention to grant a patent |
Free format text: ORIGINAL CODE: EPIDOSNIGR1 |
|
INTG | Intention to grant announced |
Effective date: 20160713 |
|
GRAS | Grant fee paid |
Free format text: ORIGINAL CODE: EPIDOSNIGR3 |
|
GRAA | (expected) grant |
Free format text: ORIGINAL CODE: 0009210 |
|
AK | Designated contracting states |
Kind code of ref document: B1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
REG | Reference to a national code |
Ref country code: GB Ref legal event code: FG4D Free format text: NOT ENGLISH |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: EP |
|
REG | Reference to a national code |
Ref country code: AT Ref legal event code: REF Ref document number: 859903 Country of ref document: AT Kind code of ref document: T Effective date: 20170115 |
|
REG | Reference to a national code |
Ref country code: IE Ref legal event code: FG4D Free format text: LANGUAGE OF EP DOCUMENT: GERMAN |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R096 Ref document number: 502013005972 Country of ref document: DE |
|
REG | Reference to a national code |
Ref country code: FR Ref legal event code: PLFP Year of fee payment: 5 |
|
REG | Reference to a national code |
Ref country code: LT Ref legal event code: MG4D Ref country code: NL Ref legal event code: MP Effective date: 20170104 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: NL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170104 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: NO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170404 Ref country code: GR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170405 Ref country code: LT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170104 Ref country code: FI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170104 Ref country code: IS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170504 Ref country code: HR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170104 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170104 Ref country code: RS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170104 Ref country code: LV Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170104 Ref country code: PT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170504 Ref country code: PL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170104 Ref country code: BG Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170404 Ref country code: ES Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170104 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R082 Ref document number: 502013005972 Country of ref document: DE Representative=s name: BARDEHLE PAGENBERG PARTNERSCHAFT MBB PATENTANW, DE |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R097 Ref document number: 502013005972 Country of ref document: DE |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170104 Ref country code: IT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170104 Ref country code: RO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170104 Ref country code: CZ Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170104 Ref country code: EE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170104 |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: PL |
|
PLBE | No opposition filed within time limit |
Free format text: ORIGINAL CODE: 0009261 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: DK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170104 Ref country code: SM Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170104 Ref country code: MC Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170104 |
|
26N | No opposition filed |
Effective date: 20171005 |
|
REG | Reference to a national code |
Ref country code: IE Ref legal event code: MM4A |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: LU Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20170321 |
|
REG | Reference to a national code |
Ref country code: FR Ref legal event code: PLFP Year of fee payment: 6 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170104 Ref country code: LI Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20170331 Ref country code: CH Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20170331 Ref country code: IE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20170321 |
|
REG | Reference to a national code |
Ref country code: BE Ref legal event code: MM Effective date: 20170331 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: BE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20170331 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170104 |
|
REG | Reference to a national code |
Ref country code: AT Ref legal event code: MM01 Ref document number: 859903 Country of ref document: AT Kind code of ref document: T Effective date: 20180321 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: FR Payment date: 20190213 Year of fee payment: 7 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R082 Ref document number: 502013005972 Country of ref document: DE Representative=s name: BARDEHLE PAGENBERG PARTNERSCHAFT MBB PATENTANW, DE Ref country code: DE Ref legal event code: R081 Ref document number: 502013005972 Country of ref document: DE Owner name: APPLE INC., CUPERTINO, US Free format text: FORMER OWNER: SENSOMOTORIC INSTRUMENTS GESELLSCHAFT FUER INNOVATIVE SENSORIK MBH, 14513 TELTOW, DE |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: HU Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO Effective date: 20130321 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: CY Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170104 Ref country code: AT Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20180321 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170104 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: TR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170104 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: AL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20170104 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: FR Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20200331 |
|
REG | Reference to a national code |
Ref country code: GB Ref legal event code: 732E Free format text: REGISTERED BETWEEN 20210708 AND 20210714 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R079 Ref document number: 502013005972 Country of ref document: DE Free format text: PREVIOUS MAIN CLASS: G06K0009000000 Ipc: G06V0010000000 |
|
P01 | Opt-out of the competence of the unified patent court (upc) registered |
Effective date: 20230525 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: DE Payment date: 20231229 Year of fee payment: 12 Ref country code: GB Payment date: 20240108 Year of fee payment: 12 |