WO2005010739A1 - System and method for controlling the display of an image - Google Patents

System and method for controlling the display of an image Download PDF

Info

Publication number
WO2005010739A1
WO2005010739A1 PCT/IB2004/051271 IB2004051271W WO2005010739A1 WO 2005010739 A1 WO2005010739 A1 WO 2005010739A1 IB 2004051271 W IB2004051271 W IB 2004051271W WO 2005010739 A1 WO2005010739 A1 WO 2005010739A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
observer
display device
head
display
Prior art date
Application number
PCT/IB2004/051271
Other languages
French (fr)
Inventor
Hanns-Ingo Maack
Original Assignee
Philips Intellectual Property & Standards Gmbh
Koninklijke Philips Electronics N. V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Philips Intellectual Property & Standards Gmbh, Koninklijke Philips Electronics N. V. filed Critical Philips Intellectual Property & Standards Gmbh
Publication of WO2005010739A1 publication Critical patent/WO2005010739A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements

Definitions

  • the invention relates to a method of controlling the display of an image on a display device. It furthermore relates to a system which is suitable for carrying out such a method.
  • a display device such as for example on a projection wall or on a computer monitor
  • the user can input control commands for example via the keyboard or mouse, by means of which control commands he can adapt the display of the image as desired.
  • the method according to the invention is used to control the display of an image on a display device.
  • the display device may be for example the projection screen of a slide projector or beamer.
  • the display device is preferably a screen or monitor on which a digitized image is displayed by an image generation unit (graphics driver).
  • the display of the image on the display device is changed as a function of a controlling movement of the head and/or the eyes of an observer such that a natural optical effect associated with said movement is amplified for the observer.
  • the change in the image display takes place solely in reaction to controlling movements of the head, that is to say the eye movement relative to the head has no effect.
  • "natural optical effects" are all changes in the perception or optical representation of the image on the retina of the observer which result from the movement of the head and/or the eyes. For example, a rotation of the eyes (without any movement of the head) thus essentially results in a shift of the representation on the retina.
  • a change in the distance between head and display device on the other hand leads to a magnification or reduction in size of the representation on the retina of the observer.
  • the display of the image on the display device is now changed in the event of a head and/or eye movement of the observer such that the changes in the representation on the retina of the observer which are naturally brought about on account of the movement are amplified by the changed display of the image on the display device.
  • the amplification may in principle be "positive” or "negative", where a positive change (or amplification in the narrower sense) tends to act in the same direction as the natural optical effect while the negative amplification tends to act in the opposite direction.
  • the observer in this case need only make those head movements which he would make in any case in order to look at a static image in a different manner.
  • the movement of the head may in particular be detected by observing its position, with the position of the head in turn being described by typically one to six degrees of freedom (coordinates), depending on the desired differentiation of the control.
  • the optionally detected movement of the eyes is typically composed of an overall movement which takes place together with the head and of a rotation of the eyes relative to the head.
  • the natural optical effect which is associated with a movement of the head and/or the eyes when looking at a static image may in particular consist in a change in the magnification, a shift, a rotation, a distortion and/or a change in the brightness of the representation on the retina of an observer. Accordingly, in the method the amplification of the natural optical effect takes place in particular by one or more corresponding operations during displaying of the image on the display device. According to one preferred refinement of the method, not all movements of the head and/or the eyes is "controlling", that is to say a trigger for a change in the display of the image on the display device. Rather, said movements are preferably controlling only if an additional condition that can be influenced by the observer is satisfied.
  • the observer can thus decide independently whether his movements change the display of the image or whether the current display is to be retained despite the movement.
  • said additional condition is satisfied a) if the change in position of the head and/or the eyes resulting from a movement of the head and/or the eyes exceeds a predefined threshold value and/or b) if the movement speed of the head and/or the eyes exceeds a predefined threshold value. That is to say that small (a) and/or slow (b) movements lie below the threshold of the method and thus do not lead to a change in the image display.
  • the observer can therefore decide, by deliberately controlling the severity of his movements, whether or not a change in the image display is to take place.
  • said additional condition is satisfied if the head and/or the eyes of the observer leave a predefined spatial reference region (typically a relatively small volume) during the movement.
  • a predefined spatial reference region typically a relatively small volume
  • all movements of the head and/or the eyes which remain within the reference region do not have any effect on the display of the image. Only if the observer deliberately leaves the reference region on account of a relatively great head movement does he thus initiate control of the image display.
  • the introduction of such a reference region has the advantage that small movements by the observer do not lead to undesirable image changes, so that the image display as a whole is kept steady.
  • the spatial position and the shape of the reference region can preferably be defined individually by the observer.
  • the reference region may also be subject to a slow drift, that is to say it follows the position of the head and/or the eyes in a delayed manner. If, therefore, the observer does not make any further movements for a certain time after a movement, the reference region "automatically" positions itself around his new head position.
  • a change in the image display is triggered by a controlling movement of the head and/or the eyes.
  • Quantitative control parameters for said change may in this case be the speed and/or a higher time differentiation of the movement. That is to say that the change in the image display is for example greater the higher the movement speed.
  • the display of the image on the display device therefore depends (exclusively or among other things) in functional terms on the spatial position of the head and/or the eyes of the observer.
  • the movement of the head and/or the eyes in this case leads to a control of the image display, since it is necessarily associated with a change in position.
  • the position of the head and/or the eyes may in this connection be described by one to six degrees of freedom, depending on the desired differentiation. It is preferably described by three degrees of freedom, that is to say the spatial coordinates of a predefined point of the head (for example the center point between the eyes), without the inclination of the head relative to the vertical being taken into account.
  • the magnification with which the image is displayed on the display device varies between a predefined minimum and a predefined maximum magnification when the position of the head and/or the eyes of the observer varies between a predefined maximum and a predefined minimum distance from the display device. That is to say that the distance of the observer from the display device influences the magnification of the image display (zoom); however, limits are defined by the minimum and the maximum magnification, and the interval of magnifications lying therebetween is mapped on a movement margin of the observer.
  • the basic parameters may be selected such that the entire desired or useful zoom range of the image display is carried out within a movement margin that is convenient for the observer.
  • a proportionality according to ctg(a) [tg(a)Y l oc Z -f(z) is observed.
  • ctg(a) [tg(a)Y l oc Z -f(z)
  • a certain image detail for example a vertebral body on an X-ray image
  • a monotonously increasing function with positive values f(z) > 0 for z > 0.
  • there may be (at least approximately) a polynominal proportionality according to The parameter ? > 0 is in this case the adjustable (positive) amplification of the natural optical magnification.
  • limit values should be defined for the amplification of the natural optical effects which can be controlled by the method, these limit values being defined as a function of the resolutions of the image and of the display device. For example, the reduction in size of the image can be terminated when it in its full extent fits on the display area in a format-filling manner. Likewise, the magnification of the image can be terminated when it, on account of the limited resolution of the image on the display device, does not provide any further information gain for the observer.
  • a digitized image i.e. an image described by individual image points
  • a magnification that is precisely adapted to the resolution of the display device when the head and/or the eyes of the observer are at a predefined reference distance from the display device.
  • Adaptation to the resolution in this case means that each image pixel is represented by precisely one pixel of the display device.
  • the image display may also be designed such that the image fits on the display device in a format-filling manner when the observer is at a predefined reference distance.
  • the control of the display of the image on the display device is based not only on the movement of the head and/or the eyes but also additionally on inputs by the observer made by means of hand movements, eyelid movements, voice or the like. By means of such additional inputs, the observer can determine for example whether his head and/or eye movements should or should not have a controlling function. If a series of successive images is displayed on the display device, a new image is preferably displayed with the display parameters of the previous image, i.e. a set amplification is retained.
  • the invention furthermore relates to a system for controlling the display of an image on a display device, which system comprises the following components: a) a measurement device for detecting a movement of the head and/or the eyes of an observer; b) an image generation unit which is coupled to the measurement device and is designed to generate a display of the image on a display device, wherein the display is changed as a function of a movement of the head and/or the eyes of an observer that is detected by the measurement device, such that a natural optical effect associated with the movement is amplified for the observer.
  • a method of the type described above can be carried out using said system.
  • the measurement device may be any device which allows the desired detection of a head or eye movement.
  • this detection should take place in a contactless manner in order that it bothers the observer as little as possible.
  • a measurement using ultrasound echoes or light barriers is for example possible.
  • the measurement device preferably comprises at least one camera for recording the head of an observer. Using suitable image analysis software, the head position and/or eye position can then be deduced from the recordings taken by the camera. If two cameras are used, use may be made of stereoscopic methods which are known for this purpose.
  • the image generation unit of the system is preferably designed to display the image on a monitor. The image is in this case typically in digitized form, so that the image generation unit may use a microprocessor or the like to perform its processing tasks.
  • Fig. 1 schematically shows a system according to the invention for controlling the display of an image on a display device.
  • Fig. 2 shows the display of the image on the display device of Fig. 1 when the observer is located relatively close to the display device.
  • Fig. 3 shows the display of the image on the display device of Fig. 1 when the observer is located further away from the display device.
  • Fig. 4 shows a schematic diagram of the geometric relations for calculating zoom factors.
  • Fig. 5 shows a schematic diagram of the geometric relations for calculating an image section shift.
  • Fig. 6 shows the dependence of the image section shift on the deviation of the head position from the reference position.
  • Fig. 1 shows a system according to the invention for controlling the display of an image 10 on a display device 9 which may for example be a flat screen.
  • the display device 9 is actuated in a known manner by the graphics card 8 of a data processing device 4.
  • an x,y,z coordinate system which by definition lies with its origin in the area (e.g. in the center) of the display device 9, the x and y axes running respectively horizontally and vertically in the plane of the display area and the z axis pointing perpendicular thereto in the direction of an observer 1.
  • the distance of the head 2 and/or the eyes of the observer 1 from the display device 9 is thus described by the z coordinate.
  • the "natural magnification" brought about on the retina of an observer by such a change in distance may be expressed for example by the ratio of the tangent values of the angles oci, 2 , which in turn is proportional to the inverse ratio of the distances z ⁇ , z 2 .
  • the system comprises at least one camera.
  • it comprises two digital cameras 3 (e.g. webcams) which are fitted at known positions in the vicinity of the monitor 9 and are directed at the observer 1.
  • the recordings taken by the cameras 3 are passed to the data processing device 4 and processed there in a software module 5.
  • the position (x n , y t ,, z) of the head 2 and/or the eyes is determined from the image information by following the eye movement and/or segmenting the face.
  • the position signals thus obtained are subjected to low-pass filtering in order to eliminate artefacts such as small trembling movements for example. Face recognition could optionally further be performed or attempted by means of cameras in order for example to automatically select for various users individual display parameters from a stored list.
  • a digital zoom factor zoom (z) is then calculated from the determined distance z of the head 2 from the display device 9.
  • the center point coordinates pxc(x n ,z) and pyc(y n ,z), based on the image 10, of an image section that is to be displayed in a format-filling manner on the display device 9 are calculated. Details regarding the calculations in the modules 6 and 7 will be discussed later in connection with Figs.
  • the hardware module 8 controls the monitor 9 such that the image 10 is displayed in accordance with the parameters calculated in the modules 6 and 7.
  • the image 10 is displayed on the display device 9 in a magnified manner when the observer 1 moves closer to the display device 9 (Fig. 2).
  • a minimum distance z m i n from the display device 9 at which a maximum magnification zoom d _ max of the image 10 is achieved This may typically be 1 :4, that is to say one image pixel is represented by four pixels of the monitor 9.
  • the minimum distance z m j n is preferably selected such that it corresponds to the shortest observation distance based on the strength of vision of the observer 1 (typically about 25 to 35 cm).
  • Fig. 3 shows the display of the image 10 when the observer 1 is further away from the display device 9 with respect to the reference position z o .
  • the image 10 is in this case shown smaller in size, with a limit zoom d _ f , t of the reduction in size preferably being reached at a maximum distance z max which the user can still comfortably reach and which is typically about 70 to 100 cm. This limit is typically defined by the entire image fitting precisely on the area of the monitor 9.
  • zoom t (z) zoom n (z) • zoom d (z) (2)
  • the natural zoom factor zoom n (z) based on the reference distance z o corresponds to the inverse ratio of the observation distances.
  • zoom d o is a reference factor that is to be predefined, which describes the digital zoom at the reference distance z o .
  • the parameter p is an amplification exponent which in principle can be freely selected.
  • p ⁇ 0 The digital zoom factor zoom increases as the distance of the observer increases in the sense of an anti-natural zoom or a "negative amplification" of the natural zoom; the further the observer moves away from the display device, the larger the image is displayed on the latter.
  • p -1: A detail of the image on the display device 9 is seen by the observer, regardless of the distance z of the latter, always at the same viewing angle, that is to say the natural zoom is precisely compensated. This case corresponds to the prior art in accordance with WO 02/093483 Al.
  • p ⁇ -1 A detail of the image is seen by the observer at an increasingly large viewing angle as the distance z of the observer from the display device 9 increases.
  • p 0: In accordance with the conventional prior art, there is no change in the digital zoom as a function of the distance of the user.
  • zoom,(z) zoorrid o ⁇ (z o z) p+1
  • monitor_size_x is the width of the monitor 9
  • image_size_x is the width of the image 10 in the x direction
  • monitor_size_y is the height of the monitor 9
  • image_size_y is the height of the image 10 in the y direction (e.g.
  • the reference factor zoomd o used in equations (3a) and (4) may in particular be equal to the value zoom ⁇ ⁇ _ f ,t defined above, so that at the reference distance z 0 the entire image is displayed on the monitor 9 in a precisely format-filling manner.
  • the parameters that are to be predefined for the above calculations may be adapted to the respective observer 1.
  • the limits z mm and z max of the observation distance and also the reference distance z 0 can be predefined individually.
  • the maximum digital zoom factor zoomd_ m a x should be around 4, since further zooming does not provide any additional information. However, in special cases in which the system is conceived for example as an aid for vision-impaired people, much higher amplification factors may also be used.
  • a mathematical description for a possible control of the image shift (“pan") will be derived.
  • Fig. 5 shows the image 10 that is to be displayed and the definition of a few important variables.
  • the (digital) image is in this case rectangular with a width XSLZE and a height YSIZE, which may be measured for example in number of pixels or rows/columns.
  • a width XSLZE and a height YSIZE which may be measured for example in number of pixels or rows/columns.
  • the origin of an x*,y* coordinate system (this is based on the image 10, unlike the spatially fixed x,y,z coordinate system of Fig. 1 which was based on the monitor 9).
  • an image section S can be displayed on the display device 9, the size of which section depends on the current digital zoom factor zoom d (z) or the observation distance z.
  • the maximum horizontal and vertical head movements which are intended to lead to a control are defined by the variables Xh_ max and yh_ m ax- Where appropriate, different limits may be defined for the head movements to the left and to the right and up and down.
  • the center point (pxc, pyc) of the image section S displayed on the display device 9 will be defined as a function of the horizontal/vertical head position x n , y n as described below.
  • the position or shift of the image section S is advantageously limited by the requirement that the entire section S should still lie within the area of the image 10, since a shift that goes beyond the edge of the image 10 would bring zones without any image information onto the monitor 9.
  • ⁇ pxcmax(z) (9) pyc(z) Cy •
  • Cx, Cy, qx and qy are freely definable factors which can be used to define a desired shift behavior.
  • a "positive amplification" of the natural image shift (which takes place on the retina) in the event of a head movement by the observer is of particular interest, i.e. the case Cx, Cy > 0 and qx, qy > 0.
  • a control method based on the above formulae is highly robust, that is to say the desired control tends to be achieved even if the required variables (in particular the head coordinates x n , y h , z) can be determined only relatively imprecisely.
  • the digital zoom factor zoomd and the choice of the section S for the image display are selected as a function of the head position of the observer 1.
  • a "position-to-speed relation” could also be implemented.
  • a reference region or tolerance region for example a cube having an edge length of 10 cm, may be defined around a preferred reference position (x n o, yho, z 0 ).

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

The invention relates to a system and a method for controlling the display of an image (10) on a monitor (9). The position of the head (2) and/or the eyes of an observer (1) is recorded by means of cameras (3). If the head moves toward the display device (9), the image (10) is displayed there in a digitally magnified manner, and vice versa. In the event of a side movement of the head (2), there is a corresponding shift of the image detail shown on the display device (9). The natural optical effects occurring for the observer (1) in the event of a movement of the head are in this way additionally amplified by the image control.

Description

System and method for controlling the display of an image
The invention relates to a method of controlling the display of an image on a display device. It furthermore relates to a system which is suitable for carrying out such a method. When an image is displayed on a display device such as for example on a projection wall or on a computer monitor, it is desirable that the observer can influence the displayed image section or the positioning of the image on the display device and also the magnification of the image. In the case of computer monitors, it is known in this respect that the user can input control commands for example via the keyboard or mouse, by means of which control commands he can adapt the display of the image as desired.
Furthermore, it is known from WO 02/093483 Al to measure the position of an observer relative to a monitor by means of cameras. The display of an image on the monitor is then adapted as a function of the determined position of the observer such that it appears the same to the observer from any position. That is to say for example that the natural reduction in size of the image in the eye of the observer which is associated with the increasing distance of the observer from the display device is compensated by a corresponding increase in the digital magnification during displaying of the image on the monitor.
Against this background, it is an object of the present invention to provide means for the simplified control of the display of an image on a display device. This object is achieved by a method having the features of claim 1 and by a system having the features of claim 12. Advantageous embodiments are contained in the dependent claims. The method according to the invention is used to control the display of an image on a display device. The display device may be for example the projection screen of a slide projector or beamer. The display device is preferably a screen or monitor on which a digitized image is displayed by an image generation unit (graphics driver). According to the method, the display of the image on the display device is changed as a function of a controlling movement of the head and/or the eyes of an observer such that a natural optical effect associated with said movement is amplified for the observer. It is particularly preferred if the change in the image display takes place solely in reaction to controlling movements of the head, that is to say the eye movement relative to the head has no effect. In this connection, "natural optical effects" are all changes in the perception or optical representation of the image on the retina of the observer which result from the movement of the head and/or the eyes. For example, a rotation of the eyes (without any movement of the head) thus essentially results in a shift of the representation on the retina. A change in the distance between head and display device on the other hand leads to a magnification or reduction in size of the representation on the retina of the observer. According to the invention, the display of the image on the display device is now changed in the event of a head and/or eye movement of the observer such that the changes in the representation on the retina of the observer which are naturally brought about on account of the movement are amplified by the changed display of the image on the display device. The amplification may in principle be "positive" or "negative", where a positive change (or amplification in the narrower sense) tends to act in the same direction as the natural optical effect while the negative amplification tends to act in the opposite direction. It is important that with the amplification the representation on the retina of the observer is changed in the event of a (controlling) movement of the head and/or the eyes, that is to say does not stay the same unlike in WO 02/093483 Al. On account of its greater importance, positive amplification will be considered in the text which follows, without the invention being restricted thereto. If the observer moves for example toward the display device in order to look more closely at details of the image, according to the method (in the case of a positive amplification) the image is additionally displayed on the display device with a higher magnification. If the user moves, for example, his head toward a certain edge zone of the display device, this edge zone automatically moves closer to the center of the display device. As a result, in this way a particularly simple and intuitively plausible control of the display of the image on the display device is achieved. For a desired change in the display, the observer in this case need only make those head movements which he would make in any case in order to look at a static image in a different manner. The movement of the head may in particular be detected by observing its position, with the position of the head in turn being described by typically one to six degrees of freedom (coordinates), depending on the desired differentiation of the control. The optionally detected movement of the eyes is typically composed of an overall movement which takes place together with the head and of a rotation of the eyes relative to the head. The natural optical effect which is associated with a movement of the head and/or the eyes when looking at a static image may in particular consist in a change in the magnification, a shift, a rotation, a distortion and/or a change in the brightness of the representation on the retina of an observer. Accordingly, in the method the amplification of the natural optical effect takes place in particular by one or more corresponding operations during displaying of the image on the display device. According to one preferred refinement of the method, not all movements of the head and/or the eyes is "controlling", that is to say a trigger for a change in the display of the image on the display device. Rather, said movements are preferably controlling only if an additional condition that can be influenced by the observer is satisfied. The observer can thus decide independently whether his movements change the display of the image or whether the current display is to be retained despite the movement. According to a first embodiment of the above-described variant of the method, said additional condition is satisfied a) if the change in position of the head and/or the eyes resulting from a movement of the head and/or the eyes exceeds a predefined threshold value and/or b) if the movement speed of the head and/or the eyes exceeds a predefined threshold value. That is to say that small (a) and/or slow (b) movements lie below the threshold of the method and thus do not lead to a change in the image display. The observer can therefore decide, by deliberately controlling the severity of his movements, whether or not a change in the image display is to take place. According to a second embodiment of the method variant, said additional condition is satisfied if the head and/or the eyes of the observer leave a predefined spatial reference region (typically a relatively small volume) during the movement. In other words, all movements of the head and/or the eyes which remain within the reference region do not have any effect on the display of the image. Only if the observer deliberately leaves the reference region on account of a relatively great head movement does he thus initiate control of the image display. The introduction of such a reference region has the advantage that small movements by the observer do not lead to undesirable image changes, so that the image display as a whole is kept steady. The spatial position and the shape of the reference region can preferably be defined individually by the observer. The reference region may also be subject to a slow drift, that is to say it follows the position of the head and/or the eyes in a delayed manner. If, therefore, the observer does not make any further movements for a certain time after a movement, the reference region "automatically" positions itself around his new head position. In the method described, a change in the image display is triggered by a controlling movement of the head and/or the eyes. Quantitative control parameters for said change may in this case be the speed and/or a higher time differentiation of the movement. That is to say that the change in the image display is for example greater the higher the movement speed. According to one preferred embodiment of the method, the display of the image on the display device therefore depends (exclusively or among other things) in functional terms on the spatial position of the head and/or the eyes of the observer. The movement of the head and/or the eyes in this case leads to a control of the image display, since it is necessarily associated with a change in position. The position of the head and/or the eyes may in this connection be described by one to six degrees of freedom, depending on the desired differentiation. It is preferably described by three degrees of freedom, that is to say the spatial coordinates of a predefined point of the head (for example the center point between the eyes), without the inclination of the head relative to the vertical being taken into account. In another advantageous embodiment of the method, the magnification with which the image is displayed on the display device varies between a predefined minimum and a predefined maximum magnification when the position of the head and/or the eyes of the observer varies between a predefined maximum and a predefined minimum distance from the display device. That is to say that the distance of the observer from the display device influences the magnification of the image display (zoom); however, limits are defined by the minimum and the maximum magnification, and the interval of magnifications lying therebetween is mapped on a movement margin of the observer. In particular, the basic parameters may be selected such that the entire desired or useful zoom range of the image display is carried out within a movement margin that is convenient for the observer. If the magnification of the image display is controlled by the distance of the observer from the display device, according to one preferred embodiment of the method a proportionality according to ctg(a) = [tg(a)Yl oc Z -f(z) is observed. Here, is the viewing angle at which an observer sees a certain image detail (for example a vertebral body on an X-ray image) at a distance z from the display device, and/is a monotonously increasing function with positive values f(z) > 0 for z > 0. In particular, there may be (at least approximately) a polynominal proportionality according to
Figure imgf000007_0001
The parameter ? > 0 is in this case the adjustable (positive) amplification of the natural optical magnification. On account of technical limitations of the display device and also a limited information content of the image, that is to say on account of the limited "resolutions" of the display device and of the image, there are generally limits for any type of change in the image display, and these limits should not be exceeded. Advantageously, therefore, limit values should be defined for the amplification of the natural optical effects which can be controlled by the method, these limit values being defined as a function of the resolutions of the image and of the display device. For example, the reduction in size of the image can be terminated when it in its full extent fits on the display area in a format-filling manner. Likewise, the magnification of the image can be terminated when it, on account of the limited resolution of the image on the display device, does not provide any further information gain for the observer. Within the context of matching the image display to the capability of the display device, a digitized image (i.e. an image described by individual image points) is preferably displayed with a magnification that is precisely adapted to the resolution of the display device when the head and/or the eyes of the observer are at a predefined reference distance from the display device. Adaptation to the resolution in this case means that each image pixel is represented by precisely one pixel of the display device. Alternatively, the image display may also be designed such that the image fits on the display device in a format-filling manner when the observer is at a predefined reference distance. According to one development of the method, the control of the display of the image on the display device is based not only on the movement of the head and/or the eyes but also additionally on inputs by the observer made by means of hand movements, eyelid movements, voice or the like. By means of such additional inputs, the observer can determine for example whether his head and/or eye movements should or should not have a controlling function. If a series of successive images is displayed on the display device, a new image is preferably displayed with the display parameters of the previous image, i.e. a set amplification is retained. The invention furthermore relates to a system for controlling the display of an image on a display device, which system comprises the following components: a) a measurement device for detecting a movement of the head and/or the eyes of an observer; b) an image generation unit which is coupled to the measurement device and is designed to generate a display of the image on a display device, wherein the display is changed as a function of a movement of the head and/or the eyes of an observer that is detected by the measurement device, such that a natural optical effect associated with the movement is amplified for the observer. A method of the type described above can be carried out using said system.
For a further explanation regarding the details, advantages and variant embodiments of the system, reference should therefore be made in particular to the above description. The measurement device may be any device which allows the desired detection of a head or eye movement. Advantageously, this detection should take place in a contactless manner in order that it bothers the observer as little as possible. A measurement using ultrasound echoes or light barriers is for example possible. The measurement device preferably comprises at least one camera for recording the head of an observer. Using suitable image analysis software, the head position and/or eye position can then be deduced from the recordings taken by the camera. If two cameras are used, use may be made of stereoscopic methods which are known for this purpose. The image generation unit of the system is preferably designed to display the image on a monitor. The image is in this case typically in digitized form, so that the image generation unit may use a microprocessor or the like to perform its processing tasks.
The invention will be further described with reference to examples of embodiments shown in the drawings to which, however, the invention is not restricted. Fig. 1 schematically shows a system according to the invention for controlling the display of an image on a display device. Fig. 2 shows the display of the image on the display device of Fig. 1 when the observer is located relatively close to the display device. Fig. 3 shows the display of the image on the display device of Fig. 1 when the observer is located further away from the display device. Fig. 4 shows a schematic diagram of the geometric relations for calculating zoom factors. Fig. 5 shows a schematic diagram of the geometric relations for calculating an image section shift. Fig. 6 shows the dependence of the image section shift on the deviation of the head position from the reference position.
Fig. 1 shows a system according to the invention for controlling the display of an image 10 on a display device 9 which may for example be a flat screen. The display device 9 is actuated in a known manner by the graphics card 8 of a data processing device 4. In order to describe the geometric relations, use is made of an x,y,z coordinate system which by definition lies with its origin in the area (e.g. in the center) of the display device 9, the x and y axes running respectively horizontally and vertically in the plane of the display area and the z axis pointing perpendicular thereto in the direction of an observer 1. The distance of the head 2 and/or the eyes of the observer 1 from the display device 9 is thus described by the z coordinate. If the observer 1 would like to look more closely at an image displayed statically on the display device 9 or is interested in part-sections of the image, he "instinctively" moves his head 1 closer to the display device 9 or the image section of interest. In accordance with the schematic diagram of Fig. 4, a move closer to the display device 9 leads to an image detail of size H, which initially from the distance zi was seen at an angle oci, then being seen from the shorter distance z2 at a larger angle α2. The "natural magnification" brought about on the retina of an observer by such a change in distance may be expressed for example by the ratio of the tangent values of the angles oci, 2, which in turn is proportional to the inverse ratio of the distances z\, z2. Based on a reference distance z2 = Zo, which typically corresponds to the observation distance that is most comfortable for the observer (about 40 to 60 cm), a natural zoom factor may thus be defined as zoomn(z) = ZQ/Z (1) Control of the image display in the case of the system of Fig. 1 is based on the principle that the natural change in the image on the retina of the observer that is associated with a movement of the head 2 is aided by a change in the image display on the display device 9 that acts in the manner of a servo mechanism in the same direction. For this purpose, the system comprises at least one camera. In the example shown it comprises two digital cameras 3 (e.g. webcams) which are fitted at known positions in the vicinity of the monitor 9 and are directed at the observer 1. The recordings taken by the cameras 3 are passed to the data processing device 4 and processed there in a software module 5. The position (xn, yt,, z) of the head 2 and/or the eyes is determined from the image information by following the eye movement and/or segmenting the face. Preferably, the position signals thus obtained are subjected to low-pass filtering in order to eliminate artefacts such as small trembling movements for example. Face recognition could optionally further be performed or attempted by means of cameras in order for example to automatically select for various users individual display parameters from a stored list. In a subsequent module 6, a digital zoom factor zoom (z) is then calculated from the determined distance z of the head 2 from the display device 9. In a further module 7, the center point coordinates pxc(xn,z) and pyc(yn,z), based on the image 10, of an image section that is to be displayed in a format-filling manner on the display device 9 are calculated. Details regarding the calculations in the modules 6 and 7 will be discussed later in connection with Figs. 4 to 6. Finally, the hardware module 8, as already mentioned, controls the monitor 9 such that the image 10 is displayed in accordance with the parameters calculated in the modules 6 and 7. According to the characteristic schematic diagram of the system, the image 10 is displayed on the display device 9 in a magnified manner when the observer 1 moves closer to the display device 9 (Fig. 2). In this case, there is preferably defined a minimum distance zmin from the display device 9 at which a maximum magnification zoomd_max of the image 10 is achieved. This may typically be 1 :4, that is to say one image pixel is represented by four pixels of the monitor 9. Furthermore, the minimum distance zmjn is preferably selected such that it corresponds to the shortest observation distance based on the strength of vision of the observer 1 (typically about 25 to 35 cm). Fig. 3 shows the display of the image 10 when the observer 1 is further away from the display device 9 with respect to the reference position zo. The image 10 is in this case shown smaller in size, with a limit zoomd_f,t of the reduction in size preferably being reached at a maximum distance zmax which the user can still comfortably reach and which is typically about 70 to 100 cm. This limit is typically defined by the entire image fitting precisely on the area of the monitor 9. In the event of a further reduction in size of the image, display area of the monitor is "thrown away" without additional image contents becoming visible to the observer, and this is generally not sensible. The situation may be different, however, in the case of X-ray images, in which for example large shaded areas with low contrast appear better in reduced-size images. Herein below, a mathematical dependence of the digital image magnification zooma on the distance z of the observer 1 from the display device 9 and the shift in the image 10 on the display area will be explained by way of example in more detail with the aid of Figs. 4 to 6. First, the control of the magnification will be considered with reference to Fig. 4. The total zoom factor zoomt(z) resulting for an observer 1 at a distance z is described by the product of the natural zoom factor zoomn(z) and the digital zoom factor zoorrid(z): zoomt(z) = zoomn(z) • zoomd(z) (2) As shown in equation (1), the natural zoom factor zoomn(z) based on the reference distance zo corresponds to the inverse ratio of the observation distances. The digital zoom zoomd(z) is then defined as a function of the distance z as follows: zooπid(z) = zooma o (zo / z)p (3a) Here, zoomd o is a reference factor that is to be predefined, which describes the digital zoom at the reference distance zo. The parameter p is an amplification exponent which in principle can be freely selected. Depending on the choice of p, different display control behaviors are produced: p < 0: The digital zoom factor zoom increases as the distance of the observer increases in the sense of an anti-natural zoom or a "negative amplification" of the natural zoom; the further the observer moves away from the display device, the larger the image is displayed on the latter. p = -1: A detail of the image on the display device 9 is seen by the observer, regardless of the distance z of the latter, always at the same viewing angle, that is to say the natural zoom is precisely compensated. This case corresponds to the prior art in accordance with WO 02/093483 Al. p < -1 : A detail of the image is seen by the observer at an increasingly large viewing angle as the distance z of the observer from the display device 9 increases. p = 0: In accordance with the conventional prior art, there is no change in the digital zoom as a function of the distance of the user. p > 0: The digital zoom factor zoomd decreases as the observation distance z increases. This corresponds to a positive amplification of the natural zoom and a behavior in the sense of a servo function. p = 1 : The digital zoom factor zoomd behaves the same as the natural zoom factor zoomn, so that the total zoom zoomt resulting for the observer is twice as great as normal. p > 1 : Even greater positive amplification of the natural zoom than in the case of p = 1. When calculating the digital zoom factor in accordance with equation (3a), the limit condition that zoomd(z) < zoomd_max (3b) is advantageously observed, where zoomd_ma as mentioned is a predefined maximum digital zoom factor which is achieved at the minimum observation distance zm,n. By combining equations (1), (2) and (3a), the total zoom factor for the observer is: zoom,(z) = zoorrid o (zo z)p+1 (4) The digital zoom factor zoomd_flt, at which the entire image fits precisely on the monitor 9, may be calculated in accordance with the following formula: zoomd f,t = MIN[monitor_size_x/image_size_x; monitor_size_y/image_size_y] (5) Here, monitor_size_x is the width of the monitor 9 and image_size_x is the width of the image 10 in the x direction, and monitor_size_y is the height of the monitor 9 and image_size_y is the height of the image 10 in the y direction (e.g. in each case measured in pixels). The reference factor zoomd o used in equations (3a) and (4) may in particular be equal to the value zoom<ι_f,t defined above, so that at the reference distance z0 the entire image is displayed on the monitor 9 in a precisely format-filling manner. Alternatively, the reference factor zoorrid o may also be determined as the "true size" of the image as zoomd_o = 1, so that at the reference distance zo each pixel corresponds precisely to one pixel of the monitor 9. The parameters that are to be predefined for the above calculations may be adapted to the respective observer 1. For instance, by way of example the limits zmm and zmax of the observation distance and also the reference distance z0 can be predefined individually. The average for zo is about 30 cm (somewhat less for younger people and somewhat greater for older people), and for short-sighted people without any vision aid the shortest distance may be as low as zm,n = 10 cm. The maximum digital zoom factor zoomd_max should be around 4, since further zooming does not provide any additional information. However, in special cases in which the system is conceived for example as an aid for vision-impaired people, much higher amplification factors may also be used. For the determination of an optimal amplification exponent p, the preferred maximum digital zoom factor zoomd max at the distance zmm and the preferred reference factor zoomd o at the reference distance zo are to be predefined. Placing these variables in equation (3a) (for z = zmιn) and solving for p then gives the equation p = ln(zoomd_max / zoomd_o) / ln(z0 / zmιn) (6) Herein below, a mathematical description for a possible control of the image shift ("pan") will be derived. In this respect, Fig. 5 shows the image 10 that is to be displayed and the definition of a few important variables. The (digital) image is in this case rectangular with a width XSLZE and a height YSIZE, which may be measured for example in number of pixels or rows/columns. In the center point of the image 10 there is by definition the origin of an x*,y* coordinate system (this is based on the image 10, unlike the spatially fixed x,y,z coordinate system of Fig. 1 which was based on the monitor 9). Of the image 10, in each case only an image section S can be displayed on the display device 9, the size of which section depends on the current digital zoom factor zoomd(z) or the observation distance z. The width of this displayable section S is px(z) and the height is py(z); the center point of the section S has in the x*,y* system the coordinates (pxc,pyc). Furthermore, as shown in Fig. 1, the current position of the head 2 of an observer 1 in relation to the x,y,z system is described by the coordinates (xn,yh,z), the preferred reference position of the head being given by ( ho=0, Yho=0, z=z0). Furthermore, the maximum horizontal and vertical head movements which are intended to lead to a control are defined by the variables Xh_max and yh_max- Where appropriate, different limits may be defined for the head movements to the left and to the right and up and down. Under these conditions, the center point (pxc, pyc) of the image section S displayed on the display device 9 will be defined as a function of the horizontal/vertical head position xn, yn as described below. First, for the case where the current zoom factor is zoomd(z) < zoomd_f,t (i.e. the entire image 10 fits on the display area 9), pxc = 0 and pyc = 0 are selected. The section S then corresponds to the entire image 10 and is always displayed in a centered manner on the monitor. If the current zoom factor is zoomd(z) > zoorrid fit, that is to say only a section S of the image 10 fits on the monitor, the width px(z) of the section S is calculated in accordance with the formula px(z) = XSIZE • zoomd_f,t / zooπid(z). (7) As the observation distance z decreases, the width px(z) likewise decreases, that is to say the nearer the observer 1 is to the display device 9, the smaller the image section S that can be displayed in a format-filling manner on the display area. The position or shift of the image section S is advantageously limited by the requirement that the entire section S should still lie within the area of the image 10, since a shift that goes beyond the edge of the image 10 would bring zones without any image information onto the monitor 9. Based on this requirement, the maximum values pxcmax and pycmax for the center point coordinates pxc and pyc of the section S are calculated in accordance with the formula: pxcmax(z) = (XSIZE - px(z)) / 2 (8) pycmax(z) = (YSIZE - py(z)) / 2 The position of the center point coordinates of the section S is thus defined as a function of the head position (xn, yh, z) as follows: pxc(z) = Cx (xh/xh_maχ)qx, with |pxc(z)| < pxcmax(z) (9) pyc(z) = Cy (yh/yh_max)qy, with |pyc(z)| < pycmax(z). Here, Cx, Cy, qx and qy are freely definable factors which can be used to define a desired shift behavior. In the present case, a "positive amplification" of the natural image shift (which takes place on the retina) in the event of a head movement by the observer is of particular interest, i.e. the case Cx, Cy > 0 and qx, qy > 0. Fig. 6 shows a graph of the center point coordinate pxc in the x direction as a function of the deviation xn of the head position from the position of rest, based on the constants Cx = XSIZE/2, Cy = YSIZE/2 and qx = qy = 1. A control method based on the above formulae is highly robust, that is to say the desired control tends to be achieved even if the required variables (in particular the head coordinates xn, yh, z) can be determined only relatively imprecisely. In the example of embodiment described, the digital zoom factor zoomd and the choice of the section S for the image display are selected as a function of the head position of the observer 1. In addition or as an alternative to this "position-to-position relation", a "position-to-speed relation" could also be implemented. In this case, a reference region or tolerance region, for example a cube having an edge length of 10 cm, may be defined around a preferred reference position (xno, yho, z0). By definition, given a head position within the reference region, no change in the image display will take place, whereas a movement of the head out of the reference region would lead to a change in the image display as follows: - A movement toward the display device 9 makes the zoom greater, with the digital zoom factor increasing monotonously with the difference |z - zo| over time. - A movement away from the display device makes the zoom smaller, with the digital zoom factor decreasing monotonously with the difference |z - zo| over time. - A movement in the direction of positive values of xn brings about a shift in the image which depends monotonously on the difference |xn - xno|. The same applies in respect of the movement in the direction of negative xn values and with respect to the y direction. With only small relative head movements, it is thus possible for any desired combinations of the zoom and shift parameters to be set. When a new satisfactory parameter combination has been achieved, the observer need only return to the reference region and the image display will again be frozen in that state. According to other developments of the invention, a blink of the eyes or preferably a double blink could be detected by a measurement device and used as a triggering event in a manner comparable to the pressing of mouse buttons. Furthermore, the procedure described may of course also be used for zooming and shifting in three-dimensional images.

Claims

CLAIMS:
1. A method of controlling the display of an image ( 10) on a display device (9), wherein the display is changed as a function of a controlling movement of the head (2) and/or the eyes of an observer (1) such that a natural optical effect associated with the movement is amplified for the observer.
2. A method as claimed in claim 1, characterized in that the amplification of the natural optical effect is produced by a change in the magnification, a shift, a rotation, a distortion and/or a change in the brightness of the image (10) on the display device (9).
3. A method as claimed in claim 1 , characterized in that a movement of the head
(2) and/or the eyes of the observer (1) is controlling only if an additional condition that can be influenced by the observer is satisfied.
4. A method as claimed in claim 3, wherein the additional condition is satisfied if the change in position resulting from a movement of the head (2) and/or the eyes, and/or the movement speed, exceeds a predefined threshold value.
5. A method as claimed in claim 3, wherein the additional condition is satisfied if the head (2) and/or the eyes leave a predefined reference region during the movement.
6. A method as claimed in claim 1, characterized in that the display of the image (10) depends in functional terms on the spatial position of the head (2) and/or the eyes of the observer (1).
7. A method as claimed in claim 1, characterized in that the magnification of the image (10) between a minimum and a maximum magnification is varied when the position of the head (2) and/or the eyes of the observer (1) varies between a maximum (zmax) and a minimum (zmin) distance from the display device (9).
8. A method as claimed in claim 1, characterized in that the relation ctg( ) Z f(z) exists between the viewing angle α at which an observer (1) sees a certain image detail (H) and the distance z of the observer from the display device (9), where/is a monotonous function with/fz > 0 for z > 0.
9. A method as claimed in claim 1, characterized in that the magnification and/or shift in the image (10) during displaying thereof on the display device (9) is limited by limit values which depend on the resolutions of the image (10) and of the display device (9).
10. A method as claimed in claim 1, characterized in that the image (10) is digitized and is displayed in a magnification in which each image pixel is represented by precisely one pixel of the display device (2) or in which the image (10) is displayed in a format-filling manner on the display device (9) if the head (2) and/or the eyes of the observer (1) are located at a predefined reference distance (zo) from the display device (9).
11. A method as claimed in claim 1, characterized in that the control of the display is additionally based on inputs by the observer made by means of hand movements, eye movements, eyelid movements and/or voice.
12. A system for controlling the display of an image (10) on a display device (9), comprising a) a measurement device (3) for detecting a movement of the head (2) and/or the eyes of an observer (1); b) an image generation unit (4) which is coupled to the measurement device and is designed to generate a display of the image (10) on a display device (9), wherein the display is changed as a function of a controlling movement of the head (2) and/or the eyes of an observer (1) such that a natural optical effect associated with the movement is amplified for the observer.
13. A system as claimed in claim 12, characterized in that the measurement device comprises at least one camera (3) for recording the head (2) of an observer (1).
14. A system as claimed in claim 12, characterized in that the image generation unit (4) is designed to display the image (10) on a monitor (9).
15. A system as claimed in claim 12, which is designed to carry out a method as claimed in at least one of claims 1 to 11.
PCT/IB2004/051271 2003-07-29 2004-07-21 System and method for controlling the display of an image WO2005010739A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP03102337.7 2003-07-29
EP03102337 2003-07-29

Publications (1)

Publication Number Publication Date
WO2005010739A1 true WO2005010739A1 (en) 2005-02-03

Family

ID=34089709

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2004/051271 WO2005010739A1 (en) 2003-07-29 2004-07-21 System and method for controlling the display of an image

Country Status (1)

Country Link
WO (1) WO2005010739A1 (en)

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007138394A1 (en) 2006-05-31 2007-12-06 Sony Ericsson Mobile Communications Ab Display based on eye information
WO2007138510A1 (en) * 2006-05-31 2007-12-06 Koninklijke Philips Electronics N.V. Controlling a viewing parameter
EP2065795A1 (en) * 2007-11-30 2009-06-03 Koninklijke KPN N.V. Auto zoom display system and method
EP2116919A1 (en) * 2008-05-09 2009-11-11 MBDA UK Limited display of 3-dimensional objects
US7835498B2 (en) 2005-02-18 2010-11-16 Koninklijke Philips Electronics N. V. Automatic control of a medical device
WO2011003303A1 (en) * 2009-07-10 2011-01-13 Peking University Image manipulation based on tracked eye movement
CN102404584A (en) * 2010-09-13 2012-04-04 腾讯科技(成都)有限公司 Method and device for adjusting scene left camera and scene right camera, three dimensional (3D) glasses and client side
WO2013036236A1 (en) 2011-09-08 2013-03-14 Intel Corporation Interactive screen viewing
WO2013190420A1 (en) 2012-06-19 2013-12-27 Koninklijke Philips N.V. Medical imaging display arrangement
EP2194445A3 (en) * 2008-12-05 2014-03-05 Sony Mobile Communications Japan, Inc. Terminal apparatus, display control method, and display control program
JP2018147455A (en) * 2017-03-07 2018-09-20 シャープ株式会社 Display device, television receiver, display control method, display control program, control apparatus, control method, control program, and recording medium
WO2018186007A1 (en) * 2017-04-05 2018-10-11 シャープ株式会社 Display device, television receiver, display control method, display control program, and recording medium
EP3836539A1 (en) * 2007-10-10 2021-06-16 Gerard Dirk Smits Image projector with reflected light tracking
US11137497B2 (en) 2014-08-11 2021-10-05 Gerard Dirk Smits Three-dimensional triangulation and time-of-flight based tracking systems and methods
US11372320B2 (en) 2020-02-27 2022-06-28 Gerard Dirk Smits High resolution scanning of remote objects with fast sweeping laser beams and signal recovery by twitchy pixel array
US11709236B2 (en) 2016-12-27 2023-07-25 Samsung Semiconductor, Inc. Systems and methods for machine perception
US11714170B2 (en) 2015-12-18 2023-08-01 Samsung Semiconuctor, Inc. Real time position sensing of objects
US12025807B2 (en) 2018-04-13 2024-07-02 Gerard Dirk Smits System and method for 3-D projection and enhancements for interactivity

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5900869A (en) * 1994-07-06 1999-05-04 Minolta Co., Ltd. Information processor system allowing multi-user editing
WO1999035633A2 (en) * 1998-01-06 1999-07-15 The Video Mouse Group Human motion following computer mouse and game controller
US20010040572A1 (en) * 1998-12-01 2001-11-15 Gary R. Bradski Computer vision control variable transformation
US20020001397A1 (en) * 1997-10-30 2002-01-03 Takatoshi Ishikawa Screen image observing device and method
WO2002037412A1 (en) * 2000-10-31 2002-05-10 Malvern Scientific Solutions Limited Method and apparatus for monitoring a target

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5900869A (en) * 1994-07-06 1999-05-04 Minolta Co., Ltd. Information processor system allowing multi-user editing
US20020001397A1 (en) * 1997-10-30 2002-01-03 Takatoshi Ishikawa Screen image observing device and method
WO1999035633A2 (en) * 1998-01-06 1999-07-15 The Video Mouse Group Human motion following computer mouse and game controller
US20010040572A1 (en) * 1998-12-01 2001-11-15 Gary R. Bradski Computer vision control variable transformation
WO2002037412A1 (en) * 2000-10-31 2002-05-10 Malvern Scientific Solutions Limited Method and apparatus for monitoring a target

Cited By (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7835498B2 (en) 2005-02-18 2010-11-16 Koninklijke Philips Electronics N. V. Automatic control of a medical device
US8085902B2 (en) 2005-02-18 2011-12-27 Koninklijke Philips Electronics N V Automatic control of a medical device
WO2007138510A1 (en) * 2006-05-31 2007-12-06 Koninklijke Philips Electronics N.V. Controlling a viewing parameter
EP2021899A1 (en) * 2006-05-31 2009-02-11 Sony Ericsson Mobile Communications AB Display based on eye information
WO2007138394A1 (en) 2006-05-31 2007-12-06 Sony Ericsson Mobile Communications Ab Display based on eye information
EP3836539A1 (en) * 2007-10-10 2021-06-16 Gerard Dirk Smits Image projector with reflected light tracking
EP2065795A1 (en) * 2007-11-30 2009-06-03 Koninklijke KPN N.V. Auto zoom display system and method
EP2116919A1 (en) * 2008-05-09 2009-11-11 MBDA UK Limited display of 3-dimensional objects
EP2194445A3 (en) * 2008-12-05 2014-03-05 Sony Mobile Communications Japan, Inc. Terminal apparatus, display control method, and display control program
US9081414B2 (en) 2009-07-10 2015-07-14 Peking University Image manipulation based on tracked eye movement
US8564533B2 (en) 2009-07-10 2013-10-22 Peking University Image manipulation based on tracked eye movement
WO2011003303A1 (en) * 2009-07-10 2011-01-13 Peking University Image manipulation based on tracked eye movement
US8797261B2 (en) 2009-07-10 2014-08-05 Peking University Image manipulation based on tracked eye movement
CN102404584A (en) * 2010-09-13 2012-04-04 腾讯科技(成都)有限公司 Method and device for adjusting scene left camera and scene right camera, three dimensional (3D) glasses and client side
CN102404584B (en) * 2010-09-13 2014-05-07 腾讯科技(成都)有限公司 Method and device for adjusting scene left camera and scene right camera, three dimensional (3D) glasses and client side
EP2754028A4 (en) * 2011-09-08 2015-08-12 Intel Corp Interactive screen viewing
US9361718B2 (en) 2011-09-08 2016-06-07 Intel Corporation Interactive screen viewing
WO2013036236A1 (en) 2011-09-08 2013-03-14 Intel Corporation Interactive screen viewing
WO2013190420A1 (en) 2012-06-19 2013-12-27 Koninklijke Philips N.V. Medical imaging display arrangement
US11137497B2 (en) 2014-08-11 2021-10-05 Gerard Dirk Smits Three-dimensional triangulation and time-of-flight based tracking systems and methods
US11714170B2 (en) 2015-12-18 2023-08-01 Samsung Semiconuctor, Inc. Real time position sensing of objects
US11709236B2 (en) 2016-12-27 2023-07-25 Samsung Semiconductor, Inc. Systems and methods for machine perception
JP2018147455A (en) * 2017-03-07 2018-09-20 シャープ株式会社 Display device, television receiver, display control method, display control program, control apparatus, control method, control program, and recording medium
WO2018186007A1 (en) * 2017-04-05 2018-10-11 シャープ株式会社 Display device, television receiver, display control method, display control program, and recording medium
JP2018180684A (en) * 2017-04-05 2018-11-15 シャープ株式会社 Display device, television receiver, display control method, display control program and recording medium
US12025807B2 (en) 2018-04-13 2024-07-02 Gerard Dirk Smits System and method for 3-D projection and enhancements for interactivity
US11372320B2 (en) 2020-02-27 2022-06-28 Gerard Dirk Smits High resolution scanning of remote objects with fast sweeping laser beams and signal recovery by twitchy pixel array
US11829059B2 (en) 2020-02-27 2023-11-28 Gerard Dirk Smits High resolution scanning of remote objects with fast sweeping laser beams and signal recovery by twitchy pixel array

Similar Documents

Publication Publication Date Title
WO2005010739A1 (en) System and method for controlling the display of an image
US10182720B2 (en) System and method for interacting with and analyzing media on a display using eye gaze tracking
JP3565707B2 (en) Observer tracking autostereoscopic display device, image tracking system, and image tracking method
EP3608755B1 (en) Electronic apparatus operated by head movement and operation method thereof
CN107632709B (en) Display system and method
US6717728B2 (en) System and method for visualization of stereo and multi aspect images
US6456262B1 (en) Microdisplay with eye gaze detection
CN107515474B (en) Automatic stereo display method and device and stereo display equipment
KR20110028760A (en) Automatic mirror adjustment system and method thereof in vehicle
Toet Gaze directed displays as an enabling technology for attention aware systems
CN110300994A (en) Image processing apparatus, image processing method and picture system
WO2018086399A1 (en) Image rendering method and apparatus, and vr device
JP5016959B2 (en) Visibility determination device
CN111309141B (en) Screen estimation
Badgujar et al. Driver gaze tracking and eyes off the road detection
US20210037225A1 (en) Method of modifying an image on a computational device
CN113568595A (en) ToF camera-based display assembly control method, device, equipment and medium
US6130672A (en) Image processing apparatus
US20200285308A1 (en) A method of modifying an image on a computational device
CN115202475A (en) Display method, display device, electronic equipment and computer-readable storage medium
CN111857461B (en) Image display method and device, electronic equipment and readable storage medium
CN113132642A (en) Image display method and device and electronic equipment
JP2014102668A (en) Display device
JP3325323B2 (en) Display device
JP2005071041A (en) Device and system for detecting visual object of driver

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
122 Ep: pct application non-entry in european phase