WO2000075860A1 - Dispositif d'ecriture/affichage electronique et technique d'exploitation - Google Patents

Dispositif d'ecriture/affichage electronique et technique d'exploitation Download PDF

Info

Publication number
WO2000075860A1
WO2000075860A1 PCT/EP1999/003921 EP9903921W WO0075860A1 WO 2000075860 A1 WO2000075860 A1 WO 2000075860A1 EP 9903921 W EP9903921 W EP 9903921W WO 0075860 A1 WO0075860 A1 WO 0075860A1
Authority
WO
WIPO (PCT)
Prior art keywords
picture elements
location
active point
reference system
active
Prior art date
Application number
PCT/EP1999/003921
Other languages
English (en)
Inventor
Roberto Battiti
Alessandro Garofalo
Original Assignee
Soffix, S.R.L.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Soffix, S.R.L. filed Critical Soffix, S.R.L.
Priority to PCT/EP1999/003921 priority Critical patent/WO2000075860A1/fr
Publication of WO2000075860A1 publication Critical patent/WO2000075860A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0428Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by sensing at the edges of the touch surface the interruption of optical paths, e.g. an illumination plane, parallel to the touch surface which may be virtual

Definitions

  • the present invention relates to electronic writing/display apparatus .
  • Different modes of interacting and communicating are converging toward integrated systems, with the provision of apparatus for the presentation of information (monitor and display screens, overhead projectors, computer controlled video projectors, loudspeakers) and for the interaction between the participants and the system (keyboard and mouse interfaces, video cameras, laser and infra-red pens, surface-contact detectors, microphones, etc.).
  • the transmission and management of information exchanged between the various subsystems of the integrated system is often handled by a computer, connected to the different apparatuses and capable of controlling them.
  • US-A-5 274 362 describes a method of determining horizontal X and vertical Y coordinates in an electronic blackboard system having two electronically conductive surfaces, at least one of which is flexible enough to permit contact between the first and the second surface, with electrodes ar- ranged on the two surfaces at right angles to each other. Contact between the first and the second surface is determined by applying currents and measuring voltage differences at selected electrodes on the two surfaces. In addition, a separate self-calibration mode is activated if no contact is determined by the usual mechanism of the system.
  • US-A-4 803 564 describes an electro-mechanical system to read information written on an existing writing board that may be attached to a wall . The system is comprised of a photoelectric conversion device mounted on a driving unit moving horizontally. The driving unit moves on a guide rail secure do angle irons fixed to the information writing board .
  • US-A-5 305 114 describes apparatus which reads a writing on a writing sheet fitted on a blackboard framework and which electronically copies the writing onto a recording medium.
  • the apparatus includes a foldable framework, the writing sheet, a stand to support the framework and a copying device .
  • US-A-4 858 021 describes a photoelectric device supported by a horizontal guide rail which is turn supported by the blackboard. A drive motor and roller drive the device by direct contact of the roller with the blackboard surface .
  • US-A-3 761 620 represents an example of a graphical input device, where a two-dimensional matrix of semiconductors is arranged in an ordered array as a flat light emitting and light sensing device. A penlight is used to activate the light sensing semiconductors to achieve a graphical input.
  • EP-A-0 372 467 describes an electrostatic blackboard with an image display function for displaying a visible toner image, including a copying apparatus for transferring the visible image on the blackboard do a large-size paper shee .
  • 0-A-97/06963 describes a writing board with a mechanism to erase what has been written by writing instruments using an ink. The erasing mechanism operates an erase switch to convey the recording medium and actuate a pump. The cleaning liquid in a storage container is circulated through the cleaner via tubes attached to surface of the recording medium being conveyed.
  • a first drawback is related to the "bothersome" presence of technology.
  • the interaction tools tend to distract the concentration of the user from the purposes of use (interaction, information exchange, knowledge enrichment) toward the technological means.
  • a training phase is required; in other cases a third person (a so-called director) is required to manage the proper acquisition and presentation of multimedia information.
  • Installation and calibration may also represent a source of concern. Quite frequently, installation and calibration by qualified technical personnel is required before use of the system by final users is made possible. This is the case when large and complex systems are installed, including boards to be mounted with high accuracy, high-precision lenses, etc. In other cases, systems based on traditional computer vision techniques must be calibrated through the precise knowledge of the relative positions and orienta- tions of the visual sensor and display units.
  • Fragility of the devices involved and the necessity of a controlled environment need sometimes to be taken into ac- count.
  • the devices involved are quite sensitive (for example, some optical pens have a high probability of being damaged if dropped by accident) .
  • the use of some electronic blackboards is made difficult because of the constraints on the physical environment, for example they need a large entrance door, or a large space for their installation.
  • the present invention aims at solving the problems outlined in the foregoing by means of a system having the features set forth in the annexed claims.
  • the invention also relates to the respective method of operation.
  • the system of the invention is comprised of two optical sensors, a projector and a computer connected to the projector and to the optical sen- sors.
  • the surface in ques- tion may be simply represented e.g. by a screen such as a traditional projection screen or simply a wall surface already available at the location where the system of the invention is used.
  • the tracer element may simply be e.g. a stick, a pen (including a light pen) already available to the user of the system or even just by the finger of the user himself or herself.
  • the optical sensors are arranged in order to monitor that part of the surface (in the following referred to as the active surface) on which the image is projected by the projector.
  • the computer is preferably provided by hardware/software modules for self-calibration, acquisition and processing of the images from the optical sensors and controlling the projector.
  • interaction with the system occurs through the movement of the operator's hand.
  • a pointed extremity for example a finger of the hand, representing an image distinguishable over the background of the active surface, touches or approaches that surface
  • the coordinates Px and Py of the contact point are calculated in a continuous manner by the system comprised of the optical sensors and the computer, and thus made available to be used as input data for the different applications installed on the computer.
  • the coordinates of the active writing point can be stored into a random-access memory (RAM) of the computer.
  • some specific active zones of the active surface can be associated to commands to be executed where these zones are touched: in this case the finger of the hand can be used as a substitute for peripheral such as a mouse used for the selection in a computer-controlled system.
  • the invention thus substantially mitigates the disadvantages outlined in the foregoing.
  • the intrusive pres- ence of technology is dispensed with, whilst interaction of the user with the system becomes more natural.
  • Specific technical competence and training are no longer required: the user simply activates the system and initiates interaction therewith with the capability of exclusively focusing onto the training/information purposes of the session and not onto the technical means .
  • the installation phase is simplified (placement of the projector and the sensors associated therewith as well as connection to the computer, without the need of precise measurements) and calibration is completely automated.
  • the apparatus of the invention is not overly sensitive and can be used in quite diverse environments.
  • a surface not necessarily a flat one
  • the sensors that monitor that surface in order to detect the location of tracer member.
  • the flexibility of use of the invention is high.
  • the same apparatus can be used on a desk, with the active surface being a part of the desk surface or an electronic drawing board, through a traditional white board, through a fold- able panel for the projection or, simply, through a piece of smooth but not necessarily flat wall surface.
  • use with a display unit is possible, to obtain a way of operation similar to touch- screen operation based on electro- magnetic principles. The portability is enhanced by the low weight of the single components.
  • FIG. 1 schematically depicts interaction between the user and a system according to the invention
  • - figure 2 grammatically shows the matrix of pixels in a projector for use in the system of the invention
  • - figure 3 including four sections designated 3a to 3d, respectively, shows definition of certain reference systems for use in the invention
  • - figure 4 illustrates location of the active writing point at the active surface in the system of the invention
  • - figure 7 shows determination of the active writing point in the system of the invention
  • - figure 8 schematically shows one way of eliminating interference between motion of the active writing point and motion of the projected image
  • - figure 9 shows an alternative approach for interaction between the user and the system of the invention
  • - figure 10 shows a further alternative approach for interaction between the user and the system
  • Apparatus for electronic writing/display (e.g. a so-called “electronic blackboard”) according to the invention is designated 1 overall in figure 1.
  • Apparatus 1 is intended for interaction with an operator H capable of generating graphical information (i.e. writing and/or drawing symbols/images) on a surface 7 by means of a pointed member F such as e.g. a pen, a light-pen, a stick or simply a finger.
  • a pointed member F such as e.g. a pen, a light-pen, a stick or simply a finger.
  • Surface 7 is shown herein as being a flat or substantially flat rectangular surface but may be notionally of any shape enabling writing or drawing and display thereon.
  • Apparatus 1 further includes a projector 6 for projecting graphical information onto surface 7 under the control of a processing unit 8 such as a computer.
  • a processing unit 8 such as a computer.
  • Optical sensors 3 and 5 are arranged in order to "frame" surface 7 and generate respective signals indicate of the position of pointed member F (i.e. the tracer member) as explained in greater detail in the following.
  • sensors 3 and 5 are preferably located at the upper corners of surface 7, still preferably by resorting to pods, brack- ets or arms 2 and 4 ensuring that sensors 3 and 5 at least slightly protrude towards operator H thus being positioned at a certain distance from surface 7.
  • the image is projected by the optics of the system into a sensor matrix of dots (picture elements or pixels) composed of elementary surface portions characterized by a single level of intensity (one for each color in the case of color-sensitive sensors) .
  • the cap- tioned sensor matrix can be conveniently comprised of a CCD element of the kind currently used in solid-state cameras.
  • the sensed matrix of intensity values is then transmitted in the form of a respective signal to computer 8.
  • projector 6 is controlled by computer 8 in order to define a matrix of intensity values, one for each pixel of the image. These values are then transformed into the projected light beams by the electronic and optical sections of the projector.
  • a simplified illustration of the operation projector 6 is provided in figure 2. All of the foregoing corresponds to current technology, not requiring detailed description herein.
  • Three reference systems, illustrated in figures 3b to 3d, respectively, are contemplated for use within the framework of the invention.
  • the first system (L - figure 3b) is associated to the pixel matrix of the left optical sensor 3.
  • the second system (R - figure 3d) to the matrix of the right optical sensor 5.
  • the third system (P - figure 3c) is associated to the projection matrix of projector 6.
  • a single pixel in the cited matrices is therefore characterized by two coordinates in the corresponding reference system.
  • a physical point in the three-dimensional space, in the working space visible by both optical sensors 3, 5 and such that it can be illuminated by the light beam originating from projector 6, is therefore associated to three points in the three reference systems described in the foregoing. The physical point will thus be projected by the optical system of the left optical sensor 3 onto a point with coordinates (Lx, Ly) in the reference system L.
  • the same point will have two coordinates (Rx, Ry) in the reference system R.
  • the same point will be repered by two coordinates (Px, Py) in the system P of projector 6.
  • Px and Py are the coordinates of the pixel of the matrix to which a maximum intensity value is to be associated if a light beam originating from the lens of the projector 6 and passing through the given physical point is needed.
  • Basic principles of the invention In the following it will be assumed that the physical point is visible by both optical sensors 3, 5 and that it can be reached by a light beam originated from projector 6 without any obstacles therebetween. This point will be referred to in the following as the active point AP.
  • Each active point in the working space of the invention has associated therewith six numerical values: Rx, Ry, Lx, Ly, Px, Py.
  • the values are calculated - in a known manner - starting from the signals generated by sensors 3 and 5 and stored in a memory area such as a RAM of computer 8.
  • the associations between active points and the values Rx, Ry, Lx, Ly, Px, Py are also subject to small inaccuracies due to the finite precisions of the processing steps executed (in a digital manner) in the optical sensors 3 and 5 and/or in computer 8. For example, round-off errors are generated if the six coordinates are represented with integer values (therefore eliminating the digits after the floating point in the number) . Standard techniques are however available for dealing with the inaccuracies/errors thus introduced.
  • the exact intersection of two lines in the three-dimensional space is substituted with the point where the distance between the two lines is at a minimum value (actually the minimum distance criterium determines two points, one on each line, and then one calcu- lates the point at the middle of the segment connecting the two previous points as the point of approximated intersection) .
  • the invention is based on the determination of the event given by contact between the active point (i.e. the writing point) and the active surface 7 and how the coordinates Px and Py in the pixel matrix of the projector 6 are calculated after the coordinates (Lx, Ly) and (Rx, Ry) of the two images of the writing point are acquired by the two optical sensors 3 and 5.
  • the existence of certain mathematical relationships i.e. functions
  • the system of the invention determines the presence or absence of contact (or close proximity, see below) between the active writing point AP and the active surface 7.
  • That information can thus be made explicit through the definition of certain mathematical functions and through the construction of approximations of these functions after starting from a set of examples, i.e. from a set of associations between input and output values, obtained during a initial self -calibration phase.
  • a simplified presentation of the foregoing may be based on the reduction to a simplified bi-dimensional model where the active point of interest is given by a black and a pointwise object on a white background and, for the sake of clarity, the horizontal plane passing through the working space will be considered also assuming that the optical sensors 3 and 5 are uni-dimensional, such that a pixel position is determined only by the coordinate Lx (for the left-hand sensor) and Rx (for right-hand sensor) .
  • the situation is schematically illustrated in figure 4. Once the coordinate Lx is fixed, the possible positions of the active point will be given by a straight line originating from the left-hand optical sensor with a given direc- tion. The specific direction depends on Lx and on the optical system.
  • the reference systems L and R are arranged in such a way that a movement of the active point in the right direction will imply an increase of the values Lx and Rx. If, in addition to the coordinate Lx, the coordinate Rx associated with the same active point is also known, the physical location of the point can be determined from the intersection of the two straight lines (consider for example point PI in figure 4 , obtained by the intersection between the continuous straight line originating from L and one of the dashed lines originating from R) .
  • CONTACT is implemented by a hardware/software module in computer 8 which, starting from the value of Lx, calculates the value Rx .
  • computer 8 Given an active point, computer 8 obtains the coordinates Lx and Rx of the point starting from the signals generated by optical sensors 3 and 5.
  • a threshold value (defined as contact threshold) and to determine the contact on the base of a comparison of the difference Delta with this threshold.
  • the value of the threshold can be determined when the invention is constructed, depending on the precision of the optical sensors (number of pixels) , and depending on the request of the user. It can also be made selectively adjustable in order to render a condition of proximity or close proximity equivalent to contact proper (see also below) .
  • the method just described can be easily generalized to the multidimensional case of the application, therefore considering also the coordinates along the y axes of the pixel matrices of the optical sensors .
  • the functions of interest that calculate the coordinates in the reference system of an optical sensor starting from the coordinates of the other sensor and from the requirement that the active point should be at contact with the surface, are the following:
  • CONTACTRx CONTACTRy the terminal part of the name is a mne- motic label whose meaning is as follows: CONTACTLx calcu- lates the coordinate Lx (henceforth the label) starting from the coordinates of the right-hand sensor, CONTACTLy calculates the coordinate Ly starting from the coordinates of the right-hand sensor, etc.
  • the system To determine the presence or absence of contact (or close proximity) , the system considers all four optical coordinates (Lx, Ly, Rx, Ry) associated with the active point, calculates one of the above functions defined in the foregoing, and compares the difference between the value obtained by the function and the value of the corresponding coordinate with the value of the contact threshold.
  • the difference calculated by the system is Lx - CONTACTLx (Rx, Ry) .
  • one of the possible uses of the system of the invention is to realize a so-called virtual pen, i.e. the user moves a tracer member (e.g. a finger of the hand or a stick with a color different from the background) thus defining an active writing point and a projector projects a light beam in correspondence to the position subsequently identified by the tracer member.
  • a tracer member e.g. a finger of the hand or a stick with a color different from the background
  • the notation indicates which coordinate is calculated (Px or Py) and which are the input coordinates (those of the left optical sensor or those of the right optical sensor) . Let us assume that the system has stored in memory a list of coordinates
  • a machine-learning system can use the above examples to generalize the association between inputs and outputs, in this way producing an approximation of the desired function.
  • the above functions may be realized through a mechanism of machine learning from examples based on the use of neural nets.
  • these functions are constructed in an automated way during a preliminary setup phase .
  • projector 6 projects one after another, under control from computer 8, a series of images comprised e.g. of a single contrasting point against the background of surface 7, e.g. e single black point on a white background.
  • the positions of the black point in the different images are obtained by varying the coordinates Px and Py of the pixel matrix of projector 6 so that they assume values corresponding to a set of points that covers the matrix in a uniform way.
  • the pixel matrix of the projector (figure 5b) is comprised of 512 x 512 pixels, with values Px and Py ranging from 0 to 511, the values considered for the cou- pies (Px, Py) could be (0,0), (0,16), (0,32), (16, 0),
  • Each image is acquired by the optical sensors 3 and 5 and transmitted to computer 8, which, starting from the pixel matrices acquired by both optical sensors, calculates the coordinates (Lx, Ly) and (Rx, Ry) of the pixel with the lowest intensity presence in the pixel matrix associated with the left and right sensors, corresponding to the image of the point projected (see figures 5a and 5c) .
  • Computer 8 stores the series of data (Px, Py, Lx, Ly, Rx, Ry) e.g. in a RAM area provided therein. After all the images have been projected, these data are used by a hardware/software component to construct the required func- tions .
  • the required functions can be calculated with traditional fitting techniques based on polynomials with a degree sufficiently high in order to ensure that non linearities in the system are compensated.
  • a method is known, e.g. from W.H. Press, B. P. Flan- nery, S.A. Teukolsky and W.T. Vetterling. Numerical Recipes in C, Cambridge University Press, 1988.
  • the required functions are approximated through flexible representations, also known as "network of functions" or "neural networks". These representations are constructed starting from a set of examples by machine learning mechanisms . Some examples of these networks of functions are illustrated in figures 6a and 6b. It is known that neural nets can be used to realize compu- tational architectures that compute the solution to a problem (in the instant case the problem is that of determining the above described functions) starting from a set of in- put-output associations that have been stored into memory and used for an automated training phase.
  • Training is executed by minimizing the above function with respect to its parameters ("weights") x: when smaller values are obtained the obtained output values and the target values tend to become similar. It is also important to ensure that the total number of parameter is limited, so that the network can generalize in an appropriate way to new cases, not considered during training.
  • the one- step- secant method OSS is a variation of what is called one-step (memory-less) Broyden-Fletcher-Goldfarb- Shanno method.
  • the one-step method requires only vectors computed from gradients g of the function f.
  • a ⁇ c — ( H ⁇ , vIvA s c Sc , y ⁇ 9c .
  • R ._ s ⁇ —yc j s ⁇ y c 1 s ⁇ —y c , &c —
  • the search direction is the negative gradient at the beginning of learning and it is restarted to -g c every N steps (N being the number of weights in the network) .
  • the one-step secant algorithm can thus be described in the form of the program excerpt reproduced in the table below. TABLE 1 - The one-step secant algorithm
  • d t is the directional derivative of E along d.
  • G ⁇ cr is a constant equai to 0.5, used to multiply the directional derivative.
  • MAX_TRIALS is equal to 10.
  • L ⁇ cr is a constant equal to 1.1.
  • L decr is a constant equal to 0.5.
  • the learning rate is decreased by L dec , after each unsuccessful trial.
  • Quadratic interpolation is not wasting computation, in fact, after the firs trial one has exactly the information that is needed to fit a parabola: the value of E 0 and E' 0 at the initial point and the value of E ⁇ at the trial point .
  • the parabola P (x) is:
  • the ⁇ m ⁇ n that minimizes the parabola is less than ⁇ .
  • computer 8 determines the values of the internal parameters of the neural network, parameters that are going to remain fixed for the subsequent use of the system: see for example the back-propagation technique of D. E. Rumelhart, G.E. Hinton and R. J. Williams "Learning internal representations by error propagation" in D . E. Rumelhart and J. L. McClelland (Eds.) Parallel Distributed Processing: Explorations in the Microstructure of Cognition . Vol . 1 : Foundations, MIT Press, 1986 or the one- step secant technique de ⁇ scribed in the foregoing.
  • the system After the preliminary self-calibration phase is completed, the system enters a mode of operation corresponding to proper identification and tracking of the active writing point AP.
  • the active point AP may be represented by different entities depending on the different applications.
  • the interaction is mediated through the moving finger of the operator, and the optical sensors 3, 5 are thus capable of distinguishing the intensity (or the color - in the case of color-sensitive optical sensors) of the finger against the background of surface 7.
  • the optical sensors are preferably located in the upper zone of the active surface, supported by pods, brackets or arms 2, 4 protruding towards the operator whilst computer 8 can identify, for each optical sensor, the point of lowest intensity (below a suitable threshold) that is in the highest position in the pixel matrix: see figure 7. From the images of the left and right sensors 3 and 5, the system derives therefore the four coordinates (Lx, Ly, Rx, Ry) corresponding to the active point.
  • identification of active point AP may be made simpler and more reliable by the use of suitable indicators, e.g. by using, as the tracer member F, an object, such as a pen of a color which is not present in other parts of the working space.
  • a factor to be considered and dispensed with is the possible interference between the motion of the writing point and the motion of the image projected onto active surface 7.
  • the use of the system to realize a virtual pen can be considered: the user traces with a tracer element (such as a pen or a point extremity or a finger) a trajectory onto active surface 7 and, at the same time (with a delay which is not perceived by the user) , projector 6 projects the image of this trajectory onto the same active surface.
  • a tracer element such as a pen or a point extremity or a finger
  • Active point AP is identified (through the determination of the low-intensity point with the highest y coordinates or of the moving point with the highest coordinates) whilst the optical sensors may not distinguish between images of a physical object (i.e. the writing point) and images projected by projector 6 (the projected light beam impinging onto active surface 7) .
  • the system may thus be possibly misled in the identification of the active point, by considering as the active point one of the points of the writing projected.
  • the projected points can be of low intensity (e.g. writing of black color) in the upper portion of the image and in motion. For the sake of example let us consider in particular the projection of the last point just "written" by the user. Under the circumstances, the system may become unstable be- cause of the following mechanism.
  • the active point AP identified by the system is the last point written on active surface 7 by the light beam projected by projector 6 with coordinates Pxl and Px2. Because of the possible inaccuracies in determination of the point, it is possible that the point is determined with certain errors, i.e. epsilon 1 and epsilon 2, leading to (Pxl + epsilon 1) and (Px2 + epsilon2) .
  • computer 8 causes projector 6 to project a minimum intensity beam with the given coordinates (assuming that writing is with a black color) . Therefore, the mechanism causes a new black point to be written on active surface 7.
  • the tracking module for determining active point AP will then detect a displacement (from Pxl and Px2 to Pxl + epsilon 1 and Px2 + epsilon 2) , as causing a new black point to be written etc.
  • a displacement from Pxl and Px2 to Pxl + epsilon 1 and Px2 + epsilon 2 , as causing a new black point to be written etc.
  • the user can observe that the trajectory of writing becomes uncontrolled: the light of projector 6 writes additional points without being prompted to do so by the user .
  • interference between the writing point and the image written by projector 6 may be dispensed with decou- pling the written image from the image used to determine active point AP.
  • computer 8 maintains in its memory modules to images, denoted Imagel and Image2.
  • Imagel contains the trace of the writing trajectory, while Image2 contains a pixel matrix with uniform intensity values.
  • Two synchronization signals determine the projection by projector 6 of the two images in two different time frames.
  • Imagel is projected most of the times, while Image2 is pro- jected only while the image is being captured by optical sensors 3 and 5. Given the uniformity of image 2, the brevity of its perma- nence on active surface 7 and the characteristics of temporal integration of the human visual system, the user will see in a conscious way only Imagel .
  • the images captured by the optical sensors 3 and 5 during projection of Image2 will not contain any sign of the projecting writing: therefore, the determination of active point AP will not be misled by any interference.
  • the permanence time of Image2 depends on the acquisition time of the sensors. In any case, optical sensors available commercially and having acquisition times lower than a hundredth of a second were found to be thoroughly satisfactory for use within the system of the invention.
  • An alternative solution provides for interference being dispensed with by distinguishing the projected writing from active point AP for example by projecting the writing with a color and/or an intensity level that are different from the color and/or the intensity level of tracer member F, i.e. the pen.
  • display means can be comprised of a liquid-crystal panel, or a display panel based on a different technology, or by a traditional computer monitor (i.e. a cathode ray tube) with appropriate size and dimensions.
  • computer 8 transmits the pixel matrix (intensity level for the possible values of the coordinates Px and Py) to the display unit instead of projector 6.
  • Optical sensors such as sensors 3 and 5 can be substituted by other means for determining the position of a tracer member, for example by detecting pressure from a pen's tip, light from a light-emitting pen or by detecting electrical changes as in "touch-screen" systems.
  • calibration of the system can be executed by projecting an image consisting of a number of calibration points (for exam- pie black points on light background) and by clicking with the pen at the positions where the different points are projected.
  • calibration can be exe- cuted by projecting the same calibration pattern and observing an image of the calibration pattern and of the display border .
  • the optical sensing means can frame a display positioned on a table, at whatever position chosen by the user.
  • the apparatus together with a writing tablet is possible, where the image is projected onto the tablet, as illustrated in figure 11.
  • the position- detecting sensors may be included in the tablet (whereby no optical sensors are needed) .
  • the image projected onto the tablet after automated calibration of the system, is used as a feedback signal for the user, avoiding the need to look at a separate display (such as a computer monitor) .
  • Some tablets admit the possibility of using a sheet of paper on their surface in order to provide visual feedback to the writing person.
  • the invention offers the possibility to have a feedback for operations such as moving objects or "cut and paste” operations, or projecting multimedia content, which cannot be achieved with a static sheet of paper.
  • the writing tablet can contain dynamic sensing capability, including pressure, tilt and height of the pen.
  • the pen can have multiple but- tons and be either corded or cordless. Calibration can be effected as it was described previously in a manner that avoids the use of visual sensors. In this last case, the position of the reference points projected by the projector in the reference system of the writing tablet is given by clicking on the tablet with a suitable pen, acquiring and storing the position and proceeding as it was described by machine learning techniques or by polynomial fitting tech- niques .
  • the use of more than two sensors is possible, to increase the spatial resolution in the case of sensors with a limited number of pixels.
  • the sensor can be mounted on a mov- ing support, so that their orientation and position can be controlled through motors controlled by computer 8. In this way, the same sensor can vary its position and/or rotation angles with respect to active surface 7.
  • this can be equipped with op- tical systems such as the focus and the magnification can be controlled by computer 8.
  • single optical sensor 3 can be placed on the ground facing upwards, so that the acquired image is a strip corresponding to an area closed to the active surface. If the ceiling of the room is white, the presence of an object with a lower intensity permits to determine that the active point (for example a hand or a pointed extremity) is touching the active surface 7. Only one coordinate is available, so that the mechanism is suitable for interacting with a one- dimensional projected strip, like a one-dimensional toolbar placed horizontally.
  • the connections between the different components of the system (computer 8, projector 6, optical sensors 3, 5) can be realized with wireless technologies, e.g. through infrared or radio signals.
  • the use of traditional image processing and enhancement techniques for the images captured by the optical sensors can be added, to reduce the effect of statistical noise in the sensors, to ameliorate the distribution of intensity values of the image, to identify edges, to identify moving parts, etc.
  • the accuracy achieved during the calibration phase can be increased through the projection of patterns different from simple black points for examples black circles of a certain radius .
  • the system can therefore determine the position of the center of these patterns, captured by the optical sensors, with interpolation techniques, therefore obtaining a degree of positional accuracy better than the dimension of a single pixel.
  • Function approximation techniques different from the neural networks can be used for constructing the functions defined for location of the active point. For example, lookup-tables or polynomials interpolating between the data points obtained during projection of the calibration patterns can be used.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Input By Displaying (AREA)

Abstract

Ce dispositif, permettant d'afficher une information graphique sur une surface (7), information écrite au moyen d'un traceur (F) définissant un point actif, comprend un projecteur (6), des capteurs optiques (3, 5) servant à détecter l'emplacement du point actif sur ladite surface (7) et une unité de traitement de données (8) commandant le fonctionnement du projecteur (6) afin que celui-ci produise des éléments d'image au niveau des points respectifs de ladite surface (7) correspondant, de manière univoque, aux emplacements où se trouve le traceur (F) comme détecté par les capteurs optiques (3, 5). L'unité de traitement (8) est conçue pour permettre l'auto-étalonnage du système, ce qui évite d'avoir recours à des mesures prises par l'homme, à des réglages ou à l'installation de dispositifs importants et manquant de souplesse. L'écriture et l'interaction avec le système se font de façon naturelle, éventuellement avec simplement le doigt comme traceur (F), ce qui n'exige pas un apprentissage poussé ni la possession de compétences techniques. Le retour visuel de l'information est fourni à l'utilisateur (H) par le faisceau lumineux projeté par le projecteur (6).
PCT/EP1999/003921 1999-06-08 1999-06-08 Dispositif d'ecriture/affichage electronique et technique d'exploitation WO2000075860A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/EP1999/003921 WO2000075860A1 (fr) 1999-06-08 1999-06-08 Dispositif d'ecriture/affichage electronique et technique d'exploitation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/EP1999/003921 WO2000075860A1 (fr) 1999-06-08 1999-06-08 Dispositif d'ecriture/affichage electronique et technique d'exploitation

Publications (1)

Publication Number Publication Date
WO2000075860A1 true WO2000075860A1 (fr) 2000-12-14

Family

ID=8167324

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP1999/003921 WO2000075860A1 (fr) 1999-06-08 1999-06-08 Dispositif d'ecriture/affichage electronique et technique d'exploitation

Country Status (1)

Country Link
WO (1) WO2000075860A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2472922A (en) * 2009-08-19 2011-02-23 Compurants Ltd A combined table and computer-controlled projector unit
US20160316186A1 (en) * 2015-04-21 2016-10-27 Dell Products L.P. Continuous Calibration of an Information Handling System Projected User Interface
US10139854B2 (en) 2015-04-21 2018-11-27 Dell Products L.P. Dynamic display resolution management for an immersed information handling system environment
US11243640B2 (en) 2015-04-21 2022-02-08 Dell Products L.P. Information handling system modular capacitive mat with extension coupling devices

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0571702A2 (fr) * 1992-05-26 1993-12-01 Takenaka Corporation Dispositif de pointage avec la main et ordinateur mural
EP0622722A2 (fr) * 1993-04-30 1994-11-02 Rank Xerox Limited Système de copie interactive
EP0626636A2 (fr) * 1993-03-16 1994-11-30 Hitachi, Ltd. Interface utilisateur pour un système de traitement d'information
US5528263A (en) * 1994-06-15 1996-06-18 Daniel M. Platzker Interactive projected video image display system
EP0866419A2 (fr) * 1997-03-21 1998-09-23 Takenaka Corporation Dispositif de pointage utilisant l'image de la main

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0571702A2 (fr) * 1992-05-26 1993-12-01 Takenaka Corporation Dispositif de pointage avec la main et ordinateur mural
EP0626636A2 (fr) * 1993-03-16 1994-11-30 Hitachi, Ltd. Interface utilisateur pour un système de traitement d'information
EP0622722A2 (fr) * 1993-04-30 1994-11-02 Rank Xerox Limited Système de copie interactive
US5528263A (en) * 1994-06-15 1996-06-18 Daniel M. Platzker Interactive projected video image display system
EP0866419A2 (fr) * 1997-03-21 1998-09-23 Takenaka Corporation Dispositif de pointage utilisant l'image de la main

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2472922A (en) * 2009-08-19 2011-02-23 Compurants Ltd A combined table and computer-controlled projector unit
GB2472922B (en) * 2009-08-19 2013-09-25 Compurants Ltd A combined table and computer-controlled projector unit
US20160316186A1 (en) * 2015-04-21 2016-10-27 Dell Products L.P. Continuous Calibration of an Information Handling System Projected User Interface
US10139854B2 (en) 2015-04-21 2018-11-27 Dell Products L.P. Dynamic display resolution management for an immersed information handling system environment
US11106314B2 (en) * 2015-04-21 2021-08-31 Dell Products L.P. Continuous calibration of an information handling system projected user interface
US11243640B2 (en) 2015-04-21 2022-02-08 Dell Products L.P. Information handling system modular capacitive mat with extension coupling devices

Similar Documents

Publication Publication Date Title
EP1611503B1 (fr) Systeme tactile a alignement automatique et procede correspondant
US6292171B1 (en) Method and apparatus for calibrating a computer-generated projected image
JP4822643B2 (ja) 無線ポインタの光学トラッキングを備えるコンピュータ・プレゼンテーション・システムおよび方法
US8120596B2 (en) Tiled touch system
US5764217A (en) Schematic guided control of the view point of a graphics processing and display system
US6554431B1 (en) Method and apparatus for image projection, and apparatus controlling image projection
US6704000B2 (en) Method for remote computer operation via a wireless optical device
KR101016136B1 (ko) 프로젝터 어레이 정렬 방법 및 시스템
US7372456B2 (en) Method and apparatus for calibrating an interactive touch system
EP1739528B1 (fr) Méthode pour un système tactile à caméra
EP2333639B1 (fr) Système tactile basé sur une caméra et procédé
US20140118255A1 (en) Graphical user interface adjusting to a change of user's disposition
EP1550940A2 (fr) Suivi de pointeur sur plusieurs sous régions d'entrée de coordonnées se chevauchant, définissant une région généralement contigue.
JPH11513483A (ja) 位置及び方位を決定する方法及び装置
US20100188355A1 (en) Apparatus and method for detecting an object pointed by a user
CN105302381B (zh) 红外线触摸屏精准度调整方法及装置
WO2000075860A1 (fr) Dispositif d'ecriture/affichage electronique et technique d'exploitation
Kim et al. Real‐Time Human Shadow Removal in a Front Projection System
JPH05153532A (ja) 光学的コンピユータ入力システムを幾何学的に補正する方法及び装置
Sánchez Salazar Chavarría et al. Interactive 3D touch and gesture capable holographic light field display with automatic registration between user and content
CN212781988U (zh) 一种多面拼接投影的高精度触摸交互系统
WO1998047406A2 (fr) Table de travail interactive
KR20100120902A (ko) 터치 디스플레이 시스템
CN116797666A (zh) 一种投影显示场景的激光无线鼠标交互控制方法、系统、装置及介质
Karahan et al. A New 3D Line of Gaze Estimation Method with Simple Marked Targets and Glasses

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): JP US

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE

121 Ep: the epo has been informed by wipo that ep was designated in this application
WA Withdrawal of international application