GB2377607A - Analysing and displaying motion of hand held instrument - Google Patents

Analysing and displaying motion of hand held instrument Download PDF

Info

Publication number
GB2377607A
GB2377607A GB0210261A GB0210261A GB2377607A GB 2377607 A GB2377607 A GB 2377607A GB 0210261 A GB0210261 A GB 0210261A GB 0210261 A GB0210261 A GB 0210261A GB 2377607 A GB2377607 A GB 2377607A
Authority
GB
United Kingdom
Prior art keywords
substrate
signal
hand
image
computer
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB0210261A
Other versions
GB0210261D0 (en
Inventor
Djordje Brujic
Iain Ainsworth
Veselin Vracar
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SURREY ADVANCED CONTROL Ltd
Original Assignee
SURREY ADVANCED CONTROL Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SURREY ADVANCED CONTROL Ltd filed Critical SURREY ADVANCED CONTROL Ltd
Publication of GB0210261D0 publication Critical patent/GB0210261D0/en
Publication of GB2377607A publication Critical patent/GB2377607A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0425Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected

Abstract

The motion of a hand held instrument such as stylus or pen (<B>10</B>), or laser pointer (<B>10a</B>), in relation to a screen (<B>25</B>) is monitored by a signal sent from the instrument to a computer (<B>22</B>). The signal may be received by IR camera (<B>21</B>). The motion of the instrument may be recorded and displayed upon visual display unit (<B>23</B>). The instrument may leave an image on the surface e.g. in ink, or the drawn image may be projected from projector (<B>20</B>) under control of the computer. The instrument may be pressure sensitive so that a motion signal is sent only when the instrument is in contact with the surface. Perspective adjustments may be performed to both motion analysis and image projection (see <B>figs. 7 and 8</B>).

Description

<Desc/Clms Page number 1>
A HUMAN-COMPUTER INTERFACE WITH A VIRTUAL TOUCH SENSITIVE SCREEN The present invention relates to a method and apparatus for enabling the motion of a hand-held instrument such as a pen, or a projected point such as that generated by a laser pointer, to be recorded and the data read by a computer system, the computer system being capable of providing via a visual display unit, an image representing the movements of the hand-held instrument. More particularly, the invention relates to a method and apparatus for translating handwritten or drawn data provided on any substrate into a computer generated image on a visual display unit.
The flooding of information that hits us day by day i. e. e-mail, news and web, requires everybody to be able to get familiar with and be able to use intuitively the information and communication technology. Demands and requirements of human beings with their different abilities have to be taken more and more into account, especially for those who are not familiar with using new technology.
Recently, the computer has no longer been used only within the office environment.
Now we travel with computers and use them in various situations. But today's Graphical User Interfaces (GUI) are not designed for such a dynamic environment.
Therefore information and communication technology, especially in the area of human computer interaction needs to be reengineered. The ease of using the computer is one of the most important goals we have to achieve in the future of digital computing.
We focus on the interaction manner between human beings and information space and amongst human beings through information space. Consequently this invention relates to computer interfaces.
The history of the development of user interfaces had a breakthrough when mouse, icons and windows were introduced in place of a command line. This formed the basis of
<Desc/Clms Page number 2>
the GUI, where icons that represent objects are displayed on a computer screen. GUI is the present major technology in which the computer presents its services to the user.
The idea of the mouse is that relative movement of the mouse in 2-D space is mapped into relative movement of its cursor on the screen. So, by moving the mouse and looking at the screen a user can bring the cursor into a desired position and by using button (s) provided on the mouse, the user can select different actions. This abstraction works, but often people are more comfortable with physical three dimensional objects then they are with the abstractions of the computer world and direct control of the computer is desired.
Natural and more intuitive control is obtained in touch sensitive screen systems where the command input area is coincident with the screen. Here, the absolute position of the pointer or a finger in relation to the screen determines the position where the action is to be taken. In this case, a brief touch is recognised as command. The negative side of touch sensitive screen technology is that a special, more expensive screen is needed, these screens very soon become dirty with repeated use and handling and their performance deteriorates.
Similar functionality is achieved with the tablets but the drawback is their complexity and their fixed size.
New metaphors for improving the GUI are the subject of this patent application.
It is well known to translate instructions or data provided via a hand-held instrument into computer readable form and also to provide an image of the data or response to the instruction on a visual display unit. Examples of such technologies include the use of a stylus in a palm top computer to select menu items displayed on a touch sensitive screen and the digitisation of handwritten text provided via an electronic notepad.
Electronic pens are also known which can record handwritten information into memory banks contained within the pen, the information being downloadable from the pen into a
<Desc/Clms Page number 3>
computer as a separate action Many of these prior art technologies utilise dedicated hardware built into or around the screen with which the handheld instrument interacts. Such systems are consequently dedicated to a particular type of display screen and information provided thereon can not easily be transferred from one display screen to another. Furthermore, once installed, these systems cannot easily be adapted to a screen of different size. Such sensitive screens used for these purposes incorporate specialised coatings and hardware with certain resistive capacitive and/or surface wave properties which can often disturb the transmission of free light from a display screen. Thus such display screens are often practical only for personal use.
The present invention aims to provide a method and apparatus whereby a hand-held instrument can be used to present handwritten or hand drawn information written on any substrate both on that substrate and simultaneously record the information in a computer for simultaneous or later presentation via a visual display unit associated with the computer.
In accordance with the present invention there is provided an apparatus for the computerised recordal and display of instructions or data provided through the interaction of a hand-held instrument with a substrate, the apparatus comprising: a hand-held instrument comprising a contact surface for contacting the substrate and a signal generator, the signal generator operable to generate a signal when the contact surface contacts the substrate; apparatus for reading the signal generated by the signal generator; and means for translating the read signal into computer readable form, whereby said signal can be translated by a computer into instructions and/or an image as required.
I
<Desc/Clms Page number 4>
In some embodiments, the hand-held instrument may comprise a contact surface for contacting the substrate and two signal generators operable to generate two signals. Desirably the two signals are substantially coaxial. Preferably at east one is not visible to the human eye. Optionally a visible signal may be used to generate a light spot or target point t facilitate remote operation of the apparatus. For example, such embodiments may utilise laser pointer technology with two axially aligned laser pointes one generating the signal for reading by the reading apparatus, the other for marking a point on the substrate to which a command is directed.
The substrate may comprise any surface which may be used for display purposes.
For example, a blackboard, a whiteboard, a wall, projector screen (even if it is not stiff), or a digital screen. Other possible substrates will no doubt occur to the skilled addressee.
The hand-held instrument is conveniently provided in the form of a pen or stylus.
Optionally the contact surface includes a marker such as ink, chalk, pencil lead or paint.
Also optionally, the hand-held instrument may comprise a holder for releasably holding a writing or colouring instrument such as a pen, stick of chalk, paintbrush, pencil or other such implement. In some embodiments, the hand-held instrument may incorporate a pressure sensitive area to which pressure is applied when the contact surface meets the substrate. Such a pressure sensitive area may be related to the signal generator such that application of the pressure to the pressure sensitive area causes generation of the signal.
The pressure sensor and signal generator may be a single component. In alternative embodiments, the signal generator may be responsive to the application of pressure to a surface of the hand-held instrument other than the contact surface such as pressure applied by a user's hand in holding the hand-held instrument.
Preferably the signal is generated in the form of an electromagnetic wave. A convenient electromagnetic wave form is infra-red or near infra-red (NIR). In embodiments comprising the second signal generator, the signal may be in the form of a laser which reflects off the substrate.
The apparatus for reading the signal may conveniently be provided in the form of
<Desc/Clms Page number 5>
a camera It will be understood that there exist cameras which not only detect visible light but other electromagnetic waves just outside the visible spectrum and such cameras and signals should be considered as part of the present invention.
A computer receiving the read signal from the reading apparatus may be configured to incorporate in its memory predetermined, relative geometric information defining an active area of the substrate on which information or instructions are provided. Software algorithms may also optionally be incorporated to the computer to correct any errors in the read signal which may result from a reading apparatus being positioned to have a perspective, rather than normal view of the signal generator.
Optionally, an image may be projected from a computer onto the substrate. The computer is configured to relate positions on the projected image to positions on the same image displayed on the computer's original display unit. In these embodiments, the handheld instrument may, for example, be used to select menu items in the image projected onto the substrate this instruction being relayed via the reading apparatus to the computer which accordingly responds to the instruction and displays the required menu item.
Optionally, whether or not an image is projected onto the substrate, an area outside the active area may be defined from which pre-defined commands may be selected. The computer is provided with a software algorithm which defines these fixed areas and recognises selection of a command outside of the active area. Optionally, one such command may be an erase command whereby the computer is instructed to interpret any movement of the hand-held implement to be re-displayed on the computer screen as the same colour as the screen background, thereby erasing data seen on the screen. A separate signal generating eraser may be used to give this instruction.
In one example, a standard digital camera may be connected to a computer and positioned arbitrarily, but so that an approximately rectangular surface on the planar substrate (intended to be used as the active area), is seen by the camera. The application program is started and frames are then captured by the camera and are analysed. Positions
<Desc/Clms Page number 6>
of the hand-held instrument (visible only when the instrument touches the surface) are calculated in the camera coordinate system. Depending on the duration of a signal from the hand held instrument (measured as a number of consecutive frames receiving the signal), which will typically correspond to the duration of time for which the hand-held instrument is in contact with the substrate, instructional commands can be identified and distinguished from actual written or drawn information. In some configurations, . if a signal of short duration is detected outside of the active screen area defined during calibration, this information may be used to generate a variety of pre-defined commands associated with regions of a command area located around the perimeter of the active screen area, e. g. in one embodiment to change the colour of a virtual pen or to select an eraser mode, in other embodiments to achieve functionality similar to the"double click"or"right click"of a mouse. Normally in the latter embodiments, a signal of short duration within the predefined command area is understood as a mouse"click"command.
The images obtained from the camera are constantly processed, preferably at a speed of about 25 frames per second. Pixels corresponding to the electromagnetic signal are searched for (the use of a filter to eliminate other wavelengths of electromagnetic radiation may improve the speed of the processing). Position of the transmitter, if active, is computed in the camera coordinate system. The image is obtained through perspective projection and to recover its position in the corresponding window it is necessary that inverse perspective transformation is performed. Also corrections due to the camera imperfections might be made. In embodiments comprising a projector it is crucial to achieve sufficient accuracy so that the calculated position corresponds to a position on the projected image as only then are the commands meaningful.
In order to achieve functionality the system has to be calibrated. In embodiments where there is no projection of the VDU image, calibration is done by selecting a 'Calibration'option in the menu and by touching four comers of the intended approximately rectangular area with the hand-held instrument. If the intended substrate has visible borders, e. g. if a white board is used, the corners may be computed using image processing techniques. In embodiments comprising a projector for projecting the VDU
<Desc/Clms Page number 7>
screen image on to the active surface, the corners of the projected image have to be either touched or calculated from the projected calibration pattern using Image processing techniques. After calibration, the correspondence between the active area and a predefined rectangular window is established and the system is ready to be used.
For the purpose of exemplification some embodiments of the invention will now be further described with reference to the following Figures in which: Figure 1 shows three embodiments of a first form of hand-held instrument suitable for use in accordance with the apparatus and method of the invention; Figure la shows an embodiment of a second form of hand-held instrument comprising a second signal generator Figure 2 shows an embodiment of apparatus according to the invention demonstrating the recording and projection of written text or drawings and projection of instructions menu onto a projected screen by projector (wherein the hand-held instrument may be of any form as described herein); and Figure 3 shows an embodiment of the invention demonstrating the recording of written text or drawing from the drawing surface and its displaying on computer screen.
Figure 4 illustrates an embodiment of the invention where the substrate is the visual display unit of a computer system.
Figure 5 illustrates an embodiment of the invention where pre-defined command regions are provided outside of the active area of the substrate Figure 6 illustrates an embodiment of the invention where pre-defined command regions are provided outside of the projected image (wherein the hand-held instrument may be of any form as described herein).
<Desc/Clms Page number 8>
Figure 7 illustrates an algorithm used for correcting the perspective of an image in accordance with the invention Figure 8 further illustrates an algorithm used for correcting the perspective of an image in accordance with the invention Three embodiments of the hand-held instruments 10 are presented in Figure 1.
First, pointer A, comprises a stylus 2, which is carried in holder 8 which has a touching surface 5a for contacting a substrate. At the opposite end of the holder 8 are power supply 7, an electrical contact 5 with a pressure sensor. An electrically conductive wire connects the sensor 5 to a signal generator 6, in this case a near infra-red (NIR) emitter. The (NIR) emitter is powered by power supply 7 and electric contact 5 through conductive wire 3. Electrical contact is only effected when pressure is applied to contacting surface 5a when the device is used to write on or touch the substrate.
Second, sleeve B, comprises a writing implement 1, for example a felt pen which is carried in holder 8 which has a removing top 4 for receiving the pen 1 which has a marking surface 5a for contacting a substrate. At the opposite end of the holder 8 are power supply 7, an electrical contact 5 with a pressure sensor. An electrically conductive wire connects the sensor 5 to a signal generator 6, in this case a near infra-red (NIR) emitter. The (NIR) emitter is powered by power supply 9 and electric contact 5 through conductive wire 3. Electrical contact is only effected when pressure is applied to contacting surface 5a when the device is used to write on or touch the substrate.
Third, eraser C, comprises an erasing surface 5a with holder 8. At the opposite end of the holder 8 are power supply 7, an electrical contact 5 with a pressure sensor. An electrically conductive wire connects the sensor 5 to a signal generator 6, in this case a near infra-red (NIR) emitter. The (NIR) emitter is powered by power supply 7 and electric contact 5 through conductive wire 3. Electrical contact is only effected when pressure is applied to contacting surface 5a when the device is used to erase on or touch the substrate.
<Desc/Clms Page number 9>
It is to be understood that whilst the pressure sensor is shown to be positioned at the opposite end to the contacting surface in the embodiment shown in Figure 1, an alternative pressure device may conveniently be provided nearer the contacting surface for example, the device may be applied in the form of a collar positioned around the nib 5a of the pen 1.
In Figure la, the hand-held instrument in the form of a laser pointer comprises a red 2 and an infra-red 1 laser diode emitters, which are carried in holder 4 which has a touch sensitive switch 9 for contacting a substrate. At the opposite end of the holder 4 are power supply 5 and electrical contacts 6. An electrically conductive wire connects the switches 6 to laser diodes emitters 1 and 2. The laser diode emitters 1 and 2 are powered by power supply 5 and electrical contacts 6 through conductive wires forming a laser beam 7 and laser point 8 through lens 3. Electrical contact is effected only when pressure is applied to touch sensitive switch 9 when the device is used to write on or touch the substrate, or switches 6 are pressed by a user's finger.
Figure 2 illustrates one embodiment of the invention. The apparatus comprises a hand-held implement 10 similar in design to that described in relation to Figure 1 or figure la, an NIR detecting camera 21 which is electrically connected to a computer 22 having a visual display unit 23 associated therewith. As shown in the Figure a plain board 25 is used as a substrate. Four points A, B, C and D define an active area on the substrate 25 which can be viewed by the camera 21 through visible light rejecting filter 29. This image is in real time analysed by computer 22 and appropriate action is taken and displayed on visual display unit 23. The computer 22 programmed with an algorithm incorporating coordinates of points A, B, C and D to define the active image area for display on visual display unit 23. On the arrangement shown, an computer generated image is displayed on visual display unit 23 as an image 24b. Image 24b is projected using video projector 20 onto the active area of the substrate 25. A user (not shown) annotates the image 24a with amendment 26a. As the contact surface 5a of the hand-held instrument 10 is pressed against substrate 25 the NIR sensor generates a signal which is picked up by video camera 21 and relayed to the computer 22. The algorithm used to define the image may incorporate parameters for adjusting perspective of the image using parameters relating to the angular
<Desc/Clms Page number 10>
position of the video camera with respect to substrate 25. Signals recorded by camera 21 are translated mto computer readable form within computer 22 and are added to Image 24b at 26b. At the top of the image 24a can be seen menu bar 28. A user (not shown) touches a menu item 27a with the hand-held instrument 10. Again the signal generated by the handheld instrument is detected by video camera 21 and relayed to computer 22. Using the positional algorithm, a position of the hand-held instrument 10 is located on the original image 24b on the visual display unit 23 where it is recognised as menu selection 27b. This is read by the computer in much the same way as an instruction received from a mouse or the like, and instruction will subsequently be carried by the computer 22 and a new image provided at 24a and 24b.
Figure 3 shows an apparatus similar to that described in relation to Figure 2 comprising hand-held implement 10 or 37, substrate 35, camera 31, computer 32 with visual display unit 33. The signal generated by the hand-held instrument 10 or 37 is detected by video camera 31, processed by computer 32 and displayed on computer monitor screen 33. Using the positional algorithm, a position of the hand-held instrument 10 or 37 is located on the original image 34b on the visual display unit 33. A copy of the new image 34a comprising drawing 36a is provided at computer monitor screen as image 34b comprising drawing 36b.
As can be seen from Figure 4, the computer monitor screen 45 may be used as the substrate. The computer screen 45 is viewed by camera 41 which detects a signal from a pointer device 10 which is used to amend the image 44b or select instructions from a menu 44a displayed on the screen 45. Thus, a non touch sensitive screen may be treated as a touch sensitive screen where date and commands can be input by direct contact with the on-screen image displayed by computer 42.
In Figure 5, substrate is again a whiteboard 55. The active area 54a in which the drawing and writing is applied is defined by the four reference points A, B, C and D. As drawn by pen or erased by eraser 57 on the substrate, action is captured by camera 51 and drawing 56a is displayed or erased as image 56b in the active area 54b of the computer
<Desc/Clms Page number 11>
screen. Outside of the active area is a border, areas of which P, Q, R, S and T have been pre-defined to correspond to certain computer commands. This command region is not displayed on the computer screen, however, the computer is programmed to recognise a signal from the pen 10 or eraser 57 given in the pre-defined areas of the command region as the pre-defined commands and actions these commands in the computer 52. It is to be understood that the command region does not necessarily need to be marked on the substrate, though this may be convenient for the user.
Figure 6 shows an apparatus similar to that described in relation to Figure 2 comprising hand-held instrument 10, substrate 65, camera 61, projector 60 with infra red filter 69, computer 62 with visual display unit 63. In addition to embodiment described in relation to Figure 2, embodiment relating to Figure 6 has additional functionality with the border regions outside of the active area predefined to correspond to certain computer commands. This command region is not displayed, however, computer is programmed to recognise a signal from the pen 10 given in the predefined areas of the command regions as the pre-defined commands and actions these commands in the computer 62. These commands could simulate left and right mouse buttons and give user complete control on programme application in the standard mouse manner.
Where the hand-held instrument is of a form similar to that shown in Figure la, the second signal (i. e. the red diode emission) may be used to point at and select commands from the command regions or from menus displayed on the substrate.
There follows a discussion of algorithms which may be used in the perspective correction of an image seen by a signal detector in accordance with the invention.
Inverse Perspective Transformation Algorithm It is well known that the image of the rectangular object as seen by a camera can have perspective distortion due to the relative position of the camera with respect to the object. The result is that a rectangular object can be seen as non-regular shape (the
<Desc/Clms Page number 12>
opposite sides may not be parallel, their lengths may differ and internal angles may not be 90 degrees).
The rectangular area is predefined and coordinates of the corner points A, B, C, D Fig 7 are stored. The planar quadrilateral writing area on the substrate is defined by touching its four corners A2, B2, C2 and D2 with the pointer. These points are seen by the camera and their position in the camera coordinate system is stored. The correspondence between A, B, C and D and A2, B2, C2 and D2 is established as follows: 1. The bottom left point as seen by the camera becomes A2 and corresponds to the point A which is the bottom left corner of the rectangular window.
2. The bottom right point as seen by the camera becomes B2 and corresponds to the point B which is the bottom right corner of the rectangular window.
3. The top right point as seen by the camera becomes C2 and corresponds to the point C which is the top right corner of the rectangular window.
4. The top left point as seen by the camera becomes D2 and corresponds to the point D which is the top left comer of the rectangular window.
Through the process of calibration the position of the points A2, B2, C2 and D2 in camera coordinate system is established. The problem is to find the acceptable mapping so that any position P2 of the marker, as seen by the camera, is interpreted in the best and most accurate way and corresponding screen coordinates of point P are computed.
Firstly, the position P2 of the pointer is computed as the weighted centre of the mass off all the pixels corresponding to NIR identified on the image.
Secondly, the inverse perspective transformation has to be applied.
f
<Desc/Clms Page number 13>
One algorithm which may be applied is a barycentric transformation. As it is well known three points A, B, and C, in space, define the plane. For the point P in the triangle whose vertices are points A, B and C it is always possible to write barycentric combination as
where
The barycentric coordinates of P with respect to A, B, C are computed as: u = area (P, B, C) /area (ABC) v = area (P, A, C) /area (ABC) u = area (P, A, B) /area (ABC) Due to their connection with barycentric combinations, barycentric coordinates are affinely invariant. Let P have barycentric coordinates u, v, w with respect to A, B, C. All four points could be mapped to another set of four points A', B', C', P'by an affine map. Then P'has the same barycentric coordinates u, v, w with respect to A', B', C'. In other words barycentric coordinates might be used to define linear interpolation. Actual location or shape of the triangle is totally irrelevant to the definition of linear interpolation. As our shape is rectangular it was divided in two triangles: ABD and BCD and transformation was applied.
This algorithm does provide limited accuracy in perspective correction resulting visually noticeable errors towards the centre of a rectangular image.
Accuracy may be improved by calculating the intersection of the diagonals on both images and applying the barycentric mapping on four triangles: ABE, BCE, CDE, ADE (where E is the common apex of the four triangles).
The error in mapping point E2 onto point E is zero as the point is calculated as intersection of diagonals.
<Desc/Clms Page number 14>
A preferred and more accurate algorithm is detailed below. This idea is used to obtain the desired accuracy. It is well known that all parallel lines intersect in the same point in infinity. This situation is illustrated below.
The vanishing point V 1 is calculated as intersection of lines interpolating A2D2 and B2G2. The line interpolating points VI and E2 intersects C2D2 at point H2 and the line A2B2 at the point F2. This line corresponds to the line going through points F and H which are easy to calculate as H is in the middle of CD and F is in the middle of AB. The same is true for second vanishing point V2 and points G2 and K2. In this way we have calculated, without introducing any error, new polygons: A2, F2, E2, K2.
F2, B2, G2, E2 G2, E2, C2, H2 K2, E2, H2, D2 And corresponding rectangles A, F, E, K F, B, G, E G, E, C, H K, E, H, D Each of these rectangles might be recursively subdivided in the same way as the problem is the same and after just a few subdivisions (6 or 7 steps will suffice) triangles are very small and still no error in mapping is introduced. When the size of largest triangle is close to several pixels we stop subdivision.
This subdivision process is done within the calibration procedure and all the points generated together with the correspondence information are stored into computer's memory.
After the calibration every image acquired is processed as follows:
I
<Desc/Clms Page number 15>
A. The image acquired by the camera is analysed and as result of the analysis the position P2 of the tip of the pointer in the camera coordinate system is obtained.
B. Starting with 2 largest triangles, defining the whole active area, correct triangle containing the centre of mass of the pointer's image is selected. This triangle contains two triangles, defined in subdivision process as explained earlier. Again correct triangle, containing the centre of mass of the pointer's image is selected and this process is repeated until the smallest triangle containing the centre of mass of the pointer is found.
Then, the barycentric transformation between the triangle containing P2 and corresponding triangle is performed and corresponding point P is calculated.
This algorithm is not restricted to one planar rectangular surface only. An object containing planar angled surfaces or even a curved object can be defined as plurality of rectangular surfaces as seen by the camera providing that the corners of each rectangular surface are registered as explained above. The algorithm can be modified to recognise on which surface the pen is writing and then mapping is performed as previously described.
In summary, the present invention achieves, inter alia that: 1. Functionality normally only associated with touch sensitive screen technologies is available on a projected screen image of the computer VDU display through image modifications projected on any approximately planar surface including, but not limited to, walls, black boards, white boards etc. In one specific embodiment, the invention may be used to effectively transform any standard computer screen into a touch sensitive screen. In this way, all Windows (or similar) applications may be controlled by touching the projected image including, but not limited to, programs for drawing and for presentations.
2. Software applications may be written in accordance with the invention so that some, specifically defined, functionality of the virtual touch sensitive screen as described above is enabled using an active pointer on any quadrilateral surface area, for example; a
<Desc/Clms Page number 16>
notebook, a drawing table, a black board or a white board. The use of a projected image is not required in these embodiments An example of such application is that the movement of a hand held instrument is recognised as a drawing/erasing command when touching the designated quadrilateral area of the substrate, while short touches of other designated areas outside the quadrilateral surface may be recognised as some simple, commonly used commands (e. g. colour selection, eraser selection, new drawing, delete, save work, print and so on).

Claims (37)

  1. CLAIMS 1. An apparatus for the computerised recordal and display of instructions or data provided through the interaction of a hand-held instrument with a substrate, the apparatus comprising: a hand-held instrument comprising a contact surface for contacting the substrate and a signal generator, the signal generator operable to generate a signal when the contact surface contacts the substrate; apparatus for reading the signal generated by the signal generator; and means for translating the read signal into computer readable form, whereby said signal can be translated by a computer into instructions and/or an image as required.
  2. 2. An apparatus for the computerised recordal and display of instructions or data provided through the interaction of a hand-held instrument with a substrate, the apparatus comprising: a hand-held instrument comprising a signal generator, means for activating the signal generator and means for providing a mark on the substrate, the signal generator operable to generate a signal when activated; apparatus for reading the signal generated by the signal generator; and means for translating the read signal into computer readable form, whereby said signal can be translated by a computer into instructions and/or image as required.
    <Desc/Clms Page number 18>
  3. 3. An apparatus as claimed in claim 2 wherein the mark is in the form of ink, chalk or other substance applied directly to the substrate
  4. 4. An apparatus as claimed in claim 2 wherein the mark is in the form of projection or reflection of visible light directed at the substrate.
  5. 5. An apparatus as claimed in any preceding claim wherein the substrate comprises a re-usable writing board, for example, but not limited to, a black board or a white board.
  6. 6. An apparatus as claimed in any of claims I to 5 wherein the substrate comprises a projector screen.
  7. 7. An apparatus as claimed in any preceding claims wherein the hand-held instrument is provided in the form of a pointing device and/or a writing, drawing or colouring implement.
  8. 8. An apparatus as claimed in claim 7 wherein the hand-held instrument is in the form of a writing, drawing or colouring implement, the writing, drawing or colouring component of the instrument being removably encased in a holder.
  9. 9. An apparatus as claimed in any preceding claim wherein the hand-held instrument incorporates a pressure sensor to which pressure is applied when a contact surface of the instrument interacts with the substrate, the pressure sensor being related to the signal generator such that application of pressure to the pressure sensor causes generation of the signal.
  10. 10. An apparatus as claimed in claim 9 wherein the pressure sensor and signal generator are provided as a single component.
  11. 11. An apparatus as claimed in any of claims I to 8 wherein the hand-held instrument
    <Desc/Clms Page number 19>
    incorporates a pressure sensor to which pressure is applied when a user grips the hand-held instrument.
  12. 12. An apparatus as claimed in any preceding claim wherein the signal generator emits a signal in the form of an electromagnetic wave.
  13. 13. An apparatus as claimed in claim 12, wherein the electromagnetic wave is in the infra-red or near infra-red region of the electromagnetic spectrum.
  14. 14. An apparatus as claimed in claim 9 wherein the electromagnetic wave is laser light.
  15. 15. An apparatus as claimed in any preceding claim wherein the apparatus for reading the signal is a camera.
  16. 16. An apparatus as claimed in claim 15 wherein the camera is a video camera or a web-cam.
  17. 17. An apparatus as claimed in claim 15 or 16 wherein the camera incorporates a filter to filter out electromagnetic interference from wavelengths other than those emitted by the signal generator.
  18. 18. An apparatus as claimed in any preceding claim wherein the means for translating the read signal is incorporated in the computer.
  19. 19. An apparatus as claimed in any of claims 1 to 18 wherein the means for translating the read signal is incorporated within the apparatus for reading the signal.
  20. 20. An apparatus as claimed in any preceding claim further comprising a computer, the computer having associated therewith, a Visual Display Unit (VDU).
  21. 21. An apparatus as claimed in claim 20 wherein the computer is programmed with an
    <Desc/Clms Page number 20>
    algorithm for the translation of data provided by the apparatus for receiving the read signal to a facsimile of an image applied to the substrate by the hand-held instrument, the facsimile image being displayable by the computer on the VDU.
  22. 22. An apparatus as claimed in claim 21 wherein the algorithm has as parameters a selection of reference points defining an active area on a substrate within which the image is applied by the hand-held instrument.
  23. 23. An apparatus as claimed in claim 21 or 22 wherein the algorithm has as parameters, positional data collected by freely positioning the apparatus for receiving the signal a plurality of predefined distances in various directions from the centre point of the active area of the substrate in a plane substantially parallel to the surface of the substrate.
  24. 24. An apparatus as claimed in any of claims 20 to 23 further comprising means for projecting a computer generated image onto the substrate within the active area, whereby permitting the computer generated image to be annotated by annotation of the projected image using the hand-held instrument.
  25. 25. An apparatus as claimed in claim 24 wherein the computer generated image incorporates an active menu from which items can be selected by touching the desired item on the projected image with the hand-held instrument.
  26. 26. An apparatus substantially as described herein and with reference to the Figures 1 to 3.
  27. 27. An apparatus as claimed in any preceding claim wherein a pre-defined command region is defined as the substrate whereby a signal located in the command region is translated as a pre-defined command to the computer.
  28. 28. An apparatus as claimed in any preceding claim wherein the hand held comprises
    <Desc/Clms Page number 21>
    a second signal generator generating a visible signal such as a light spot which can be used to point remotely to selected areas on a substrate.
  29. 29. An apparatus as claimed in claim 25 wherein the first and second signals are substantially coaxial.
  30. 30. An apparatus as claimed in claim 26 wherein the two signal generators comprise laser pointers.
  31. 31. A hand-held instrument suitable for use in the apparatus according to any preceding claim.
  32. 32. A method for computerised recordal and display of instructions or data provided through the interaction of a hand-held instrument with a substrate, comprising; generating an electromagnetic signal in response to interaction of the hand- held instrument with the substrate; interacting the hand-held instrument with the substrate at a plurality of pre- selected points defining an active area of the substrate; detecting the electromagnetic signal and recording the position of the pre- selected points in computer readable form; associating the pre-selected points with a display on a VDU screen and producing a facsimile of the active area; interacting the hand-held instrument within the active area of the substrate to produce a desired image; detecting the electromagnetic signal and recording the image in computer readable form; associating the position of the desired image with the pre-selected points and producing a facsimile of the desired image on the VDU.
  33. 33. A method as claimed in claim 32 further comprising applying an algorithm to
    <Desc/Clms Page number 22>
    record data to adjust for perspective differences between the real image on the substrate and a distorted image seen by the recording means due to non-linear positioning of the recording means in relation to the substrate.
  34. 34. A method as claimed in claim 33 wherein the algorithm involves applying a barycentric transformation to triangles having sides defined by interpolated lines passing through any two of four reference points defining the active area on the substrate.
  35. 35. A method as claimed in claim 34 wherein the algorithm involves: a) interpolating lines passing through each pair combination of four reference points defining an active area of the substrate to define a plurality of triangles; b) interpolating lines passing through any vertex of a triangle defined in step a) which is positioned outside the active area and any vertex of a triangle defined in step a) which is positioned in the active area thereby defining a plurality of quadrilaterals within the active area; c) re-applying recursively step a) to each quadrilateral defined in step b) taking the comers of the quadrilaterals as the four reference points until each quadrilateral is of a pre-selected maximum size; d) applying a barycentric transformation to all triangles within the active area defined in steps a) to c)
  36. 36. A method as claimed in claims 35 further comprising locating the position of a signal emitted within the active area including the steps of; e) identifying the smallest triangle from step c) within which the signal is located; f) computing the barycentric coordinates of the signal with respect to the vertices of the triangle identified in step e)
    <Desc/Clms Page number 23>
    g) computing the position of the signal using the barycentric coordinates computed in step f) and with reference to a corresponding perfect triangle in the actual image.
  37. 37. A method as claimed in any of claims 32 to 36 wherein the substrate comprises a non-planar surface and is defined by a plurality of substantially planar rectangular surfaces each having four reference points and the methods includes the step of identifying which of the plurality of planar surfaces is active at a given time.
GB0210261A 2001-07-10 2002-05-07 Analysing and displaying motion of hand held instrument Withdrawn GB2377607A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB0116805A GB0116805D0 (en) 2001-07-10 2001-07-10 A human-computer interface with a virtual touch sensitive screen

Publications (2)

Publication Number Publication Date
GB0210261D0 GB0210261D0 (en) 2002-06-12
GB2377607A true GB2377607A (en) 2003-01-15

Family

ID=9918217

Family Applications (2)

Application Number Title Priority Date Filing Date
GB0116805A Ceased GB0116805D0 (en) 2001-07-10 2001-07-10 A human-computer interface with a virtual touch sensitive screen
GB0210261A Withdrawn GB2377607A (en) 2001-07-10 2002-05-07 Analysing and displaying motion of hand held instrument

Family Applications Before (1)

Application Number Title Priority Date Filing Date
GB0116805A Ceased GB0116805D0 (en) 2001-07-10 2001-07-10 A human-computer interface with a virtual touch sensitive screen

Country Status (1)

Country Link
GB (2) GB0116805D0 (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1452902A1 (en) * 2003-02-28 2004-09-01 Hewlett-Packard Development Company, L.P. Visible pointer tracking with separately detectable pointer tracking signal
DE102009007571A1 (en) * 2009-02-04 2010-08-12 Hermann Dopfer Whiteboard-presentation platform for visualization of test and object in e.g. class room, has socket body made from wood, movable overhead-carriage, and high-quality coated, white furniture plate provided as whiteboard-writing surface
EP2328020A1 (en) * 2009-11-26 2011-06-01 Samsung Electronics Co., Ltd. Presentation recording apparatus and method
EP2383609A1 (en) * 2010-04-30 2011-11-02 Samsung Electronics Co., Ltd. Interactive display apparatus and operating method thereof
EP2317761A3 (en) * 2009-10-29 2013-03-06 Hitachi Consumer Electronics Co. Ltd. Presentation system and display device for use in the presentation system
FR2995093A1 (en) * 2012-09-05 2014-03-07 Gregory Vincent Optical pointer for pointing system for pursuing projection surface zone e.g. wall, has laser diode and infra-red diode emitting laser and infra-red light beams arranged such that light beams are directed jointly to projection surface zone
US9104270B2 (en) 2006-05-22 2015-08-11 Thomson Licensing Video system having a touch screen
CN105786224A (en) * 2016-03-29 2016-07-20 电子科技大学 Universal laser pointer and computer operation method
EP3410277A4 (en) * 2016-01-25 2019-07-24 Hiroyuki Ikeda Image projection device

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2486336A1 (en) * 1980-05-23 1982-01-08 Option Sa Photo-stylus system for writing on TV screen - has microprocessor controlled TV sweep generator cooperating with stylus producing coordinate pulses which are stored in memory
GB2149542A (en) * 1983-10-25 1985-06-12 Agb Research Plc A data recording system
EP0171747A2 (en) * 1984-08-14 1986-02-19 Metaphor Computer Systems Cordless intelligent mouse
US4754268A (en) * 1984-10-13 1988-06-28 Mitsuboshi Belting Ltd. Wireless mouse apparatus
US4797665A (en) * 1985-02-06 1989-01-10 Alps Electric Co., Ltd. X-Y position input device
EP0507269A2 (en) * 1991-04-01 1992-10-07 YASHIMA ELECTRIC CO., Ltd. of ISHIHARA NOGAMI Writing device for storing handwriting

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2486336A1 (en) * 1980-05-23 1982-01-08 Option Sa Photo-stylus system for writing on TV screen - has microprocessor controlled TV sweep generator cooperating with stylus producing coordinate pulses which are stored in memory
GB2149542A (en) * 1983-10-25 1985-06-12 Agb Research Plc A data recording system
EP0171747A2 (en) * 1984-08-14 1986-02-19 Metaphor Computer Systems Cordless intelligent mouse
US4754268A (en) * 1984-10-13 1988-06-28 Mitsuboshi Belting Ltd. Wireless mouse apparatus
US4797665A (en) * 1985-02-06 1989-01-10 Alps Electric Co., Ltd. X-Y position input device
EP0507269A2 (en) * 1991-04-01 1992-10-07 YASHIMA ELECTRIC CO., Ltd. of ISHIHARA NOGAMI Writing device for storing handwriting

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1452902A1 (en) * 2003-02-28 2004-09-01 Hewlett-Packard Development Company, L.P. Visible pointer tracking with separately detectable pointer tracking signal
US9104270B2 (en) 2006-05-22 2015-08-11 Thomson Licensing Video system having a touch screen
DE102009007571A1 (en) * 2009-02-04 2010-08-12 Hermann Dopfer Whiteboard-presentation platform for visualization of test and object in e.g. class room, has socket body made from wood, movable overhead-carriage, and high-quality coated, white furniture plate provided as whiteboard-writing surface
DE102009007571B4 (en) * 2009-02-04 2012-03-08 Hermann Dopfer Whiteboard presentation platform with camera tripod
EP2317761A3 (en) * 2009-10-29 2013-03-06 Hitachi Consumer Electronics Co. Ltd. Presentation system and display device for use in the presentation system
EP2328020A1 (en) * 2009-11-26 2011-06-01 Samsung Electronics Co., Ltd. Presentation recording apparatus and method
US8408710B2 (en) 2009-11-26 2013-04-02 Samsung Electronics Co., Ltd. Presentation recording apparatus and method
EP2383609A1 (en) * 2010-04-30 2011-11-02 Samsung Electronics Co., Ltd. Interactive display apparatus and operating method thereof
FR2995093A1 (en) * 2012-09-05 2014-03-07 Gregory Vincent Optical pointer for pointing system for pursuing projection surface zone e.g. wall, has laser diode and infra-red diode emitting laser and infra-red light beams arranged such that light beams are directed jointly to projection surface zone
EP3410277A4 (en) * 2016-01-25 2019-07-24 Hiroyuki Ikeda Image projection device
US11513637B2 (en) 2016-01-25 2022-11-29 Hiroyuki Ikeda Image projection device
US11928291B2 (en) 2016-01-25 2024-03-12 Hiroyuki Ikeda Image projection device
CN105786224A (en) * 2016-03-29 2016-07-20 电子科技大学 Universal laser pointer and computer operation method

Also Published As

Publication number Publication date
GB0116805D0 (en) 2001-08-29
GB0210261D0 (en) 2002-06-12

Similar Documents

Publication Publication Date Title
US7015894B2 (en) Information input and output system, method, storage medium, and carrier wave
US20010030668A1 (en) Method and system for interacting with a display
RU2669717C2 (en) Handbook input / output system, digital ink sheet, information intake system and sheet supporting information input
US6429856B1 (en) Coordinate position inputting/detecting device, a method for inputting/detecting the coordinate position, and a display board system
US6421042B1 (en) Coordinate position inputting/detecting device, a method for inputting/detecting the coordinate position, and a display board system
US9128537B2 (en) Bimanual interactions on digital paper using a pen and a spatially-aware mobile projector
JP4627781B2 (en) Coordinate input / detection device and electronic blackboard system
US20140210799A1 (en) Interactive Display System and Method
CN108431729A (en) To increase the three dimensional object tracking of display area
WO2000006395A1 (en) Electronic blackboard system
US6466197B1 (en) Method and apparatus for driving pointing device of computer system
JP2004062658A (en) Coordinate input device, control method for the same, and program
KR101176104B1 (en) Interactive electric board system and management method using optical sensing pen
TW550512B (en) System and method for acquiring a graphical representation of one or more intersection points between a straight line and a surface
CN113672099A (en) Electronic equipment and interaction method thereof
GB2377607A (en) Analysing and displaying motion of hand held instrument
JP2001067183A (en) Coordinate input/detection device and electronic blackboard system
JP4266076B2 (en) Electronic blackboard system
JP2003330612A (en) Information input/output system, program and storage medium
JP2002268809A (en) Information input-output system, information controlling method, program and recording medium
US20210031014A1 (en) Touch and Virtual Application for Art, Designs and Tattoos
US20040075641A1 (en) Input device and methods of use within a computing system
CN104281301A (en) Input method and electronic equipment
US20060072009A1 (en) Flexible interaction-based computer interfacing using visible artifacts
JP4603183B2 (en) Information input / output system, display control method, storage medium, and program

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)