GB2481606A - Fine movement of an object displayed on an interactive surface having a display resolution greater than its detection resolution - Google Patents

Fine movement of an object displayed on an interactive surface having a display resolution greater than its detection resolution Download PDF

Info

Publication number
GB2481606A
GB2481606A GB1010941.1A GB201010941A GB2481606A GB 2481606 A GB2481606 A GB 2481606A GB 201010941 A GB201010941 A GB 201010941A GB 2481606 A GB2481606 A GB 2481606A
Authority
GB
United Kingdom
Prior art keywords
displayed
detected
pixel
input
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB1010941.1A
Other versions
GB201010941D0 (en
GB2481606B (en
Inventor
Andrew Oakley
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Promethean Ltd
Original Assignee
Promethean Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Promethean Ltd filed Critical Promethean Ltd
Priority to GB1010941.1A priority Critical patent/GB2481606B/en
Publication of GB201010941D0 publication Critical patent/GB201010941D0/en
Priority to EP11171864.9A priority patent/EP2402847A3/en
Priority to US13/172,082 priority patent/US9367228B2/en
Publication of GB2481606A publication Critical patent/GB2481606A/en
Application granted granted Critical
Publication of GB2481606B publication Critical patent/GB2481606B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen

Abstract

A method of controlling the positioning of an object displayed on an interactive display surface (e.g. interactive whiteboard), in which the interactive display surfaces display resolution is greater than the detection resolution. The method includes determining selection of a displayed object on the interactive display surface, detecting movement of an input at the interactive display surface associated with the selected displayed object, and in an object transformation mode, moving at least one displayed pixel of the selected displayed object by a distance which is less than the distance of the detected movement of the detected input. Fine movement of a displayed object may thus be achieved. The mode of movement (coarse or fine) may be determined based on the size of an interaction touch or the position of the interaction or its speed of movement.

Description

FINE OBJECT POSITIONING
BACKGROUND TO THE INVENTION:
Field of the Invention:
The present invention relates to the control of the transformation of the display of a graphical object displayed on the surface of an interactive display, which surface is adapted to detect inputs at the surface thereof to control the graphical object.
Description of the Related Art:
Several technologies exist that adapt a display screen or display surface associated with a computer to be interactive.
An interactive display screen or surface is adapted to allow detection of inputs at the surface thereof, and is often referred to simply as an interactive surface. The detected inputs at the interactive surface can be used to manipulate objects displayed on the surface, annotate the display surface, and generally control software applications running on the computer in a similar way to a conventional computer mouse.
Example known interactive surfaces are adapted to detect touch inputs on the surface, or detect the presence of specially adapted input devices, such as pen-type devices.
In general, interactive surfaces adapted for use with pens advantageously have a greater input detection resolution than touch input surfaces.
However, in some scenarios a touch sensitive interactive surface may be a preferable implementation. Touch sensitive interactive surfaces tend to be used for providing solutions in public kiosk implementations, where special pens are not available or not desirable. The relative cost of touch and pen implementations are such that cost factors may dictate a touch solution for some implementations. Further, in some applications, a touch solution may have particular advantages.
An example of such an application is the use of an electronic whiteboard system incorporating an interactive surface in an educational environment for young children, where having to control a pen to use the whiteboard on a vertical surface can be problematic for users. A touch surface may also be more suited to an environment in which multiple users provide inputs at the interactive surface at the same time.
Thus sometimes a trade-off must be made in terms of the pen or touch positioning location accuracy (which may be best provided by a pen-type interactive surface) against cost, speed of response, or capability to cope with many simultaneous touches (which may be best provided by a touch-type interactive surface) . Often all of these factors are interrelated.
This may lead to a situation where in a given implementation the image pixel resolution of the displayed image is greater than the resolving resolution of the touch sensing technology, i.e. the display resolution is greater than the detection resolution.
For the purposes of basic control and annotation, this disparity between resolutions is rarely a problem. However for situations where objects must be moved and located with precision, a problem arises. An example of such a scenario is the use of a touch sensitive interactive whiteboard in an education environment. Consider an example implementation in a mathematics application where it is required for a user to control the position of an on-screen virtual protractor to make a precise measurement. Without pixel-accurate movement control, which requires the detection resolution to be compatible with the display resolution, this may not be possible.
It is an object of the invention to provide an improved arrangement to address one or more of the above-stated problems, and provide for finer control of the transformation of a displayed object in a system where the image pixel resolution of a displayed image is greater than the resolving resolution of the input sensing technology.
SUMMARY OF THE INVENTION:
In accordance with the invention there is provided a method of controlling the positioning of an object displayed on an interactive display surface, in which interactive display surface the display resolution is greater than the detection resolution, the method comprising: determining selection of a displayed object on the interactive display surface; detecting movement of an input at the interactive display surface associated with the selected displayed object; and in an object transformation mode, moving at least one displayed pixel of the selected displayed object by a distance which is less than the distance of the detected movement of the detected input.
The step of moving at least one displayed pixel of the selected displayed object may comprise moving the at least one displayed pixel from an initial display pixel position to an adjacent display pixel position in responsive to movement of the detected input from an initial detection position to an adjacent detection position.
The at least one displayed pixel of the selected displayed object may be moved from an initial display pixel position to a further display pixel position which is most closely located to a reduced scale distance of a distance of the detected input from its initial detected position to its further detected position.
The method may further comprise: in a further object transformation mode, moving at least one displayed pixel of the selected displayed object by a distance corresponding to a distance of the detected movement of the detected input; and selecting between the object transformation mode and the further object transformation mode.
The step of moving at least one displayed pixel of the selected displayed object may comprise moving the at least one displayed pixel from an initial display pixel position to a further display pixel position in response to movement of the detected input from an initial detection position to a further detection position, the further display pixel position being the display pixel position which is most closely located to the position of the further detection position.
In the further object transformation mode each input movement detected between detection positions may cause at least one pixel of the displayed object to move by a plurality of pixel positions, and wherein in the object transformation mode each input movement detected between detection positions causes at least one pixel of the displayed object to move by a single pixel position.
In the further object transformation mode the at least one displayed pixel may be movable by a distance which corresponds to the distance by which the detected input is moved.
In the object transformation mode the at least one displayed pixel may be movable by a distance which is a reduced scale distance of the distance by which the detected input is moved.
The method may further comprise selecting between the modes in dependence on a characteristic of the detected input.
The characteristic of the detected input may be one or more of: the contact surface area of the detected input; the number of contact points associated with the detected input; the position of a contact point of the detected input relative to the selected object; or the speed of movement of the detected input.
A computer program may be adapted to perform any stated method.
A computer program product may be adapted to store computer code which, when run on a computer, performs any stated method.
The invention also provides an interactive display system including an interactive display surface for detecting one or more inputs at said surface, and for controlling the display of an object on the interactive display surface in dependence on detected movement of an input at the surface whilst the input is selecting the object, the system comprising: a displayed object transformation controller, for transforming the display of at least one displayed pixel of the selected object, in dependence upon a detected movement of the input at the interactive surface, the displayed object transformation controller being adapted, in a first mode of operation, to move at least one displayed pixel of the displayed object by a distance which is less than the detected distance moved by the input means.
The displayed object transformation controller may be adapted to move the at least one displayed pixel from an initial display pixel position to an adjacent display pixel position responsive to movement of the detected input from an initial detection position to an adjacent detection position.
The displayed object transformation controller may be adapted to move the at least one displayed pixel from an initial display pixel position to a further display pixel position which most closely matches the position of a reduced scale distance of a distance of the detected input from its initial position, in a direction determined by the input detected direction.
The displayed object transformation controller may be configured, in a further object transformation mode, to move at least one displayed pixel of the selected displayed object by a distance corresponding to a distance of the detected movement of the detected input; and is further configured to select between the object transformation mode and the further object transformation mode.
The displayed object transformation controller may be further configured to move the at least one displayed pixel to a displayed pixel position responsive to movement of the detected input from an initial detection position to a further detection position, the further display pixel position being the display pixel position which most closely matches the position of the further detection position.
The displayed object transformation controller may be further configured, in the further object transformation mode, such that at least one pixel of the selected displayed object is moved from a starting position to a finishing position which most closely matches the detected finishing position of the detected input.
The displayed object transformation controller may be further configured such that in the further object transformation mode each input movement detected between detection positions causes at least one pixel of the displayed object to move by a plurality of pixel positions, and wherein in the object transformation mode each input movement detected between detection positions causes at least one pixel of the displayed object to move by a single pixel position.
The invention provides in one aspect a method in an interactive display system, in which inputs are detected at an interactive surface for controlling the display of an object displayed on the interactive surface, in which system the display resolution is greater than the detection resolution, wherein in a first mode of operation a detected movement of an input from one detection position to an adjacent detection position in a particular direction results in the movement of a displayed pixel of the object from one display position to an adjacent display position in a direction determined by the direction of the detected movement of the input.
In this way, the displayed pixel may move by an actual distance which corresponds to a reduced distance of the actual distance moved by the detected input. The movement is preferably in the same direction.
A detection position may comprise a position at which an input can be detected on the interactive surface. A display position may comprise a position at which a pixel of an image can be displayed.
In a second mode of operation a detected movement of an input from one detection position to an adjacent detection position in a particular direction may result in the movement of a displayed pixel of the object from one display position to a non-adjacent display position in a direction determined by the direction of movement of the input. Thus, the detected movement of an input from an initial detection position to an adjacent detection position in a particular direction may result in the movement of a displayed pixel of the object from an initial display position to a display position which is a plurality of display positions away from the initial display position.
In this way, the displayed pixel may move by an actual distance and direction which corresponds to the actual distance and direction moved by the detected input.
The invention further provides in this aspect an interactive display system, adapted to detect inputs at an interactive surface for controlling the display of an object displayed on the interactive surface, in which system the display resolution is greater than the detection resolution, the system being configured such that in a first mode of operation a detected movement of an input from one detection position to an adjacent detection position in a particular direction results in the movement of a displayed pixel of the object from one display position to an adjacent display position in a direction determined by the direction of the detected movement of the input.
The system may be further configured such that in a second mode of operation a detected movement of an input from one detection position to an adjacent detection position may result in the movement of a displayed pixel of the object from one display position to a non-adjacent display position.
BRIEF DESCRIPTION OF THE FIGURES:
The invention will now be described by way of example with reference to the accompanying figures, in which: Figure 1 illustrates the main elements of a typical known example interactive display system; Figures 2 (a) and 2 (b) illustrate a prior art problem addressed by the invention; Figure 3 illustrates a preferred technique for implementation of the invention; and Figure 4 illustrates the architecture of a computer system incorporating the invention.
DESCRIPTION OF THE PREFERRED EMBODIMENTS:
The invention is now described by way of reference to various examples, embodiments, and advantageous applications.
One skilled in the art will appreciate that the invention is not limited to the details of any described example, embodiment or detail. In particular the invention is described with reference to an exemplary interactive display system. One skilled in the art will appreciate that the principles of the invention are not limited to such a described system.
A typical example of a system incorporating an interactive surface is an electronic whiteboard system. An electronic whiteboard system typically is adapted to sense the position of a pointing device or pointer relative to a work surface (the display surface) of the whiteboard, the work surface being an interactive surface. When an image is displayed on the work surface of the whiteboard, and its position calibrated, the pointer can be used in the same way as a computer mouse to manipulate objects on the display by moving a pointer over the surface of the whiteboard.
A typical application of an interactive whiteboard system is in a teaching environment. The use of interactive whiteboard systems improve teaching productivity and also improve student comprehension. Such whiteboards also allow use to be made of good quality digital teaching materials, and allow data to be manipulated and presented using audio visual technologies.
A typical construction of an electronic whiteboard system comprises an interactive display forming the electronic whiteboard, a projector for projecting images onto the display, and a computer system in communication with the electronic whiteboard for generating the images for projection, running software applications associated with such images, and for processing data received from the display associated with pointer activity, such as the location of the pointer on the display surface. In this way the computer system can control the generation of images to take into account the detected movement of the pointer on the interactive surface.
In known systems the pointer can be provided as a specially adapted pen, to interact with an interactive surface which is provided with appropriate input detection technology.
The pen and the interactive surface may interact by electromagnetic means. Other techniques for detecting a pointer position on an interactive surface are known, such as the use of camera technology. The present invention is not limited in its applicability to any particular input detection technology incorporated in an interactive surface. However as set out in the background section above particular problems manifest themselves in a touch input detection technology, and therefore a main embodiment of the invention is described in the context of an electronic whiteboard including a touch sensitive interactive surface.
With reference to Figure 1, an exemplary interactive display system 100 comprises: a whiteboard assembly arrangement generally designated by reference numeral 106, and including an interactive surface 102; a projector 108, and a computer system 114. The projector 108 is attached to a fixed arm or boom 110, which extends in a direction perpendicular from the surface of the whiteboard 106. One end of the boom supports the projector 108 in a position in front of the display surface 102, and the other end of the boom 110 is fixed to the whiteboard 106 or near the whiteboard, such as a frame associated with the whiteboard 106, or a wall on which the whiteboard 106 is mounted.
Although the example interactive display system is illustratively a projection system, the invention is not limited to the use of interactive surfaces in a projection system. The interactive surface may, for example, be configured as an emissive surface.
The computer 114 controls the interactive display system. A computer display 116 is associated with the computer 114. The computer 114 is additionally provided with a keyboard input device 118 and a mouse input device 120. The computer 114 is connected to the whiteboard 106 by communication line 122 to receive input data from the display surface 102, and is connected to the projector 108 by a communication link 112 in order to provide display images to the projector 108. Although in Figure 1 these are shown as wired connections, they also may be implemented as wireless connections.
Shown in Figure 1 is a pointing device 104, which is used to provide inputs at the interactive display surface 102.
As noted above, in the main embodiment described herein, the interactive surface is a touch sensitive surface, and the pointing device 104 is illustrated as a finger/hand. The invention is not limited in its applicability to any particular touch detection technology.
As is known in the art, the computer 114 controls the interactive display system to project images via the projector 108 onto the interactive surface 102, which can be referred to as an interactive display surface. The position of the pointing device 104 is detected by the interactive display surface 102, data returned to the computer 114, and location information determined by the computer 114. The pointing device 104 operates in the same way as a mouse to control the displayed images.
A problem which the present invention addresses can be understood with respect to Figures 2 (a) and 2 (b).
Figure 2(a) illustrates an array of pixels generally designated by reference numeral 202. As can be seen, there is a regular, fixed spacing between each pixel in the horizontal and vertical directions. The number of pixels per unit area defines the display resolution. As illustrated in Figure 2(a), the spacing of pixels in both the vertical and horizontal directions -or the x and y axes -is a dimension x. In alternative arrangements, the spacing in the vertical direction may be different to that of the horizontal direction. In general, the smaller the pixel spacing (i.e. the smaller the value of x), the greater the display resolution.
In this description, the term display resolution is used to refer to the resolution of displayed images, being the number of display pixels per unit area. This may include a fixed spacing in both the vertical and horizontal directions as shown, or a first fixed spacing in the horizontal direction and a second fixed spacing in the vertical direction.
Figure 2(b) illustrates an array of detection points generally designated by reference numeral 210. As can be seen, there is also a regular, fixed spacing between each detection point in the horizontal and vertical directions. The number of detection points per unit area defines the detection resolution. As illustrated in Figure 2(b), the spacing of pixels in both the vertical and horizontal directions -or the x and y axes -is a dimension y. In alternative arrangements, the spacing in the vertical direction may be different to that of the horizontal direction. In general, the larger the detection point spacing (i.e. the larger the value of y), the lower the detection resolution.
In this description, the term detection resolution is used to refer to the resolution by which inputs can be detected at the interactive surface, being the number of detection points per unit area on the interactive surface.
This may refer to a fixed spacing in both the vertical and horizontal directions as shown, or a first fixed spacing in the horizontal direction and a second fixed spacing in the vertical direction.
Although not drawn to scale, the arrays of Figures 2(a) and 2(b) are drawn to illustrate a difference in resolution there between. It can be seen that the spacing x of the pixels in Figure 2(a) is smaller than the spacing y of the detection points in Figure 2(c), such that there are more display pixels per unit area than detection points. Thus the display resolution is greater than the detection resolution. This gives rise to a problem, in that the display resolution being greater than the detection resolution gives rise to difficulty in mapping the detected position of an input to a specific display pixel. For example, if an input is detected at detection position 208 of the array 210 of Figure 2(b), there are a number of potential display pixels with which the position could be associated. Therefore the detected position input cannot be precisely mapped to a display pixel.
If the detection resolution was the same as the display resolution, then each detected input could be accurately mapped to the display pixels.
The invention addresses this problem by providing a method for controlling the transformation, under the control of a moving input at the interactive surface, of the graphical representation of a graphical object displayed on the interactive display surface, such that responsive to a movement of a detected input for a selected object, the display of the displayed object is transformed. The transformation is movement of at least one display pixel of the displayed object by a distance corresponding to at least one display pixel position, the actual distance moved being less than the actual distance by which the interactive surface detects the input to move.
The transformation of the displayed object is a movement.
The movement may be a movement of the entire displayed object, or a movement of one or more display pixels of the displayed object. The movement may be to move a displayed object to a new location; to rotate a displayed object; to resize a displayed object; or in 3-dimensions to adjust the depth of the displayed object. All these movements require movement of at least one display pixel of a displayed object, but not necessarily all display pixels of the displayed object.
Thus, the detected actual distance of movement of the input is scaled, and the display of the object associated with the detected input transformed by the actual scaled distance.
As such, the ratio of detected actual input distance to transformed actual object distance is l:n, where n is less than one. For every detected unit of distance moved by the input, the displayed object is therefore transformed by a lesser scaled distance. A detected unit of distance may be the distance between adjacent detection points in Figure 2(b); i.e. the input detection resolution.
In a preferred embodiment, the scaling may be set to translate or map the input detection resolution to the display resolution, such that responsive to detection of movement from one detection point to an adjacent detection point, an object (or a point of an object) is moved from one displayed pixel to an adjacent displayed pixel. Thus if the display resolution is twice the detection resolution (i.e. the detection points are separated by a distance y which is twice the distance by which the display pixels are separated, such that y=2x), then the ratio of detected actual input distance to transformed actual object distance is 1:0.5.
The implementation of any such scaling may be selective.
The scaling may be provided as a first mode of operation. In a second mode of operation, a different scaling may be applied to the detected distance by which an input is detected to move, to transform the display of a displayed object.
Alternatively, in a second or further mode of operation, no scaling may be applied to the detected distance by which an input is detected to move, to transform the display of a displayed object.
In a preferred implementation, in the first mode of operation the transformation of the display of the object may be appropriately scaled to be less than the distance by which the associated detected input is moved; and in a second mode of operation the transformation of the display of the object may correspond to the distance by which the detected input is moved.
The direction of movement of the display of the object is determined by the direction of movement of the input.
Preferably the movement of the display of the object is in the same direction as the detected movement of the input.
In a first preferred mode of operation the display of the object is preferably transformed by a distance of one display pixel in response to a detected movement of the input which corresponds to a distance of more than one display pixel spacing. In the preferred embodiment, this mode of operation is enabled only when the display resolution is greater than the detection resolution. The display of the object is preferably transformed by a distance of one display pixel in response to a detected movement of the input which corresponds to one detection spacing. That is, the display of the object is preferably transformed by a distance of x in response to a detected movement of the input of a distance y.
This first preferred mode of operation can be considered a first displayed object transformation mode, involving moving the displayed position of at least one display pixel of the selected displayed object by one or more units of actual distance which is less than the actual distance of the detected movement of the detected input. As noted above, the displayed position of a selected object may be moved by one display pixel in response to detection of the movement of the pointing device by one detection position. As the display resolution is greater than the detection resolution, this results in the actual distance moved by at least one display pixel of the displayed object being less than the actual distance moved by the pointing device.
Alternatively, in this displayed object transformation mode at least one display pixel of the selected displayed object may be moved from a starting position to a finishing position which most closely matches a reduced scale version of a distance the detected input moves from its starting point, in the detected direction.
In a preferred second mode of operation the object is preferably movable by a distance of multiple display pixels, in response to a detected movement of the input which corresponds to the same distance of multiple display pixels spacing.
This second mode of operation can be considered a second displayed object transformation mode, involving moving the displayed position of at least one pixel of the selected displayed object by more than one unit of pixel spacing distance corresponding to a unit of detected resolution distance corresponding to the actual detected movement of the detected input. Thus, for example, if the display resolution is n times the detection resolution, for each detected movement of the pointing device between adjacent detection points, the displayed object is moved by n pixels.
Alternatively, in the second displayed object transformation mode at least one pixel of the selected displayed object is moved from a starting position to a finishing position which most closely matches the actual detected finishing position of the detected input.
In general, if in an embodiment the ratio of the display resolution to the detection resolution is n:rn, in the first displayed object transformation mode each in units of input movement detected may cause the displayed object to move by n pixels, and in the second displayed object transformation mode each in units of input movement detected may cause the displayed object to move by inn pixels.
In a preferred implementation, the system is adapted to switch between the preferred first and second modes of operation. Such switching may be in accordance with an automatic arrangement, or by manual selection.
In a preferred arrangement, an automated switching is provided. Examples of techniques for providing for the switching between modes are set out further hereinbelow.
With reference to Figure 3, there is illustrated an exemplary process which may be implemented into computer systems associated with an interactive display, in accordance with a preferred embodiment of the invention.
In a step 302 input data detected at the interactive surface is received. In a step 304, a determination is made as to the mode of operation to be entered. In general, a "fine" mode of operation may be entered, or a "coarse" mode of operation may be entered. In the fine mode of operation, the transformation of a display pixel of a displayed object, responsive to detection of movement of an input associated with that object, is controlled to be achieved in a fine, or precise, way. This corresponds to the first mode of operation described above, where the detected movement is scaled. In the coarse mode of operation the transformation of a display pixel of a displayed object, responsive to detection of movement associated with an input selecting that object, is achieved in a coarse, or imprecise, way. This corresponds to the second mode of operation described above, where the detected movement is not scaled, but mapped.
In dependence on the mode of operation determined in step 304, the fine mode of operation may be entered in step 306.
In step 306, the detected movement of the input is scaled and the scaled movement then mapped to the closest display pixel to the determined location. In a coarse mode of operation in step 308, the detected movement of the input is mapped to the closest display pixel (without any scaling) The automated switching preferably provides for selecting between the first and second modes of operation, in step 304, in dependence on a characteristic of the detected input.
Example techniques for determining the selection are described below.
In one arrangement, the characteristic of the detected input is the contact surface area of the detected input. Such an arrangement may rely upon the detected contact surface of the area being indicative of how accurately the position of the detected input is to be resolved. For example, a selection technique may implement the first mode of operation if the width of the detected input is determined to be wider than the space between two pixels.
This technique can be considered a generalisation of an example described below, where a distinction is made between a single finger and multiple fingers or a flat hand.
In this arrangement, where the invention is implemented in an electronic whiteboard system adapted to detected inputs from touch inputs, the contact surface area of the detected input can be used to indicate which type of input is being used. A threshold value may be set, and if the detected contact area is determined to be less than the threshold, the input may be assumed to require fine-positioning and the first mode of operation is enabled. If the detected contact area is determined to be greater than the threshold, the input may be assumed to require coarse positioning, and then the second mode of operation is enabled.
In another example utilising this arrangement, the switching may be determined in dependence upon distinguishing between types of touch. For example the touch input detection technology is adapted to assess the area of touch, using techniques well-known in the art, to distinguish between a flat hand and one finger making contact with the interactive surface within the object to be moved.
If a flat hand is detected, then the selected object is determined to be moved in a coarse manner about the surface corresponding to the raw resolution of the sensing technology.
As such, each incremental step of touch resolution results in the object being moved to the nearest representative pixel of the display technology. For example if the touch sensor resolution had half the resolving resolution of the display resolution, each step of touch movement would cause the object to jump two display pixels at a time.
If a finger is detected, then the selected object is determined to be moved in a fine manner about the surface corresponding to a higher resolution than the resolution of the sensing technology. The sensor will detect the smaller surface area of contact from the finger in comparison to the flat hand, and switch to a fine positioning mode of operation.
The user will then need to move their finger by a distance further than they want the object to move, in order to achieve the desired movement of the object. Thus a positioning scaling factor is introduced. In an example, the user is required to move their finger two units of actual distance to move the object one unit of actual distance. Thus a 1:2 scaling is provided -for each two detected step units of distance of the finger, the object is moved by one display pixel. To make the system even easier to control, the scaling could be made 1:3 or 1:4, requiring 3 or 4 position sensing increments to move the object one display pixel.
This example could be extended, to include the possibility that the scaling ratio is dictated by the number of fingers: e.g. 2 fingers gives a scaling ratio of 1:2; 3 fingers gives a scaling ratio of 1:3.
In another arrangement, the characteristic of the detected input applied in step 304 is the number of contact points associated with the detected input.
An example in accordance with this arrangement is the selection of the mode of operation in accordance with the number of fingers detected for the object selection. The detection of one finger may select the first mode of operation, and fine positioning to occur. The detection of two fingers may select the second mode of operation, and coarse positioning to occur.
In yet another arrangement the characteristic of the detected input applied in step 304 is the position of a contact point of the detected input relative to the selected object.
An example in accordance with this mode of operation is that if the detection of the input is in the centre of the object, the second mode of operation is enabled and coarse positioning takes place. If the detection of the input is approximately half way to the edge of the object, the first mode of operation is enabled and fine positioning takes place.
For example, where the displayed object is a protractor, touching the object in the middle allows the protractor to be generally moved around the display. Touching the protractor at the edge allows the protractor object to be precisely moved to, e.g., measure an angle.
A yet further example of a switching control mechanism is detection of the speed at which the detected input is moving.
If the detected input is moving slowly (below a predetermined speed threshold), then the first mode of operation is enabled, and the detected input movement is scaled. If the detected input is moving quickly (above a predetermined speed threshold), then the second mode of operation is enabled, and the detected input movement is not scaled.
Overall, the selection technique comprises: determining selection of a displayed object on the interactive display surface; detecting movement of an input at the interactive display surface associated with the selected displayed object; and determining a displayed object transformation mode in dependence on a characteristic of the detected input.
In an embodiment, the system may be adapted such that when a displayed object is selected by the pointer, a display window "pops up" on the display, within which the pointing device can be positioned and moved, to control the movement of the displayed object in a fine positioning mode. Thus the object is first selected, and then the movement to be applied to the object is entered in the pop-up box. The direction of the movement is made in dependence upon the direction of movement of the pointer within the pop-up box, and the appropriate scaling is applied to the detected movement to move the displayed object.
With reference to Figure 4, there is illustrated the main functional elements of a computer system for controlling an interactive surface in accordance with embodiments of the invention. The computer system 400 comprises a processor 404, a memory 408, a graphics driver 406, and an interactive whiteboard driver 402. The processor 404, memory 408, graphics driver 406 and interactive whiteboard driver 402 are interconnected by a communications bus 410. Data from the interactive surface, representing data detected by the interactive surface at its surface, is delivered to the interactive whiteboard driver 402, and processed by the processor 404. The processor 404 uses the memory 408 to store data as appropriate in order to achieve the necessary processing. The processed data processed by the processor 404 results in updated data being delivered to the graphics driver 406, including updated positioning information for displayed objects. The functionality of the present invention is implemented and controlled by the processor 404.
The methods described herein may be implemented on computer software running on a computer system. The invention may therefore be embodied as a computer program code being executed under the control of a processor of a computer system. The computer program code may be stored on a computer program product. A computer program product may be included in a computer memory, a portable disk or portable storage memory, or hard disk memory.
The invention is described herein in the context of its application to a computer system forming part of an interactive display system. It will be understood by one skilled in the art that the principles of the invention, and embodiments described herein, are not limited to any specific interactive display system. The principles of the invention and its embodiments may be implemented in any interactive display system. The invention and its embodiments is not limited to the use of a pointer or pointing device in combination with an interactive display system, and the invention and its embodiments equally apply to arrangements in which a touch-sensitive touch surface arrangement is provided for the interactive display, or any other type of interactive surface is provided such as one utilising camera technology.
Although the invention is described herein by way of a reference to a main embodiment comprising an electronic whiteboard in an educational environment, one skilled in the art will appreciate that the invention is limited neither to an electronic whiteboard nor to educational applications. The inventions is applicable to any interactive surface, such as for example an interactive surface provided as integral to a computing device, such as a so-called slate device, or interactive surfaces configures a table-top devices. It will also be understood by one skilled in the art, however, that the invention and its embodiments are not limited to application in an educational environment. One skilled in the art will envisage other possible applications.
The invention has been described herein by way of reference to particular examples and exemplary embodiments.
One skilled in the art will appreciate that the invention is not. limited to the details of the specific examples and exemplary embodiments set forth. Numerous other embodiments may be envisaged without departing from the scope of the invention, which is defined by the appended claims.

Claims (19)

  1. CLAIMS: 1. A method of controlling the positioning of an object displayed on an Interactive display surface, in which interactive display surface the display resolution is greater than the detection resolution, the method comprising: determining selection of a displayed object on the interactive display surface; detecting movement of an input at the interactive display surface associated with the selected displayed object; and in an object transformation mode, moving at least one displayed pixel of the selected displayed object by a distance which is less than the distance of the detected movement of the detected input.
  2. 2. The method of claim 1 wherein the step of moving at least one displayed pixel of the selected displayed object comprises moving the at least one displayed pixel from an initial display pixel position to an adjacent display pixel position in responsive to movement of the detected input from an initial detection position to an adjacent detection position.
  3. 3. The method of claim 1 or claim 2 wherein the at least one displayed pixel of the selected displayed object is moved from an initial display pixel position to a further display pixel position which is most closely located to a reduced scale distance of a distance of the detected input from its initial detected position to its further detected position.
  4. 4. The method of any one of claims 1 to 3 further comprising: in a further object transformation mode, moving at least one displayed pixel of the selected displayed object by a distance corresponding to a distance of the detected movement of the detected input; and selecting between the object transformation mode and the further object transformation mode.
  5. 5. The method of claim 4 wherein the step of moving at least one displayed pixel of the selected displayed object comprises moving the at least one displayed pixel from an initial display pixel position to a further display pixel position in response to movement of the detected input from an initial detection position to a further detection position, the further display pixel position being the display pixel position which is most closely located to the position of the further detection position.
  6. 6. The method of claim 4 or claim 5 wherein in the further object transformation mode each input movement detected between detection positions causes at least one pixel of the displayed object to move by a plurality of pixel positions, and wherein in the object transformation mode each input movement detected between detection positions causes at least one pixel of the displayed object to move by a single pixel position.
  7. 7. The method of any one of claims 4 to 6 wherein in the further object transformation mode the at least one displayed pixel is movable by a distance which corresponds to the distance by which the detected input is moved.
  8. 8. The method of any preceding claim wherein in the object transformation mode the at least one displayed pixel is movable by a distance which is a reduced scale distance of the distance by which the detected input is moved.
  9. 9. The method of any one of claims 4 to 8 further comprising selecting between the modes in dependence on a characteristic of the detected input.
  10. 10. The method of claim 9 wherein the characteristic of the detected input is one or more of: the contact surface area of the detected input; the number of contact points associated with the detected input; the position of a contact point of the detected input relative to the selected object; or the speed of movement of the detected input.
  11. 11. A computer program adapted to perform the method of any one of claims 1 to 10,
  12. 12. A computer program product adapted to store computer code which, when run on a computer, performs the method of any one of claims 1 to 10.
  13. 13. An interactive display system including an interactive display surface for detecting one or more inputs at said surface, and for controlling the display of an object on the interactive display surface in dependence on detected movement of an input at the surface whilst the input is selecting the object, the system comprising: a displayed object transformation controller, for transforming the display of at least one displayed pixel of the selected object, in dependence upon a detected movement of the input at the interactive surface, the displayed object transformation controller being adapted, in a first mode of operation, to move at least one displayed pixel of the displayed object by a distance which is less than the detected distance moved by the input means.
  14. 14. The system of claim 13 wherein the displayed object transformation controller is adapted to move the at least one displayed pixel from an initial display pixel position to an adjacent display pixel position responsive to movement of the detected input from an initial detection position to an adjacent detection position.
  15. 15. The system of claim 13 or claim 14 wherein the displayed object transformation controller is adapted to move the at least one displayed pixel from an initial display pixel position to a further display pixel position which most closely matches the position of a reduced scale distance of a distance of the detected input from its initial position, in a direction determined by the input detected direction.
  16. 16. The system of any one of claims 13 to 15 wherein the displayed object transformation controller is configured, in a further object transformation mode, to move at least one displayed pixel of the selected displayed object by a distance corresponding to a distance of the detected movement of the detected input; and is further configured to select between the object transformation mode and the further object transformation mode.
  17. 17. The system of claim 16 wherein the displayed object transformation controller is further configured to move the at least one displayed pixel to a displayed pixel position responsive to movement of the detected input from an initial detection position to a further detection position, the further display pixel position being the display pixel position which most closely matches the position of the further detection position.
  18. 18. The system of claim 16 or claim 17 wherein the displayed object transformation controller is further configured, in the further object transformation mode, such that at least one pixel of the selected displayed object is moved from a starting position to a finishing position which most closely matches the detected finishing position of the detected input.
  19. 19. The method of any one of claims 16 to 18 wherein the displayed object transformation controller is further configured such that in the further object transformation mode each input movement detected between detection positions causes at least one pixel of the displayed object to move by a plurality of pixel positions, and wherein in the object transformation mode each input movement detected between detection positions causes at least one pixel of the displayed object to move by a single pixel position.*::r: INTELLECTUAL . ... PROPERTY OFFICE Application No: GB 1010941.1 Examiner: Dr Russell Maurice Claims searched: 1-19 Date of search: 25 October 2010 Patents Act 1977: Search Report under Section 17 Documents considered to be relevant: Category Relevant Identity of document and passage or figure of particular relevance to claims X 1-19 US 2004/196267 Al (FUJITSU LTD) see e.g. paragraphs 3, 7, 26 & 35 A -W02009/158685A1 (MICROSOFT CORP) see e.g. the Abstract Categories: X Document indicating lack of novelty or inventive A Document indicating technological background and/or state step of the art.Y Document indicating lack of inventive step if P Document published on or after the declared priority date but combined with one or more other documents of before the filing date of this invention.same category.& Member of the same patent family E Patent document published on or after, but with priority date earlier than, the filing date of this application.Field of Search:Search of GB, EP, WO & US patent documents classified in the following areas of the UKCX: Worldwide search of patent documents classified in the following areas of the IPC IGO6F The following online and other databases have been used in the preparation of this search report WPI, EPODOC, XPIPCOM International Classification: Subclass Subgroup Valid From GO6F 0003/048 01/01/2006 Intellectual Property Office is an operating name of the Patent Office www.ipo.gov.uk
GB1010941.1A 2010-06-29 2010-06-29 Fine object positioning Expired - Fee Related GB2481606B (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
GB1010941.1A GB2481606B (en) 2010-06-29 2010-06-29 Fine object positioning
EP11171864.9A EP2402847A3 (en) 2010-06-29 2011-06-29 Fine object positioning
US13/172,082 US9367228B2 (en) 2010-06-29 2011-06-29 Fine object positioning

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1010941.1A GB2481606B (en) 2010-06-29 2010-06-29 Fine object positioning

Publications (3)

Publication Number Publication Date
GB201010941D0 GB201010941D0 (en) 2010-08-11
GB2481606A true GB2481606A (en) 2012-01-04
GB2481606B GB2481606B (en) 2017-02-01

Family

ID=42583166

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1010941.1A Expired - Fee Related GB2481606B (en) 2010-06-29 2010-06-29 Fine object positioning

Country Status (3)

Country Link
US (1) US9367228B2 (en)
EP (1) EP2402847A3 (en)
GB (1) GB2481606B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9195349B2 (en) * 2011-10-20 2015-11-24 Nokia Technologies Oy Apparatus, method and computer program using a proximity detector
CN104346061B (en) * 2013-08-07 2018-08-31 联想(北京)有限公司 Control method and electronic equipment
EP2957987A1 (en) * 2014-06-19 2015-12-23 Nokia Technologies OY A non-depth multiple implement input and a depth multiple implement input
US20190056857A1 (en) * 2017-08-18 2019-02-21 Microsoft Technology Licensing, Llc Resizing an active region of a user interface
US11237699B2 (en) 2017-08-18 2022-02-01 Microsoft Technology Licensing, Llc Proximal menu generation
US11301124B2 (en) 2017-08-18 2022-04-12 Microsoft Technology Licensing, Llc User interface modification using preview panel
JP2021153219A (en) * 2020-03-24 2021-09-30 セイコーエプソン株式会社 Method for controlling display unit, information processing apparatus, and display system

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040196267A1 (en) * 2003-04-02 2004-10-07 Fujitsu Limited Information processing apparatus operating in touch panel mode and pointing device mode
WO2009158685A2 (en) * 2008-06-27 2009-12-30 Microsoft Corporation Virtual touchpad

Family Cites Families (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2686440B1 (en) * 1992-01-17 1994-04-01 Sextant Avionique DEVICE FOR MULTIMODE MANAGEMENT OF A CURSOR ON THE SCREEN OF A DISPLAY DEVICE.
US6014121A (en) * 1995-12-28 2000-01-11 Canon Kabushiki Kaisha Display panel and apparatus capable of resolution conversion
DE69627286D1 (en) * 1995-12-28 2003-05-15 Canon Kk Color display panel and device with improved sub-pixel arrangement
US6340994B1 (en) * 1998-08-12 2002-01-22 Pixonics, Llc System and method for using temporal gamma and reverse super-resolution to process images for use in digital display systems
US20020063807A1 (en) * 1999-04-19 2002-05-30 Neal Margulis Method for Performing Image Transforms in a Digital Display System
US7728821B2 (en) * 2004-08-06 2010-06-01 Touchtable, Inc. Touch detecting interactive display
US20060066590A1 (en) * 2004-09-29 2006-03-30 Masanori Ozawa Input device
KR100678945B1 (en) * 2004-12-03 2007-02-07 삼성전자주식회사 Apparatus and method for processing input information of touchpad
JP4086035B2 (en) * 2004-12-09 2008-05-14 セイコーエプソン株式会社 Automatic image correction circuit
US7577925B2 (en) * 2005-04-08 2009-08-18 Microsoft Corporation Processing for distinguishing pen gestures and dynamic self-calibration of pen-based computing systems
US7640518B2 (en) * 2006-06-14 2009-12-29 Mitsubishi Electric Research Laboratories, Inc. Method and system for switching between absolute and relative pointing with direct input devices
US8269725B2 (en) * 2006-06-28 2012-09-18 Microsoft Corporation Input simulation system for touch based devices
WO2008085874A2 (en) * 2007-01-05 2008-07-17 Marvell World Trade Ltd. Methods and systems for improving low-resolution video
CN101626735A (en) * 2007-03-07 2010-01-13 皇家飞利浦电子股份有限公司 Positioning device for positioning an object on a surface
JP2009077362A (en) * 2007-08-24 2009-04-09 Sony Corp Image processing device, dynamic image reproduction device, and processing method and program for them
TW200928905A (en) * 2007-12-26 2009-07-01 E Lead Electronic Co Ltd A method for controlling touch pad cursor
JP2009276819A (en) * 2008-05-12 2009-11-26 Fujitsu Ltd Method for controlling pointing device, pointing device and computer program
US8098262B2 (en) * 2008-09-05 2012-01-17 Apple Inc. Arbitrary fractional pixel movement
US8451236B2 (en) * 2008-12-22 2013-05-28 Hewlett-Packard Development Company L.P. Touch-sensitive display screen with absolute and relative input modes
US20100245268A1 (en) * 2009-03-30 2010-09-30 Stg Interactive S.A. User-friendly process for interacting with informational content on touchscreen devices
US20110205169A1 (en) * 2010-02-24 2011-08-25 Primax Electronics Ltd. Multi-touch input apparatus and its interface method using hybrid resolution based touch data
JP2011177203A (en) * 2010-02-26 2011-09-15 Nintendo Co Ltd Object controlling program and object controlling apparatus
JP5488082B2 (en) * 2010-03-17 2014-05-14 セイコーエプソン株式会社 Information recognition system and control method thereof
JP5510185B2 (en) * 2010-08-20 2014-06-04 ソニー株式会社 Information processing apparatus, program, and display control method

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040196267A1 (en) * 2003-04-02 2004-10-07 Fujitsu Limited Information processing apparatus operating in touch panel mode and pointing device mode
WO2009158685A2 (en) * 2008-06-27 2009-12-30 Microsoft Corporation Virtual touchpad

Also Published As

Publication number Publication date
EP2402847A2 (en) 2012-01-04
GB201010941D0 (en) 2010-08-11
US20120001945A1 (en) 2012-01-05
GB2481606B (en) 2017-02-01
US9367228B2 (en) 2016-06-14
EP2402847A3 (en) 2017-09-06

Similar Documents

Publication Publication Date Title
US9996176B2 (en) Multi-touch uses, gestures, and implementation
US9207801B2 (en) Force sensing input device and method for determining force information
CN102520839B (en) Notification group touch gesture dismissal techniques
US9367228B2 (en) Fine object positioning
US8004503B2 (en) Auto-calibration of a touch screen
US10627990B2 (en) Map information display device, map information display method, and map information display program
US8446389B2 (en) Techniques for creating a virtual touchscreen
US8525776B2 (en) Techniques for controlling operation of a device with a virtual touchscreen
US20130215018A1 (en) Touch position locating method, text selecting method, device, and electronic equipment
US20110227947A1 (en) Multi-Touch User Interface Interaction
US9128548B2 (en) Selective reporting of touch data
JP6004716B2 (en) Information processing apparatus, control method therefor, and computer program
US20100328236A1 (en) Method for Controlling a Computer System and Related Computer System
US20080082940A1 (en) Methods, systems, and computer program products for controlling presentation of a resource based on position or movement of a selector and presentable content
US11182067B2 (en) Interactive display overlay systems and related methods
CN103324306A (en) Touch screen computer mouse simulation system and method
US8954638B2 (en) Selective reporting of touch data
JP5256755B2 (en) Information processing method and information processing apparatus
US20200057549A1 (en) Analysis device equipped with touch panel device, method for display control thereof, and program
GB2485221A (en) Selection method in dependence on a line traced between contact points
JP2015200977A (en) Information processing unit, computer program and recording medium
JP2010231319A (en) Information processor, operation control program and operation control method

Legal Events

Date Code Title Description
732E Amendments to the register in respect of changes of name or changes affecting rights (sect. 32/1977)

Free format text: REGISTERED BETWEEN 20130919 AND 20130925

PCNP Patent ceased through non-payment of renewal fee

Effective date: 20210629