AU758514B2 - Control device and method of controlling an object - Google Patents

Control device and method of controlling an object Download PDF

Info

Publication number
AU758514B2
AU758514B2 AU43033/99A AU4303399A AU758514B2 AU 758514 B2 AU758514 B2 AU 758514B2 AU 43033/99 A AU43033/99 A AU 43033/99A AU 4303399 A AU4303399 A AU 4303399A AU 758514 B2 AU758514 B2 AU 758514B2
Authority
AU
Australia
Prior art keywords
control device
image
images
recording means
control
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
AU43033/99A
Other versions
AU4303399A (en
Inventor
Petter Ericson
Christer Fahraeus
Ola Hugosson
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Anoto Group AB
Original Assignee
C Technologies AB
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from SE9801535A external-priority patent/SE511855C2/en
Priority claimed from SE9803456A external-priority patent/SE512182C2/en
Application filed by C Technologies AB filed Critical C Technologies AB
Publication of AU4303399A publication Critical patent/AU4303399A/en
Application granted granted Critical
Publication of AU758514B2 publication Critical patent/AU758514B2/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • G06F3/0317Detection arrangements using opto-electronic means in co-operation with a patterned surface, e.g. absolute position or relative movement detection for an optical mouse or pen positioned with respect to a coded surface
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03545Pens or stylus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means

Description

WO 99/60469 PCT/SE99/00719 CONTROL DEVICE AND METHOD OF CONTROLLING AN OBJECT Field of the Invention The present invention relates to a control device having image-recording means which are adapted to be moved, preferably manually, for controlling an object as a function of the movement of the image-recording means.
The invention also relates to a method of controlling an object.
BackQround of the Invention Today, personal computers are usually equipped with a control device, a so-called computer mouse, which is used for positioning a cursor on the computer screen. The positioning is carried out by the user passing the mouse over a surface, the hand movement indicating how the mouse should be positioned. The mouse generates positioning signals indicating how the mouse has been moved and thus how the cursor should be moved. Presently, the most common type of mouse has a ball on its underside, which turns as a result of friction against the surface when the mouse is passed over the same and which in this connection drives position sensors which in turn generate the positioning signals. Normally, the mouse can also be used for providing instructions to the computer by the intermediary of one or more buttons on which the user clicks.
Optical computer mouses are also known. JP 09190277 shows an optical mouse having one CCD line sensor for the X-axis and one CCD line sensor for the Y-axis. Data recorded by means of the CCD line sensors at a certain time is compared with data recorded at a subsequent time, whereby the movement of the mouse in the X direction and in the Y direction can be determined.
A mouse is thus used for controlling a virtual object. However, there are other control devices whose WO 99/60469 PCT/SE99/00719 2 structure is similar to that of a mouse, but which are used for controlling physical objects instead.
Furthermore, there are control devices for controlling objects in two dimensions, i.e. in a plane, or in three dimensions, i.e. in a space.
WO 98/11528 describes a control device which provides a computer with three-dimensional information. The device is based on three accelerometers which are placed in mutually perpendicular directions and which are capable of measuring acceleration or inclination in one to three directions. The device can, for example, be placed on the user's head or it can be hand-held.
A computer mouse for inputting three-dimensional information to a computer is described in US Patent 5,506,605. This computer mouse is hand-held and is intended to be held freely in space. Furthermore, it can comprise sensors for measuring various physical properties which are subsequently interpreted by suitable electronic means, converted into digital format, and input to the computer. The position of the mouse in space is determined by position sensors, which may be based on light, acceleration, gyroscopes, etc. In the embodiment described, use is made of an ultrasonic sensor and a magnetic sensor. On the basis of the input, the computer can subsequently generate tactile feedback in the form of vibrations which, for example, provide the user with information concerning the location of the mouse in relation to its desired location.
Summary of the Invention It is an object of the present invention to provide an improved control device and an improved method of controlling an object which are suited for both two-dimensional and three-dimensional control of physical as well as virtual objects.
This object is achieved by control devices according to claims 1 and 23 and by a method according to claim 24.
Preferred embodiments are stated in the subclaims.
WO 99/60469 PCT/SE99/00719 3 Thus, according to a first aspect, the invention relates to a control device having image-recording means which are adapted to be moved by a user, preferably manually, for controlling an object, which may be physical or virtual, as a function of the movement of the image-recording means. The image-recording means are adapted to record a plurality of images with partially overlapping contents when they are being moved, the partially overlapping contents enabling the determination of how the image-recording means have been moved.
The invention is thus based on the idea of using images for determining how a unit is being moved. This technology can be used for two-dimensional as well as three-dimensional control. It is advantageous because it requires few sensors and no moving parts. The entire movement information is contained in the overlapping contents of the images. Because the device records images of the surroundings, an "absolute" position indication is obtained, making it possible to detect when the imagerecording means are in a specific position, which, for example, is not possible when using control devices based on measuring acceleration. In addition to movement, turning can also be detected and used for controlling an object.
In one embodiment, the control device is designed for controlling an object in a plane. In this case, the overlapping images enable the determination of not only the movement of the image-recording means but also their turning in the plane, which, for example, is not possible when using a traditional mouse with a ball. Accordingly, the control device is advantageously adapted to control the angular position of the object in the plane. When the device is designed for control in a plane, the imagerecording means are advantageously provided with a lightsensitive sensor means having a two-dimensional sensor surface, a so-called area sensor, for recording the images. In this context, a two-dimensional sensor sur- WO 99/60469 PCT/SE99/00719 4 face refers to the fact that the sensor surface must be capable of imaging a surface with a matrix of pixels. CCD sensors and CMOS sensors are examples of suitable sensors. A single sensor is thus sufficient for providing control in a plane.
In an alternative embodiment, the device is designed for controlling an object in a space. In this case, too, the control device is advantageously adapted to control the angular position of the object, in which connection the control can take place about three axes. In an economical embodiment, it may be sufficient for the device to have two light-sensitive sensors each having a two-dimensional sensor surface for recording said images in two different directions.
However, for more precise control in space, it is preferable for the image-recording means to comprise three sensors for recording the images in three, preferably perpendicular, directions. This enables the determination of the translation along three mutually perpendicular axes as well as of the rotation about these axes by means of relatively simple calculations.
Suitably, the control device has image-processing means for providing control signals for controlling the object. The image-processing means may be located in the same physical casing as the image-recording means, the output signals from this physical casing thus constituting the control signals for controlling the object which is to be controlled. However, the image-processing means may also be located in another physical casing, for example in a computer whose cursor constitutes the object which is to be controlled, or in a computer which in turn controls, or forms part of, a physical object which is controlled by means of the control device, the output signals from the image-processing means constituting the control signals for controlling the object. In this context, it should be noted that the control signals outputted from the image-processing means may require further WO 99/60469 PCT/SE99/00719 processing before they can be used for direct control of the object. The image-processing means are advantageously implemented with the aid of a processor and software, but can also be implemented completely with the aid of hardware.
The image-processing means are suitably adapted to determine the relative positions of the images with the aid of the partially overlapping contents for providing said control signals. If the control device is used for control in three dimensions, this is suitably carried out in parallel with respect to all the sensors. The distance and direction of the movement, and thus the current position, can be determined on the basis of the relative positions of the images.
Advantageously, the control device has a calibration mode, in which the image-recording means are moved in a way that enables the image-processing means to relate the relative positions of the images to an actual movement of the image-recording means. As an alternative, the control device could be provided with a distance meter measuring the distance to the surfaces being imaged with the aid of the sensors, but that would, of course, be more expensive.
The image-processing means are suitably adapted to generate said control signals on the basis of at least one movement vector obtained from the relative positions of the images.
Additionally, or alternatively, the image-processing means may be adapted to generate said control signals on the basis of at least one turning indication obtained from the relative position of the images. The control signals can thus be used for controlling the turning of an object as well as its movement, which is an advantage compared to traditional mechanical computer mouses.
In the case of a control device for three-dimensional control, the image-processing means can combine information from all the sensors with respect to the WO 99/60469 PCT/SE99/00719 6 relative positions of the images in order to generate one movement vector and one turning vector. In this way, the position of the image-recording means can be unambiguously determined. In other words, the control device can carry out a digitisation of the movement performed by a hand when it moves the image-recording means in order to enable a computer to control an object on the basis of this movement.
In one embodiment, the image-processing means may be adapted to generate said control signals on the basis of the speed at which the image-recording means have been moved, the speed being determined from the relative positions of the images and the image-recording frequency.
Suitably, the receiver of the control signals should know that the control signals are control signals so that it will know how the signals are to be subsequently processed. Consequently, the image-processing means are preferably adapted to output said control signals in such a way that a receiver can identify the control signals as being intended for controlling an object. This can, for example, be effected by the use of a predetermined protocol.
An advantage of using an image-based control device is that it becomes possible to determine when the imagerecording means are in a predetermined position, since this position can be defined by means of one or several images. For example, it is possible to detect when the image-recording means have returned to their original position. For this purpose, the image-processing means are adapted to store at least one reference image and to compare images recorded subsequently with this image in order to generate a signal in the case of an essentially complete overlap. For instance, the user can define a certain position as a reference position by clicking on the control device in this position.
If the image-recording means and the image-processing means are located in different physical casings, WO 99/60469 PCT/SE99/00719 7 the image-recording means may advantageously comprise a transmitter for wireless transmission of images from the image-recording means to the image-processing means.
Moreover, especially if the image-recording and the image-processing means are located in the same physical casing, it may be an advantage if the image-processing means comprise a transmitter for wireless outputting of the control signals, for example to a computer whose cursor is to be controlled. In both cases, the control device is very easy to use since no flex is required for the information transfer. For example, a user can have a personal image-recording means or control device and use it with different computers or receivers of the control signals. The transmitter can be an IR transmitter, a radio transmitter, which, for example, uses the so-called Bluetooth standard, or some other transmitter which is suitable for wireless information transfer between two units located fairly close to each other.
In a preferred embodiment, the control device is a computer mouse, i.e. a device which can be connected to a computer and be used for positioning a cursor in one, two, or several dimensions.
The control device can be used in a first absolute mode or in a second relative mode. In the absolute mode, the movement of the controlled object is proportional to the movement of the image-recording means. In other words, the object moves in a way that corresponds to the movement of the image-recording means, regardless of where these are located. In the relative mode, however, the control device is configured so that the speed or acceleration of the controlled object increases when the distance increases between the image-recording means and a predefined origin of coordinates. In this way, it becomes possible to achieve faster movement of the object by holding the image-recording means farther away from the predefined origin, while, at the same time, precision WO 99/60469 PCT/SE99/00719 8 control can be achieved by holding the image-recording means closer to the origin.
According to a second aspect of the invention, it relates to a control device having image-recording means which are adapted to be turned, preferably manually, for controlling an object as a function of the turning of the image-recording means. The control device is adapted to record a plurality of images with partially overlapping contents when the image-recording means are being turned, the partially overlapping contents of the images enabling the determination of how the image-recording means have been turned.
This control device is thus based on the same idea as the control device described above, but instead of controlling the object as a function of the movement of the image-recording means, it is controlled as a function of their turning. This control device may, for example, be a trackball. The embodiments discussed above are to a large extent also applicable to the turning control device, and the same advantages are obtained.
According to a third aspect of the invention, it relates to a method of controlling an object, comprising the steps of moving a control device; recording, with the aid of the control device, a plurality of images with overlapping contents during the movement of the control device; and determining the movement of the control device with the aid of the contents of the overlapping images. The same advantages are obtained as those described with respect to the above-mentioned devices.
Brief Description of the Drawings The present invention will be described in more detail below by way of exemplifying embodiments with reference to the accompanying drawings, in which Fig. 1 schematically shows an embodiment of a control device according to the invention; WO 99/60469 PCT/SE99/00719 9 Fig. 2 is a block diagram of the electronic circuitry part of an embodiment of the control device according to the invention; Fig. 3 schematically shows a second embodiment of a control device according to the invention; Fig. 4 is a flowchart illustrating the operation of a control device for two-dimensional control; Fig. 5 schematically shows an "open box" in which the control device in Fig. 3 can be used; Fig. 6 schematically shows a movement of the control device according to the invention from a point to a point (x+6x,y+6y,z+6z) in an orthonormal coordinate system with the axes e w and e,; Fig. 7 schematically shows which translation scalars are outputted from the respective sensors when the control device is being moved (index shows which sensor is generating the respective scalars); and Fig. 8 schematically shows how the control device is intended to be moved in the calibration mode.
Description of Preferred Embodiments The control device according to the invention can be implemented in embodiments of essentially two main types.
A first embodiment of the control device according to the invention will be described below, which embodiment is intended to be used as a two-dimensional mouse. Next, a second embodiment of the control device will be described, which embodiment is intended to be used as a threedimensional mouse. Finally, the operation of the twodimensional and the three-dimensional mouse will be described. In both embodiments described, the imagerecording means and the image-processing means are located in the same physical casing, from which control signals are outputted. As mentioned above, the image-processing means can also be located in a separate physical casing. It is very simple for the skilled person to carry out this modification.
WO 99/60469 PCT/SE99/00719 Design of the Control Device In the first embodiment of the control device shown in Fig. 1, it comprises a casing 1 having approximately the same shape as a conventional highlighter pen. One short side of the casing has a window 2, by the intermediary of which images are read into the device. The window 2 is somewhat recessed in the casing in order not to wear against the underlying surface.
The casing 1 essentially contains an optics part 3, an electronic circuitry part 4, and a power supply part The optics part 3 comprises a light-emitting diode 6, a lens system 7, and image-recording means in the form of a light-sensitive sensor 8, which constitutes the interface with the electronic circuitry part 4.
The task of the LED 6 is to illuminate a surface which is currently located under the window in the case where the control device is held directly against a surface or very close thereto. A diffuser 9 is mounted in front of the LED 6 for diffusing the light.
The lens system 7 has the task of projecting an image of the surface located under the window 2 onto the light-sensitive sensor 8 as accurately as possible.
In this example, the light-sensitive sensor 8 comprises a two-dimensional, square CCD unit (CCD charge coupled device) with a built-in A/D converter. Such sensors are commercially available. The sensor 8 is mounted at a small angle to the window 2 and on its own circuit board 11.
The power supply to the control device is obtained from a battery 12, which is mounted in a separate compartment 13 in the casing.
The block diagram in Fig. 2 schematically shows the electronic circuitry part 4. This is located on a circuit board and comprises a processor 20, which by the intermediary of a bus 21 is connected to a ROM 22, in which the programs of the processor are stored, to a read/write WO 99/60469 PCT/SE99/00719 11 memory 23, which constitutes the working memory of the processor and in which the images from the sensor are stored, to a control logic unit 24, as well as to the sensor and the LED 6. The processor 20, the bus 21, the memories 22 and 23, the control logic unit 24, as well as the associated software together constitute image-processing means.
The control logic unit 24 is in turn connected to a number of peripheral units, comprising a radio transceiver 26 for transferring information to/from an external computer, buttons 27, by means of which the user can control the image-recording means and which can also be used as the clicking buttons of a traditional mouse, as well as an indicator 29, e.g. a LED, indicating when the mouse is ready to be used. Control signals to the memories, the sensor, and the peripheral units are generated in the control logic unit 24. The control logic also handles generation and prioritisation of interrupts to the processor. The buttons 27, the radio transceiver 26, and the LED 6 are accessed by the processor writing and reading in a register in the control logic unit 24. The buttons 27 generate interrupts to the processor 20 when they are activated.
Fig. 3 shows a second embodiment of the control device according to the invention. Like the first embodiment, this embodiment comprises a pen-shaped casing 31.
Besides the window 32 on one short side of the casing, the device has two additional windows 32' and 32". Each of the windows 32, 32', 32" is somewhat recessed in the casing so that it will not wear or scratch should the control device impinge upon a surface when it is in use, or when it is in the idle position.
As in the above case, the casing 1 essentially contains an optics part 33, an electronic circuitry part 34, and a power supply part The optics part 33 comprises a lens package (not shown) with three lens systems and a set of sensors (not WO 99/60469 PCT/SE99/00719 12 shown) with three light-sensitive sensors which constitute the interface to the electronic circuitry part 34 for the windows 32, 32' and 32" respectively. There is no light-emitting diode in this embodiment. The control device is intended to be held at a distance from the surfaces which are being imaged and, consequently, in most cases, ambient light is sufficient to permit images to be recorded.
The lens systems have the task of projecting images of the surfaces at which the windows 32, 32', 32" are directed onto the light-sensitive sensors as accurately as possible.
As in the above embodiment, the light-sensitive sensors comprise two-dimensional, square CCD units with built-in A/D converters. Each sensor is mounted on its own circuit board.
In this embodiment, too, the power supply to the control device is obtained from a battery, which is mounted in a separate compartment in the casing.
In this second embodiment, the design of the electronic circuitry part is essentially the same as that described above with respect to the first embodiment. The electronic circuitry part is shared by all three sensors.
Application of the Device as a Two-dimensional Mouse The device according to the first embodiment can be used as a mouse for inputting movement information, by means of which a cursor can be controlled on a computer screen.
The user directs the window 2 of the control device at a patterned surface, e.g. a mouse pad. He presses one of the buttons 27 to activate the image-recording means, whereupon the processor 20 commands the LED 6 to begin generating strobe pulses at a predetermined frequency, suitably at least 50 Hz. Subsequently, the user passes the control device over the surface in the same way as if it were a traditional mouse, whereupon images with partially overlapping contents are recorded by the sensor WO 99/60469 PCT/SE99/00719 13 8 and stored in the read/write memory 23. The images are stored as images, i.e. with the aid of a plurality of pixels, each having a grey scale value in a range from white to black.
The flowchart in Fig. 4 shows the operation of the two-dimensional mouse in more detail. In step 400, a starting image is recorded. In step 401, the next image is recorded. The contents of this image partially overlap the contents of the previous image.
As soon as an image has been recorded in step 401, the process begins of determining how it overlaps the previous image, step 402, i.e. in which relative position the best match is obtained between the contents of the images. This determination is carried out by translating the images vertically and horizontally relative to each other, and by rotating the images relative to each other.
For this purpose, every possible overlap position between the images is examined, at the pixel level, and an overlap measurement is determined as follows: 1) For each overlapping pixel position, the grey scale values of the two relevant pixels are added up if the latter are not white. Such a pixel position in which none of the pixels is white is designated a plus position.
2) The grey scale sums for all the plus positions are added up.
3) The neighbours of each pixel position are examined. If an overlapping pixel position is not a neighbour of a plus position and consists of a pixel which is white and a pixel position which is not white, the grey scale value of the non-white pixel is subtracted, possibly multiplied by a constant, from the sum in point 2).
4) The overlap position providing the highest overlap measurement as stated above is selected.
Our Swedish patent application No. 9704924-1 and the corresponding U.S. application No. 024 641 describe an alternative way of matching the images in order to find WO 99/60469 PCT/SE99/00719 14 the best overlap position. The content of these applications is herewith incorporated by reference.
As soon as the best overlap position between the current image and the previous image has been determined, the previous image is discarded, whereupon the current image becomes the previous image in relation to the next image recorded.
By determining the relative position of the two images a movement vector is obtained, which indicates how far and in which direction the image-recording means have been moved between the recording of the two images. If the mouse has also been turned between the two images, also a measurement of this turning is obtained. Subsequently, a control signal, which includes the movement vector and the measurement of turning, is transmitted, step 403, by the radio transceiver 26 to the computer for which the control device is operating as a mouse. The computer uses the movement vector and the measurement of turning for positioning the cursor on its screen.
Subsequently, the flow returns to step 401. In order to increase the speed, the steps can be partly carried out in parallel, e.g. by starting the recording of the next image while the current image is being put together with the previous image.
When the mouse is activated, the buttons 27 can be used as clicking buttons for inputting instructions to the computer.
Application of the Device as a Three-dimensional Mouse The device according to the second embodiment can be used as a mouse for inputting movement information, by means of which a cursor can be controlled in three dimensions on a computer screen, i.e. in a space.
As described above, the three-dimensional mouse comprises three sensors 32, 32', 32" having two-dimensional, light-sensitive sensor surfaces. The main axes of the sensors are orientated along the and z-axes in an orthogonal coordinate system and have a two-dimensional WO 99/60469 PCT/SE99/00719 spatial resolution of n x n pixels and a time resolution of m images per second. Each lens system provides a field of vision with an angle of vision of v radians for the associated sensor surface.
When the device is in use, the mouse movements are carried out in an "open box" 50 according to Fig. which is defined by at least two side walls 51 and 52 which are orientated at right angles in relation to each other, and a floor 53. It is also possible that the mouse can be held freely in space, but this requires more complicated calculation algorithms than the ones that will be described below.
When the device is in use, the above method of determining the relative position of images is used for each sensor. Accordingly, the operation in this case can also be described by means of the flowchart in Fig. 4, but instead of recording individual images, a set of images consisting of three images is recorded simultaneously. One movement vector and one turning vector are thus generated with the aid of the images recorded by each light-sensitive sensor, which vectors describe the movement carried out by the mouse between the recording of two consecutive images. These vectors are then included in a control signal which is transmitted to the object which is to be controlled by means of the mouse.
Furthermore, to enable the successful use of the mouse, it is necessary that the light conditions be such that the light-sensitive sensors are capable of recording images of sufficiently high quality to permit their processing as described above.
In order further to facilitate the reader's understanding of how the movement of the mouse can control the object, a description of the calculations carried out to determine the movement of the mouse will now be provided by way of example, with reference to Fig. 6. In the calculations below it is assumed that the image-matching algorithm is of a simple type which for each sensor mere- WO 99/60469 PCTISE99/007!9 16 ly calculates the translation in two mutually perpendicular directions between two images. Suppose that the mouse is located in the position y, z) and that it has a rotation which can be described by means of the orthonormal rotation matrix R. The x-axis of the mouse thus points in the direction R-ex, the y-axis points in the direction R.ey, and the z-axis points in the direction Also suppose that between the recording of two images, the mouse carries out a translational motion and/or a rotational motion according to: (x+6x,y+6y,z+6z) R R 6R In the local coordinate system of the mouse, the translation vectors can be defined as shown in Fig. 7. The first sensor records movements in the x and y directions, the second sensor records movements in the y and z directions, and the third sensor records movements in the x and z directions. Consequently, for any triplet of consecutive images, the translation scalars (xl, yl, y2, z2, x3, z3) describe the detected movement of the mouse. The translation scalars consist of the outputs from the image-matching algorithm for each sensor.
In order to calculate the rotation of the mouse, the effect of a rotation upon the translation scalars is calculated. Suppose that the mouse is rotated through an angle a, which is sufficiently small so that sin a a.
For the sake of clarity, it is also assumed that the rotation takes place about the z-axis by a, radians. This rotation results in the scalars n n y2=-a. and x3=--a, V v where n is the number of pixels along one side of the sensor and v is the angle of vision of the sensor surface, expressed in radians. Thus, the following applies to all the axes: WO 99/60469 PCT/SE99/00719 17 0 0
V
0 0 xl v yl V n Y1 0 0 V ax y v y2 n ay z2 0 O a V Z x3 0 0 z3
V
0 0
V
By knowing the values of the translation scalars, which are output signals from the image-matching algorithm, the number of pixels n of the sensor surface along one side and the angle of vision v of the sensor, it is thus possible to calculate a rotation vector (ax, cxy, az) for the rotation of the mouse about the yand z-axes.
Furthermore, in order to calculate the translational motion one must know the functional distance from each sensor to the ambient geometry. The functional distance is a constant which relates the output from the imagematching algorithm to the translational motion. The functional distance is determined by means of a calibration which will be described below. In the special case where the mouse moves inside an "open box" 50 as described above, the functional distance corresponds to the geometrical distance from the middle of the mouse to the respective walls 51, 52, and 53 of the box For the sake of clarity, a translation of a distance 6x along the x-axis is looked at. Accordingly, the effect of the translation with respect to the scalars xl and x3 will be respectively n xl x 2d, tan- 2 WO 99/60469 PCT/SE99/00719 18 and x3 dx
V
2d 3 tan 2 Here, dl and d 3 are the functional distances from the mouse to the projected surfaces with respect to (xl,yl) and (x3,z3). The following is obtained if this is generalised to all the axes n 0 0 2d, tan- 2 0 n 0 v 2d, tan- 2 xl n 0 0 yl 2d 2 tan ta 2 n y2 o yo 2dtan- x3 z3 0 0 2d 3 tan- 2 0 0 v 2d 3 tan- L2 By knowing the values of the translation scalars, which are obtained as output signals from the imagematching algorithm, the number of pixels n of the sensor surface along one side, the field of vision v of the sensor and the functional distances d
I
d 3 to the projected surfaces, it is thus possible to calculate a translation vector (8x,6y,6z) for the translation of the mouse along the y- and z-axes.
To sum up, the translation scalars (xl, yl, y2, z2, x3, z3) which are obtained in the image matching, thus depend on the rotation as well as translation of the WO 99/60469 PCT/SE99/00719 19 mouse. Knowing these and other parameters described above, a translation vector (6x,6y,6z) and a rotation vector (ax, ay,) can be obtained by solving the following system of equations, which is solvable. These vectors are then included in a control signal which is transmitted to the object controlled by means of the mouse, which signal indicates the new position of the object.
0 0 0 0 v v 2d, tan- 2 n n 0 0 0 0 V v 2d, tan- 2 Sx xl 0 0 0 0 Sy y v v 2d, tan 2 6z y2 0 0 n 0 n 0 a z2 2d 2 tan- y x3 2 S_2 a _z3 S0 0 0 v v 2d 3 tan- 2 n n 0 0 0 0 2d, tan- 2 Calibration, i.e. the calculation of the functional distances di, d 2 and d 3 can be carried out by moving the mouse along the edges of the open box. The mouse is moved along the and z-axes according to a sequence A-B- C shown in Fig. 8. Each movement gives rise to two equations, which together give the following system of equations: d i 0,5 d, 0,5 n d 2 0,5 n V V V 2x, tan- 2y, tan- 2z 2 tan- 2 2 2 n n n d 3 0,5 d 0,5 d 3 -0,5 2x 3 tan- 2y, tan- 2z 3 tan- 2 2 2 This overdefined system of equations contains all the information required to calculate the values of the functional distances dl, d 2 and d 3 Furthermore, by means of the mouse according to this embodiment, the user can choose at a certain time to store the images which the sensors are currently recording in a memory. Subsequently, each set of recorded images is compared to the stored set of images and when a complete overlap exists, a signal is generated to the user. This enables precision control of an object since **the user can find his way back to the exact position at which the mouse was located on a previous occasion.
Naturally, the same principles can be used in the case of two-dimensional control of an object.
15 In another application of the mouse, only the rotational motion is detected. In this case, no calibration is required and it is sufficient to solve the equation presented above in connection with the discussion concerning rotation. In this application, the mouse can, for 20 example, be mounted on a helmet or the like which is worn by the user and which, for example, is used in various types of virtual reality applications.
i The reference to any prior art in this specification is not, and should not be taken as, an acknowledgment or any S"form of suggestion that that prior art forms part of the common general knowledge in Australia.
Throughout this specification and the claims which follow, unless the context requires otherwise, the word "comprise", and variations such as "comprises" and "comprising", will be understood to imply the inclusion of a stated integer or step or group of integers or steps but not the exclusion of any other integer or step or group of integers or steps.
The reference numerals in the following claims do not R,414_in any way limit the scope of the respective claims.

Claims (27)

1. A control device having image-recording means which are adapted to be moved, preferably manually, for controlling an object as a function of the movement of the image-recording means; c h a r a c t e r i s e d in that the control device is adapted to record a plurali- ty of images with partially overlapping contents when the image-recording means are being moved, the partially overlapping contents of the images enabling the determi- nation of how the image-recording means have been moved.
2. A control device according to claim 1, wherein the control device is adapted to control said object in a plane.
3. A control device according to claim 2, wherein the control device is adapted to control the angular position of said object in said plane.
4. A control device according to claim 2 or 3, further comprising a light-sensitive sensor means (8) having a two-dimensional sensor surface for recording the images.
A control device according to claim 1, wherein the control device is designed for controlling said object in a space.
6. A control device according to claim 5, wherein the control device is adapted to control the angular position of said object in said space.
7. A control device according to claim 5 or 6, further comprising at least two light-sensitive sensor means having a two-dimensional sensor surface for recording said images in two different directions.
8. A control device according to claim 5 or 6, further comprising three light-sensitive sensor means having a two-dimensional sensor surface for record- ing said images in three linearly independent directions. WO 99/60469 PCT/SE99/00719 22
9. A control device according to any one of claims 1-8, further comprising image-processing means (20-24) for providing control signals for controlling said object.
10. A control device according to claim 9, wherein the image-processing means (20-24) are adapted to deter- mine the relative positions of the images with the aid of the partially overlapping contents for providing said control signals.
11. A control device according to any one of claims 5-8, further comprising image-processing means (20-24) which are adapted to determine the relative positions of the images with the aid of the partially overlapping con- tents simultaneously with respect to all the light-sensi- tive sensor means for providing said control signals.
12. A control device according to claim 11', the con- trol device furthermore having a calibration mode, in which the image-recording means are moved in a way that enables the image-processing means (20-24) to relate the relative positions of the images to an actual movement of the image-recording means.
13. A control device according to any one of claims 10-12, wherein the image-processing means (20-24) are adapted to generate said control signals on the basis of at least one movement vector obtained from the relative positions of the images.
14. A control device according to any one of claims 10-13, wherein the image-processing means (20-24) are adapted to generate said control signals on the basis of at least one turning indication obtained from the rela- tive positions of the images. A control device according to any one of claims 10-14, wherein the image-processing means (20-24) are adapted to generate said control signals on the basis of the speed at which the image-recording means have been moved, the speed being determined from the relative posi- tions of the images.
WO 99/60469 PCT/SE99/00719 23
16. A control device according to claims 9-15, wherein the image-processing means (20-24) are adapted to output said control signals in a way that enables a receiver to identify the control signals as being intend- ed for controlling an object.
17. A control device according to any one of claims 9-16, wherein the image-processing means (20-24) are fur- thermore adapted to store at least one reference image and to compare images recorded subsequently with that image in order to generate a signal in the case of an essentially complete overlap.
18. A control device according to any one of claims 9-17, wherein the image-processing means (20-24) comprise a transmitter (26) for wireless outputting of the control signals.
19. A control device according to any one of claims 9-18, wherein the image-recording means comprise a trans- mitter (26) for wireless transmission of images to the image-processing means (20-24)
20. A control device according to any one of the preceding claims, wherein the control device is a mouse.
21. A control device according to any one of the preceding claims, wherein the device has a first operat- ing mode in which the control device is adapted to con- trol said object in a way that enables its movement to be proportional to the movement of the image-recording means.
22. A control device according to any one of the preceding claims, wherein the device has a second operat- ing mode in which the control device is adapted to con- trol said object so that the speed of its movement is proportional to the distance between the image-recording means and a predefined origin.
23. A control device having image-recording means which are adapted to be turned, preferably manually, for controlling an object as a function of the turning of the image-recording means; c h a r a c t e r i s e d in that the control device is adapted to record a plurality of images with partially overlapping contents when the image-recording means are being turned, the partially overlapping contents of the images enabling the determi- nation of how the image-recording means have been turned.
24. A method of controlling an object, comprising the steps of moving a control device; recording, with the aid of the control device, a plurality of images with overlapping contents during the movement of the control device, and determining the movement of the control device *with the aid of the contents of the overlapping images.
25. A method of controlling an object according to 15 claim 24, further comprising the step of determining the relative position of the images with the aid of the partially overlapping contents for providing control signals for controlling the object.
26. A control device having image-recording means 20 which are adapted to be moved for controlling an object as a function of the movement of the image-recording means S"substantially as hereinbefore described with reference to the accompanying drawings.
27. A method of controlling an object substantially as hereinbefore described with reference to the accompanying drawings. DATED this 20th day of December, 2002 C TECHNOLOGIES AB By its Patent Attorneys DAVIES COLLISON CAVE
AU43033/99A 1998-04-30 1999-04-30 Control device and method of controlling an object Ceased AU758514B2 (en)

Applications Claiming Priority (9)

Application Number Priority Date Filing Date Title
SE9801535 1998-04-30
SE9801535A SE511855C2 (en) 1998-04-30 1998-04-30 Handwritten character recording device for characters, symbols, graphs, calligraphy
US9132398P 1998-06-30 1998-06-30
US60/091323 1998-06-30
SE9803456 1998-10-09
SE9803456A SE512182C2 (en) 1998-04-30 1998-10-09 Hand held input unit such as input pen for personal computer
US10581698P 1998-10-27 1998-10-27
US60/105816 1998-10-27
PCT/SE1999/000719 WO1999060469A1 (en) 1998-04-30 1999-04-30 Control device and method of controlling an object

Publications (2)

Publication Number Publication Date
AU4303399A AU4303399A (en) 1999-12-06
AU758514B2 true AU758514B2 (en) 2003-03-20

Family

ID=27484810

Family Applications (1)

Application Number Title Priority Date Filing Date
AU43033/99A Ceased AU758514B2 (en) 1998-04-30 1999-04-30 Control device and method of controlling an object

Country Status (9)

Country Link
EP (1) EP1073946A1 (en)
JP (1) JP2002516429A (en)
KR (1) KR20010052283A (en)
CN (1) CN1303494A (en)
AU (1) AU758514B2 (en)
BR (1) BR9910572A (en)
CA (1) CA2331075A1 (en)
IL (1) IL139103A0 (en)
WO (1) WO1999060469A1 (en)

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7054487B2 (en) 2000-02-18 2006-05-30 Anoto Ip Lic Handelsbolag Controlling and electronic device
US6839453B1 (en) 2000-05-16 2005-01-04 The Upper Deck Company, Llc Method and apparatus for authenticating unique items such as sports memorabilia
KR100408518B1 (en) * 2001-04-12 2003-12-06 삼성전자주식회사 Pen input device and Measuring method of coordinate
EP2093650B1 (en) 2002-11-20 2013-05-15 Koninklijke Philips Electronics N.V. User interface system based on pointing device
SE0303370D0 (en) 2003-12-16 2003-12-16 Anoto Ab Method, apparatus, computer program and storage medium for recording a movement of a user unit
US7136054B2 (en) 2004-01-06 2006-11-14 Microsoft Corporation Camera-pen-tip mapping and calibration
US7263224B2 (en) * 2004-01-16 2007-08-28 Microsoft Corporation Strokes localization by m-array decoding and fast image matching
KR100675830B1 (en) * 2004-03-11 2007-01-29 주식회사 애트랩 Image sensor, optic pointing device and motion value calculation method of it
US7536051B2 (en) * 2005-02-17 2009-05-19 Microsoft Corporation Digital pen calibration by local linearization
JP2009020718A (en) * 2007-07-12 2009-01-29 Nec Commun Syst Ltd Radio input device and equipment operation system
US8054512B2 (en) 2007-07-30 2011-11-08 Palo Alto Research Center Incorporated System and method for maintaining paper and electronic calendars
CN101859205A (en) * 2009-04-08 2010-10-13 鸿富锦精密工业(深圳)有限公司 Hand input device and hand input system
CN105387802A (en) * 2015-10-13 2016-03-09 东莞市微大软件科技有限公司 Method for controlling movement of worktable of automatic image measuring instrument

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0112415A1 (en) * 1982-12-22 1984-07-04 International Business Machines Corporation A method and apparatus for continuously updating a display of the coordinates of a light pen
CN1122925A (en) * 1994-11-07 1996-05-22 颜艮山 Instant look mouse scanner

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0112415A1 (en) * 1982-12-22 1984-07-04 International Business Machines Corporation A method and apparatus for continuously updating a display of the coordinates of a light pen
CN1122925A (en) * 1994-11-07 1996-05-22 颜艮山 Instant look mouse scanner

Also Published As

Publication number Publication date
CA2331075A1 (en) 1999-11-25
AU4303399A (en) 1999-12-06
KR20010052283A (en) 2001-06-25
BR9910572A (en) 2001-01-16
WO1999060469A1 (en) 1999-11-25
IL139103A0 (en) 2001-11-25
EP1073946A1 (en) 2001-02-07
CN1303494A (en) 2001-07-11
JP2002516429A (en) 2002-06-04

Similar Documents

Publication Publication Date Title
US6198485B1 (en) Method and apparatus for three-dimensional input entry
AU758514B2 (en) Control device and method of controlling an object
US7817134B2 (en) Pointing device
US6906699B1 (en) Input unit, method for using the same and input system
WO2013035554A1 (en) Method for detecting motion of input body and input device using same
US20020085097A1 (en) Computer vision-based wireless pointing system
US7006079B2 (en) Information input system
US20080117168A1 (en) Method and apparatus for controlling application using motion of image pickup unit
KR20010052282A (en) Input unit, method for using the same and input system
WO2004042548A1 (en) Movement detection device
JP2003148919A (en) Device and method for detecting three-dimensional relative movement
US7825898B2 (en) Inertial sensing input apparatus
JP2006190212A (en) Three-dimensional position input device
US7199791B2 (en) Pen mouse
GB2345538A (en) Optical tracker
EP1073945B1 (en) Device and method for recording hand-written information
JP4292927B2 (en) Pen-type data input device and program
JP2006268854A (en) Method and system for determining position of handheld object based on acceleration of handheld object
MXPA00010533A (en) Control device and method of controlling an object
KR100446610B1 (en) Pen-type computer input device
SE512182C2 (en) Hand held input unit such as input pen for personal computer
JP2005073095A (en) White board eraser
SE511855C2 (en) Handwritten character recording device for characters, symbols, graphs, calligraphy
SE513940C2 (en) Unit and input system with mouse function and input function and ways to use the unit
MXPA00010548A (en) Device and method for recording hand-written information

Legal Events

Date Code Title Description
FGA Letters patent sealed or granted (standard patent)