AU4303399A - Control device and method of controlling an object - Google Patents

Control device and method of controlling an object Download PDF

Info

Publication number
AU4303399A
AU4303399A AU43033/99A AU4303399A AU4303399A AU 4303399 A AU4303399 A AU 4303399A AU 43033/99 A AU43033/99 A AU 43033/99A AU 4303399 A AU4303399 A AU 4303399A AU 4303399 A AU4303399 A AU 4303399A
Authority
AU
Australia
Prior art keywords
control device
image
images
control
recording means
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
AU43033/99A
Other versions
AU758514B2 (en
Inventor
Petter Ericson
Christer Fahraeus
Ola Hugosson
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Anoto Group AB
Original Assignee
C Technologies AB
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from SE9801535A external-priority patent/SE511855C2/en
Priority claimed from SE9803456A external-priority patent/SE512182C2/en
Application filed by C Technologies AB filed Critical C Technologies AB
Publication of AU4303399A publication Critical patent/AU4303399A/en
Application granted granted Critical
Publication of AU758514B2 publication Critical patent/AU758514B2/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • G06F3/0317Detection arrangements using opto-electronic means in co-operation with a patterned surface, e.g. absolute position or relative movement detection for an optical mouse or pen positioned with respect to a coded surface
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03545Pens or stylus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Input By Displaying (AREA)
  • Image Input (AREA)

Description

WO 99/60469 PCT/SE99/00719 1 CONTROL DEVICE AND METHOD OF CONTROLLING AN OBJECT Field of the Invention The present invention relates to a control device having image-recording means which are adapted to be mov ed, preferably manually, for controlling an object as a function of the movement of the image-recording means. 5 The invention also relates to a method of controlling an object. Background of the Invention Today, personal computers are usually equipped with a control device, a so-called computer mouse, which is 10 used for positioning a cursor on the computer screen. The positioning is carried out by the user passing the mouse over a surface, the hand movement indicating how the mouse should be positioned. The mouse generates position ing signals indicating how the mouse has been moved and 15 thus how the cursor should be moved. Presently, the most common type of mouse has a ball on its underside, which turns as a result of friction against the surface when the mouse is passed over the same and which in this con nection drives position sensors which in turn generate 20 the positioning signals. Normally, the mouse can also be used for providing instructions to the computer by the intermediary of one or more buttons on which the user clicks. Optical computer mouses are also known. JP 09190277 25 shows an optical mouse having one CCD line sensor for the X-axis and one CCD line sensor for the Y-axis. Data recorded by means of the CCD line sensors at a certain time is compared with data recorded at a subsequent time, whereby the movement of the mouse in the X direction and 30 in the Y direction can be determined. A mouse is thus used for controlling a virtual object. However, there are other control devices whose WO 99/60469 PCT/SE99/00719 2 structure is similar to that of a mouse, but which are used for controlling physical objects instead. Furthermore, there are control devices for control ling objects in two dimensions, i.e. in a plane, or in 5 three dimensions, i.e. in a space. WO 98/11528 describes a control device which pro vides a computer with three-dimensional information. The device is based on three accelerometers which are placed in mutually perpendicular directions and which are cap 10 able of measuring acceleration or inclination in one to three directions. The device can, for example, be placed on the user's head or it can be hand-held. A computer mouse for inputting three-dimensional information to a computer is described in US Patent 15 5,506,605. This computer mouse is hand-held and is intended to be held freely in space. Furthermore, it can comprise sensors for measuring various physical proper ties which are subsequently interpreted by suitable elec tronic means, converted into digital format, and input 20 to the computer. The position of the mouse in space is determined by position sensors, which may be based on light, acceleration, gyroscopes, etc. In the embodiment described, use is made of an ultrasonic sensor and a magnetic sensor. On the basis of the input, the computer 25 can subsequently generate tactile feedback in the form of vibrations which, for example, provide the user with information concerning the location of the mouse in rela tion to its desired location. Summary of the Invention 30 It is an object of the present invention to provide an improved control device and an improved method of con trolling an object which are suited for both two-dimen sional and three-dimensional control of physical as well as virtual objects. 35 This object is achieved by control devices according to claims 1 and 23 and by a method according to claim 24. Preferred embodiments are stated in the subclaims.
WO 99/60469 PCT/SE99/00719 3 Thus, according to a first aspect, the invention relates to a control device having image-recording means which are adapted to be moved by a user, preferably manually, for controlling an object, which may be physi 5 cal or virtual, as a function of the movement of the image-recording means. The image-recording means are adapted to record a plurality of images with partially overlapping contents when they are being moved, the partially overlapping contents enabling the determination 10 of how the image-recording means have been moved. The invention is thus based on the idea of using images for determining how a unit is being moved. This technology can be used for two-dimensional as well as three-dimensional control. It is advantageous because 15 it requires few sensors and no moving parts. The entire movement information is contained in the overlapping con tents of the images. Because the device records images of the surroundings, an "absolute" position indication is obtained, making it possible to detect when the image 20 recording means are in a specific position, which, for example, is not possible when using control devices based on measuring acceleration. In addition to movement, turn ing can also be detected and used for controlling an object. 25 In one embodiment, the control device is designed for controlling an object in a plane. In this case, the overlapping images enable the determination of not only the movement of the image-recording means but also their turning in the plane, which, for example, is not possible 30 when using a traditional mouse with a ball. Accordingly, the control device is advantageously adapted to control the angular position of the object in the plane. When the device is designed for control in a plane, the image recording means are advantageously provided with a light 35 sensitive sensor means having a two-dimensional sensor surface, a so-called area sensor, for recording the images. In this context, a two-dimensional sensor sur- WO 99/60469 PCT/SE99/00719 4 face refers to the fact that the sensor surface must be capable of imaging a surface with a matrix of pixels. CCD sensors and CMOS sensors are examples of suitable sen sors. A single sensor is thus sufficient for providing 5 control in a plane. In an alternative embodiment, the device is designed for controlling an object in a space. In this case, too, the control device is advantageously adapted to control the angular position of the object, in which connection 10 the control can take place about three axes. In an econo mical embodiment, it may be sufficient for the device to have two light-sensitive sensors each having a two-dimen sional sensor surface for recording said images in two different directions. 15 However, for more precise control in space, it is preferable for the image-recording means to comprise three sensors for recording the images in three, preferably perpendicular, directions. This enables the determination of the translation along three mutually 20 perpendicular axes as well as of the rotation about these axes by means of relatively simple calculations. Suitably, the control device has image-processing means for providing control signals for controlling the object. The image-processing means may be located in the 25 same physical casing as the image-recording means, the output signals from this physical casing thus constitut ing the control signals for controlling the object which is to be controlled. However, the image-processing means may also be located in another physical casing, for exam 30 ple in a computer whose cursor constitutes the object which is to be controlled, or in a computer which in turn controls, or forms part of, a physical object which is controlled by means of the control device, the output signals from the image-processing means constituting the 35 control signals for controlling the object. In this con text, it should be noted that the control signals output ted from the image-processing means may require further WO 99/60469 PCT/SE99/00719 5 processing before they can be used for direct control of the object. The image-processing means are advantageously implemented with the aid of a processor and software, but can also be implemented completely with the aid of hard 5 ware. The image-processing means are suitably adapted to determine the relative positions of the images with the aid of the partially overlapping contents for providing said control signals. If the control device is used for 10 control in three dimensions, this is suitably carried out in parallel with respect to all the sensors. The distance and direction of the movement, and thus the current position, can be determined on the basis of the relative positions of the images. 15 Advantageously, the control device has a calibration mode, in which the image-recording means are moved in a way that enables the image-processing means to relate the relative positions of the images to an actual movement of the image-recording means. As an alternative, the control 20 device could be provided with a distance meter measuring the distance to the surfaces being imaged with the aid of the sensors, but that would, of course, be more expen sive. The image-processing means are suitably adapted to 25 generate said control signals on the basis of at least one movement vector obtained from the relative positions of the images. Additionally, or alternatively, the image-processing means may be adapted to generate said control signals on 30 the basis of at least one turning indication obtained from the relative position of the images. The control signals can thus be used for controlling the turning of an object as well as its movement, which is an advantage compared to traditional mechanical computer mouses. 35 In the case of a control device for three-dimen sional control, the image-processing means can combine information from all the sensors with respect to the WO 99/60469 PCT/SE99/00719 6 relative positions of the images in order to generate one movement vector and one turning vector. In this way, the position of the image-recording means can be unambiguous ly determined. In other words, the control device can 5 carry out a digitisation of the movement performed by a hand when it moves the image-recording means in order to enable a computer to control an object on the basis of this movement. In one embodiment, the image-processing means may be 10 adapted to generate said control signals on the basis of the speed at which the image-recording means have been moved, the speed being determined from the relative posi tions of the images and the image-recording frequency. Suitably, the receiver of the control signals should 15 know that the control signals are control signals so that it will know how the signals are to be subsequently pro cessed. Consequently, the image-processing means are pre ferably adapted to output said control signals in such a way that a receiver can identify the control signals as 20 being intended for controlling an object. This can, for example, be effected by the use of a predetermined proto col. An advantage of using an image-based control device is that it becomes possible to determine when the image 25 recording means are in a predetermined position, since this position can be defined by means of one or several images. For example, it is possible to detect when the image-recording means have returned to their original position. For this purpose, the image-processing means 30 are adapted to store at least one reference image and to compare images recorded subsequently with this image in order to generate a signal in the case of an essentially complete overlap. For instance, the user can define a certain position as a reference position by clicking on 35 the control device in this position. If the image-recording means and the image-process ing means are located in different physical casings, WO 99/60469 PCT/SE99/00719 7 the image-recording means may advantageously comprise a transmitter for wireless transmission of images from the image-recording means to the image-processing means. Moreover, especially if the image-recording and the 5 image-processing means are located in the same physical casing, it may be an advantage if the image-processing means comprise a transmitter for wireless outputting of the control signals, for example to a computer whose cursor is to be controlled. In both cases, the control 10 device is very easy to use since no flex is required for the information transfer. For example, a user can have a personal image-recording means or control device and use it with different computers or receivers of the control signals. The transmitter can be an IR transmitter, a 15 radio transmitter, which, for example, uses the so-called Bluetooth standard, or some other transmitter which is suitable for wireless information transfer between two units located fairly close to each other. In a preferred embodiment, the control device is a 20 computer mouse, i.e. a device which can be connected to a computer and be used for positioning a cursor in one, two, or several dimensions. The control device can be used in a first absolute mode or in a second relative mode. In the absolute mode, 25 the movement of the controlled object is proportional to the movement of the image-recording means. In other words, the object moves in a way that corresponds to the movement of the image-recording means, regardless of where these are located. In the relative mode, however, 30 the control device is configured so that the speed or acceleration of the controlled object increases when the distance increases between the image-recording means and a predefined origin of coordinates. In this way, it becomes possible to achieve faster movement of the object 35 by holding the image-recording means farther away from the predefined origin, while, at the same time, precision WO 99/60469 PCT/SE99/00719 8 control can be achieved by holding the image-recording means closer to the origin. According to a second aspect of the invention, it relates to a control device having image-recording means 5 which are adapted to be turned, preferably manually, for controlling an object as a function of the turning of the image-recording means. The control device is adapted to record a plurality of images with partially overlapping contents when the image-recording means are being turned, 10 the partially overlapping contents of the images enabling the determination of how the image-recording means have been turned. This control device is thus based on the same idea as the control device described above, but instead of 15 controlling the object as a function of the movement of the image-recording means, it is controlled as a function of their turning. This control device may, for example, be a trackball. The embodiments discussed above are to a large extent also applicable to the turning control 20 device, and the same advantages are obtained. According to a third aspect of the invention, it relates to a method of controlling an object, comprising the steps of moving a control device; recording, with the aid of the control device, a plurality of images with 25 overlapping contents during the movement of the control device; and determining the movement of the control device with the aid of the contents of the overlapping images. The same advantages are obtained as those described with respect to the above-mentioned devices. 30 Brief Description of the Drawings The present invention will be described in more detail below by way of exemplifying embodiments with reference to the accompanying drawings, in which Fig. 1 schematically shows an embodiment of a con 35 trol device according to the invention; WO 99/60469 PCT/SE99/00719 9 Fig. 2 is a block diagram of the electronic cir cuitry part of an embodiment of the control device according to the invention; Fig. 3 schematically shows a second embodiment of 5 a control device according to the invention; Fig. 4 is a flowchart illustrating the operation of a control device for two-dimensional control; Fig. 5 schematically shows an "open box" in which the control device in Fig. 3 can be used; 10 Fig. 6 schematically shows a movement of the control device according to the invention from a point (x,y,z) to a point (x+8x,y+6y,z+8z) in an orthonormal coordinate sys tem with the axes ex, ey, and ez; Fig. 7 schematically shows which translation scalars 15 are outputted from the respective sensors when the con trol device is being moved (index shows which sensor is generating the respective scalars); and Fig. 8 schematically shows how the control device is intended to be moved in the calibration mode. 20 Description of Preferred Embodiments The control device according to the invention can be implemented in embodiments of essentially two main types. A first embodiment of the control device according to the invention will be described below, which embodiment is 25 intended to be used as a two-dimensional mouse. Next, a second embodiment of the control device will be describ ed, which embodiment is intended to be used as a three dimensional mouse. Finally, the operation of the two dimensional and the three-dimensional mouse will be 30 described. In both embodiments described, the image recording means and the image-processing means are locat ed in the same physical casing, from which control sig nals are outputted. As mentioned above, the image-pro cessing means can also be located in a separate physical 35 casing. It is very simple for the skilled person to carry out this modification.
WO 99/60469 PCT/SE99/00719 10 Design of the Control Device In the first embodiment of the control device shown in Fig. 1, it comprises a casing 1 having approximately the same shape as a conventional highlighter pen. One 5 short side of the casing has a window 2, by the inter mediary of which images are read into the device. The window 2 is somewhat recessed in the casing in order not to wear against the underlying surface. The casing 1 essentially contains an optics part 3, 10 an electronic circuitry part 4, and a power supply part 5. The optics part 3 comprises a light-emitting diode 6, a lens system 7, and image-recording means in the form of a light-sensitive sensor 8, which constitutes the 15 interface with the electronic circuitry part 4. The task of the LED 6 is to illuminate a surface which is currently located under the window in the case where the control device is held directly against a sur face or very close thereto. A diffuser 9 is mounted in 20 front of the LED 6 for diffusing the light. The lens system 7 has the task of projecting an image of the surface located under the window 2 onto the light-sensitive sensor 8 as accurately as possible. In this example, the light-sensitive sensor 8 com 25 prises a two-dimensional, square CCD unit (CCD = charge coupled device) with a built-in A/D converter. Such sen sors are commercially available. The sensor 8 is mounted at a small angle to the window 2 and on its own circuit board 11. 30 The power supply to the control device is obtained from a battery 12, which is mounted in a separate compartment 13 in the casing. The block diagram in Fig. 2 schematically shows the electronic circuitry part 4. This is located on a circuit 35 board and comprises a processor 20, which by the interme diary of a bus 21 is connected to a ROM 22, in which the programs of the processor are stored, to a read/write WO 99/60469 PCT/SE99/00719 11 memory 23, which constitutes the working memory of the processor and in which the images from the sensor are stored, to a control logic unit 24, as well as to the sensor & and the LED 6. The processor 20, the bus 21, 5 the memories 22 and 23, the control logic unit 24, as well as the associated software together constitute image-processing means. The control logic unit 24 is in turn connected to a number of peripheral units, comprising a radio trans 10 ceiver 26 for transferring information to/from an exter nal computer, buttons 27, by means of which the user can control the image-recording means and which can also be used as the clicking buttons of a traditional mouse, as well as an indicator 29, e.g. a LED, indicating when the 15 mouse is ready to be used. Control signals to the memo ries, the sensor, and the peripheral units are generated in the control logic unit 24. The control logic also handles generation and prioritisation of interrupts to the processor. The buttons 27, the radio transceiver 26, 20 and the LED 6 are accessed by the processor writing and reading in a register in the control logic unit 24. The buttons 27 generate interrupts to the processor 20 when they are activated. Fig. 3 shows a second embodiment of the control 25 device according to the invention. Like the first embodi ment, this embodiment comprises a pen-shaped casing 31. Besides the window 32 on one short side of the casing, the device has two additional windows 32' and 32". Each of the windows 32, 32', 32" is somewhat recessed in the 30 casing so that it will not wear or scratch should the control device impinge upon a surface when it is in use, or when it is in the idle position. As in the above case, the casing 1 essentially con tains an optics part 33, an electronic circuitry part 34, 35 and a power supply part 5. The optics part 33 comprises a lens package (not shown) with three lens systems and a set of sensors (not WO 99/60469 PCT/SE99/00719 12 shown) with three light-sensitive sensors which consti tute the interface to the electronic circuitry part 34 for the windows 32, 32' and 32" respectively. There is no light-emitting diode in this embodiment. The control 5 device is intended to be held at a distance from the sur faces which are being imaged and, consequently, in most cases, ambient light is sufficient to permit images to be recorded. The lens systems have the task of projecting images 10 of the surfaces at which the windows 32, 32', 32" are directed onto the light-sensitive sensors as accurately as possible. As in the above embodiment, the light-sensitive sen sors comprise two-dimensional, square CCD units with 15 built-in A/D converters. Each sensor is mounted on its own circuit board. In this embodiment, too, the power supply to the control device is obtained from a battery, which is mounted in a separate compartment in the casing. 20 In this second embodiment, the design of the elec tronic circuitry part is essentially the same as that described above with respect to the first embodiment. The electronic circuitry part is shared by all three sensors. Application of the Device as a Two-dimensional Mouse 25 The device according to the first embodiment can be used as a mouse for inputting movement information, by means of which a cursor can be controlled on a computer screen. The user directs the window 2 of the control device 30 at a patterned surface, e.g. a mouse pad. He presses one of the buttons 27 to activate the image-recording means, whereupon the processor 20 commands the LED 6 to begin generating strobe pulses at a predetermined frequency, suitably at least 50 Hz. Subsequently, the user passes 35 the control device over the surface in the same way as if it were a traditional mouse, whereupon images with partially overlapping contents are recorded by the sensor WO 99/60469 PCT/SE99/00719 13 8 and stored in the read/write memory 23. The images are stored as images, i.e. with the aid of a plurality of pixels, each having a grey scale value in a range from white to black. 5 The flowchart in Fig. 4 shows the operation of the two-dimensional mouse in more detail. In step 400, a starting image is recorded. In step 401, the next image is recorded. The contents of this image partially overlap the contents of the previous image. 10 As soon as an image has been recorded in step 401, the process begins of determining how it overlaps the previous image, step 402, i.e. in which relative position the best match is obtained between the contents of the images. This determination is carried out by translating 15 the images vertically and horizontally relative to each other, and by rotating the images relative to each other. For this purpose, every possible overlap position between the images is examined, at the pixel level, and an over lap measurement is determined as follows: 20 1) For each overlapping pixel position, the grey scale values of the two relevant pixels are added up if the latter are not white. Such a pixel position in which none of the pixels is white is designated a plus posi tion. 25 2) The grey scale sums for all the plus positions are added up. 3) The neighbours of each pixel position are exa mined. If an overlapping pixel position is not a neigh bour of a plus position and consists of a pixel which is 30 white and a pixel position which is not white, the grey scale value of the non-white pixel is subtracted, pos sibly multiplied by a constant, from the sum in point 2). 4) The overlap position providing the highest over lap measurement as stated above is selected. 35 Our Swedish patent application No. 9704924-1 and the corresponding U.S. application No. 024 641 describe an alternative way of matching the images in order to find WO 99/60469 PCT/SE99/00719 14 the best overlap position. The content of these applica tions is herewith incorporated by reference. As soon as the best overlap position between the current image and the previous image has been determined, 5 the previous image is discarded, whereupon the current image becomes the previous image in relation to the next image recorded. By determining the relative position of the two images a movement vector is obtained, which indicates how 10 far and in which direction the image-recording means have been moved between the recording of the two images. If the mouse has also been turned between the two images, also a measurement of this turning is obtained. Subse quently, a control signal, which includes the movement 15 vector and the measurement of turning, is transmitted, step 403, by the radio transceiver 26 to the computer for which the control device is operating as a mouse. The computer uses the movement vector and the measurement of turning for positioning the cursor on its screen. 20 Subsequently, the flow returns to step 401. In order to increase the speed, the steps can be partly carried out in parallel, e.g. by starting the recording of the next image while the current image is being put together with the previous image. 25 When the mouse is activated, the buttons 27 can be used as clicking buttons for inputting instructions to the computer. Application of the Device as a Three-dimensional Mouse The device according to the second embodiment can be 30 used as a mouse for inputting movement information, by means of which a cursor can be controlled in three dimen sions on a computer screen, i.e. in a space. As described above, the three-dimensional mouse com prises three sensors 32, 32', 32" having two-dimensional, 35 light-sensitive sensor surfaces. The main axes of the sensors are orientated along the x-, y-, and z-axes in an orthogonal coordinate system and have a two-dimensional WO 99/60469 PCT/SE99/00719 15 spatial resolution of n x n pixels and a time resolution of m images per second. Each lens system provides a field of vision with an angle of vision of v radians for the associated sensor surface. 5 When the device is in use, the mouse movements are carried out in an "open box" 50 according to Fig. 5, which is defined by at least two side walls 51 and 52 which are orientated at right angles in relation to each other, and a floor 53. It is also possible that the mouse 10 can be held freely in space, but this requires more com plicated calculation algorithms than the ones that will be described below. When the device is in use, the above method of determining the relative position of images is used for 15 each sensor. Accordingly, the operation in this case can also be described by means of the flowchart in Fig. 4, but instead of recording individual images, a set of images consisting of three images is recorded simulta neously. One movement vector and one turning vector are 20 thus generated with the aid of the images recorded by each light-sensitive sensor, which vectors describe the movement carried out by the mouse between the recording of two consecutive images. These vectors are then includ ed in a control signal which is transmitted to the object 25 which is to be controlled by means of the mouse. Furthermore, to enable the successful use of the mouse, it is necessary that the light conditions be such that the light-sensitive sensors are capable of recording images of sufficiently high quality to permit their pro 30 cessing as described above. In order further to facilitate the reader's under standing of how the movement of the mouse can control the object, a description of the calculations carried out to determine the movement of the mouse will now be provided 35 by way of example, with reference to Fig. 6. In the cal culations below it is assumed that the image-matching algorithm is of a simple type which for each sensor mere- WO 99/60469 PCT/SE99/00719 16 ly calculates the translation in two mutually perpendicu lar directions between two images. Suppose that the mouse is located in the position (x, y, z) and that it has a rotation which can be described by means of the ortho 5 normal rotation matrix R. The x-axis of the mouse thus points in the direction R-ex, the y-axis points in the direction R-ey, and the z-axis points in the direction R-ez. Also suppose that between the recording of two images, the mouse carries out a translational motion 10 and/or a rotational motion according to: (x,y,z) -> (x+6x,y+8y,z+8z) R -> R - 8R In the local coordinate system of the mouse, the transla tion vectors can be defined as shown in Fig. 7. The first 15 sensor records movements in the x and y directions, the second sensor records movements in the y and z direc tions, and the third sensor records movements in the x and z directions. Consequently, for any triplet of conse cutive images, the translation scalars (xl, yl, y2, z2, 20 x3, z3) describe the detected movement of the mouse. The translation scalars consist of the outputs from the image-matching algorithm for each sensor. In order to calculate the rotation of the mouse, the effect of a rotation upon the translation scalars is cal 25 culated. Suppose that the mouse is rotated through an angle a, which is sufficiently small so that sin a ~ a. For the sake of clarity, it is also assumed that the rotation takes place about the z-axis by az radians. This rotation results in the scalars 30 n1 n y2=-a. and x3=--a V V where n is the number of pixels along one side of the sensor and v is the angle of vision of the sensor sur face, expressed in radians. Thus, the following applies 35 to all the axes: WO 99/60469 PCT/SE99/00719 17 0 - 0 n 0 0 1 ny1 0 0 ax y2 0 0 a, n 0 0 -- z3 V - n - 0 0 V By knowing the values of the translation scalars, which are output signals from the image-matching algo 5 rithm, the number of pixels n of the sensor surface along one side and the angle of vision v of the sensor, it is thus possible to calculate a rotation vector (ax, aXy, az) for the rotation of the mouse about the x-, y and z-axes. 10 Furthermore, in order to calculate the translational motion one must know the functional distance from each sensor to the ambient geometry. The functional distance is a constant which relates the output from the image matching algorithm to the translational motion. The 15 functional distance is determined by means of a calibra tion which will be described below. In the special case where the mouse moves inside an "open box" 50 as describ ed above, the functional distance corresponds to the geo metrical distance from the middle of the mouse to the 20 respective walls 51, 52, and 53 of the box 50. For the sake of clarity, a translation of a distance 6x along the x-axis is looked at. Accordingly, the effect of the translation with respect to the scalars xl and x3 will be respectively 25 x1= n dx 2d, tan 2 WO 99/60469 PCT/SE99/00719 18 and A n 9 x3 = n x V 2d 3 tan 2 Here, di and d 3 are the functional distances from the 5 mouse to the projected surfaces with respect to (xl,yl) and (x3,z3). The following is obtained if this is gene ralised to all the axes n 0 0 2d tan 2 0 n 0 2d tan 2 xi 0 0 y1 2d 2 tan -/ y2 n 0 2 Y z2 2d2tan- x3 2 z3 n0 0 2d 3 tan 2 0 0 n V 2d 3 tan 2 10 By knowing the values of the translation scalars, which are obtained as output signals from the image matching algorithm, the number of pixels n of the sensor surface along one side, the field of vision v of the sen 15 sor and the functional distances di- d 3 to the projected surfaces, it is thus possible to calculate a translation vector (8x,Sy,Sz) for the translation of the mouse along the x-, y- and z-axes. To sum up, the translation scalars (xl, yl, y2, z2, 20 x3, z3) which are obtained in the image matching, thus depend on the rotation as well as translation of the WO 99/60469 PCT/SE99/00719 19 mouse. Knowing these and other parameters described above, a translation vector (6x,6y,6z) and a rotation vector (accy,az) can be obtained by solving the following system of equations, which is solvable. These vectors are 5 then included in a control signal which is transmitted to the object controlled by means of the mouse, which signal indicates the new position of the object. n 0 0 0 - 0 V V/ 2d tan 2 0 n 0 n 0 0 V 2d, tan 2 dx x1 2 972A 0 0 0 0 - Sy yi 2d 2 tan 2Sz y2 0 0 n 0 - 0 ax z2 2d 2 tan-V V a x3 2 a z3 n0 0 0 0 n -z V V 2d 3 tan 2 0 0 n 0 0 V V 2d 3 tan 2 10 Calibration, i.e. the calculation of the functional distances di, d 2 , and d 3 , can be carried out by moving the mouse along the edges of the open box. The mouse is moved along the x-, y-, and z-axes according to a sequence A-B 15 C shown in Fig. 8. Each movement gives rise to two equa tions, which together give the following system of equa tions: di -0,5= n d 2 -0,5= n d 2 -0,5= n 2x tan- tan- 2z 2 tan 2 2 2 d 3 + 0,5 = n di - 0,5 = d 3 -- 0,5 = n 2x, tan - 2y, tan- 2z 3 tan 2 2 2 WO 99/60469 PCT/SE99/00719 20 This overdefined system of equations contains all the information required to calculate the values of the functional distances di, d 2 , and d 3 . Furthermore, by means of the mouse according to this 5 embodiment, the user can choose at a certain time to store the images which the sensors are currently record ing in a memory. Subsequently, each set of recorded images is compared to the stored set of images and when a complete overlap exists, a signal is generated to the 10 user. This enables precision control of an object since the user can find his way back to the exact position at which the mouse was located on a previous occasion. Naturally, the same principles can be used in the case of two-dimensional control of an object. 15 In another application of the mouse, only the rota tional motion is detected. In this case, no calibration is required and it is sufficient to solve the equation presented above in connection with the discussion con cerning rotation. In this application, the mouse can, for 20 example, be mounted on a helmet or the like which is worn by the user and which, for example, is used in various types of virtual reality applications.

Claims (25)

1. A control device having image-recording means 5 which are adapted to be moved, preferably manually, for controlling an object as a function of the movement of the image-recording means; c h a r a c t e r i s e d in that the control device is adapted to record a plurali ty of images with partially overlapping contents when 10 the image-recording means are being moved, the partially overlapping contents of the images enabling the determi nation of how the image-recording means have been moved.
2. A control device according to claim 1, wherein the control device is adapted to control said object in 15 a plane.
3. A control device according to claim 2, wherein the control device is adapted to control the angular position of said object in said plane.
4. A control device according to claim 2 or 3, 20 further comprising a light-sensitive sensor means (8) having a two-dimensional sensor surface for recording the images.
5. A control device according to claim 1, wherein the control device is designed for controlling said 25 object in a space.
6. A control device according to claim 5, wherein the control device is adapted to control the angular position of said object in said space.
7. A control device according to claim 5 or 6, 30 further comprising at least two light-sensitive sensor means (8) having a two-dimensional sensor surface for recording said images in two different directions.
8. A control device according to claim 5 or 6, further comprising three light-sensitive sensor means 35 (8) having a two-dimensional sensor surface for record ing said images in three linearly independent directions. WO 99/60469 PCT/SE99/00719 22
9. A control device according to any one of claims 1-8, further comprising image-processing means (20-24) for providing control signals for controlling said object. 5 10. A control device according to claim 9, wherein the image-processing means (20-24) are adapted to deter mine the relative positions of the images with the aid of the partially overlapping contents for providing said control signals.
10
11. A control device according to any one of claims 5-8, further comprising image-processing means (20-24) which are adapted to determine the relative positions of the images with the aid of the partially overlapping con tents simultaneously with respect to all the light-sensi 15 tive sensor means (8) for providing said control signals.
12. A control device according to claim 11, the con trol device furthermore having a calibration mode, in which the image-recording means are moved in a way that enables the image-processing means (20-24) to relate the 20 relative positions of the images to an actual movement of the image-recording means.
13. A control device according to any one of claims 10-12, wherein the image-processing means (20-24) are adapted to generate said control signals on the basis of 25 at least one movement vector obtained from the relative positions of the images.
14. A control device according to any one of claims 10-13, wherein the image-processing means (20-24) are adapted to generate said control signals on the basis of 30 at least one turning indication obtained from the rela tive positions of the images.
15. A control device according to any one of claims 10-14, wherein the image-processing means (20-24) are adapted to generate said control signals on the basis of 35 the speed at which the image-recording means have been moved, the speed being determined from the relative posi tions of the images. WO 99/60469 PCT/SE99/00719 23
16. A control device according to claims 9-15, wherein the image-processing means (20-24) are adapted to output said control signals in a way that enables a receiver to identify the control signals as being intend 5 ed for controlling an object.
17. A control device according to any one of claims 9-16, wherein the image-processing means (20-24) are fur thermore adapted to store at least one reference image and to compare images recorded subsequently with that 10 image in order to generate a signal in the case of an essentially complete overlap.
18. A control device according to any one of claims 9-17, wherein the image-processing means (20-24) comprise a transmitter (26) for wireless outputting of the control 15 signals.
19. A control device according to any one of claims 9-18, wherein the image-recording means comprise a trans mitter (26) for wireless transmission of images to the image-processing means (20-24).
20 20. A control device according to any one of the preceding claims, wherein the control device is a mouse.
21. A control device according to any one of the preceding claims, wherein the device has a first operat ing mode in which the control device is adapted to con 25 trol said object in a way that enables its movement to be proportional to the movement of the image-recording means.
22. A control device according to any one of the preceding claims, wherein the device has a second operat 30 ing mode in which the control device is adapted to con trol said object so that the speed of its movement is proportional to the distance between the image-recording means and a predefined origin.
23. A control device having image-recording means 35 which are adapted to be turned, preferably manually, for controlling an object as a function of the turning of the image-recording means; c h a r a c t e r i s e d in that WO 99/60469 PCT/SE99/00719 24 the control device is adapted to record a plurality of images with partially overlapping contents when the image-recording means are being turned, the partially overlapping contents of the images enabling the determi 5 nation of how the image-recording means have been turned.
24. A method of controlling an object, comprising the steps of - moving a control device; - recording, with the aid of the control device, a 10 plurality of images with overlapping contents during the movement of the control device, and - determining the movement of the control device with the aid of the contents of the overlapping images.
25. A method of controlling an object according to 15 claim 24, further comprising the step of - determining the relative position of the images with the aid of the partially overlapping contents for providing control signals for controlling the object.
AU43033/99A 1998-04-30 1999-04-30 Control device and method of controlling an object Ceased AU758514B2 (en)

Applications Claiming Priority (9)

Application Number Priority Date Filing Date Title
SE9801535A SE511855C2 (en) 1998-04-30 1998-04-30 Handwritten character recording device for characters, symbols, graphs, calligraphy
SE9801535 1998-04-30
US9132398P 1998-06-30 1998-06-30
US60/091323 1998-06-30
SE9803456A SE512182C2 (en) 1998-04-30 1998-10-09 Hand held input unit such as input pen for personal computer
SE9803456 1998-10-09
US10581698P 1998-10-27 1998-10-27
US60/105816 1998-10-27
PCT/SE1999/000719 WO1999060469A1 (en) 1998-04-30 1999-04-30 Control device and method of controlling an object

Publications (2)

Publication Number Publication Date
AU4303399A true AU4303399A (en) 1999-12-06
AU758514B2 AU758514B2 (en) 2003-03-20

Family

ID=27484810

Family Applications (1)

Application Number Title Priority Date Filing Date
AU43033/99A Ceased AU758514B2 (en) 1998-04-30 1999-04-30 Control device and method of controlling an object

Country Status (9)

Country Link
EP (1) EP1073946A1 (en)
JP (1) JP2002516429A (en)
KR (1) KR20010052283A (en)
CN (1) CN1303494A (en)
AU (1) AU758514B2 (en)
BR (1) BR9910572A (en)
CA (1) CA2331075A1 (en)
IL (1) IL139103A0 (en)
WO (1) WO1999060469A1 (en)

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7054487B2 (en) 2000-02-18 2006-05-30 Anoto Ip Lic Handelsbolag Controlling and electronic device
US6839453B1 (en) 2000-05-16 2005-01-04 The Upper Deck Company, Llc Method and apparatus for authenticating unique items such as sports memorabilia
KR100408518B1 (en) * 2001-04-12 2003-12-06 삼성전자주식회사 Pen input device and Measuring method of coordinate
ES2425076T3 (en) 2002-11-20 2013-10-11 Koninklijke Philips N.V. User interface system based on pointing device
SE0303370D0 (en) 2003-12-16 2003-12-16 Anoto Ab Method, apparatus, computer program and storage medium for recording a movement of a user unit
US7136054B2 (en) 2004-01-06 2006-11-14 Microsoft Corporation Camera-pen-tip mapping and calibration
US7263224B2 (en) * 2004-01-16 2007-08-28 Microsoft Corporation Strokes localization by m-array decoding and fast image matching
KR100675830B1 (en) * 2004-03-11 2007-01-29 주식회사 애트랩 Image sensor, optic pointing device and motion value calculation method of it
US7536051B2 (en) * 2005-02-17 2009-05-19 Microsoft Corporation Digital pen calibration by local linearization
JP2009020718A (en) * 2007-07-12 2009-01-29 Nec Commun Syst Ltd Radio input device and equipment operation system
US8054512B2 (en) 2007-07-30 2011-11-08 Palo Alto Research Center Incorporated System and method for maintaining paper and electronic calendars
CN101859205A (en) * 2009-04-08 2010-10-13 鸿富锦精密工业(深圳)有限公司 Hand input device and hand input system
CN105387802A (en) * 2015-10-13 2016-03-09 东莞市微大软件科技有限公司 Method for controlling movement of worktable of automatic image measuring instrument

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0112415B1 (en) * 1982-12-22 1987-03-18 International Business Machines Corporation A method and apparatus for continuously updating a display of the coordinates of a light pen
CN1122925A (en) * 1994-11-07 1996-05-22 颜艮山 Instant look mouse scanner

Also Published As

Publication number Publication date
EP1073946A1 (en) 2001-02-07
IL139103A0 (en) 2001-11-25
CN1303494A (en) 2001-07-11
AU758514B2 (en) 2003-03-20
KR20010052283A (en) 2001-06-25
BR9910572A (en) 2001-01-16
JP2002516429A (en) 2002-06-04
WO1999060469A1 (en) 1999-11-25
CA2331075A1 (en) 1999-11-25

Similar Documents

Publication Publication Date Title
US6198485B1 (en) Method and apparatus for three-dimensional input entry
US7257255B2 (en) Capturing hand motion
US7817134B2 (en) Pointing device
WO2013035554A1 (en) Method for detecting motion of input body and input device using same
US6906699B1 (en) Input unit, method for using the same and input system
AU758514B2 (en) Control device and method of controlling an object
US7006079B2 (en) Information input system
US7825898B2 (en) Inertial sensing input apparatus
US20020118181A1 (en) Absolute optical position determination
KR20010052282A (en) Input unit, method for using the same and input system
CN110785729B (en) Electronic device for generating analog strokes and for digital storage of analog strokes and input system and method for digitizing analog recordings
US20020158848A1 (en) Optical position determination on plain paper
JP2006190212A (en) Three-dimensional position input device
US20230300290A1 (en) Information display system, information display method, and non-transitory recording medium
EP1380006B1 (en) Handwritten character recording and recognition device
JP4292927B2 (en) Pen-type data input device and program
EP1073945B1 (en) Device and method for recording hand-written information
JP2006268854A (en) Method and system for determining position of handheld object based on acceleration of handheld object
MXPA00010533A (en) Control device and method of controlling an object
SE512182C2 (en) Hand held input unit such as input pen for personal computer
KR20020085099A (en) Pen-type computer input device
JP2023136239A (en) Information processing device, information processing system, supporting system, and information processing method
CN115253275A (en) Intelligent terminal, palm machine, virtual system and space positioning method of intelligent terminal
MXPA00010548A (en) Device and method for recording hand-written information
SE511855C2 (en) Handwritten character recording device for characters, symbols, graphs, calligraphy

Legal Events

Date Code Title Description
FGA Letters patent sealed or granted (standard patent)