WO1999060469A1 - Control device and method of controlling an object - Google Patents

Control device and method of controlling an object Download PDF

Info

Publication number
WO1999060469A1
WO1999060469A1 PCT/SE1999/000719 SE9900719W WO9960469A1 WO 1999060469 A1 WO1999060469 A1 WO 1999060469A1 SE 9900719 W SE9900719 W SE 9900719W WO 9960469 A1 WO9960469 A1 WO 9960469A1
Authority
WO
WIPO (PCT)
Prior art keywords
control device
image
images
recording means
control
Prior art date
Application number
PCT/SE1999/000719
Other languages
French (fr)
Inventor
Christer FÅHRAEUS
Ola Hugosson
Petter Ericson
Original Assignee
C Technologies Ab
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from SE9801535A external-priority patent/SE511855C2/en
Priority claimed from SE9803456A external-priority patent/SE512182C2/en
Application filed by C Technologies Ab filed Critical C Technologies Ab
Priority to CA002331075A priority Critical patent/CA2331075A1/en
Priority to KR1020007012071A priority patent/KR20010052283A/en
Priority to JP2000550020A priority patent/JP2002516429A/en
Priority to IL13910399A priority patent/IL139103A0/en
Priority to EP99952125A priority patent/EP1073946A1/en
Priority to AU43033/99A priority patent/AU758514B2/en
Priority to BR9910572-1A priority patent/BR9910572A/en
Publication of WO1999060469A1 publication Critical patent/WO1999060469A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • G06F3/0317Detection arrangements using opto-electronic means in co-operation with a patterned surface, e.g. absolute position or relative movement detection for an optical mouse or pen positioned with respect to a coded surface
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03545Pens or stylus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means

Definitions

  • the present invention relates to a control device having image-recording means which are adapted to be moved, preferably manually, for controlling an object as a function of the movement of the image-recording means.
  • the invention also relates to a method of controlling an object .
  • a control device a so-called computer mouse
  • the positioning is carried out by the user passing the mouse over a surface, the hand movement indicating how the mouse should be positioned.
  • the mouse generates positioning signals indicating how the mouse has been moved and thus how the cursor should be moved.
  • the most common type of mouse has a ball on its underside, which turns as a result of friction against the surface when the mouse is passed over the same and which in this connection drives position sensors which in turn generate the positioning signals.
  • the mouse can also be used for providing instructions to the computer by the intermediary of one or more buttons on which the user clicks .
  • JP 09190277 shows an optical mouse having one CCD line sensor for the X-axis and one CCD line sensor for the Y-axis. Data recorded by means of the CCD line sensors at a certain time is compared with data recorded at a subsequent time, whereby the movement of the mouse in the X direction and in the Y direction can be determined.
  • a mouse is thus used for controlling a virtual object.
  • control devices whose structure is similar to that of a mouse, but which are used for controlling physical objects instead.
  • control devices for controlling objects in two dimensions, i.e. in a plane, or in three dimensions, i.e. in a space.
  • WO 98/11528 describes a control device which provides a computer with three-dimensional information.
  • the device is based on three accelerometers which are placed in mutually perpendicular directions and which are cap- able of measuring acceleration or inclination in one to three directions.
  • the device can, for example, be placed on the user's head or it can be hand-held.
  • a computer mouse for inputting three-dimensional information to a computer is described in US Patent 5,506,605.
  • This computer mouse is hand-held and is intended to be held freely in space. Furthermore, it can comprise sensors for measuring various physical properties which are subsequently interpreted by suitable electronic means, converted into digital format, and input to the computer.
  • the position of the mouse in space is determined by position sensors, which may be based on light, acceleration, gyroscopes, etc.
  • position sensors which may be based on light, acceleration, gyroscopes, etc.
  • use is made of an ultrasonic sensor and a magnetic sensor.
  • the computer can subsequently generate tactile feedback in the form of vibrations which, for example, provide the user with information concerning the location of the mouse in relation to its desired location.
  • the invention relates to a control device having image-recording means which are adapted to be moved by a user, preferably manually, for controlling an object, which may be physi- cal or virtual, as a function of the movement of the image-recording means.
  • the image-recording means are adapted to record a plurality of images with partially overlapping contents when they are being moved, the partially overlapping contents enabling the determination of how the image-recording means have been moved.
  • the invention is thus based on the idea of using images for determining how a unit is being moved.
  • This technology can be used for two-dimensional as well as three-dimensional control. It is advantageous because it requires few sensors and no moving parts.
  • the entire movement information is contained in the overlapping contents of the images.
  • the device records images of the surroundings, an "absolute" position indication is obtained, making it possible to detect when the image- recording means are in a specific position, which, for example, is not possible when using control devices based on measuring acceleration.
  • turning can also be detected and used for controlling an object .
  • the control device is designed for controlling an object in a plane.
  • the control device is advantageously adapted to control the angular position of the object in the plane.
  • the image- recording means are advantageously provided with a light - sensitive sensor means having a two-dimensional sensor surface, a so-called area sensor, for recording the images.
  • a two-dimensional sensor sur- face refers to the fact that the sensor surface must be capable of imaging a surface with a matrix of pixels.
  • CCD sensors and CMOS sensors are examples of suitable sensors. A single sensor is thus sufficient for providing control in a plane.
  • the device is designed for controlling an object in a space.
  • the control device is advantageously adapted to control the angular position of the object, in which connection the control can take place about three axes.
  • the image-recording means it is preferable for the image-recording means to comprise three sensors for recording the images in three, preferably perpendicular, directions. This enables the determination of the translation along three mutually perpendicular axes as well as of the rotation about these axes by means of relatively simple calculations.
  • the control device has image-processing means for providing control signals for controlling the object.
  • the image-processing means may be located in the same physical casing as the image-recording means, the output signals from this physical casing thus constituting the control signals for controlling the object which is to be controlled.
  • the image-processing means may also be located in another physical casing, for exam- pie in a computer whose cursor constitutes the object which is to be controlled, or in a computer which in turn controls, or forms part of, a physical object which is controlled by means of the control device, the output signals from the image-processing means constituting the control signals for controlling the object.
  • control signals output- ted from the image-processing means may require further processing before they can be used for direct control of the object.
  • the image-processing means are advantageously implemented with the aid of a processor and software, but can also be implemented completely with the aid of hard- ware .
  • the image-processing means are suitably adapted to determine the relative positions of the images with the aid of the partially overlapping contents for providing said control signals. If the control device is used for control in three dimensions, this is suitably carried out in parallel with respect to all the sensors. The distance and direction of the movement, and thus the current position, can be determined on the basis of the relative positions of the images.
  • the control device has a calibration mode, in which the image-recording means are moved in a way that enables the image-processing means to relate the relative positions of the images to an actual movement of the image-recording means.
  • the control device could be provided with a distance meter measuring the distance to the surfaces being imaged with the aid of the sensors, but that would, of course, be more expensive .
  • the image-processing means are suitably adapted to generate said control signals on the basis of at least one movement vector obtained from the relative positions of the images .
  • the image-processing means may be adapted to generate said control signals on the basis of at least one turning indication obtained from the relative position of the images.
  • the control signals can thus be used for controlling the turning of an object as well as its movement, which is an advantage compared to traditional mechanical computer mouses.
  • the image-processing means can combine information from all the sensors with respect to the relative positions of the images in order to generate one movement vector and one turning vector. In this way, the position of the image-recording means can be unambiguously determined.
  • the control device can carry out a digitisation of the movement performed by a hand when it moves the image-recording means in order to enable a computer to control an object on the basis of this movement .
  • the image-processing means may be adapted to generate said control signals on the basis of the speed at which the image-recording means have been moved, the speed being determined from the relative positions of the images and the image-recording frequency.
  • the receiver of the control signals should know that the control signals are control signals so that it will know how the signals are to be subsequently processed. Consequently, the image-processing means are preferably adapted to output said control signals in such a way that a receiver can identify the control signals as being intended for controlling an object. This can, for example, be effected by the use of a predetermined protocol .
  • An advantage of using an image-based control device is that it becomes possible to determine when the image- recording means are in a predetermined position, since this position can be defined by means of one or several images. For example, it is possible to detect when the image-recording means have returned to their original position.
  • the image-processing means are adapted to store at least one reference image and to compare images recorded subsequently with this image in order to generate a signal in the case of an essentially complete overlap. For instance, the user can define a certain position as a reference position by clicking on the control device in this position.
  • the image-recording means may advantageously comprise a transmitter for wireless transmission of images from the image-recording means to the image-processing means.
  • the image-processing means may comprise a transmitter for wireless outputting of the control signals, for example to a computer whose cursor is to be controlled.
  • the control device is very easy to use since no flex is required for the information transfer.
  • a user can have a personal image-recording means or control device and use it with different computers or receivers of the control signals.
  • the transmitter can be an IR transmitter, a radio transmitter, which, for example, uses the so-called Bluetooth standard, or some other transmitter which is suitable for wireless information transfer between two units located fairly close to each other.
  • control device is a computer mouse, i.e. a device which can be connected to a computer and be used for positioning a cursor in one, two, or several dimensions.
  • the control device can be used in a first absolute mode or in a second relative mode.
  • the absolute mode the movement of the controlled object is proportional to the movement of the image-recording means.
  • the object moves in a way that corresponds to the movement of the image-recording means, regardless of where these are located.
  • the control device is configured so that the speed or acceleration of the controlled object increases when the distance increases between the image-recording means and a predefined origin of coordinates. In this way, it becomes possible to achieve faster movement of the object by holding the image-recording means farther away from the predefined origin, while, at the same time, precision control can be achieved by holding the image-recording means closer to the origin.
  • a control device having image-recording means which are adapted to be turned, preferably manually, for controlling an object as a function of the turning of the image-recording means.
  • the control device is adapted to record a plurality of images with partially overlapping contents when the image-recording means are being turned, the partially overlapping contents of the images enabling the determination of how the image-recording means have been turned.
  • This control device is thus based on the same idea as the control device described above, but instead of controlling the object as a function of the movement of the image-recording means, it is controlled as a function of their turning.
  • This control device may, for example, be a trackball.
  • a third aspect of the invention relates to a method of controlling an object, comprising the steps of moving a control device; recording, with the aid of the control device, a plurality of images with overlapping contents during the movement of the control device; and determining the movement of the control device with the aid of the contents of the overlapping images .
  • Fig. 1 schematically shows an embodiment of a con- trol device according to the invention
  • Fig. 2 is a block diagram of the electronic circuitry part of an embodiment of the control device according to the invention
  • FIG. 3 schematically shows a second embodiment of a control device according to the invention
  • Fig. 4 is a flowchart illustrating the operation of a control device for two-dimensional control
  • Fig. 5 schematically shows an "open box" in which the control device in Fig. 3 can be used
  • Fig. 6 schematically shows a movement of the control device according to the invention from a point (x,y,z) to a point (x+ ⁇ x, y+ ⁇ y, z+ ⁇ z) in an orthonormal coordinate system with the axes e x , e y , and e z ;
  • Fig. 7 schematically shows which translation scalars are outputted from the respective sensors when the control device is being moved (index shows which sensor is generating the respective scalars) ;
  • Fig. 8 schematically shows how the control device is intended to be moved in the calibration mode.
  • the control device according to the invention can be implemented in embodiments of essentially two main types.
  • a first embodiment of the control device according to the invention will be described below, which embodiment is intended to be used as a two-dimensional mouse.
  • a second embodiment of the control device will be described, which embodiment is intended to be used as a three- dimensional mouse.
  • the operation of the two- dimensional and the three-dimensional mouse will be described.
  • the image- recording means and the image-processing means are located in the same physical casing, from which control signals are outputted.
  • the image-processing means can also be located in a separate physical casing. It is very simple for the skilled person to carry out this modification. Design of the Control Device
  • the control device shown in Fig. 1 it comprises a casing 1 having approximately the same shape as a conventional highlighter pen.
  • One short side of the casing has a window 2 , by the intermediary of which images are read into the device.
  • the window 2 is somewhat recessed in the casing in order not to wear against the underlying surface.
  • the casing 1 essentially contains an optics part 3, an electronic circuitry part 4, and a power supply part 5.
  • the optics part 3 comprises a light-emitting diode 6, a lens system 7, and image-recording means in the form of a light-sensitive sensor 8, which constitutes the interface with the electronic circuitry part 4.
  • the task of the LED 6 is to illuminate a surface which is currently located under the window in the case where the control device is held directly against a surface or very close thereto.
  • a diffuser 9 is mounted in front of the LED 6 for diffusing the light.
  • the lens system 7 has the task of projecting an image of the surface located under the window 2 onto the light-sensitive sensor 8 as accurately as possible.
  • CCD charge coupled device
  • Such sensors are commercially available.
  • the sensor 8 is mounted at a small angle to the window 2 and on its own circuit board 11.
  • the power supply to the control device is obtained from a battery 12, which is mounted in a separate compartment 13 in the casing.
  • the block diagram in Fig. 2 schematically shows the electronic circuitry part 4.
  • This is located on a circuit board and comprises a processor 20, which by the intermediary of a bus 21 is connected to a ROM 22, in which the programs of the processor are stored, to a read/write memory 23, which constitutes the working memory of the processor and in which the images from the sensor are stored, to a control logic unit 24, as well as to the sensor & and the LED 6.
  • the processor 20, the bus 21, the memories 22 and 23, the control logic unit 24, as well as the associated software together constitute image-processing means.
  • the control logic unit 24 is in turn connected to a number of peripheral units, comprising a radio trans- ceiver 26 for transferring information to/from an external computer, buttons 27, by means of which the user can control the image-recording means and which can also be used as the clicking buttons of a traditional mouse, as well as an indicator 29, e.g. a LED, indicating when the mouse is ready to be used.
  • a radio trans- ceiver 26 for transferring information to/from an external computer
  • buttons 27 by means of which the user can control the image-recording means and which can also be used as the clicking buttons of a traditional mouse, as well as an indicator 29, e.g. a LED, indicating when the mouse is ready to be used.
  • Control signals to the memories, the sensor, and the peripheral units are generated in the control logic unit 24.
  • the control logic also handles generation and prioritisation of interrupts to the processor.
  • the buttons 27, the radio transceiver 26, and the LED 6 are accessed by the processor writing and reading in a register in the control logic
  • Fig. 3 shows a second embodiment of the control device according to the invention. Like the first embodiment, this embodiment comprises a pen-shaped casing 31. Besides the window 32 on one short side of the casing, the device has two additional windows 32' and 32". Each of the windows 32, 32', 32" is somewhat recessed in the casing so that it will not wear or scratch should the control device impinge upon a surface when it is in use, or when it is in the idle position.
  • the casing 1 essentially contains an optics part 33, an electronic circuitry part 34, and a power supply part 5.
  • the optics part 33 comprises a lens package (not shown) with three lens systems and a set of sensors (not shown) with three light-sensitive sensors which constitute the interface to the electronic circuitry part 34 for the windows 32, 32' and 32" respectively. There is no light -emitting diode in this embodiment.
  • the control device is intended to be held at a distance from the surfaces which are being imaged and, consequently, in most cases, ambient light is sufficient to permit images to be recorded.
  • the lens systems have the task of projecting images of the surfaces at which the windows 32, 32', 32" are directed onto the light-sensitive sensors as accurately as possible.
  • the light-sensitive sensors comprise two-dimensional, square CCD units with built-in A/D converters. Each sensor is mounted on its own circuit board.
  • the power supply to the control device is obtained from a battery, which is mounted in a separate compartment in the casing.
  • the design of the electronic circuitry part is essentially the same as that described above with respect to the first embodiment.
  • the electronic circuitry part is shared by all three sensors.
  • Application of the Device as a Two-dimensional Mouse The device according to the first embodiment can be used as a mouse for inputting movement information, by means of which a cursor can be controlled on a computer screen.
  • the user directs the window 2 of the control device at a patterned surface, e.g. a mouse pad. He presses one of the buttons 27 to activate the image-recording means, whereupon the processor 20 commands the LED 6 to begin generating strobe pulses at a predetermined frequency, suitably at least 50 Hz. Subsequently, the user passes the control device over the surface in the same way as if it were a traditional mouse, whereupon images with partially overlapping contents are recorded by the sensor 8 and stored in the read/write memory 23.
  • the images are stored as images, i.e. with the aid of a plurality of pixels, each having a grey scale value in a range from white to black.
  • step 400 a starting image is recorded.
  • step 401 the next image is recorded.
  • the contents of this image partially overlap the contents of the previous image.
  • step 402 the process begins of determining how it overlaps the previous image, step 402, i.e. in which relative position the best match is obtained between the contents of the images. This determination is carried out by translating the images vertically and horizontally relative to each other, and by rotating the images relative to each other. For this purpose, every possible overlap position between the images is examined, at the pixel level, and an overlap measurement is determined as follows: 1) For each overlapping pixel position, the grey scale values of the two relevant pixels are added up if the latter are not white. Such a pixel position in which none of the pixels is white is designated a plus position. 2) The grey scale sums for all the plus positions are added up.
  • the previous image is discarded, whereupon the current image becomes the previous image in relation to the next image recorded.
  • a movement vector is obtained, which indicates how far and in which direction the image-recording means have been moved between the recording of the two images. If the mouse has also been turned between the two images, also a measurement of this turning is obtained. Subsequently, a control signal, which includes the movement vector and the measurement of turning, is transmitted, step 403, by the radio transceiver 26 to the computer for which the control device is operating as a mouse. The computer uses the movement vector and the measurement of turning for positioning the cursor on its screen. Subsequently, the flow returns to step 401. In order to increase the speed, the steps can be partly carried out in parallel, e.g. by starting the recording of the next image while the current image is being put together with the previous image . When the mouse is activated, the buttons 27 can be used as clicking buttons for inputting instructions to the computer. Application of the Device as a Three-dimensional Mouse
  • the device according to the second embodiment can be used as a mouse for inputting movement information, by means of which a cursor can be controlled in three dimensions on a computer screen, i.e. in a space.
  • the three-dimensional mouse comprises three sensors 32, 32', 32" having two-dimensional, light-sensitive sensor surfaces.
  • the main axes of the sensors are orientated along the x- , y- , and z-axes in an orthogonal coordinate system and have a two-dimensional spatial resolution of n x n pixels and a time resolution of m images per second.
  • Each lens system provides a field of vision with an angle of vision of v radians for the associated sensor surface.
  • the mouse movements are carried out in an "open box" 50 according to Fig. 5, which is defined by at least two side walls 51 and 52 which are orientated at right angles in relation to each other, and a floor 53. It is also possible that the mouse can be held freely in space, but this requires more complicated calculation algorithms than the ones that will be described below.
  • the above method of determining the relative position of images is used for each sensor. Accordingly, the operation in this case can also be described by means of the flowchart in Fig. 4, but instead of recording individual images, a set of images consisting of three images is recorded simultaneously. One movement vector and one turning vector are thus generated with the aid of the images recorded by each light-sensitive sensor, which vectors describe the movement carried out by the mouse between the recording of two consecutive images . These vectors are then included in a control signal which is transmitted to the object which is to be controlled by means of the mouse.
  • the light conditions be such that the light-sensitive sensors are capable of recording images of sufficiently high quality to permit their pro- cessing as described above.
  • the x-axis of the mouse thus points in the direction R-e x , the y-axis points in the direction R-e y , and the z-axis points in the direction R-e z . Also suppose that between the recording of two images, the mouse carries out a translational motion and/or a rotational motion according to:
  • the translation vectors can be defined as shown in Fig. 7.
  • the first sensor records movements in the x and y directions
  • the second sensor records movements in the y and z directions
  • the third sensor records movements in the x and z directions. Consequently, for any triplet of consecutive images, the translation scalars (xl, yl , y2 , z2 , x3 , z3) describe the detected movement of the mouse.
  • the translation scalars consist of the outputs from the image-matching algorithm for each sensor.
  • n the number of pixels along one side of the sensor and v is the angle of vision of the sensor surface, expressed in radians.
  • the functional distance is a constant which relates the output from the image- matching algorithm to the translational motion.
  • the functional distance is determined by means of a calibration which will be described below. In the special case where the mouse moves inside an "open box" 50 as described above, the functional distance corresponds to the geometrical distance from the middle of the mouse to the respective walls 51, 52, and 53 of the box 50.
  • di and d 3 are the functional distances from the mouse to the projected surfaces with respect to (xl,yl) and (x3,z3) . The following is obtained if this is generalised to all the axes
  • a translation vector ( ⁇ x, ⁇ y, ⁇ z) and a rotation vector ( ⁇ x , ⁇ y , ⁇ 2 ) can be obtained by solving the following system of equations, which is solvable. These vectors are then included in a control signal which is transmitted to the object controlled by means of the mouse, which signal indicates the new position of the object.
  • Calibration i.e. the calculation of the functional distances d , d 2 , and d 3 , can be carried out by moving the mouse along the edges of the open box.
  • the mouse is moved along the x- , y- , and z-axes according to a sequence A-B- C shown in Fig. 8.
  • the user can choose at a certain time to store the images which the sensors are currently recording in a memory. Subsequently, each set of recorded images is compared to the stored set of images and when a complete overlap exists, a signal is generated to the user.
  • This enables precision control of an object since the user can find his way back to the exact position at which the mouse was located on a previous occasion.
  • the same principles can be used in the case of two-dimensional control of an object.
  • the mouse In another application of the mouse, only the rotational motion is detected. In this case, no calibration is required and it is sufficient to solve the equation presented above in connection with the discussion concerning rotation.
  • the mouse can, for example, be mounted on a helmet or the like which is worn by the user and which, for example, is used in various types of virtual reality applications.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Input By Displaying (AREA)
  • Image Input (AREA)

Abstract

A control device, for example a computer mouse, has image-recording means which are adapted to be moved, preferably manually, for controlling an object, for example a cursor on a computer screen, as a function of the movement of the image-recording means. The control device is adapted to record a plurality of images with partially overlapping contents when the image-recording means are being moved, the partially overlapping contents of the images enabling image-processing means to generate control signals indicating how the image-recording means have been moved. A method of controlling and a control device based on the turning of the image-recording means are also shown.

Description

CONTROL DEVICE AND METHOD OF CONTROLLING AN OBJECT
Field of the Invention
The present invention relates to a control device having image-recording means which are adapted to be moved, preferably manually, for controlling an object as a function of the movement of the image-recording means. The invention also relates to a method of controlling an object . Background of the Invention
Today, personal computers are usually equipped with a control device, a so-called computer mouse, which is used for positioning a cursor on the computer screen. The positioning is carried out by the user passing the mouse over a surface, the hand movement indicating how the mouse should be positioned. The mouse generates positioning signals indicating how the mouse has been moved and thus how the cursor should be moved. Presently, the most common type of mouse has a ball on its underside, which turns as a result of friction against the surface when the mouse is passed over the same and which in this connection drives position sensors which in turn generate the positioning signals. Normally, the mouse can also be used for providing instructions to the computer by the intermediary of one or more buttons on which the user clicks .
Optical computer mouses are also known. JP 09190277 shows an optical mouse having one CCD line sensor for the X-axis and one CCD line sensor for the Y-axis. Data recorded by means of the CCD line sensors at a certain time is compared with data recorded at a subsequent time, whereby the movement of the mouse in the X direction and in the Y direction can be determined.
A mouse is thus used for controlling a virtual object. However, there are other control devices whose structure is similar to that of a mouse, but which are used for controlling physical objects instead.
Furthermore, there are control devices for controlling objects in two dimensions, i.e. in a plane, or in three dimensions, i.e. in a space.
WO 98/11528 describes a control device which provides a computer with three-dimensional information. The device is based on three accelerometers which are placed in mutually perpendicular directions and which are cap- able of measuring acceleration or inclination in one to three directions. The device can, for example, be placed on the user's head or it can be hand-held.
A computer mouse for inputting three-dimensional information to a computer is described in US Patent 5,506,605. This computer mouse is hand-held and is intended to be held freely in space. Furthermore, it can comprise sensors for measuring various physical properties which are subsequently interpreted by suitable electronic means, converted into digital format, and input to the computer. The position of the mouse in space is determined by position sensors, which may be based on light, acceleration, gyroscopes, etc. In the embodiment described, use is made of an ultrasonic sensor and a magnetic sensor. On the basis of the input, the computer can subsequently generate tactile feedback in the form of vibrations which, for example, provide the user with information concerning the location of the mouse in relation to its desired location. Summary of the Invention It is an object of the present invention to provide an improved control device and an improved method of controlling an object which are suited for both two-dimensional and three-dimensional control of physical as well as virtual objects. This object is achieved by control devices according to claims 1 and 23 and by a method according to claim 24. Preferred embodiments are stated in the subclaims. Thus, according to a first aspect, the invention relates to a control device having image-recording means which are adapted to be moved by a user, preferably manually, for controlling an object, which may be physi- cal or virtual, as a function of the movement of the image-recording means. The image-recording means are adapted to record a plurality of images with partially overlapping contents when they are being moved, the partially overlapping contents enabling the determination of how the image-recording means have been moved.
The invention is thus based on the idea of using images for determining how a unit is being moved. This technology can be used for two-dimensional as well as three-dimensional control. It is advantageous because it requires few sensors and no moving parts. The entire movement information is contained in the overlapping contents of the images. Because the device records images of the surroundings, an "absolute" position indication is obtained, making it possible to detect when the image- recording means are in a specific position, which, for example, is not possible when using control devices based on measuring acceleration. In addition to movement, turning can also be detected and used for controlling an object . In one embodiment, the control device is designed for controlling an object in a plane. In this case, the overlapping images enable the determination of not only the movement of the image-recording means but also their turning in the plane, which, for example, is not possible when using a traditional mouse with a ball. Accordingly, the control device is advantageously adapted to control the angular position of the object in the plane. When the device is designed for control in a plane, the image- recording means are advantageously provided with a light - sensitive sensor means having a two-dimensional sensor surface, a so-called area sensor, for recording the images. In this context, a two-dimensional sensor sur- face refers to the fact that the sensor surface must be capable of imaging a surface with a matrix of pixels. CCD sensors and CMOS sensors are examples of suitable sensors. A single sensor is thus sufficient for providing control in a plane.
In an alternative embodiment, the device is designed for controlling an object in a space. In this case, too, the control device is advantageously adapted to control the angular position of the object, in which connection the control can take place about three axes. In an economical embodiment, it may be sufficient for the device to have two light-sensitive sensors each having a two-dimensional sensor surface for recording said images in two different directions. However, for more precise control in space, it is preferable for the image-recording means to comprise three sensors for recording the images in three, preferably perpendicular, directions. This enables the determination of the translation along three mutually perpendicular axes as well as of the rotation about these axes by means of relatively simple calculations.
Suitably, the control device has image-processing means for providing control signals for controlling the object. The image-processing means may be located in the same physical casing as the image-recording means, the output signals from this physical casing thus constituting the control signals for controlling the object which is to be controlled. However, the image-processing means may also be located in another physical casing, for exam- pie in a computer whose cursor constitutes the object which is to be controlled, or in a computer which in turn controls, or forms part of, a physical object which is controlled by means of the control device, the output signals from the image-processing means constituting the control signals for controlling the object. In this context, it should be noted that the control signals output- ted from the image-processing means may require further processing before they can be used for direct control of the object. The image-processing means are advantageously implemented with the aid of a processor and software, but can also be implemented completely with the aid of hard- ware .
The image-processing means are suitably adapted to determine the relative positions of the images with the aid of the partially overlapping contents for providing said control signals. If the control device is used for control in three dimensions, this is suitably carried out in parallel with respect to all the sensors. The distance and direction of the movement, and thus the current position, can be determined on the basis of the relative positions of the images. Advantageously, the control device has a calibration mode, in which the image-recording means are moved in a way that enables the image-processing means to relate the relative positions of the images to an actual movement of the image-recording means. As an alternative, the control device could be provided with a distance meter measuring the distance to the surfaces being imaged with the aid of the sensors, but that would, of course, be more expensive .
The image-processing means are suitably adapted to generate said control signals on the basis of at least one movement vector obtained from the relative positions of the images .
Additionally, or alternatively, the image-processing means may be adapted to generate said control signals on the basis of at least one turning indication obtained from the relative position of the images. The control signals can thus be used for controlling the turning of an object as well as its movement, which is an advantage compared to traditional mechanical computer mouses. In the case of a control device for three-dimensional control, the image-processing means can combine information from all the sensors with respect to the relative positions of the images in order to generate one movement vector and one turning vector. In this way, the position of the image-recording means can be unambiguously determined. In other words, the control device can carry out a digitisation of the movement performed by a hand when it moves the image-recording means in order to enable a computer to control an object on the basis of this movement .
In one embodiment, the image-processing means may be adapted to generate said control signals on the basis of the speed at which the image-recording means have been moved, the speed being determined from the relative positions of the images and the image-recording frequency.
Suitably, the receiver of the control signals should know that the control signals are control signals so that it will know how the signals are to be subsequently processed. Consequently, the image-processing means are preferably adapted to output said control signals in such a way that a receiver can identify the control signals as being intended for controlling an object. This can, for example, be effected by the use of a predetermined protocol .
An advantage of using an image-based control device is that it becomes possible to determine when the image- recording means are in a predetermined position, since this position can be defined by means of one or several images. For example, it is possible to detect when the image-recording means have returned to their original position. For this purpose, the image-processing means are adapted to store at least one reference image and to compare images recorded subsequently with this image in order to generate a signal in the case of an essentially complete overlap. For instance, the user can define a certain position as a reference position by clicking on the control device in this position.
If the image-recording means and the image-processing means are located in different physical casings, the image-recording means may advantageously comprise a transmitter for wireless transmission of images from the image-recording means to the image-processing means. Moreover, especially if the image-recording and the image-processing means are located in the same physical casing, it may be an advantage if the image-processing means comprise a transmitter for wireless outputting of the control signals, for example to a computer whose cursor is to be controlled. In both cases, the control device is very easy to use since no flex is required for the information transfer. For example, a user can have a personal image-recording means or control device and use it with different computers or receivers of the control signals. The transmitter can be an IR transmitter, a radio transmitter, which, for example, uses the so-called Bluetooth standard, or some other transmitter which is suitable for wireless information transfer between two units located fairly close to each other.
In a preferred embodiment, the control device is a computer mouse, i.e. a device which can be connected to a computer and be used for positioning a cursor in one, two, or several dimensions.
The control device can be used in a first absolute mode or in a second relative mode. In the absolute mode, the movement of the controlled object is proportional to the movement of the image-recording means. In other words, the object moves in a way that corresponds to the movement of the image-recording means, regardless of where these are located. In the relative mode, however, the control device is configured so that the speed or acceleration of the controlled object increases when the distance increases between the image-recording means and a predefined origin of coordinates. In this way, it becomes possible to achieve faster movement of the object by holding the image-recording means farther away from the predefined origin, while, at the same time, precision control can be achieved by holding the image-recording means closer to the origin.
According to a second aspect of the invention, it relates to a control device having image-recording means which are adapted to be turned, preferably manually, for controlling an object as a function of the turning of the image-recording means. The control device is adapted to record a plurality of images with partially overlapping contents when the image-recording means are being turned, the partially overlapping contents of the images enabling the determination of how the image-recording means have been turned.
This control device is thus based on the same idea as the control device described above, but instead of controlling the object as a function of the movement of the image-recording means, it is controlled as a function of their turning. This control device may, for example, be a trackball. The embodiments discussed above are to a large extent also applicable to the turning control device, and the same advantages are obtained.
According to a third aspect of the invention, it relates to a method of controlling an object, comprising the steps of moving a control device; recording, with the aid of the control device, a plurality of images with overlapping contents during the movement of the control device; and determining the movement of the control device with the aid of the contents of the overlapping images . The same advantages are obtained as those described with respect to the above-mentioned devices. Brief Description of the Drawings
The present invention will be described in more detail below by way of exemplifying embodiments with reference to the accompanying drawings, in which
Fig. 1 schematically shows an embodiment of a con- trol device according to the invention; Fig. 2 is a block diagram of the electronic circuitry part of an embodiment of the control device according to the invention;
Fig. 3 schematically shows a second embodiment of a control device according to the invention;
Fig. 4 is a flowchart illustrating the operation of a control device for two-dimensional control;
Fig. 5 schematically shows an "open box" in which the control device in Fig. 3 can be used; Fig. 6 schematically shows a movement of the control device according to the invention from a point (x,y,z) to a point (x+δx, y+δy, z+δz) in an orthonormal coordinate system with the axes ex, ey, and ez;
Fig. 7 schematically shows which translation scalars are outputted from the respective sensors when the control device is being moved (index shows which sensor is generating the respective scalars) ; and
Fig. 8 schematically shows how the control device is intended to be moved in the calibration mode. Description of Preferred Embodiments
The control device according to the invention can be implemented in embodiments of essentially two main types. A first embodiment of the control device according to the invention will be described below, which embodiment is intended to be used as a two-dimensional mouse. Next, a second embodiment of the control device will be described, which embodiment is intended to be used as a three- dimensional mouse. Finally, the operation of the two- dimensional and the three-dimensional mouse will be described. In both embodiments described, the image- recording means and the image-processing means are located in the same physical casing, from which control signals are outputted. As mentioned above, the image-processing means can also be located in a separate physical casing. It is very simple for the skilled person to carry out this modification. Design of the Control Device
In the first embodiment of the control device shown in Fig. 1, it comprises a casing 1 having approximately the same shape as a conventional highlighter pen. One short side of the casing has a window 2 , by the intermediary of which images are read into the device. The window 2 is somewhat recessed in the casing in order not to wear against the underlying surface.
The casing 1 essentially contains an optics part 3, an electronic circuitry part 4, and a power supply part 5.
The optics part 3 comprises a light-emitting diode 6, a lens system 7, and image-recording means in the form of a light-sensitive sensor 8, which constitutes the interface with the electronic circuitry part 4.
The task of the LED 6 is to illuminate a surface which is currently located under the window in the case where the control device is held directly against a surface or very close thereto. A diffuser 9 is mounted in front of the LED 6 for diffusing the light.
The lens system 7 has the task of projecting an image of the surface located under the window 2 onto the light-sensitive sensor 8 as accurately as possible.
In this example, the light-sensitive sensor 8 com- prises a two-dimensional, square CCD unit (CCD = charge coupled device) with a built-in A/D converter. Such sensors are commercially available. The sensor 8 is mounted at a small angle to the window 2 and on its own circuit board 11. The power supply to the control device is obtained from a battery 12, which is mounted in a separate compartment 13 in the casing.
The block diagram in Fig. 2 schematically shows the electronic circuitry part 4. This is located on a circuit board and comprises a processor 20, which by the intermediary of a bus 21 is connected to a ROM 22, in which the programs of the processor are stored, to a read/write memory 23, which constitutes the working memory of the processor and in which the images from the sensor are stored, to a control logic unit 24, as well as to the sensor & and the LED 6. The processor 20, the bus 21, the memories 22 and 23, the control logic unit 24, as well as the associated software together constitute image-processing means.
The control logic unit 24 is in turn connected to a number of peripheral units, comprising a radio trans- ceiver 26 for transferring information to/from an external computer, buttons 27, by means of which the user can control the image-recording means and which can also be used as the clicking buttons of a traditional mouse, as well as an indicator 29, e.g. a LED, indicating when the mouse is ready to be used. Control signals to the memories, the sensor, and the peripheral units are generated in the control logic unit 24. The control logic also handles generation and prioritisation of interrupts to the processor. The buttons 27, the radio transceiver 26, and the LED 6 are accessed by the processor writing and reading in a register in the control logic unit 24. The buttons 27 generate interrupts to the processor 20 when they are activated.
Fig. 3 shows a second embodiment of the control device according to the invention. Like the first embodiment, this embodiment comprises a pen-shaped casing 31. Besides the window 32 on one short side of the casing, the device has two additional windows 32' and 32". Each of the windows 32, 32', 32" is somewhat recessed in the casing so that it will not wear or scratch should the control device impinge upon a surface when it is in use, or when it is in the idle position.
As in the above case, the casing 1 essentially contains an optics part 33, an electronic circuitry part 34, and a power supply part 5.
The optics part 33 comprises a lens package (not shown) with three lens systems and a set of sensors (not shown) with three light-sensitive sensors which constitute the interface to the electronic circuitry part 34 for the windows 32, 32' and 32" respectively. There is no light -emitting diode in this embodiment. The control device is intended to be held at a distance from the surfaces which are being imaged and, consequently, in most cases, ambient light is sufficient to permit images to be recorded.
The lens systems have the task of projecting images of the surfaces at which the windows 32, 32', 32" are directed onto the light-sensitive sensors as accurately as possible.
As in the above embodiment, the light-sensitive sensors comprise two-dimensional, square CCD units with built-in A/D converters. Each sensor is mounted on its own circuit board.
In this embodiment, too, the power supply to the control device is obtained from a battery, which is mounted in a separate compartment in the casing. In this second embodiment, the design of the electronic circuitry part is essentially the same as that described above with respect to the first embodiment. The electronic circuitry part is shared by all three sensors. Application of the Device as a Two-dimensional Mouse The device according to the first embodiment can be used as a mouse for inputting movement information, by means of which a cursor can be controlled on a computer screen.
The user directs the window 2 of the control device at a patterned surface, e.g. a mouse pad. He presses one of the buttons 27 to activate the image-recording means, whereupon the processor 20 commands the LED 6 to begin generating strobe pulses at a predetermined frequency, suitably at least 50 Hz. Subsequently, the user passes the control device over the surface in the same way as if it were a traditional mouse, whereupon images with partially overlapping contents are recorded by the sensor 8 and stored in the read/write memory 23. The images are stored as images, i.e. with the aid of a plurality of pixels, each having a grey scale value in a range from white to black. The flowchart in Fig. 4 shows the operation of the two-dimensional mouse in more detail. In step 400, a starting image is recorded. In step 401, the next image is recorded. The contents of this image partially overlap the contents of the previous image. As soon as an image has been recorded in step 401, the process begins of determining how it overlaps the previous image, step 402, i.e. in which relative position the best match is obtained between the contents of the images. This determination is carried out by translating the images vertically and horizontally relative to each other, and by rotating the images relative to each other. For this purpose, every possible overlap position between the images is examined, at the pixel level, and an overlap measurement is determined as follows: 1) For each overlapping pixel position, the grey scale values of the two relevant pixels are added up if the latter are not white. Such a pixel position in which none of the pixels is white is designated a plus position. 2) The grey scale sums for all the plus positions are added up.
3) The neighbours of each pixel position are examined. If an overlapping pixel position is not a neighbour of a plus position and consists of a pixel which is white and a pixel position which is not white, the grey scale value of the non-white pixel is subtracted, possibly multiplied by a constant, from the sum in point 2) .
4) The overlap position providing the highest overlap measurement as stated above is selected. Our Swedish patent application No. 9704924-1 and the corresponding U.S. application No. 024 641 describe an alternative way of matching the images in order to find the best overlap position. The content of these applications is herewith incorporated by reference.
As soon as the best overlap position between the current image and the previous image has been determined, the previous image is discarded, whereupon the current image becomes the previous image in relation to the next image recorded.
By determining the relative position of the two images a movement vector is obtained, which indicates how far and in which direction the image-recording means have been moved between the recording of the two images. If the mouse has also been turned between the two images, also a measurement of this turning is obtained. Subsequently, a control signal, which includes the movement vector and the measurement of turning, is transmitted, step 403, by the radio transceiver 26 to the computer for which the control device is operating as a mouse. The computer uses the movement vector and the measurement of turning for positioning the cursor on its screen. Subsequently, the flow returns to step 401. In order to increase the speed, the steps can be partly carried out in parallel, e.g. by starting the recording of the next image while the current image is being put together with the previous image . When the mouse is activated, the buttons 27 can be used as clicking buttons for inputting instructions to the computer. Application of the Device as a Three-dimensional Mouse
The device according to the second embodiment can be used as a mouse for inputting movement information, by means of which a cursor can be controlled in three dimensions on a computer screen, i.e. in a space.
As described above, the three-dimensional mouse comprises three sensors 32, 32', 32" having two-dimensional, light-sensitive sensor surfaces. The main axes of the sensors are orientated along the x- , y- , and z-axes in an orthogonal coordinate system and have a two-dimensional spatial resolution of n x n pixels and a time resolution of m images per second. Each lens system provides a field of vision with an angle of vision of v radians for the associated sensor surface. When the device is in use, the mouse movements are carried out in an "open box" 50 according to Fig. 5, which is defined by at least two side walls 51 and 52 which are orientated at right angles in relation to each other, and a floor 53. It is also possible that the mouse can be held freely in space, but this requires more complicated calculation algorithms than the ones that will be described below.
When the device is in use, the above method of determining the relative position of images is used for each sensor. Accordingly, the operation in this case can also be described by means of the flowchart in Fig. 4, but instead of recording individual images, a set of images consisting of three images is recorded simultaneously. One movement vector and one turning vector are thus generated with the aid of the images recorded by each light-sensitive sensor, which vectors describe the movement carried out by the mouse between the recording of two consecutive images . These vectors are then included in a control signal which is transmitted to the object which is to be controlled by means of the mouse.
Furthermore, to enable the successful use of the mouse, it is necessary that the light conditions be such that the light-sensitive sensors are capable of recording images of sufficiently high quality to permit their pro- cessing as described above.
In order further to facilitate the reader's understanding of how the movement of the mouse can control the object, a description of the calculations carried out to determine the movement of the mouse will now be provided by way of example, with reference to Fig. 6. In the calculations below it is assumed that the image-matching algorithm is of a simple type which for each sensor mere- ly calculates the translation in two mutually perpendicular directions between two images. Suppose that the mouse is located in the position (x, y, z) and that it has a rotation which can be described by means of the ortho- normal rotation matrix R. The x-axis of the mouse thus points in the direction R-ex, the y-axis points in the direction R-ey, and the z-axis points in the direction R-ez . Also suppose that between the recording of two images, the mouse carries out a translational motion and/or a rotational motion according to:
(x,y,z) -» (x+δx, y+δy, z+δz) R → R • δR In the local coordinate system of the mouse, the translation vectors can be defined as shown in Fig. 7. The first sensor records movements in the x and y directions, the second sensor records movements in the y and z directions, and the third sensor records movements in the x and z directions. Consequently, for any triplet of consecutive images, the translation scalars (xl, yl , y2 , z2 , x3 , z3) describe the detected movement of the mouse. The translation scalars consist of the outputs from the image-matching algorithm for each sensor.
In order to calculate the rotation of the mouse, the effect of a rotation upon the translation scalars is cal- culated. Suppose that the mouse is rotated through an angle α, which is sufficiently small so that sin α ≤ α. For the sake of clarity, it is also assumed that the rotation takes place about the z-axis by αz radians. This rotation results in the scalars
y = — n a2 and x~ = — n aτ v v where n is the number of pixels along one side of the sensor and v is the angle of vision of the sensor surface, expressed in radians. Thus, the following applies to all the axes: n
0 0
V n
0 0
V n
0 0
V n
0 0
V n
0 0
V n
0 0
V
Figure imgf000019_0001
By knowing the values of the translation scalars, which are output signals from the image-matching algorithm, the number of pixels n of the sensor surface along one side and the angle of vision v of the sensor, it is thus possible to calculate a rotation vector (αx, αy, αz) for the rotation of the mouse about the x- , y- and z-axes.
Furthermore, in order to calculate the translational motion one must know the functional distance from each sensor to the ambient geometry. The functional distance is a constant which relates the output from the image- matching algorithm to the translational motion. The functional distance is determined by means of a calibration which will be described below. In the special case where the mouse moves inside an "open box" 50 as described above, the functional distance corresponds to the geometrical distance from the middle of the mouse to the respective walls 51, 52, and 53 of the box 50.
For the sake of clarity, a translation of a distance δx along the x-axis is looked at. Accordingly, the effect of the translation with respect to the scalars xl and x3 will be respectively
xl δx
2d, tan— and x3 = -δx
2d3 tan-
Here, di and d3 are the functional distances from the mouse to the projected surfaces with respect to (xl,yl) and (x3,z3) . The following is obtained if this is generalised to all the axes
2d, tan—
2d, tan- n 2d, tan— v 2 2 n
0
2d tan—
2 2
Figure imgf000020_0001
0 v
2d3 tan
2d3 tan-
By knowing the values of the translation scalars, which are obtained as output signals from the image- matching algorithm, the number of pixels n of the sensor surface along one side, the field of vision v of the sensor and the functional distances d1- d3 to the projected surfaces, it is thus possible to calculate a translation vector (δx,δy,δz) for the translation of the mouse along the x- , y- and z-axes.
To sum up, the translation scalars (xl, yl , y2 , z2 , x3 , z3) which are obtained in the image matching, thus depend on the rotation as well as translation of the mouse. Knowing these and other parameters described above, a translation vector (δx,δy,δz) and a rotation vector (αxy2) can be obtained by solving the following system of equations, which is solvable. These vectors are then included in a control signal which is transmitted to the object controlled by means of the mouse, which signal indicates the new position of the object.
n v
2d, tan- n n v 2d, tan— v 1 2 n n v
2d, tan—
2 2 n
0 v
2d2 tan v n
Figure imgf000021_0001
v
2<i3 tan v n n
2d, tan— v
3 2
Calibration, i.e. the calculation of the functional distances d , d2, and d3, can be carried out by moving the mouse along the edges of the open box. The mouse is moved along the x- , y- , and z-axes according to a sequence A-B- C shown in Fig. 8. Each movement gives rise to two equations, which together give the following system of equations : n dl - 0,5 = 0,5 = d2 05 - "
2x, tan- 2y2 tan- 2z, tan—
2 2 n d3 + 0,5 d, - 0,5 = di 05 - "
2x, tan— 2yl tan- 2z, tan—
3 2 3 2 This overdefined system of equations contains all the information required to calculate the values of the functional distances di, d2, and d3.
Furthermore, by means of the mouse according to this embodiment, the user can choose at a certain time to store the images which the sensors are currently recording in a memory. Subsequently, each set of recorded images is compared to the stored set of images and when a complete overlap exists, a signal is generated to the user. This enables precision control of an object since the user can find his way back to the exact position at which the mouse was located on a previous occasion. Naturally, the same principles can be used in the case of two-dimensional control of an object. In another application of the mouse, only the rotational motion is detected. In this case, no calibration is required and it is sufficient to solve the equation presented above in connection with the discussion concerning rotation. In this application, the mouse can, for example, be mounted on a helmet or the like which is worn by the user and which, for example, is used in various types of virtual reality applications.

Claims

1. A control device having image-recording means which are adapted to be moved, preferably manually, for controlling an object as a function of the movement of the image-recording means; c h a r a c t e r i s e d in that the control device is adapted to record a plurality of images with partially overlapping contents when the image-recording means are being moved, the partially overlapping contents of the images enabling the determination of how the image-recording means have been moved.
2. A control device according to claim 1, wherein the control device is adapted to control said object in a plane.
3. A control device according to claim 2, wherein the control device is adapted to control the angular position of said object in said plane.
4. A control device according to claim 2 or 3 , further comprising a light-sensitive sensor means (8) having a two-dimensional sensor surface for recording the images .
5. A control device according to claim 1, wherein the control device is designed for controlling said object in a space.
6. A control device according to claim 5, wherein the control device is adapted to control the angular position of said object in said space.
7. A control device according to claim 5 or 6 , further comprising at least two light-sensitive sensor means (8) having a two-dimensional sensor surface for recording said images in two different directions.
8. A control device according to claim 5 or 6 , further comprising three light-sensitive sensor means (8) having a two-dimensional sensor surface for recording said images in three linearly independent directions.
9. A control device according to any one of claims 1-8, further comprising image-processing means (20-24) for providing control signals for controlling said object .
10. A control device according to claim 9, wherein the image-processing means (20-24) are adapted to determine the relative positions of the images with the aid of the partially overlapping contents for providing said control signals.
11. A control device according to any one of claims 5-8, further comprising image-processing means (20-24) which are adapted to determine the relative positions of the images with the aid of the partially overlapping contents simultaneously with respect to all the light-sensi- tive sensor means (8) for providing said control signals.
12. A control device according to claim 11, the control device furthermore having a calibration mode, in which the image-recording means are moved in a way that enables the image-processing means (20-24) to relate the relative positions of the images to an actual movement of the image-recording means.
13. A control device according to any one of claims 10-12, wherein the image-processing means (20-24) are adapted to generate said control signals on the basis of at least one movement vector obtained from the relative positions of the images.
14. A control device according to any one of claims 10-13, wherein the image-processing means (20-24) are adapted to generate said control signals on the basis of at least one turning indication obtained from the relative positions of the images.
15. A control device according to any one of claims 10-14, wherein the image-processing means (20-24) are adapted to generate said control signals on the basis of the speed at which the image-recording means have been moved, the speed being determined from the relative positions of the images.
16. A control device according to claims 9-15, wherein the image-processing means (20-24) are adapted to output said control signals in a way that enables a receiver to identify the control signals as being intend- ed for controlling an object.
17. A control device according to any one of claims 9-16, wherein the image-processing means (20-24) are furthermore adapted to store at least one reference image and to compare images recorded subsequently with that image in order to generate a signal in the case of an essentially complete overlap.
18. A control device according to any one of claims 9-17, wherein the image-processing means (20-24) comprise a transmitter (26) for wireless outputting of the control signals.
19. A control device according to any one of claims 9-18, wherein the image-recording means comprise a transmitter (26) for wireless transmission of images to the image-processing means (20-24) .
20. A control device according to any one of the preceding claims, wherein the control device is a mouse.
21. A control device according to any one of the preceding claims, wherein the device has a first operating mode in which the control device is adapted to con- trol said object in a way that enables its movement to be proportional to the movement of the image-recording means .
22. A control device according to any one of the preceding claims, wherein the device has a second operat- ing mode in which the control device is adapted to control said object so that the speed of its movement is proportional to the distance between the image-recording means and a predefined origin.
23. A control device having image-recording means which are adapted to be turned, preferably manually, for controlling an object as a function of the turning of the image-recording means; c h a r a c t e r i s e d in that the control device is adapted to record a plurality of images with partially overlapping contents when the image-recording means are being turned, the partially overlapping contents of the images enabling the determi- nation of how the image-recording means have been turned.
24. A method of controlling an object, comprising the steps of
- moving a control device;
- recording, with the aid of the control device, a plurality of images with overlapping contents during the movement of the control device, and
- determining the movement of the control device with the aid of the contents of the overlapping images.
25. A method of controlling an object according to claim 24, further comprising the step of
- determining the relative position of the images with the aid of the partially overlapping contents for providing control signals for controlling the object.
PCT/SE1999/000719 1998-04-30 1999-04-30 Control device and method of controlling an object WO1999060469A1 (en)

Priority Applications (7)

Application Number Priority Date Filing Date Title
CA002331075A CA2331075A1 (en) 1998-04-30 1999-04-30 Control device and method of controlling an object
KR1020007012071A KR20010052283A (en) 1998-04-30 1999-04-30 Control device and method of controlling an object
JP2000550020A JP2002516429A (en) 1998-04-30 1999-04-30 Control device and control method of object
IL13910399A IL139103A0 (en) 1998-04-30 1999-04-30 Control device and method of controlling an object
EP99952125A EP1073946A1 (en) 1998-04-30 1999-04-30 Control device and method of controlling an object
AU43033/99A AU758514B2 (en) 1998-04-30 1999-04-30 Control device and method of controlling an object
BR9910572-1A BR9910572A (en) 1998-04-30 1999-04-30 Control device and method of controlling an object

Applications Claiming Priority (8)

Application Number Priority Date Filing Date Title
SE9801535-7 1998-04-30
SE9801535A SE511855C2 (en) 1998-04-30 1998-04-30 Handwritten character recording device for characters, symbols, graphs, calligraphy
US9132398P 1998-06-30 1998-06-30
US60/091,323 1998-06-30
SE9803456-4 1998-10-09
SE9803456A SE512182C2 (en) 1998-04-30 1998-10-09 Hand held input unit such as input pen for personal computer
US10581698P 1998-10-27 1998-10-27
US60/105,816 1998-10-27

Publications (1)

Publication Number Publication Date
WO1999060469A1 true WO1999060469A1 (en) 1999-11-25

Family

ID=27484810

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/SE1999/000719 WO1999060469A1 (en) 1998-04-30 1999-04-30 Control device and method of controlling an object

Country Status (9)

Country Link
EP (1) EP1073946A1 (en)
JP (1) JP2002516429A (en)
KR (1) KR20010052283A (en)
CN (1) CN1303494A (en)
AU (1) AU758514B2 (en)
BR (1) BR9910572A (en)
CA (1) CA2331075A1 (en)
IL (1) IL139103A0 (en)
WO (1) WO1999060469A1 (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6839453B1 (en) 2000-05-16 2005-01-04 The Upper Deck Company, Llc Method and apparatus for authenticating unique items such as sports memorabilia
EP1553482A1 (en) * 2004-01-06 2005-07-13 Microsoft Corporation Camera-pen-tip mapping and calibration
US7054487B2 (en) 2000-02-18 2006-05-30 Anoto Ip Lic Handelsbolag Controlling and electronic device
US7536051B2 (en) * 2005-02-17 2009-05-19 Microsoft Corporation Digital pen calibration by local linearization
US7865043B2 (en) 2003-12-16 2011-01-04 Anoto Ab Method, apparatus, computer program and storage medium for recording a movement of a user unit
US8054512B2 (en) 2007-07-30 2011-11-08 Palo Alto Research Center Incorporated System and method for maintaining paper and electronic calendars
US8971629B2 (en) 2002-11-20 2015-03-03 Koninklijke Philips N.V. User interface system based on pointing device

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100408518B1 (en) * 2001-04-12 2003-12-06 삼성전자주식회사 Pen input device and Measuring method of coordinate
US7263224B2 (en) * 2004-01-16 2007-08-28 Microsoft Corporation Strokes localization by m-array decoding and fast image matching
KR100675830B1 (en) * 2004-03-11 2007-01-29 주식회사 애트랩 Image sensor, optic pointing device and motion value calculation method of it
JP2009020718A (en) * 2007-07-12 2009-01-29 Nec Commun Syst Ltd Radio input device and equipment operation system
CN101859205A (en) * 2009-04-08 2010-10-13 鸿富锦精密工业(深圳)有限公司 Hand input device and hand input system
CN105387802A (en) * 2015-10-13 2016-03-09 东莞市微大软件科技有限公司 Method for controlling movement of worktable of automatic image measuring instrument

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0112415A1 (en) * 1982-12-22 1984-07-04 International Business Machines Corporation A method and apparatus for continuously updating a display of the coordinates of a light pen
CN1122925A (en) * 1994-11-07 1996-05-22 颜艮山 Instant look mouse scanner

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0112415A1 (en) * 1982-12-22 1984-07-04 International Business Machines Corporation A method and apparatus for continuously updating a display of the coordinates of a light pen
CN1122925A (en) * 1994-11-07 1996-05-22 颜艮山 Instant look mouse scanner

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7054487B2 (en) 2000-02-18 2006-05-30 Anoto Ip Lic Handelsbolag Controlling and electronic device
US6839453B1 (en) 2000-05-16 2005-01-04 The Upper Deck Company, Llc Method and apparatus for authenticating unique items such as sports memorabilia
US7027623B2 (en) 2000-05-16 2006-04-11 The Upper Deck Company, Llc Apparatus for capturing an image
US8971629B2 (en) 2002-11-20 2015-03-03 Koninklijke Philips N.V. User interface system based on pointing device
US8970725B2 (en) 2002-11-20 2015-03-03 Koninklijke Philips N.V. User interface system based on pointing device
US7865043B2 (en) 2003-12-16 2011-01-04 Anoto Ab Method, apparatus, computer program and storage medium for recording a movement of a user unit
EP1553482A1 (en) * 2004-01-06 2005-07-13 Microsoft Corporation Camera-pen-tip mapping and calibration
US7136054B2 (en) 2004-01-06 2006-11-14 Microsoft Corporation Camera-pen-tip mapping and calibration
KR101037232B1 (en) 2004-01-06 2011-05-25 마이크로소프트 코포레이션 Camera-pen-tip mapping and calibration
US7536051B2 (en) * 2005-02-17 2009-05-19 Microsoft Corporation Digital pen calibration by local linearization
US8054512B2 (en) 2007-07-30 2011-11-08 Palo Alto Research Center Incorporated System and method for maintaining paper and electronic calendars

Also Published As

Publication number Publication date
EP1073946A1 (en) 2001-02-07
KR20010052283A (en) 2001-06-25
BR9910572A (en) 2001-01-16
CA2331075A1 (en) 1999-11-25
AU4303399A (en) 1999-12-06
CN1303494A (en) 2001-07-11
IL139103A0 (en) 2001-11-25
AU758514B2 (en) 2003-03-20
JP2002516429A (en) 2002-06-04

Similar Documents

Publication Publication Date Title
US6198485B1 (en) Method and apparatus for three-dimensional input entry
US7257255B2 (en) Capturing hand motion
US20020085097A1 (en) Computer vision-based wireless pointing system
JP4007899B2 (en) Motion detection device
US7817134B2 (en) Pointing device
WO2013035554A1 (en) Method for detecting motion of input body and input device using same
US6906699B1 (en) Input unit, method for using the same and input system
US7006079B2 (en) Information input system
AU758514B2 (en) Control device and method of controlling an object
WO1999060468A1 (en) Input unit, method for using the same and input system
EP3343242B1 (en) Tracking system, tracking device and tracking method
US7825898B2 (en) Inertial sensing input apparatus
JP2006190212A (en) Three-dimensional position input device
EP1073945B1 (en) Device and method for recording hand-written information
MXPA00010533A (en) Control device and method of controlling an object
KR100446610B1 (en) Pen-type computer input device
SE512182C2 (en) Hand held input unit such as input pen for personal computer
JP5195041B2 (en) Pointing device, object recognition device, and program
SE513940C2 (en) Unit and input system with mouse function and input function and ways to use the unit
SE511855C2 (en) Handwritten character recording device for characters, symbols, graphs, calligraphy
MXPA00010548A (en) Device and method for recording hand-written information

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 99806673.7

Country of ref document: CN

AK Designated states

Kind code of ref document: A1

Designated state(s): AE AL AM AT AT AU AZ BA BB BG BR BY CA CH CN CU CZ CZ DE DE DK DK EE EE ES FI FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK SK SL TJ TM TR TT UA UG US UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW SD SL SZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
121 Ep: the epo has been informed by wipo that ep was designated in this application
WWE Wipo information: entry into national phase

Ref document number: 43033/99

Country of ref document: AU

Ref document number: 139103

Country of ref document: IL

WWE Wipo information: entry into national phase

Ref document number: PA/a/2000/010533

Country of ref document: MX

ENP Entry into the national phase

Ref document number: 2331075

Country of ref document: CA

WWE Wipo information: entry into national phase

Ref document number: 1020007012071

Country of ref document: KR

WWE Wipo information: entry into national phase

Ref document number: 1999952125

Country of ref document: EP

WWP Wipo information: published in national office

Ref document number: 1999952125

Country of ref document: EP

REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

WWP Wipo information: published in national office

Ref document number: 1020007012071

Country of ref document: KR

WWG Wipo information: grant in national office

Ref document number: 43033/99

Country of ref document: AU

WWW Wipo information: withdrawn in national office

Ref document number: 1020007012071

Country of ref document: KR

WWW Wipo information: withdrawn in national office

Ref document number: 1999952125

Country of ref document: EP