WO2012000541A2 - Device enabling a user to interact with a computer system - Google Patents

Device enabling a user to interact with a computer system Download PDF

Info

Publication number
WO2012000541A2
WO2012000541A2 PCT/EP2010/059227 EP2010059227W WO2012000541A2 WO 2012000541 A2 WO2012000541 A2 WO 2012000541A2 EP 2010059227 W EP2010059227 W EP 2010059227W WO 2012000541 A2 WO2012000541 A2 WO 2012000541A2
Authority
WO
WIPO (PCT)
Prior art keywords
code
display
computer system
processing unit
sensor
Prior art date
Application number
PCT/EP2010/059227
Other languages
French (fr)
Other versions
WO2012000541A3 (en
Inventor
Jakob Eyvind Bardram
Juan David Hincapié RAMOS
Original Assignee
IT-Universitetet i København
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by IT-Universitetet i København filed Critical IT-Universitetet i København
Priority to PCT/EP2010/059227 priority Critical patent/WO2012000541A2/en
Publication of WO2012000541A2 publication Critical patent/WO2012000541A2/en
Publication of WO2012000541A3 publication Critical patent/WO2012000541A3/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0425Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected

Definitions

  • the senor is activated by the means for detecting a user's touch or gesture. This will allow a user of the device to activate the sensor by using means for detecting a user's touch or gesture such as a button on the device. Thereby the device has to search for a first code only when there is a first code present to be read. Thereby the power consumption of the input device is decreased.
  • the camera 503 is positioned below the planar display surface 502 arranged for observing the planar display surface 502 and the displays 512, 513 of the input devices 506, 509. This allows the camera 503 to detect the optical codes 512, 513.
  • the camera 503 is connected to the processing unit 504 and transmits the recorded images to the processing unit 504. Using image analysis algorithms to analyse positional markers in the optical codes 512, 5 3, the processing unit 504 is able to determine the position of the input devices 506, 509 and to determine the data content of the optical codes 512, 513.
  • the computer unit 706 is capable of determining the data content of the optical second code and, using the at least one marker on the optical second code, the algorithms run in the computer unit 706 are able to determine the position of the device.
  • the computer unit 706 is able to control the content of the display screen 707. This may be achieved by controlling a projector arranged to project images onto the display screen 707.
  • the computer unit 706 may control the content of the display screen 707 in relation to the read data of the optical second code and the position of the device 708. Thereby information related to the first code of the second device may be presented on the display screen 708 in proximity of the second device.
  • the optical second code may comprise a plurality of markers, allowing the image analysis algorithms run in the computer unit 706 to additionally determine the orientation of the device 708.

Abstract

Disclosed is a device enabling a user to interact with a computer system via a display screen, said device comprising a housing that accommodates electronic circuitry and has a bottom face, where a device display faces said display screen when the device is positioned on the display screen on its bottom face, said device display being configured to display a code optically in two dimensions towards said display screen. The device is characterized by comprising a sensor configured to read a first code from an object in proximity of the device and to provide a sensor signal conveying the first code; and a processing unit coupled to the sensor to receive the sensor signal. The processing unit is coupled to the device display and is configured to convert the first code to a second code and to provide a display signal to the device display to display the second code optically.

Description

Device enabling a user to interact with a computer system
Field
This invention generally relates to input devices for data processing systems and to data processing systems utilizing such input devices.
Background
Input devices play an essential role in data processing systems. It is through use of input devices that users are allowed to interact with the systems to solve particular tasks such as writing a letter, obtaining information or performing calculations. Conventional input devices such as the computer mouse and computer keyboard have become everyday tools for people all over the world. The increase in the processing powers of data processing systems have made input devices even more important, as for most common tasks, it is no longer the processing times of the systems that determine the time it take to solve a particular task, but the time it takes for a user to initiate the task.
The increasing use of touch screens to ease human interaction with data processing systems is evidence of this development. Touch screens have the ability to detect a touch event on the touch screen and to determine the position of the touch event. Conventional touch screens are designed to be touched by touch input devices, e.g. special-designed pens or simply a finger. Devices not specially designed for interaction with touch screen systems can only to a limited degree be used as input devices, as
conventional touch screen systems lack the ability to determine the type of device touching the systems. Thereby information related to the devices touching the touch screen systems is unavailable. It thus remains a problem to allow a broader range of devices to be used as input devices. Summary
According to a first aspect, there is provided a device enabling a user to interact with a computer system via a display screen of the computer system, said device comprising a housing that accommodates electronic circuitry and has a bottom face, where a device display faces said display screen when the device is positioned on the display screen on its bottom face, said device display being configured to display a code optically in two dimensions towards said display screen.
The device is characterized by comprising
- a sensor configured to read a first code from an object in proximity of the device and to provide a sensor signal conveying the first code; and - a processing unit coupled to the sensor to receive the sensor signal; wherein the processing unit is coupled to the device display and is configured for converting the first code into a second code and for providing a display signal to the device display to display the second code optically.
Consequently, a flexible input device is provided, allowing a broad range of products/devices comprising codes to be used as input devices for data processing systems. The first code may be any kind of code such as a bar code, a QR-code or a RFID tag. The first code may comprise data related to a given product or device.
The device may comprise an internal energy source such as a battery or means for receiving energy from an external energy source e.g. by a wire or by induction. The sensor may be any sensor suitable for reading a given code, such as a barcode reader for reading a barcode, an RFID reader for reading a RFID tag or a camera for reading a QR code. The device display may be any display suitable for displaying information, such as LCD displays, Light-Emitting Diode displays, Electroluminescent displays, Plasma panels displays, HPA displays, Thin-film transistor displays, Organic light-emitting diode displays, Surface-conduction electron-emitter displays, Laser TVs, Carbon nanotubes displays, projectors, Holography devices or Nanocrystal displays.
The device display may be configured to transmit light in the visible and/or infrared, and/or ultraviolet spectrum.
The processing unit may generate the second code using any algorithm taking the first code as an input. In some embodiments, the sensor comprises a radio frequency receiver.
In some embodiments, the sensor comprises means for reading a Radio Frequency Identification, RFID, tag. The means for reading a RF!D tag may be able to read any RFID tags such as active RFID tags, passive RFID tags or battery assisted passive RFID tags.
In some embodiments, the device is configured to display a marker from the bottom face of the housing to indicate the position and/or rotation of the device, so that a reader of the second code can determine the position and/or orientation of the device.
The marker may be displayed in the device display. The marker may be imbedded in the second code. The marker may be a part of the display not controlled by the processing unit. The marker may be a light source or made of a reflective material.
The device may comprise any number of markers such as 1 , 2, 3, 5, or 10. The markers may be positioned with a predetermined spatial relationship.
In some embodiments, the device comprises 3 markers positioned in a triangular pattern such that the interconnecting lines between the markers create a right-angled triangle.
By placing three markers in a triangular pattern, an effective way of determining both the position and the orientation of the input device is provided. In some embodiments, the device display comprises a plurality of individual light sources arranged in a matrix, each individual light source having a diameter above approximately 1 mm.
In some embodiments, each individual light source has a diameter above 1 , 2, or 5 mm.
The individual light sources may be LEDs.
The matrix may be a rectangular matrix of any size. The LEDs may be any type of LEDs generating light with any wavelength, such as ultraviolet light, visible light or infrared light.
By using a plurality of individual light sources as a display, such as a plurarlity of LEDs, a simple and effective display is provided. Individual light sources are easily recognisable for a reader of the optical second code, thereby minimizing the risk of errors in the decoding process. For some tasks it may be insufficient to determine the type of a given device/product and the location of the input device. For such tasks an even more flexible input device is desirable.
In some embodiments, the device is configured with means configured to detect a user's touch or gesture and, in response to such a detection, to control operation of the device. Consequently, a user may interact with a computer system using the means configured to detect a user's touch or gesture. Thereby a user may control functionalities of a computer system both by reading codes from products using the device, by positioning the device in a given position and by using the means configured to detect a user's touch or gesture. Thereby a more flexible input device is provided.
The device may comprise any number of means for detecting a user's touch or gesture such as 1 , 2, 3, 4, 5, 10, or more. The means for detecting a user's touch or gesture may be buttons arranged on the device e.g. 1 , 2, 3, 4, 5, 10, or more buttons arranged on the device. The means for detecting a user's touch or gesture may be arranged any where on the device, such as on the top of the device or on the sides of the device.
The means for detecting a user's touch or gesture may control functionalities of the input device such as functionalities of the processing unit or
functionalities of a data processing system.
In some embodiments, the sensor is activated by the means for detecting a user's touch or gesture. This will allow a user of the device to activate the sensor by using means for detecting a user's touch or gesture such as a button on the device. Thereby the device has to search for a first code only when there is a first code present to be read. Thereby the power consumption of the input device is decreased.
In some embodiments, the device display is configured to display a part of the second code only when a display event occurs. A display event may occur when the sensor detects a new code, an event occurring at fixed time intervals e.g. every 1 seconds, every 2 seconds, every 10 seconds, every minute or longer, when a user activates a display event e.g. by pushing on a button or the like.
The part of the second code may be the entire second code. Markers in the second code for allowing a reader of the second code to detect the position and/or orientation of the device may not be affected by display events e.g. they may always be turned on when the device is turned on.
Consequently an even lower energy consumption can be achieved as the device display may display specific parts of the second code only when it is necessary.
Some devices/products may have a size and/or a shape making it difficult to fit a display onto them. That makes it difficult to use such devices/products as input devices for data processing systems such as table top display systems.
In some embodiments, the housing comprises a structure configured to carry a physical object.
In some embodiments the structure is configured to carry a plurality of objects each object comprising a code. The sensor may be configured to read the codes of the objects. The processing unit may be configured for converting the codes from the objects into a number of second codes for providing a display signal to the device display to display the second codes. A plurality of codes from the objects may be converted into a single second code and/or a single code from one of the objects may be converted into a single second code. A plurality of second codes may be generated. The plurality of second codes from the objects may be shown sequentially by the device display.
In some embodiments, the structure configured to carry a physical object is a horizontal flat surface.
In some embodiments, the structure configured to carry a physical object is a concave surface. in some embodiments, said device comprises means for holding an elongated object in an upright position In some embodiments said elongated device is a test tube.
Consequently, using a device comprising a structure configured to carry a physical object, devices not previously suitable to be used as input devices for data processing systems, can now be used indirectly by them being placed on the device.
In some embodiments, the means for holding an elongated object in an upright position is a recess in the top of device having a size allowing an elongated object to be supported. In some embodiments, said device comprises means for securing the physical object to the device.
In some embodiments, the means for supporting an elongated object in an upright position may be a ring fitted on an arm secured to the top surface of the device, the ring having a diameter configured to support an elongated object.
In some embodiments, the means for supporting an elongated object in an upright position may be a magnet configured to attract magnets fitted on elongated objects.
In some embodiments, the means for supporting an elongated object in an upright position may be a rack configured to carry a plurality of elongated objects in an upright position.
In some embodiments, the sensor is arranged to read the first code of the object while being carried or held by the device. In some embodiments, the processing unit is configured to embed bits that convey information in addition to that of the first code in the second code.
In some embodiments, the second code comprises a number of data bits, a number of checksum bits and a number of action bits, wherein the data bits are created by the processing unit in relation to a read first code, the checksum bits being created to allow a reader of the optical second code to verify the code and the action bits being created by the processing unit independently of any read code. The data bits may comprise data corresponding to data of the read first code. The checksum bits may be any check sum bits allowing a reader of the optical second code the possibility to check the validity of the optical second code. The action bits may be created by the processing unit in relation to any action or event such as a push on a button on the device or a state of the device e.g. on, standby or off.
According to a second aspect there is provided a computer system
comprising:
- a device as described above, and
- a camera configured with a field of view covering at least a portion of the display screen and the device display;
- a computer unit configured to: receive input at least from said camera, to process input from said camera and to display information on said display screen in response thereto. The display screen may be any screen suitable for displaying information to a user of the system. The camera may be any camera suitable for observing the display surface. The camera may be configured to record light with any wavelength, such as ultraviolet light, visible light or infrared light. In some embodiments, the computer unit is further configured to control the display screen to display content in relation to the first code of an object
In some embodiments, the computer system further comprises a projector for projecting an image onto the display screen. The display screen may be a matted glass surface.
By having a computer system as described above, codes of other devices or products may be directly read by the system thereby allowing the system to obtain information related to the product. The position and/or orientation of the device and possibly the object or product contained in the device may be determined. Thereby information related to the object or product contained by the device can be displayed in proximity of the object/product
In some embodiments, the computer system is further configured to display a virtual button on the display screen in proximity of the device. The virtual button may have any shape or colour. The computer system may be configured to detect when a user touches a virtual button. The virtual button may control functionalities of the computer system. The virtual button may control information related to a product contained by the device. In some embodiments a plurality of virtual buttons is used.
In some embodiments, the display surface comprises means for detecting a users touch or gesture such as buttons. In some embodiments, the second processing unit is further configured to control the display to show information related to the read first data of the first code.
According to a third aspect, the invention relates to a method for interacting with a computer system using a device, said method comprising the steps of:
• reading a first code using said device;
• transforming said first code into a second code using said device;
• determining, using the computer system, the position of the device;
• reading the second code using the computer system;
· controlling functionalities of the computer system in relation to the read second code and the determined position of the input device.
Here and in the following, the terms 'processing means' and 'processing unit' are intended to comprise any circuit and/or device suitably adapted to perform the functions described herein. In particular, the above term comprises general- or special-purpose programmable microprocessors, Digital Signal Processors (DSP), Application Specific Integrated Circuits (ASIC), Programmable Logic Arrays (PLA), Field Programmable Gate Arrays (FPGA), special purpose electronic circuits, etc., or a combination thereof. The different aspects of the present invention can be implemented in different ways including the input devices, table top display system and methods for interacting with data processing systems described above and in the following, each yielding one or more of the benefits and advantages described in connection with at least one of the aspects described above, and each having one or more preferred embodiments corresponding to the preferred embodiments described in connection with at least one of the aspects described above and/or disclosed in the dependent claims.
Furthermore, it will be appreciated that embodiments described in connection with one of the aspects described herein may equally be applied to the other aspects.
Brief description of the drawings
The above and/or additional objects, features and advantages of the present invention will be further elucidated by the following illustrative and non- limiting detailed description of embodiments of the present invention, with reference to the appended drawings, wherein:
Figs 1a-c show a device according to an embodiment of the present invention.
Fig. 1 d shows a display of a device according to an embodiment of the present invention.
Figs 2a-b show a device according to an embodiment of the present invention.
Fig. 3 shows a schematic drawing of a device according to an embodiment of the present invention. Fig. 4 shows a side view of a computer system according to an embodiment of the present invention.
Fig. 5 shows a bottom view of a computer system according to an
embodiment of the present invention.
Fig. 6 shows a computer system according to an embodiment of the present invention.
Fig. 7 shows a schematic drawing of a computer system according to an embodiment of the present invention.
Figure 8 shows a second code according to an embodiment of the present invention.
Figure 9 shows a flow chart for a method for interacting with a computer system according to an embodiment of the present invention.
Figs 0a-b show a device according to an embodiment of the present invention.
Figs 1 a-b show a device according to an embodiment of the present invention.
Detailed description
In the following description, reference is made to the accompanying figures, which show, by way of illustration, how the invention may be practiced.
Figs 1 a-c show a top view, a side view and a bottom view of a device according to an embodiment of the present invention. The device 101 comprises a sensor configured to read a first code 104, a processing unit 108 and a device display 06 fitted in a housing 102. The device may optionally comprise means configured to detect a user's touch or gesture in the form of buttons 105 and a structure configure to carry a physical object in the form of a support surface 103 for supporting a second device comprising a first code. The support surface 103 is a horizontal flat surface positioned at the top of the housing 102, thereby allowing a device to be easily placed on the support surface 103 without sliding off the support surface 103. The buttons 105 are also positioned on top of the device. The device display 106 is positioned at the bottom of the device opposite to the support surface 103. Thereby the display is configured to allow a reader of the second code, positioned below the device, to read the second code.
Fig. 3 shows a schematic view of a device according to an embodiment of the present invention. The device 301 comprises a sensor configured to read a first code 302, a processing unit 303 and a device display 304. The sensor is connected to the processing unit 303 and transmits data of a read first code to the processing unit 303 when an object having a first code is brought within the detection range of the sensor 302. The processing unit converts the first code into a second code using a predetermined algorithm. The predetermined algorithm may transfer the entire data of the first code into the second code or a selected portion of the data. The algorithm may additionally add at least one marker to the second code or the at least one marker may be a part of the device not controlled by the processing unit, to indicate the position of the device to a reader of the second code. The marker may always be positioned at the same position on the device. The processing unit 303 is connected to the device display 304 and transmits the second code to the display 304 so that the device display 304 can optically display the second code as an optical second code. The device display 304 may display the entire optical second code including the at least one positional marker. A part of the device display may not be controlled by processing unit e.g. a number LEDs when the device display is a matrix of LEDs. The parts of the device display not controlled by the processing unit may be used as markers. The part of the display not controlled by the processing unit may be turned on automatically when the device is turned on. Figs 2a-b show a top view and a side view of an device 202 holding a second device 206 according an embodiment of the present invention. The second device 206 comprises a first code 209 e.g. a RFID tag. The second device 206 is a cylindrical container positioned on top of a support surface 203 of the device 202. The device 202 comprises a sensor configured to read the first code 209 e.g. a RFID reader, a processing unit and a display. The device has the ability to convert the first code 209 of the second device 206 into an optical second code using the principle discussed in relation to figure 3.
Figs 4-5 show a side view and a bottom view of a computer system 501 according to an embodiment of the present invention. The computer system 501 comprises a planar display screen 502, two devices 506 509, a camera 503, a processing unit 504 and a projector 505. The two devices 506, 509 are positioned on top of the planar display screen 502. Each of the two devices 506 509 holds a cylindrical container 507, 510, each of the cylindrical containers 507, 5 0 comprising a code 508, 51 1 . The devices 506, 509 convert the code 508, 511 of the cyiindrica! containers 507, 510 into optical codes 512, 513 displayed by displays 5 4, 515 positioned at the bottom of the input devices 506, 509. The camera 503 is positioned below the planar display surface 502 arranged for observing the planar display surface 502 and the displays 512, 513 of the input devices 506, 509. This allows the camera 503 to detect the optical codes 512, 513. The camera 503 is connected to the processing unit 504 and transmits the recorded images to the processing unit 504. Using image analysis algorithms to analyse positional markers in the optical codes 512, 5 3, the processing unit 504 is able to determine the position of the input devices 506, 509 and to determine the data content of the optical codes 512, 513. The processing unit 504 is connected to the projector 505 and may control the projector to project content related to each of the two cylindrical containers 507, 510 in proximity of the input devices 506, 509, holding the cylindrical containers 507, 510. Fig. 6 shows a top view of a computer system 601 according to an
embodiment of the present invention. Shown is a display screen 602 whereon two devices 606, 609 are positioned. Each of the devices 606, 609 holds a cylindrical container 607, 610. Each cylindrical container 607, 610 comprises a code. Using the principle discussed in relation to figures 5 and 7, the computer system is able to determine the position and orientation of the devices, and the data content of the codes of the cylindrical containers.
Thereby specific content 603, 604 related to the two cylindrical containers 607, 610 can be displayed on the display surface in front of each of the devices 606, 609 holding the cylindrical containers 607, 610. Displayed on the planar display screen 602 is also content 605, 611 not related to any of the two cylindrical containers. Each of the devices 606, 609 comprises means configured to detect a user's touch or gesture in the form of two buttons 612, 613, 614, 615. The buttons allow a user to communicate with the computer system; e.g. a click on a button may alter the information 603 604 presented to the devices. In front of the device 609 is additionally shown two virtual buttons 615 616. The two virtual buttons 615 616 are generated by the computer system 601 and displayed on the display screen 602. The computer system 601 may be configures to detect when a user touches one of the virtual buttons. The virtual buttons 615 616 may control functionalities of the computer system 601.
Fig. 7 shows a schematic view of a computer system 701 according to an embodiment of the present invention. The computer system 701 comprises a device 708, a camera 705, a computer unit 706, and a display screen 707. The device 708 comprises a sensor configured to read a first code 702, a processing unit 703 and a device display 704. The sensor 702 is connected to the processing unit 703 and transmits data of a read first code to the processing unit 703 when a second device having a first code is brought within the detection range of the sensor 702. The processing unit 703 converts the data of the first code into a second code using a predetermined algorithm. The predetermined algorithm may transfer the entire data of the first code into the second code or a selected portion of the data. The algorithm may additionally add at least one marker to indicate the position and/or rotation of the device, the second code or the at least on marker may be a part of the device not controlled by the processing unit. The processing unit 703 is connected to the device display 704 and transmits the second code to the device display 704 so that the device display 704 is able to optically display the second code as an optical second code. The device display 704 may display the entire optical second code including the at least one marker. A part of the device display may not be controlled by a processing unit, e.g. a number LEDs when the display is a matrix of LEDs. The parts of the display not controlled by the processing unit may be used as markers. The part of the display not controlled by the processing unit may be automatically turned on when the device is turned on. The markers may alternatively be placed in proximity of the device display 704. The device display 704 of the device 708 is arranged to be optically detectable by the camera 705 so that the optical second code displayed by the device display 704 can be read by the camera 705. If the markers are not part of the device display, they may also be arranged to be optically detectable by the camera
705. The camera 705 is connected to the computer unit 706 and transmits the recorded images comprising the optica! second code to the computer unit
706. Using image analysis algorithms, the computer unit 706 is capable of determining the data content of the optical second code and, using the at least one marker on the optical second code, the algorithms run in the computer unit 706 are able to determine the position of the device. The computer unit 706 is able to control the content of the display screen 707. This may be achieved by controlling a projector arranged to project images onto the display screen 707. The computer unit 706 may control the content of the display screen 707 in relation to the read data of the optical second code and the position of the device 708. Thereby information related to the first code of the second device may be presented on the display screen 708 in proximity of the second device. The optical second code may comprise a plurality of markers, allowing the image analysis algorithms run in the computer unit 706 to additionally determine the orientation of the device 708.
Fig. 8 shows an optical code 801 according to an embodiment of the present invention. The optical code is created by 64 bits arranged in an 8-by-8 quadratic pattern. The optical code comprises three markers for indicating the position and/or orientation of the device 802 positioned in three corners, such that the interconnecting lines between the markers 802 create a right- angled triangle. The optical code 801 further comprises four 8-bits data integers 803, 804, 805, 806, two checksum bits 807 and two action bits 808. The checksum bits are used by a reader of the optical code to check the validity of the optica! code. Thereby errors in the process of generating the optical code or reading of the optical code can be detected. The action bits are free bits unrelated to any read first code. The action bits may be used for signal other events, such as the push of a button on an input device. When the optical code is displayed in a LED display, each bit may correspond to a single LED, so that a first state of a given bit can be shown by the LED being off and a second state of a given bit can be shown by the LED being on. It should be understood that any type of optical code having any shape can be used with the present invention.
Figure 9 shows a flow chart for a method for interacting with a computer system using a device according to an embodiment of the present invention. In the first step 902, a first code is read using the device. Then, in step 903, the first code is transformed into a second code using the device. Next, in step 904, the computer system determines the position of the device. In step 905, the computer system reads the second code. Then, in step 906, functionalities of the computer system are controlled based on the read second code and the determined position of the device. Figs 10a-b show a device according to an embodiment of the present invention. The device 1001 is configured to hold an elongated object such as a test tube in an upright position. The device 1001 comprises a sensor configured to read a first code 1005, a processing unit 1003 and a device display 1004 all arranged in a housing 1002.
Figs 11 a-b show a device according to an embodiment of the present invention. The device 1101 is configured to hold 6 elongated objects 1114, 1115, 1116, 1117, 1118, 1119 such as a test tube in an upright position. The device 1101 comprises a sensor 1105 configured to read first codes, a processing unit 1103 and a device display 1104 all arranged in a housing 1102. Each of the elongated objects comprises a code 120, 1121 , 1122, 1123, 1124, 1125 that can be read by the sensor 1105. The elongated objects are positioned in six recesses 1106, 1 07, 1108, 1109 1110, 111 in the housing 1102.
Although some embodiments have been described and shown in detail, the invention is not restricted to them, but may also be embodied in other ways within the scope of the subject matter defined in the following claims. In particular, it is to be understood that other embodiments may be utilised and structural and functional modifications may be made without departing from the scope of the present invention. In device claims enumerating several means, several of these means can be embodied by one and the same item of hardware. The mere fact that certain measures are recited in mutually different dependent claims or described in different embodiments does not indicate that a combination of these measures cannot be used to advantage. It should be emphasized that the term "comprises/comprising" when used in this specification is taken to specify the presence of stated features, integers, steps or components, but does not preclude the presence or addition of one or more other features, integers, steps, components or groups thereof.

Claims

Claims:
1. A device enabling a user to interact with a computer system via a display screen of the computer system, said device comprising a housing that accommodates electronic circuitry and has a bottom face, where a device display faces said display screen when the device is positioned on the display screen on its bottom face, said device display being configured to display a code visually in two dimensions towards said display screen;
CHARACTERIZED in that said device comprises
- a sensor configured to read a first code from an object in proximity of the device and to provide a sensor signal conveying the first code; and - a processing unit coupled to the sensor to receive the sensor signal; wherein the processing unit is coupled to the device display and is configured for converting the first code into a second code and for providing a display signal to the device display to display the second code visually.
2. A device according to claim 1 , wherein the sensor comprises a radio frequency receiver.
3. A device according to any of the preceding claims, wherein the sensor comprises means for reading a Radio Frequency Identification, RFID, tag, and/or a barcode and/or a QR code.
4. A device according to any of the preceding claims, where the device is configured to display a marker from the bottom face of the housing to indicate the position and/or rotation of the device.
5. A device according to any of the preceding claims, wherein the display comprises a plurality of individual light sources arranged in a matrix, each individual light source having a diameter above approximately 1 mm.
6. A device according to any of the preceding claims, wherein the device is configured with means configured to detect a user's touch or gesture and, in response to such a detection, to control operation of the device.
7. A device according to any of the preceding claims, wherein the housing comprises a structure configured to carry a physical object.
8. A device according to claim 7, comprising means for holding an elongated object in an upright position
9. A device according to claim 7 or 8, wherein the sensor is arranged to read the first code of the object while being carried or held by the device.
10. A device according to any of claim 7, 8 or 9, wherein the housing comprises a structure configured to carry a plurality of physical objects.
11. A device according to any of the preceding claims, wherein the
processing unit is configured to embed bits that convey information in addition to that of the first code in the second code.
12. A computer system comprising:
- a device according to any of the preceding claims; and
- a camera configured with a field of view covering at least a portion of the display screen and the device display;
- a computer unit configured to: receive input at least from said camera, to process input from said camera and to display information on said display screen in response thereto.
13. A table top display system according to claim 12, wherein the second processing unit is further configured to control the display surface to display content in relation to the read second data of the optica! second code and the determined position of the input device.
14. A table top display system according to any of claims 12 through 13, wherein the table top display system further comprises a projector for projecting an image onto the display surface.
15. A table top display system according to any of claims 12 through 4, wherein the second processing unit further is configured to control the display to show information related to the read first data of the first code.
16. A method for interacting with a computer system using a device, said method comprising the steps of:
• reading a first code using said device;
· transforming said first code into a second code using said device;
• determining, using the computer system, the position of the device;
• reading the second code using the computer system;
• controlling functionalities of the computer system in relation to the read second code and the determined position of the input device.
PCT/EP2010/059227 2010-06-29 2010-06-29 Device enabling a user to interact with a computer system WO2012000541A2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/EP2010/059227 WO2012000541A2 (en) 2010-06-29 2010-06-29 Device enabling a user to interact with a computer system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/EP2010/059227 WO2012000541A2 (en) 2010-06-29 2010-06-29 Device enabling a user to interact with a computer system

Publications (2)

Publication Number Publication Date
WO2012000541A2 true WO2012000541A2 (en) 2012-01-05
WO2012000541A3 WO2012000541A3 (en) 2012-08-30

Family

ID=44624834

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2010/059227 WO2012000541A2 (en) 2010-06-29 2010-06-29 Device enabling a user to interact with a computer system

Country Status (1)

Country Link
WO (1) WO2012000541A2 (en)

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7570249B2 (en) * 2005-03-30 2009-08-04 Microsoft Corporation Responding to change of state of control on device disposed on an interactive display surface

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
None

Also Published As

Publication number Publication date
WO2012000541A3 (en) 2012-08-30

Similar Documents

Publication Publication Date Title
JP5346081B2 (en) Multi-touch touch screen with pen tracking
JP7062848B2 (en) Card-shaped media and information equipment
JP5411265B2 (en) Multi-touch touch screen with pen tracking
Walker A review of technologies for sensing contact location on the surface of a display
CN104321786A (en) Enrollment using synthetic fingerprint image and fingerprint sensing systems
US9063577B2 (en) User input using proximity sensing
CA2623738C (en) Digital labels for product authentication
US20080238706A1 (en) Apparatus and Method for Proximity-Responsive Display Materials
US8446367B2 (en) Camera-based multi-touch mouse
CN103678184A (en) Detection system and method between accessory and electronic device
CN103713842A (en) Touch-enabled complex data entry
EP3418868A2 (en) Device, and card-type device
KR20130143037A (en) Information carrier and system for acquiring information
DE602006014808D1 (en) COMBINED DETECTION OF POSITION CODING PATTERNS AND BAR CODES
Izadi et al. ThinSight: integrated optical multi-touch sensing through thin form-factor displays
CN106662936B (en) Position input on a display
TW200609832A (en) Surface sensitive input device with indexes and computer system using the input device
JP2015191433A (en) Information input device and vending machine
WO2012000541A2 (en) Device enabling a user to interact with a computer system
US20230065441A1 (en) Kiosk mounting apparatus with an embedded payment module, and applications thereof
Doyle et al. Cambridge IGCSE Complete ICT: Student Book
Yarin Towards the distributed visualization of usage history
Kozuma et al. Ecosystem for Smart Glass Technologies (ESGT)
TW201240370A (en) Portable information detection and processing device and system
CN103022082B (en) Transparence Display eyeglass and the information retrieval device in conjunction with visual recognition function thereof

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 10725490

Country of ref document: EP

Kind code of ref document: A2