EP1405165A1 - Commande d'une unite munie d'un processeur - Google Patents

Commande d'une unite munie d'un processeur

Info

Publication number
EP1405165A1
EP1405165A1 EP02741594A EP02741594A EP1405165A1 EP 1405165 A1 EP1405165 A1 EP 1405165A1 EP 02741594 A EP02741594 A EP 02741594A EP 02741594 A EP02741594 A EP 02741594A EP 1405165 A1 EP1405165 A1 EP 1405165A1
Authority
EP
European Patent Office
Prior art keywords
command
address
processor
sensor device
graphical
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP02741594A
Other languages
German (de)
English (en)
Inventor
Stefan Lynggaard
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Anoto AB
Original Assignee
Anoto AB
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Anoto AB filed Critical Anoto AB
Publication of EP1405165A1 publication Critical patent/EP1405165A1/fr
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03545Pens or stylus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • G06F3/0317Detection arrangements using opto-electronic means in co-operation with a patterned surface, e.g. absolute position or relative movement detection for an optical mouse or pen positioned with respect to a coded surface
    • G06F3/0321Detection arrangements using opto-electronic means in co-operation with a patterned surface, e.g. absolute position or relative movement detection for an optical mouse or pen positioned with respect to a coded surface by optically sensing the absolute position with respect to a regularly patterned surface forming a passive digitiser, e.g. pen optically detecting position indicative tags printed on a paper sheet

Definitions

  • the present invention relates to methods for controlling a unit provided with a processor, and to a device, a computer program product and a product kit for the same purpose .
  • Intelligent homes are dwellings, where one or more electronic devices can be controlled or monitored from units located outside the house. Examples of devices that can be controlled or monitored remotely are heating devices, for example in holiday cottages, fire alarms, flood alarms and burglar alarms. Other examples of devices that could be controlled remotely or monitored are computers, lighting, irrigation systems, TV, video, music centers, refrigerators, freezers, cookers, microwave ovens or washing machines.
  • a basic requirement is that the device can be provided with or can be connected to a processor which can receive, process and pass on information. The processor should then be capable of being connected to some form of communication network, such as the telephone network or the Internet .
  • the remote control unit is typically a telephone, mobile telephone or computer terminal, which communicates with the processor via the communication network.
  • the remote control unit is a telephone or mobile telephone which communicates directly with the processor by means of predefined key commands or codes.
  • the remote control unit is a computer or mobile telephone which communicates with the processor via a computer network, such as the Internet.
  • a problem with using a telephone as remote control terminal is that the user interface is based on pressing keys on a numeric keypad, which can lead to problems, for example in lear- ning or remembering the codes that are to be used. Communicating by pressing keys on a numeric keypad is thus not particularly user-friendly.
  • a mobile telephone has usually limited data entry facilities on account of its small format. In addition, no or limited feedback is given in response to what has been keyed into the mobile telephone.
  • the object of the present invention is to solve the above problem completely or partially.
  • a method for controlling a unit provided with a processor comprises receiving at least one graphical notation in the form of positions representing a sensor device's movement across a base that is provided with a position-coding pattern, while the graphical notation was made.
  • the method further comprises identifying, based on the at least one graphical notation, at least one command for the unit provided with a processor, and receiving an address to the unit provided with a processor.
  • the method comprises controlling the unit provided with a processor by sending the at least one command to the address.
  • a graphical notation may be any writing and/or drawing, which is made on a base, using a sensor device, that records positions based on the position-coding pattern provided on the base.
  • the graphical notation may be a single, continuous stroke, or a group of such strokes. Each stroke may be represented as a sequence of coordinate pairs coded by the position-coding pattern on the base.
  • the user input may be digitized without any additional operation on the part of the user, such as scanning the base or digitizing it in some other way.
  • such graphical notations may be used for drawing up a command structure or hierarchy on a base, such as a paper.
  • the command structure may comprise commands for controlling a unit provided with a processor. This provides a rapid, simple and easy to understand way for the user to control the unit provided with a processor. In addition, by noting down the commands, the user obtains automatically an easy to understand copy of what was entered into the unit.
  • An additional advantage of the present invention is that the user is not limited to one base which is speci- fie to a particular type of command.
  • a unit provided with a processor can be a device which contains any form of processor, for example a microprocessor. Examples of such devices are computers, modern household appliances (dishwashers, microwave ovens, cookers/stoves, audio/video players, etc) , industrial machinery and other computer-controlled applications such as central heating installations, air conditioning installations, telephone systems and monitoring/alarm systems.
  • a base can be a device on which information can be noted down, usually a sheet of paper, a drawing board or similar medium which is provided with a position-coding pattern which makes possible electronic recording of what is noted down on the base.
  • a command can be such words, symbols, sub-addresses in computer networks, program names, command names, file names, storage addresses or symbols which represent particular operations, functions, operators, parameters and arguments that can be used individually or in combination for controlling a unit provided with a processor.
  • An address of the unit provided with a processor can be an address for communication with the unit provided with a processor. It can be a computer network address, such as a standard IP address, but other forms of address are possible. It can also be an address of a unit via which the unit provided with a processor communicates, for example a proxy-server or a unit for short-range communication such as Bluetooth ® . Thus it is also possible to utilize the invention when the user is in the vicinity of the unit provided with a processor without having to connect, for example, via a computer network. An address can also be expressed as a short name, which is associated with the address. For example, the word "home" can be an indication of a particular computer network address of a computer which is located in the user's home.
  • Making graphical notations may, but does not necessarily, mean that a mark is left on the base. This may provide the advantage that the base will constitute a copy of what was entered into the unit provided with a processor. With the reuse of previously entered commands it can, however, be an advantage if no mark is left on the base, in order to make possible repeated use.
  • the at least one graphical notation may at least partly comprise handwritten characters.
  • the at least one graphical notation may be at least partly converted into a character coded format for identifying the at least one command.
  • identifying the at least one command may comprise identifying at least one graphical symbol from the at least one graphical notation, the graphical symbol representing the at least one command. This way, user friendly symbols, which speed up the making of the graphical notation, may be provided.
  • This variant may be combined with the above described character coded variant, by e.g. providing predefined symbols for frequently used commands or addresses, while less frequently used commands or addresses are to be noted as handwritten characters for conversion to character coded form.
  • identifying the at least one command may comprise detecting a command indicator, based on the at least one graphical notation.
  • identifying the at least one command may comprise identifying a subarea of the position-coding pattern, the subarea being essentially encircled by the command indicator. The subarea may be associated with the command.
  • a command indicator may be a graphically noted indication, which is recognized by the sensor device as an instruction to identify a command.
  • the instruction may be a symbol and in one embodiment of the invention, the symbol may encircle the command, and thus constitute a frame or any other drawn shape, which wholly or partially encircles the command.
  • the frame may have a specific appearance, which is recognized by the sensor device and thus interpreted as the indication that a command is being entered.
  • a subarea is an area of the position-coding pattern on the base, which area is delimited by e.g. the command indicator or by some other relation to the command, such as a certain area encircling the graphical notation. For example, a halo-like area surrounding the command may be defined as soon as a command is recognized.
  • the subarea may, via the position-coding pattern, be associated with the command, such that a command may be identified based on a recording of any pair of coordinates that falls within the subarea associated with that command.
  • the command indicator may have a dual function: defining the subarea and indi- eating that a command is being entered.
  • sending the at least one command to the address may be effected in response to a recording of a pair of coordinates within the subarea. This enables "reuse” of a previously noted and recorded command.
  • the address may be received in different manners.
  • receiving the address comprises receiving the address from a memory in the sensor device.
  • the address to the unit provided with a processor may be preprogrammed and stored in a memory in the sensor device.
  • receiving the address may comprise identifying the address based on the at least one graphical notation.
  • the address may be noted graphically on the base, recorded by the sensor device and optionally associated with one or more commands.
  • an association between the address and the at least one command may be identified based on the at least one graphical notation.
  • the association may indicate the relationship between the commands or the command and the address .
  • receiving at least one graphical notation comprises receiving at least three separable graphical notations representing the address, the at least one command and the at least one association between the address and the at least one command.
  • the graphical notations representing the address, the command and the association may be clearly distinguishable from each other.
  • the at least one association may be identified based on a separable graphical notation connecting the graphical notations representing the at least one command and the address, respectively.
  • the at least one association may be identified as a graphical notation having essentially the shape of a line extending between the graphical notations representing the at least one command and the address.
  • associations may also connect two commands, or a command and an address. The associations provide a simple and intuitive way of relating commands, addresses, etc. to each other.
  • an electronic representation of a command hierarchy which comprises at least two commands and at least one association, each of the at least two commands and the at least one association being graphically represented on the base by nodes and arcs, respectively, and each of the at least two commands being associated with a respective subarea of the position-coding pattern.
  • a command hierarchy, or a command structure may be any hierarchy of commands for one or more units provided with a processor.
  • a command hierarchy typically has a root, which may be a command or an address to a unit provided with a processor.
  • the command hierarchy may have, but does not necessarily need to have, a plurality of branches, which may also be referred to as nodes, each branch or node, in turn, having a number of sub branches, such that a tree-like structure is formed.
  • the command hierarchy may be large, thus comprising a large number of command levels and/or a large number of commands on each level .
  • the nodes may be connected by lines, such that each node has one superior node, but may have more than one subordinate node.
  • the lines connecting the nodes may be interpreted as the associations referred to above, while the nodes may be addresses or commands for controlling a unit provided with a processor. Naturally, one address may in turn have a number of sub- addresses to, e.g. different units provided with processors.
  • the inventive method may include forming at least one command string based on the command hierarchy, and controlling the unit provided with a processor by sending the command string to the address.
  • a command string is an instruction for a unit provided with a processor, which instruction is built up by more than one command and optionally by an address.
  • a command string may include more than one command.
  • identifying the at least one command may comprise receiving a pair of coordinates from one of the subareas, the pair of coordinates representing a chosen command and identifying an electronic representation of the chosen command, wherein the command string is formed based on the chosen command and at least one hierarchically superior command.
  • the command hierarchy according to this embodiment may be represented on a base provided with a position- coding pattern.
  • the electronic representation of the command hierarchy may be at least partly provided through identifying the command hierarchy based on the at least one graphical notation. At least partly means that an existing command hierarchy, which is either preprinted or e.g. graphically noted by the user or someone else, may be expanded by the user adding further commands or addresses .
  • providing the electronic representation of the command hierarchy may at least partly comprise electronically receiving the electronic representation of the command hierarchy.
  • the base, on which the graphical notation is made, may be provided with a graphical representation of at least a part of the command hierarchy.
  • a base having a preprinted command structure may be provided together with an electronic version of the command structure, which is to be stored in the memory of the sensor device for future use.
  • providing the electronic representation of the command hierarchy may also comprise identifying, based on the at least one graphical notation, at least one further command and at least one further association, and storing, in a memory of the sensor device, the at least one command and the least one further command, based on the at least one further association.
  • the predefined command structure may be expanded by the user adding e.g. further commands or by the user adding e.g. an address.
  • the command hierarchy may also be used by recording pairs of coordinates within predefined subareas, which are associated with commands in the command hierarchy. Based on such recordings, commands may be sent to the unit provided with a processor, as described above .
  • the electronic representation of the command hierarchy may be stored in a tree data structure in the memory of the sensor device.
  • a tree data structure may be any data structure for representing a hierarchy or a tree structure. Numerous such data structures are known.
  • command strings are formed in response to an indication of a command, such as a recording of a pair of coordinates within a subarea that is associated with a command.
  • the command string may be built up starting with the selected command and adding each hierarchically superior command, until the root is reached. Building the command string may also comprise adding separating characters, such as " ⁇ " etc., arranging the commands in a suitable order and adding the necessary parameters or switches.
  • the electronic representation of the command hierarchy may be stored in the form of at least one command string which is formable based on the electronic representation of the command hierarchy.
  • a plurality, or all, of the command strings that may be formed based on the command hierarchy may be stored in the form of more or less complete command strings.
  • a computer program product for controlling a unit provided with a processor.
  • the computer program product comprises instructions for a sensor device, which, when executed, causes the sensor device to perform the above described method.
  • a sensor device for controlling a unit provided with a processor comprises a signal processor for receiving positions representing the sensor device's movement across a base that is provided with a position-coding pattern.
  • the signal processor is arranged for receiving at least one graphical notation in the form of the positions, identifying, based on the at least one graphical notation, at least one command for the unit provided with a processor, receiving an address to the unit provided with a processor, and controlling the unit provided with a processor by sending the at least one command to the address.
  • the signal processor of the sensor device may be arranged to perform the method described above. The method may be implemented by means of special -purpose circuitry, by programmable microprocessors or by a combination thereof.
  • a method for controlling a unit provided with a processor comprises using a sensor device for recording positions representing the sensor device's movement across a base provided with a position-coding pattern, while a graphical notation is made on the base, noting graphically on the base, using the sensor device, at least one command to the unit provided with a pro- cessor, and sending the command to the unit provided with a processor for controlling this unit.
  • This method provides a user-friendly and intuitive way of controlling a unit provided with a processor.
  • the method can, for example, be used when a user wants to enter a command string into, for example, a computer or other unit provided with a processor connected to the sensor device, e.g. by wireless means.
  • the method enables a unit provided with a processor to be controlled without using a preexisting user interface, such as a preprinted base provided with command options. Instead, the user may use any base that is provided with a position-coding pattern, on which he or she makes graphical notations, which correspond to the desirable commands and sends the commands to the unit provided with a processor.
  • a product kit for controlling a unit provided with a processor comprises a control base, provided with a position-coding pattern, on which control base a command hierarchy, comprising at least two commands, is graphically represented, and on which control base each of the at least two commands is associated with a respective subarea of the position- coding pattern, and a computer program product comprising an electronic representation of the command hierarchy, whereby the at least two commands are identifiable based on a recording of a pair of coordinates within its respective associated subarea.
  • a product kit provides a way of integrating e.g. a household appliance into a remote control system for an intelligent home.
  • the electronic representation of the command hierarchy may be stored in the memory of the sensor device, and possibly completed by the user adding an address to which the commands are to be sent, thereby completing the installation of the appliance in the intelligent home.
  • the user may then either use the control base provided for selecting predefined commands, or draw up the command hierarchy on an arbitrary base and send a command string to the unit provided with a processor.
  • FIG. 1 schematically shows a system in which the present invention can be used.
  • Fig. 2 shows schematically a base with graphically noted commands and associations according to a first application of an embodiment of the present invention.
  • Fig. 3 shows schematically a base with several graphically noted commands and associations according to a second application of an embodiment of the present invention.
  • Fig. 4 shows schematically a base with several gra- phically noted commands and associations according to a third application of an embodiment of the present invention.
  • Fig. 5 shows schematically a sensor device for use in connection with the present invention.
  • Figs 6-12 are flow charts, which schematically illustrate methods for controlling a unit provided with a processor according to the invention.
  • the present invention is based on the general idea of controlling a unit provided with a processor by means of commands which are written on a position-coded base and which thereafter are sent to the unit provided with a processor.
  • Fig. 1 which more specifically shows a base 1 in the form of a sheet of paper, on which commands can be written, a sen- sor device 2 using which the commands can be written on the base 1, recorded in electronic form and sent to a unit 3 provided with a processor, which in Fig. 1 is exemplified by a computer.
  • the sensor device 2 can communicate with the com- puter 3 in various ways.
  • One alternative is for the sensor device 2 to communicate directly with the computer 3, for example via a cable, an infrared link or a short- range radio link, such as according to the Bluetooth standard. This is illustrated in Fig. 1 by a broken line 4.
  • a second alternative is for the sensor device 2 to communicate with the computer via a local or global computer network 5, such as the Internet.
  • the sensor device 2 can be connected to the computer network 5 by means of a computer 6 which is permanently connected to the com- puter network, and with which the sensor device can communicate in, for example, any one of the ways mentioned above for communication with the computer 3. This is also shown by a broken line 4.
  • the sensor device can be connected to the computer network 5 by wireless means via a radio access point 7 which is reached, for example, via a mobile telephone 8, a hand-held computer 9, such as a PDA - Personal Digital Assistant -, or a portable computer 10.
  • these units com- municate with the radio access point 7 via each other, for example by the portable computer 10 or the hand-held computer 9 utilizing a modem in the mobile telephone 8 as a link to the radio access point 7, which can be a radio access point in some known system such as GSM, CDMA, GPRS or some other type of mobile communication network.
  • the sensor device 2 can itself have means of communication, making possible direct connection to the radio access point 7.
  • the base 1 is provided with a position-coding pattern P.
  • the position-coding pattern P is shown only schematically in Fig. 1 as a surface provided with dots. This position-coding pattern is used to record in electronic form what is written on the base.
  • Various types of position-coding pattern which can be used for this purpose are known. In US 5,477,012, for example, a position-coding pattern is shown where each position is coded by a unique symbol.
  • the position-coding pattern can be read off by a pen which detects the posi- tion code optically, decodes it and generates a pair of coordinates that describes the movement of the pen across the surface.
  • WO 01/26032 four different displacements of a dot from a nominal position are used to code four different pairs of bits in the position-coding pattern.
  • a certain number of dots for example 6 * 6 dots, codes a unique position.
  • the position can be calculated from the bit values corresponding to the dots.
  • the position-coding patterns in WO 00/73983 and WO 01/26032 can be detected optically by a pen that decodes the dots and generates a pair of coordinates for each set of, for example, 6 * 6 dots. If the position- coding pattern is read off while the pen is writing on the position-coding pattern, a sequence of pairs of coordinates that describes the movement of the pen across the position-coding pattern is thus obtained and thus constitutes an electronic representation of what was written on the sheet of paper.
  • the base 1 is provided with a position-coding pattern of the type described in WO 01/26032.
  • the sensor device 2 can then also be of the type described in WO 01/26032. An example of the construction of such a device is described in the following with reference to Fig. 5.
  • It comprises a casing 11 which has approximately the same shape as a pen. At the end of the casing there is an opening 12. The end is intended to abut against or to be held a short distance from the surface on which the position determination is to be carried out.
  • the casing contains principally an optics part, an electronic circuitry part and a power supply.
  • the optics part comprises at least one light-emitting diode 13 for illuminating the surface which is to be imaged and a light-sensitive area sensor 14, for example a CCD or CMOS sensor, for recording a two-dimensional image.
  • the device can also contain an optical system, such as a mirror and/or lens system (not shown) .
  • the light-emitting diode can be an infrared light-emit- ting diode and the sensor can be sensitive to infrared light.
  • the power supply for the device is obtained from a battery 15, which is mounted in a separate compartment in the casing. It is also possible to obtain the power supply via a cable from an external power source (not shown) .
  • the electronic circuitry part contains a signal - processor 16 which comprises a processor with primary memory and program memory.
  • the processor is programmed to read images from the sensor, to detect the position- coding pattern in the images and to decode this into positions in the form of pairs of coordinates, and to process the information thus recorded in electronic form in the way described in greater detail below for control- ling the unit 3 provided with a processor.
  • the device also comprises a pen point 17, with the aid of which ordinary pigment-based writing can be written on the surface on which the position determination is to be carried out.
  • the pen point 17 can be extendable and retractable so that the user can control whether or not it is to be used. In certain applications the device does not need to have a pen point at all.
  • the pigment -based writing is suitably of a type that is transparent to infrared light and the marks suitably absorb infrared light.
  • the detection of the pattern is carried out without the above-mentioned writing interfering with the pattern.
  • the device may also comprise buttons 18, by means of which the device can be activated and controlled. It also has a transceiver 19 for wireless transmission, for example using infrared light, radio waves or ultrasound, of information to and from the device.
  • the device can also comprise a display 20 for displaying positions or recorded information.
  • the device can be divided between different physical casings, a first casing containing components which are required for recording images of the position-coding pattern and for transmitting these to components which are contained in a second casing and which carry out the position determination on the basis of the recorded image or images .
  • the sensor device 2 communicates with other units 8, 9, 10 or 6, by wireless means in a way known to those skilled in the art.
  • the communication between the units 8, 9, 10 and 6 respectively, the radio access point 7, the computer network 5 and the unit 3 provided with a processor also takes place in a way known to those skilled in the art.
  • the function of the sensor, and the application of the position-coding pattern on the base are also methods known to those skilled in the art.
  • a command can be a word, but can also be a symbol, provided that the sensor device is pre-programmed to recognize and identify a symbol as a command. Such pre-programming can be carried out by some form of learning, whereby a command is associated with a symbol, as will be described in more detail below.
  • One of the noted phrases 21 constitutes an indication of a computer network address to which the sensor device connects, e.g. the computer network address where the unit 3 provided with a processor is located.
  • the computer network address can also be some other type of address for computer communication, such as a Bluetooth ® address .
  • Commands are noted on the base in a tree structure, in such a way that an address or an indication of the address constitutes the root and in such a way that commands constitute nodes in the tree structure.
  • a node is associated with another node by a line being drawn between them.
  • the commands 22, 23 can thus constitute conditions, operators, parameters or subordinate commands, as shown in Fig. 2. It is recognized that the tree structure with commands can vary in extent, from a single chain of commands to a large tree with many commands and subordinate commands .
  • a command frame 24 may be noted around each command.
  • the command frame 24 may work in such a way that it delimits a part of the base 1 which is to be associated with the command and also in such a way that it is recognized by the sensor device 2 and understood as an indication that a command is being entered and thus work as a command indicator. Between the commands 22, 23 or the command frames 24 lines may be drawn which indicate connections between the commands.
  • the shape of the command frames in Figs 2-4 constitutes only one example of how such frames can be designed. Other shapes are possible and different shapes can represent different types of commands or addresses.
  • On the base 1 there is also a "send" box 26 which indicates to the sensor device that the commands are to be sent to the unit provided with a processor.
  • the "send” box 26 can be either a pre-printed box comprising a specific, predefined part of the position-coding pattern which codes for the "send” function, or alternatively the "send” box 26 can be noted by the user on the base 1 and provided with a particular symbol or a particular command word which represents the "send” function. As another alternative, the "send” box can be omitted, by for example the sensor device connecting directly to the computer network address when it identifies a command box or an indication of a computer network address.
  • the user has noted the address "my computer” 21 with a sensor device 2, which indicates to the sensor device 2 the computer network address of the user's own computer.
  • the command frame 24 is noted, which defines the subarea of the position-coding pattern which after the input will be associated with the address by the sensor device.
  • the sensor device reads off the position-coding pattern and forms an electronic representation of the graphical image that the address 21 and the command frame 24 constitute.
  • the graphical image is then interpreted using OCR or ICR, so that an electronic form of the command "my computer" is obtained and is stored in a memory in the sensor device.
  • the address is stored together with an indication that the noted "my computer” is an address and not just a text string.
  • the address derived by the sensor device for example 197.57.3.982, can be stored.
  • a representation of the surface on the base 1 which is enclosed by command frame 24 is also stored in the sensor device and associated with the address "my computer” as an association with an IP address, here exemplified by 197.57.3.982.
  • the noted commands are recorded in electronic form, interpreted and stored together with the indication that they constitute commands and with the subareas of the base 1 with which they are associated.
  • the user notes associations 25 between the commands 22, 23, this is also recorded and stored as indications of how the commands 22, 23 are related to each other.
  • a command string is formed in the sensor device from the stored commands 22, 23 and associations 25, the first component of the command string consisting of the address that constitutes a root in the tree-like command structure, that is the address "my computer".
  • Next associations 25 are followed, until the last command "hard disk” 23 is reached, whereupon the command string is built up gradually and finally assumes, for example, the form my computer/format/hard disk or alternatively
  • Command strings may be formed and stored in different ways. According to one alternative, each command string that may be formed from a given command structure or hierarchy may be stored. According to this alternative, new command strings are added as new commands or parameters are added to the command structure .
  • a tree structure that has been noted graphically and registered may be represented in any appropriate way in the sensor device, such as by means of any data structure for representing tree structures.
  • a certain command is selected, by e.g. registering a pair of coordinates within the command frame on the base, the corresponding command string is formed from the marked command or parameter and all other commands or parameters through the root .
  • the command string is sent to the computer network address that is indicated by the address "my computer” , whereupon the unit 3 provided with a processor and connected to the computer network address executes the command and its hard disk is formatted.
  • the unit 3 provided with a processor then sends an acknowledgement to the sensor device 2 that the command has been carried out .
  • the acknowledgement can be presented to the user in the form of a sound, light or vibration signal in the sensor device or by being displayed on some other unit in the vicinity of the user, such as a mobile telephone or hand-held computer.
  • an address 30 and a number of commands 31, 32, 33, 34, 35, 36, surrounded by command frames 24 and connected by associations 25, have been noted in a similar way to that described above with reference to Fig. 2.
  • the commands have been noted and linked together into a tree-like structure, where an address "home” 30 indicates a computer network address, e.g. of a unit which is situated in the user's intelligent home, which unit is arranged to control one or more units provided with a processor in the user's home.
  • each unit provided with a processor in the user's home can have a network address and can be connected directly to the computer network, without any master unit as in the example.
  • a number of units provided with a processor are connected to the computer network address, which units are addressed by the commands 31, 32, 33, 34.
  • Other types of units are possible and are to be regarded as covered by the invention.
  • Each unit can then be provided with a num- ber of commands or parameters for its control.
  • the tree structure is drawn out in full only for the temperature control of a heating system, but it is recognized that the structure can be extended as new units are added, or as existing units are provided with new func- tions. It is also recognized that the tree structure shown in Fig.
  • FIG. 3 is drawn mechanically and that for the purpose of illustration it shows a larger part of the tree structure than what is necessary for the example.
  • the tree structure or command hierarchy may be hand-drawn, and only that part of the structure, that is those commands, which are to be used are drawn, as in Fig. 2.
  • predefined and printed or machine drawn command structures, or parts thereof are conceivable. These may e.g. be provided by the manufacturer of a certain household appliance. Such predefined command hierarchies may be expandable by e.g. allowing the user to add further parameters or commands. It is also possible that the manufacturer of a household appliance provides a complete, predefined command hierarchy containing all the necessary commands or parameters for an appliance or a group of appliances.
  • the command hierarchy may be provided in the form of a control base, e.g. a sheet or a paper, provided with the position-coding pattern, on which the command hierarchy is preprinted, but where the address has not been filled out .
  • a corresponding electronic version of the command hierarchy may be provided in the form of software for installation in the sensor device. The software provides the necessary instructions for the sensor device to associate the subareas of the position-coding pattern with the proper command.
  • the user may install the appliance by filling in his address with the sensor device, at the proper position in the command hierarchy, thus making the appliance controllable by means of the sensor device.
  • the command hierarchy Once the command hierarchy has been installed in the sensor device, the user may use it and add commands according to what has been described above.
  • the appliance manufacturer may provide its customer with a product kit comprising the control base and software for installation of the command hierarchy in the sensor device.
  • a command 35 which indicates the temperature parameter and a value command 36 for this are marked and connected by means of an association line 25.
  • the command string which is generated for setting the temperature of the heating installation is home/heating/temperature/20 , whereupon the heating installation in the user's home sets the temperature to 20 degrees, in response to the user sending the command string to the computer network address, which is indicated by the address "home” by marking the node containing the temperature parameter and then the "send" box 26.
  • Similar command structures can be constructed for all the other units 31, 32, 33, 34, whereby setting parameters and values can be entered and sent to the units 31, 32, 33, 34.
  • Fig. 4 shows a third application of an embodiment of the present invention.
  • the input is carried out to programs in the user's computer 3 (Fig. 1) .
  • an address 40 and a number of commands 41, 42, 43, 44 have been drawn on a base 1 and connected by associations 25 and marked with command frames 24.
  • a "send" box 26 is also arranged on the base 1.
  • a sketch 45 has also been noted on the base and recorded in electronic form.
  • the command structure in Fig. 4 comprises an address "my computer" 40 which indicates the address of the user's computer, and a graphics program 41, a word processing program 42 and a spreadsheet program 43 which are installed in the user's computer 3.
  • the command "import” 44 has been noted and recorded in electronic form using the sensor device.
  • the sketch 45 is connected by a line 47 to the command "import” 44.
  • the sketch 45 is stored in the sensor device in the form of graphical data, below called “image data”. More specifically, the sketch is stored as a graphics file, e.g. a vector graphics file. This can be in a standard storage format such as .wmf (Windows ® Meta File) or in a storage format specific to the sensor device.
  • the file is transferred to the unit provided with a processor before the command is executed or in association with the com- mand being executed.
  • the following command string is sent to the unit 3 provided with a processor: my computer/graphics program/import/image data.
  • the unit 3 provided with a processor receives the command string and the image and causes the graphics program to import the image.
  • the importing to the graphics program can be carried out by the program's existing capability of being executed by indicating a command string comprising a file name.
  • a storage format specific to the sensor device it is necessary for the program that receives the image data to have been provided with functionality for handling the storage format of the sensor device.
  • a base 1 on which commands are noted can be used repeatedly by the commands being stored in a memory in the sensor device when the noting is carried out. This may mean that the user can indicate to the sensor device which command is to be carried out by pointing at a written-down command with the sensor device so that the sensor device can read off the position-coding pattern corresponding to the command.
  • the sensor device may identify this command in its memory and send it to the address with which the command was associated when the noting on the base was carried out. This may be done by simply recording a pair of coordinates within the proper subarea of the position-coding pattern, without marking any "send" box.
  • further commands or parameters defining a previously noted and recorded command may be added by being noted graphically on the base and associated with the command in question.
  • a dynamic command structure is obtained, which can be enlarged as new units are added or enlarged with new commands. Commands that has been added to the command hierarchy in this manner will thus constitute parts of the command hierarchy as described above .
  • the methods described here may be implemented as a computer program product, as shown in Figs 6 and 7.
  • the computer program product comprises a computer program which is stored in the program memory of the sensor device and is executed in its processor.
  • the method can be implemented completely or partially in the form of a product-specific circuit, such as for example an ASIC, an FPGA or in the form of digital or analogue circuits or in any suitable combination of these .
  • Fig. 6 is a schematic flow chart for a method according to of the invention.
  • the method may be implemented in e.g. a computer program product.
  • a graphical notation is received in step 50 in the sensor device 2.
  • a command for the unit 3 provided with a processor is identified in step 51 based on the graphical notation.
  • the sensor device also receives in step 52 an address to the unit 3 provided with a processor.
  • the command is sent to the address.
  • Step 53 may be initiated in different manners, by e.g. the sensor device 2 detecting a "send" -box on the base, by receiving a send command or any other indication such as a button being depressed etc.
  • Fig. 7 is a schematic flow chart for a method according to an embodiment of the invention.
  • the step 51 for identifying the command for the unit 3 provided with a processor comprises a substep 51a of at least partly converting the graphical notation into a machine readable character format by e.g. ICR (intelligent character recognition) or HWR (handwriting recognition) , based on the output from step 51a, whereupon the command may be identified in step 51b.
  • ICR integer character recognition
  • HWR handwriting recognition
  • a command that is written in plain text may be identified, either directly from the actual combination of machine readable characters, or by retrieving a related command from a database on the basis of the character combination.
  • the sensor device may be arranged to identify an address from such a character combination (step 52) .
  • Fig. 8 is a schematic flow chart for a method according to another embodiment of the invention.
  • the step 51 of identifying the command comprises a first substep 51c of identifying a graphical symbol, which may be predefined, and thus recognized, as being equivalent to a certain command.
  • the corresponding command is identified. For example, a cross mark ("X") could be interpreted as a "delete" command.
  • the sensor device may be arranged to identify an address from such a graphical symbol (step 52) .
  • Figs 7 and 8 may be combined, by the sensor device being capable of first determining whether a command or address is recorded in plain text or not, and then identifying the command or address based on either a character combination or a graphical symbol.
  • Fig. 9 is a schematic flow chart for a method according to yet another embodiment of the invention.
  • the step 51 of identifying the command further comprises the substep 51d of detecting a command indicator, such as the command indicators 24 of Figs 2-4.
  • a subarea of the position-coding pattern may be identified in step 51e.
  • the subarea may be associated with the command in step 51f, such that a recording of a pair of coordinates within the subarea will be interpreted by the sensor device as being equal to a recording of the associated command.
  • Fig. 10 is a schematic flow chart for a method according to a further embodiment of the invention.
  • the step 52a of receiving the address comprises receiving the address from a memory 55, which may be incorporated in the sensor device 2.
  • Fig. 11 is a schematic flow chart for a method according to another embodiment of the invention.
  • the command (in step 51), the address (in step 52b) and the association (in step 54) are all identified based on the graphical notation 50.
  • Fig. 12 illustrates an embodiment of a method according to the invention, in which a command hierarchy in step 57 has been provided in a memory 56 of the sensor device 2.
  • the command hierarchy may be provided by storing graphical notations that has been made at earlier occasions. It may also be provided by downloading into the memory 56 from e.g. an external memory medium, e.g. via any one of the communication paths discussed above in relation to Fig 1..
  • commands may be e.g. associated with subareas of the position-coding pattern.
  • the step 51 of identifying the command may comprise a substep 51g of receiving a pair of coordinates from a subarea that is associated with the command.
  • the command associated with the subarea is identified.
  • Fig. 12 also illustrates that based on the command hierarchy, command strings may be formed.
  • the command strings may be formed while commands are being entered through graphical notations or in response to a command being identified through e.g. a graphical symbol or through a pair of coordinates within a predefined sub- area .
  • a unit that communicates with the sensor device, such as a computer, a PDA etc.
  • a unit may communicate via e.g. a computer network or via short range communication such as Bluetooth ® or IrDA.
  • the graphical notation may also comprise e.g. data in the form of e.g. figures, text, sketches or graphics, which is recorded by the sensor device 2 and associated with commands or addresses as described above with reference to fig 4.
  • the sensor device may be programmed to evaluate the graphical notations as they are received, based on their contents. For example, a plain text command, a graphical symbol or a command indicator may trigger the above described method of identifying commands. All other, non- recognized data may be treated in any standard fashion, such as may be stored as strokes, i.e. sequences of coordinate pairs, in the memory of the sensor device 2.
  • the invention can also be varied in other ways within the scope of the appended claims.
  • many different types of position-coding pattern are conceivable, in addition to those shown herein.
  • the position-coding pattern does not necessarily need to be optically detectable. It could, for example, be readable by magnetic, capacitive, inductive, chemical or acoustic means. However, this would require a different type of sensor to be used.
  • the position-coding pattern in WO 01/26032 can code coordinates of a very large number of unique positions or points. It can be considered as though all these points together make up an imaginary surface which is considerably larger than any single base. This imaginary surface can be divided into different areas which are reserved for different applications. An area can, for example, be reserved for controlling units provided with a processor. Information defining such areas and functions connected thereto can be stored in the pen and utilized for controlling the function of the pen.
  • Another alternative is to allow commands to be noted within practically any part of the imaginary surface.
  • almost any surface which is provided with a position-coding pattern can be used for entering commands and for the associated control of a unit provided with a processor.
  • measures may possibly need to be taken to prevent interference, as certain areas can previously have been reserved for certain functions.
  • the address to which commands and data are sent can be identified by, for example, looking up addresses in a database external to the sensor device. Both commands and address could be sent to an external unit for interpretation and further processing. This interpretation can be carried out in the sensor device, in the unit provided with a processor, or in some other external unit, possibly dedicated to the purpose.
  • the address to which the command string is to be sent is determined in advance, for example, by the sensor device being pre-programmed with such information. According to this embodiment, it is not necessary to note an address graphically. At least one command can be noted, but command structures as described above can also be noted and stored according to this embodiment. Alternatively, the address to which the command string is to be sent may be associated with a specific area of the position-coding pattern, and thus with a specific base on which that area of the position- coding pattern is arranged. As an alternative to storing the commands as a dynamic structure, it is possible to store a plurality of command strings where each command string represents a conceivable combination of commands.
  • the form in which the commands and the address are sent can also vary: it is, for example, possible to send raw data in the form of the images which the sensor device takes of the base. It is also possible to send some form of processed, for example compressed, image data, a series of coordinates which has been derived from the images and which represents the movement of the sensor device across the base, or commands or address in character-coded format. Other ways of sending commands, addresses or data are not excluded.
  • the sending of address, commands and data to the unit provided with a processor can be initiated by the use of the "send" button, but can also be initiated in response to an indication being written down on the base. For example, the drawing of a frame around a command can initiate transmission. It is also possible to initiate transmission as soon as the sensor device has recorded a complete command string, or when a partial area within a frame is marked which surrounds an already written-down command. Further alternatives for initiating sending comprises, but is not limited to voice control, depressing a button on the pen etc.
  • entry of addresses and commands can be marked by, for example, the user pressing a button on the sensor device. This can precede the entry of a command, but the button can also be held depressed during the whole or part of the entry procedure.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Character Discrimination (AREA)
  • User Interface Of Digital Computer (AREA)
  • Labeling Devices (AREA)

Abstract

La présente invention concerne un procédé permettant de commander une unité munie d'un processeur. Le procédé consiste à recevoir au moins une notation graphique sous la forme de positions représentant le mouvement d'un dispositif capteur sur une base munie d'un motif de codage de position et, tandis que la notation graphique est effectuée, identifier, sur la base de ladite notation graphique, au moins une instruction destinée à l'unité munie d'un processeur, recevoir une adresse de l'unité munie d'un processeur, et commander l'unité munie d'un processeur en envoyant au moins une instruction à l'adresse précitée. L'invention se rapporte également à un programme informatique et à un dispositif permettant la mise en oeuvre du procédé, à un procédé permettant de commander une unité munie d'un processeur à l'aide d'un dispositif capteur et à une trousse de produits permettant de commander une unité munie d'un processeur.
EP02741594A 2001-06-25 2002-06-25 Commande d'une unite munie d'un processeur Withdrawn EP1405165A1 (fr)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
SE0102236A SE0102236L (sv) 2001-06-25 2001-06-25 Styrning av en processorförsedd enhet
SE0102236 2001-06-25
PCT/SE2002/001245 WO2003001357A1 (fr) 2001-06-25 2002-06-25 Commande d'une unite munie d'un processeur

Publications (1)

Publication Number Publication Date
EP1405165A1 true EP1405165A1 (fr) 2004-04-07

Family

ID=20284588

Family Applications (1)

Application Number Title Priority Date Filing Date
EP02741594A Withdrawn EP1405165A1 (fr) 2001-06-25 2002-06-25 Commande d'une unite munie d'un processeur

Country Status (4)

Country Link
EP (1) EP1405165A1 (fr)
CN (1) CN100541404C (fr)
SE (1) SE0102236L (fr)
WO (1) WO2003001357A1 (fr)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7916124B1 (en) 2001-06-20 2011-03-29 Leapfrog Enterprises, Inc. Interactive apparatus using print media
US7922099B1 (en) 2005-07-29 2011-04-12 Leapfrog Enterprises, Inc. System and method for associating content with an image bearing surface
US8261967B1 (en) 2006-07-19 2012-09-11 Leapfrog Enterprises, Inc. Techniques for interactively coupling electronic content with printed media
KR100841285B1 (ko) * 2006-09-18 2008-06-25 주식회사 펜래버레토리 표면상에 절대 위치 표시 패턴을 갖는 제조물 및 그 절대위치 표시 패턴의 형성 방법
CN108665504A (zh) * 2017-04-02 2018-10-16 田雪松 基于位置编码识别的终端控制方法

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5661506A (en) 1994-11-10 1997-08-26 Sia Technology Corporation Pen and paper information recording system using an imaging pen
US5852434A (en) 1992-04-03 1998-12-22 Sekendur; Oral F. Absolute optical position determination

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5477012A (en) * 1992-04-03 1995-12-19 Sekendur; Oral F. Optical position determination
JP3475048B2 (ja) * 1997-07-18 2003-12-08 シャープ株式会社 手書き入力装置
JP2000207115A (ja) * 1999-01-08 2000-07-28 Matsushita Electric Ind Co Ltd 入力ペン、それを用いた入力装置および携帯電子機器、並びにクリップ付きハンドストラップ機構の使用方法
WO2001016691A1 (fr) 1999-08-30 2001-03-08 Anoto Ab Bloc-notes
SE517445C2 (sv) * 1999-10-01 2002-06-04 Anoto Ab Positionsbestämning på en yta försedd med ett positionskodningsmönster

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5852434A (en) 1992-04-03 1998-12-22 Sekendur; Oral F. Absolute optical position determination
US5661506A (en) 1994-11-10 1997-08-26 Sia Technology Corporation Pen and paper information recording system using an imaging pen

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
DYMETMAN M; COPPERMAN M: "Intelligent paper", LECTURE NOTES IN COMPUTER SCIENCE, vol. 1375, March 1998 (1998-03-01), pages 392 - 406, XP002328425
See also references of WO03001357A1

Also Published As

Publication number Publication date
CN1520542A (zh) 2004-08-11
WO2003001357A1 (fr) 2003-01-03
SE0102236D0 (sv) 2001-06-25
CN100541404C (zh) 2009-09-16
SE0102236L (sv) 2002-12-26

Similar Documents

Publication Publication Date Title
US7202861B2 (en) Control of a unit provided with a processor
US20110279415A1 (en) Method and system for implementing a user interface for a device employing written graphical elements
US20060066591A1 (en) Method and system for implementing a user interface for a device through recognized text and bounded areas
CN103970409B (zh) 产生增强现实内容的方法和使用所述增强现实内容的终端
US20010035861A1 (en) Controlling and electronic device
KR100735554B1 (ko) 문자 입력 방법 및 이를 위한 장치
CN101697277B (zh) 一种实现智能无线麦克风多功能的方法、装置及系统
WO2006049574A1 (fr) Gestion de logique interne pour stylos electroniques
US20030046256A1 (en) Distributed information management
WO2007136846A2 (fr) Enregistrement et lecture de messages vocaux associés à une surface
AU7046700A (en) Notepad
EP1272971B1 (fr) Gestion d'informations
US20090127006A1 (en) Information Management in an Electronic Pen Arrangement
WO2001061454A8 (fr) Commande d'un dispositif electronique
CN102034341B (zh) 控制系统及其控制画面的产生方法
EP1405165A1 (fr) Commande d'une unite munie d'un processeur
CN101393493A (zh) 自动注册指定操作的手写笔迹的方法和装置
JP6367031B2 (ja) 電子機器遠隔操作システム及びプログラム
KR20030085268A (ko) 아이콘 형식의 유저 인터페이스를 갖는 리모콘
JPH10207908A (ja) ネットワークサービスアクセス管理装置
CN202795331U (zh) 具有声控按键功能的电子装置
CN106126103A (zh) 一种信号输入方法及装置
WO2002021252A1 (fr) Enregistrement electronique et communication d'informations
EP1681623A1 (fr) Interface utilisateur de dispositif par texte reconnu et zones bornées
EP1405167A1 (fr) Procede destine a connecter un dispositif de detection a une unite externe arbitraire

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20040126

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LI LU MC NL PT SE TR

AX Request for extension of the european patent

Extension state: AL LT LV MK RO SI

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: ANOTO IP LIC HB

111L Licence recorded

Free format text: 0100 LEAPFROG ENTERPRISES INC.

Effective date: 20050530

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: ANOTO AB

17Q First examination report despatched

Effective date: 20071004

TPAC Observations filed by third parties

Free format text: ORIGINAL CODE: EPIDOSNTIPA

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: ANOTO AB

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20110720