RU2457532C2 - Input processing system for information processing apparatus - Google Patents

Input processing system for information processing apparatus Download PDF

Info

Publication number
RU2457532C2
RU2457532C2 RU2008139959/08A RU2008139959A RU2457532C2 RU 2457532 C2 RU2457532 C2 RU 2457532C2 RU 2008139959/08 A RU2008139959/08 A RU 2008139959/08A RU 2008139959 A RU2008139959 A RU 2008139959A RU 2457532 C2 RU2457532 C2 RU 2457532C2
Authority
RU
Russia
Prior art keywords
raster
image
scanner
surface
information
Prior art date
Application number
RU2008139959/08A
Other languages
Russian (ru)
Other versions
RU2008139959A (en
Inventor
Кенджи ЙОШИДА (JP)
Кенджи Йошида
Original Assignee
Кенджи Йошида
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Family has litigation
Priority to JP2006066751 priority Critical
Priority to JP2006-066751 priority
Priority to JP2006-314650 priority
Priority to JP2006314650 priority
Priority to JP2007-060495 priority
Application filed by Кенджи Йошида filed Critical Кенджи Йошида
First worldwide family litigation filed litigation Critical https://patents.darts-ip.com/?family=39654815&utm_source=google_patent&utm_medium=platform_link&utm_campaign=public_patent_search&patent=RU2457532(C2) "Global patent litigation dataset” by Darts-ip is licensed under a Creative Commons Attribution 4.0 International License.
Publication of RU2008139959A publication Critical patent/RU2008139959A/en
Application granted granted Critical
Publication of RU2457532C2 publication Critical patent/RU2457532C2/en

Links

Images

Abstract

FIELD: information technology.
SUBSTANCE: owing to an icon on a medium for reading raster formed on the surface of the medium using a scanner connected to an information processing device for converting raster to each or one code value and coordinate value determined by the raster, and for outputting speech, a picture, a moving picture, a letter or symbol or program, corresponding to each or one code value and coordinate value, stored in the information processing device, or for outputting information on accessing a website, corresponding to each or one code value and coordinate value, stored in the information processing device, information on speech, a picture, a moving picture or a letter or symbol, prepared in advance, program launch, website access etc can be performed.
EFFECT: broader functional capabilities owing to formation of an icon on paper, a paper controller, a paper keyboard or mouse pad, input systems, ie, an icon on paper, a paper controller, a paper keyboard and a mouse pad, which can input letters and symbols into a computer and perform operations with easy manipulation and replace hardware such as a keyboard, mouse and graphic pad.
42 cl 115 dwg

Description

FIELD OF TECHNOLOGY

The present invention relates to an input processing system for an information processing device using rasters made (printed) on a surface of a medium, such as a sheet of paper.

KNOWN LEVEL OF TECHNOLOGY

Computers are used in many cases of our daily lives. The functions and purposes of computers have expanded dramatically, for example, obtaining the necessary information by accessing WEB-pages on the Internet and purchasing goods in addition to the usual purpose, such as creating documents and making calculations.

When using a computer, it is usually necessary to perform an operation such as entering letters or characters using the keyboard and the mouse-like manipulator (hereinafter referred to as the “mouse”).

However, in order to master the input of letters and characters using the keyboard, it takes a lot of rather complicated actions and a lot of time. In particular, it is difficult to use the devices for older people and people with physical disabilities. In addition, the problem of "digital divide" (the gap in the use of modern digital and information technologies) arises, when there are differences in information and opportunities between people who can use the keyboard and mouse and to master computers, and those who can not use them and master computers.

In order to solve these problems, an information processing device and a service system are proposed that can enter information into a computer using code combinations, such as a barcode or QR code (hereinafter “barcode”), printed on the surface of the medium. The barcode is described in a catalog or on a web site provided by the appropriate trader (trader). When a user (operator) reads a barcode using a barcode reader connected to a computer, the user can obtain the necessary information or buy goods (see, for example, patent document 1).

Patent Document 1. Japanese Patent Application Laid-Open No. 2005-4574.

DESCRIPTION OF THE INVENTION

PROBLEMS TO BE SOLVED BY THE INVENTION

However, in order to display barcodes, a certain area must be provided on the surface of the medium, and there is thus a limit to the number of codes that can be displayed on the surface of the medium. Compared to the keyboard, it is difficult to enter codes corresponding to different letters, signs and symbols. In addition, barcodes are factors that degrade the aesthetics of the media surface.

The aim of the present invention is to eliminate the aforementioned disadvantages and create new input systems, that is, the creation of a pictogram on paper, a paper controller, a paper keyboard and a mouse pad instead of hardware such as a keyboard, mouse and graphics tablet capable of entering letters, signs and etc. and perform operations with easy manipulation.

MEANS FOR SOLVING PROBLEMS

To solve these problems, the following tools are offered.

In accordance with a first aspect of the present invention, there is provided an input processing system for an information processing device, wherein a raster that is created on the surface of the medium and in which each or one of the coordinate and code value is determined in one format is read using a scanner connected to information processing device, with the transmission of a working command to enter each or one coordinate value and code value into the central processor of the information processing device, divided pattern, wherein the raster is printed on the surface of the carrier raster and on the support surface is read using the scanner, the reading raster, each or one coordinate value and the code value corresponding to the raster is inputted to the information processing unit CPU.

In accordance with a second aspect of the present invention, there is provided an input processing system for an information processing device, wherein a raster that is created on the surface of a medium is read using a scanner connected to the information processing device and converted to the interrupt key code value on the hardware keyboard defined by the raster, with the generation of the key input interruption processing in the central processor of the information processing device, the raster being created for each icon printed on the surface of the medium, and if the icon for which the raster is created on the surface of the medium must be scanned using a scanner that reads the raster, before or after reading the raster, the scanner tilts relative to the surface of the medium by the difference in the light and shadow of the image read by the scanner, and key input interruption processing is generated, determined in accordance with the direction of inclination of the scanner relative to the surface of the medium.

In accordance with a third aspect of the present invention, there is provided an input processing system for an information processing device according to claim 2, wherein the scanner operation is recognized by changing the difference in light and shadow of the image read by the scanner, and where, in accordance with the scanner operation, key input interrupt processing is generated.

In accordance with a fourth aspect of the present invention, there is provided an input processing system for an information processing apparatus according to claim 2 or 3, wherein the key input interrupt processing includes changing the type of letter or character to be entered, a command to convert the letter or character, and moving the cursor.

In accordance with a fifth aspect of the present invention, there is provided a Japanese input system, where a raster created on a surface of a medium is read using a scanner connected to an information processing device and converted into an interrupt key code on a hardware keyboard defined by a raster, thereby generating processing of interruption of key input in the central processor of the information processing device, the raster being created for each icon printed on the surface of the medium, where, if The icon for which a raster has been created on the surface of the medium must be scanned using a scanner that reads the raster and a word is entered that includes only the vowel; the raster on the icon is read by touching the tip of the scanner with an icon for which the code value corresponding to this vowel is defined as raster, where if the icon for which a raster was created on the surface of the medium must be scanned using a scanner that reads the raster and a word is entered that includes the consonant and vowel beech you, the raster corresponding to this consonant letter is read by touching and stopping the reader provided on the tip of the scanner c and on the icon for which the code value corresponding to the consonant letter is defined as a raster, and the scanner reader moves to the icon for which the code value corresponding to the vowel following the consonant is defined as a raster on the surface of the medium, and temporarily stops at the icon corresponding to the vowel to read the raster and read spruce, provided on the tip of the scanner is separated from the surface of the support so that it can not recognize the raster, wherein the introduction of a single letter or character, several words or phrases.

Thus, by touching the icon for which the raster was created and disconnecting from it, it is possible to implement a completely different input system for entering letters and characters, which cannot be implemented using the hardware keyboard.

According to a sixth aspect of the present invention, there is provided an information input device, comprising: a voice input device that inputs operator voice information; a conversion device that analyzes the entered speech information and converts the entered speech information into one or more candidate words formed or formed by letters or characters corresponding to the entered speech information; a display device that displays one or more candidate words obtained or obtained by conversion; a scanner that reads a raster created on the surface of the medium, and in which each or one coordinate value and code value for an arbitrary moving cursor is rasterized so as to select one of the candidate words displayed on the display device; and a resolver that converts the raster read by the scanner into a code value and enters a candidate word corresponding to the code value as a resolved word.

Thus, when entering speech information into the information processing device, voice input can be compensated by selectively reading the printed surface of the carrier raster using a scanner with access to the entered candidate information (i.e., the candidate sign or candidate menu) displayed on the device’s screen information processing.

In accordance with a seventh aspect of the present invention, there is provided an input processing system for an information processing device, where a raster that is created on the surface of the medium and in which each or one coordinate value and value is rasterized is read using a scanner connected to the information processing device, with transmission in this case, a working command to the central processor of the information processing device defined by the raster, the raster being printed on the surface of the medium, and where if the raster is on the surface ositelya must be read using the scanner reading raster scanner tilt detection is performed with respect to the support surface by the difference of light and shadow image read by the scanner and the display operation is performed in the graphical user interface according to the direction of inclination of the scanner with respect to the support surface.

In accordance with an eighth aspect of the present invention, there is provided an input processing system for an information processing apparatus according to claim 7, wherein the scanner operation is recognized by changing the difference in light and shadow of the image read by the scanner, and the graphical user interface operation on the screen is performed in accordance with the scanner operation.

In accordance with a ninth aspect of the present invention, there is provided an input processing system for an information processing device according to claim 7 or 8, wherein the operation of the graphical user interface on the screen is a mouse-controlled operation, such as scrolling the screen, moving the cursor, indicating an icon on the screen, transfer operation with fixation in a new place (“drag and drop”), the choice of a menu command or the operation of issuing a command to the input position of a letter, sign or similar element.

In accordance with a tenth aspect of the present invention, there is provided an input processing system for an information processing apparatus, where a raster created on a surface of a medium is read using a scanner connected to the information processing apparatus and converted into an interrupt key code on a hardware keyboard defined by the raster, with generation while processing interruptions of key input in the central processor of the information processing device, wherein the raster is printed with concave and convex parts of the relief points on the surface of the media as an icon.

In accordance with an eleventh aspect of the present invention, there is provided an input processing system for the information processing apparatus of claim 10, wherein the raster and elevation points denoting the raster are created in a predetermined area as a pair on the surface of the medium, and a block separating and limiting this is provided for each region region.

In accordance with a twelfth aspect of the present invention, there is provided a remote controller for making viewing and listening, or recording, or for accessing a website based on program information or website information printed on a medium surface, the remote controller comprising: an imaging device, which optically reads a raster created by rasterizing a given code value based on a given algorithm for each area of program information or and Site formation printed on the support surface; a control device that analyzes the raster from the image read by the image former and transmitted from the image former, and decodes the raster into a code value denoted by the raster; and a transmission device that transmits the decrypted code value to a radio receiver, tuner, recording and playing device, player or network access device, a set-top box for television to receive broadcasts and network access, or a personal computer.

In accordance with a thirteenth aspect of the present invention, there is provided a remote controller having a raster obtained by rasterizing a predetermined code value based on a predetermined algorithm and created on an icon on the surface of a medium indicating a control button for a radio receiver, tuner, recorder and player, player or access device networks, set-top boxes to a television for receiving broadcasts and access to a network or personal computer, the remote controller comprising: ormirovatel image, which optically reads the raster; a control device that analyzes the raster from the image read by the image former and transmitted from the image former, and decodes the raster into a code value denoted by the raster; and a transmission device that transmits the decrypted code value to a radio receiver, tuner, recording and playing device, player or network access device, a set-top box for television to receive broadcasts and network access, or a personal computer.

In accordance with the fourteenth aspect of the present invention, there is provided a remote controller according to claim 12 or 13, wherein the image driver is a reader configured integrally with the remote controller.

In accordance with a fifteenth aspect of the present invention, there is provided a remote controller according to claim 12 or 13, comprising: a stand representing the main body of the remote controller, the stand comprising a control device and a transmission device; and a scanner connected to the stand with wires or wirelessly, wherein the scanner comprises an image driver communicating with the control device.

In accordance with a sixteenth aspect of the present invention, there is provided a projection and moving image control system comprising: a projection board on which to create a raster obtained by rasterizing each or one predetermined coordinate value and a predetermined code value based on a predetermined algorithm, the projection board having one surface formed by an image display area for projecting a moving image or image, and a control area a guide for displaying an image of a pictogram for controlling a moving image or an image projected onto an image display area; a projector for projecting a moving image or image at least on the image display area; a reader that reads a raster created in the zone of the controller; and a control device that analyzes the raster in the image of the icon created in the controller area and read by the reader, which converts the raster into a coordinate value or a code value denoted by a raster that outputs a control signal corresponding to the coordinate value or code value to the projector and controls the output a moving image signal or an image displayed in the image display area.

According to a seventeenth aspect of the present invention, there is provided a projection image and moving image control system according to claim 16, wherein the projection board is configured such that a transparent sheet is glued to the surface of the white board by a bonding layer, and a raster is created between the transparent sheet and the bonding layer.

In accordance with an eighteenth aspect of the present invention, there is provided a system for processing and displaying information, comprising: a projection board on which to create a raster obtained by rasterizing each or one predetermined coordinate value and a predetermined code value based on a predetermined algorithm; a projector that projects an image of an icon representing at least the start of a program onto a projection board and projects an image or a moving image to display a program installed in a storage device corresponding to the image of the icon; a reader that reads the raster created on the projected image of the icon; and a control device that analyzes the raster created in the image of the icon and read by the reader, which converts the raster into a coordinate value or a code value denoted by a raster, and starts a program from a memory device corresponding to the coordinate value or code value by a trigger signal.

According to a nineteenth aspect of the present invention, there is provided a projected image and moving image control system or an information processing and display system according to claim 16 or 18, wherein the surface of the projection board on which the raster is created is different from the surface on which the image, moving image is projected, or an icon image, and the projector is positioned as a reverse projector relative to the projection board.

In accordance with a twentieth aspect of the present invention, there is provided a projection image and moving image control system or an information processing and display system according to claim 19, wherein the raster on the projection board is made of a material having an absorption characteristic in the infrared region of the spectrum, and at least The projection board surface is provided with an infrared notch filter.

In accordance with a twenty-first aspect of the present invention, there is provided a paper controller creation system for printing an image of a pictogram displayed on a display device on a surface of a sheet of paper together with a raster corresponding to a pictogram image, the paper controller creation system being a pictogram image print control system and comprises : a display device that creates and displays an icon image; a control device that links the image of the icon displayed on the display device with each or one coordinate value and a code value determined in advance, and issues a command to print the image of the icon and raster; and a printing device that, on command from the control device, prints an icon and a raster image on the surface of a given medium.

In accordance with a twenty-second aspect of the present invention, there is provided a printing method for an information processing apparatus for printing a desktop screen displayed on a display device on a surface of a sheet of paper together with a raster, comprising the steps of: a stage in which coordinate values corresponding to the desktop screen are displayed ; the stage at which the raster is created, which means the coordinates on the screen when printing a desktop screen; a stage in which a raster is created in which coordinate values and a code value are included that mean a function of a functional image or the like on a functional image, for example, an icon image on a desktop screen, in one format; and the stage at which the desktop screen is printed along with the rasters.

In accordance with a twenty-third aspect of the present invention, there is provided a system for controlling a projected image and a moving image, or an information processing and display system according to one of paragraphs 16-20, where a raster created on a projection board is defined in rasters that are identical in coordinate value and code value, and where the given matrix blocks are formed on the board, and the same matrix block is assigned the same code value, despite the change in the coordinate value.

In accordance with a twenty-fourth aspect of the present invention, there is provided a projected image and moving image control system or an information processing and display system according to claim 23, wherein the pictogram image is located on one or more matrix units, and when the image raster reads the pictogram image, a control command is issued image corresponding to the image of the icon, or start a program corresponding to the image of the icon.

In accordance with a twenty-fifth aspect of the present invention, there is provided an input processing system for an information processing device, an Japanese input system, an information input device, a remote controller, a projected image and moving image control system, an information processing and display system, or an image print control system of an icon one of paragraphs 2, 5-7, 10-13, 16, 18 and 21, where the raster is created from a material with absorption in the infrared region of the spectrum, and the coordinate value and coding The new value is defined in the raster in one format.

In accordance with a twenty-sixth aspect of the present invention, there is provided an input processing system for an information processing device, where a raster that is created on a surface of a medium and whose coordinate value and code value are defined in the same format is read using a scanner connected to an information processing device transmitting a working command to enter each or one coordinate value and code value into the central processor of the information processing device defined by the raster, m the raster is printed on the surface of the medium, the raster on the surface of the medium is a set of specified points obtained by superimposing a raster on a controller or keyboard template to locate raster points at grid points at specified intervals in horizontal and vertical directions and the location of information points that have values, defined by how the information points are offset from the virtual grid point in the center, surrounded by four raster points at the grid points around in an irregular grid point, the raster containing several information areas in which rasters containing X coordinate values, Y coordinate values and code values in the format of a single raster are printed, and where the raster on the surface of the medium is read using a scanner that reads the raster, with the input each or one coordinate value and code value corresponding to the raster to the central processor of the information processing device.

EFFECT OF THE INVENTION

The present invention makes it possible to create a new input system, that is, a paper icon, paper controller, paper keyboard and mouse pad, instead of hardware such as a keyboard, mouse, and graphics tablet capable of inputting letters, characters, and the like into a computer. and perform operations on the computer with simple operations.

BRIEF DESCRIPTION OF THE GRAPHICAL MATERIAL

1 is a block diagram illustrating a system configuration of a scanner and a computer.

Figure 2 is an exemplary diagram illustrating a raster in accordance with GRID 1;

Figure 3 is an enlarged diagram illustrating an example of information points on a raster in accordance with GRID 1.

FIG. 4 is an explanatory diagram showing an arrangement of information points in accordance with GRID 1.

Figure 5 shows another example in which information points and data defined by information points are displayed as bits in accordance with GRID 1.

Figure 6 shows an example in which information points and data defined by information points are displayed as bits in accordance with GRID 1; Fig. 6 (a) shows that two points are located, Fig. 6 (b) shows that four points are located, and Fig. 6 (c) shows that five points are located.

7 illustrates modifications of a raster in accordance with GRID 1; Fig. 7 (a) is a schematic diagram of the arrangement of six information points, Fig. 7 (b) is a schematic diagram of the arrangement of nine information points, Fig. 7 (c) is a schematic diagram of the arrangement of 12 information points, and Fig. 7 ( d) is a schematic representation of the location of 36 information points.

FIG. 8 is an explanatory diagram illustrating a point arrangement in accordance with GRID 1.

9 is an explanatory diagram explaining a format of rasters in accordance with one embodiment of the present invention.

10 is an explanatory diagram illustrating rasters in accordance with GRID 2.

11 is a diagram illustrating the relationship between points and grid lines in accordance with GRID 2.

12 is a diagram illustrating the order in which the information point is offset from the grid point in accordance with GRID 2.

13 is a raster diagram for explaining the collection of information using a difference in accordance with GRID 2.

14 is a diagram illustrating a correspondence between information bits, a data protection table, and true values in accordance with GRID 2.

15 is an explanatory diagram illustrating a usage state of a paper keyboard in accordance with one embodiment of the present invention.

On Fig shows the front sides of the pages of the paper keyboard.

On Fig shows the front sides of the pages of the paper keyboard.

FIG. 18 is an explanatory diagram illustrating a paper controller in accordance with one embodiment of the present invention.

Fig. 19 illustrates yet another embodiment of a paper controller for registering URLs (Uniform Resource Locators) on the Internet as bookmarks.

20 is an explanatory diagram illustrating a usage state of a paper controller.

21 is an explanatory diagram illustrating a usage state of a paper controller.

22 is a diagram illustrating a screen displayed on a monitor if the operation is performed using a paper controller.

23 is a diagram illustrating a screen displayed on a monitor if the operation is performed using a paper controller.

24 is a diagram for explaining a table used in an embodiment; Fig. 24 (a) shows a table of local indexes provided in a personal computer, and Fig. 24 (b) shows a table of a management server provided in a management server.

25 is a diagram for explaining yet another embodiment of a paper controller and illustrating a paper controller comprising guide blocks.

FIG. 26 is a diagram explaining yet another embodiment of a paper bookmark controller.

FIG. 27 is a sectional view of the paper controller shown in FIG. 25.

28 is an explanatory diagram illustrating a state in which the paper controller is separated from the guide blocks.

29 is a diagram illustrating another embodiment of a paper controller on which protrusions of embossed points are provided as well as rasters.

30 is a diagram illustrating another embodiment of a paper controller on which protrusions of embossed points are provided, as well as rasters.

Fig is a diagram explaining the operation of the paper keyboard during the operation of the scanner; Fig. 31 (a) is a diagram explaining the operation of striking the grid, Fig. 31 (b) is a diagram explaining the operation of tapping on the grid, and Fig. 31 (c) is a diagram explaining the operation of moving on the grid.

32 is a diagram illustrating an operation of a paper keyboard in a scanner operation; Fig. 32 (a) is a diagram explaining the operation of grinding the mesh to the right, and Fig. 32 (b) is a diagram explaining the operation of grinding the mesh to the left.

33 is an explanatory diagram illustrating a state of use of a mouse pad in accordance with one embodiment of the present invention.

Fig. 34 is a diagram illustrating a mouse pad; each of figures 34 (a) and 34 (b) shows a round mouse mat, and each of figures 34 (c) and 34 (d) shows a rectangular mouse mat.

Fig. 35 is a diagram illustrating one specific example of a mouse pad.

Fig. 36 is a diagram explaining a scroll operation of a web page according to a browser program by a scanner operation using a mouse pad.

Fig. 37 is a diagram explaining a scroll operation of a web page in accordance with a browser program by a scanner operation using a mouse pad.

Fig. 38 is a diagram explaining yet another embodiment of a mouse pad and showing a mouse pad having annular grooves.

Fig. 39 is a diagram illustrating another embodiment of a mouse pad and showing a mouse pad having radial grooves.

40 is a diagram illustrating yet another embodiment of a paper keyboard and explaining a paper keyboard performing input of tapping and tearing operations.

Fig. 41 is a diagram explaining a method of inputting a letter or character if the letter or character is entered using the paper keyboard shown in Fig. 40.

Fig. 42 is a diagram illustrating a specific example of inputting a letter or character using the paper keyboard shown in Fig. 40.

Fig. 43 is a diagram explaining the use as a help device for inputting a letter or character by speech recognition.

Fig. 44 is a diagram illustrating a scanner integrated with an infrared remote controller.

Fig. 45 is a diagram for explaining an operation of transmitting an infrared signal to a television using a remote controller so that the scanner can be placed on a stand.

Fig. 46 is a diagram illustrating an operation of transmitting an infrared signal to a set-top box to a television using a remote controller so that the scanner can be placed on a stand.

Fig. 47 is a diagram for explaining a paper controller having a remote controller function for a television and a set-top box control function.

Fig. 48 is a diagram for explaining a paper controller having a remote controller function for a television and a set-top box control function.

Fig. 49 is a diagram explaining the functions and operations of the paper controllers shown in Figures 47 and 48.

Fig. 50 is a diagram explaining the functions and operations of the paper controllers shown in Figures 47 and 48.

Fig. 51 is a diagram explaining the functions and operations of the paper controllers shown in Figures 47 and 48.

Fig is a diagram explaining the functions and operations of the paper controllers shown in figures 47 and 48.

Fig is a diagram explaining the functions and operations of the paper controllers shown in figures 47 and 48.

Fig. 54 is a diagram explaining the functions and operations of the paper controllers shown in Figures 47 and 48.

Fig. 55 is a diagram explaining the functions and operations of the paper controllers shown in Figures 47 and 48.

Fig. 56 is a diagram explaining the functions and operations of the paper controllers shown in Figures 47 and 48.

Fig. 57 is a diagram for explaining a paper controller for providing various services in a hotel.

Fig. 58 is a diagram for explaining a paper controller for controlling a music or video reproducing apparatus.

Fig. 59 is a diagram for explaining a paper controller for controlling a music or video reproducing apparatus.

Fig. 60 is a diagram explaining a method of using the paper controllers shown in Figs. 58 and 59.

Fig. 61 is a diagram illustrating a specific example displayed on the paper if paper controllers shown in Figs. 58 and 59 are used.

Fig is a diagram explaining the functions and operations of the paper controllers shown in figures 58 and 59.

Fig is a diagram explaining the functions and operations of the paper controllers shown in figures 58 and 59.

Fig. 64 is a diagram explaining the functions and operations of the paper controllers shown in Figs. 58 and 59.

Fig is a diagram explaining the functions and operations of the paper controllers shown in figures 58 and 59.

Fig.66 is a diagram explaining the functions and operations of the paper controllers shown in figures 58 and 59.

Fig. 67 is a diagram explaining the functions and operations of the paper controllers shown in Figures 58 and 59.

68 is a diagram for explaining a white board on which rasters are printed, and illustrating a state in which the controller and images are displayed by the projector.

69 is a longitudinal side view showing an expanded sectional composition of a white board.

70 is a diagram for explaining a white board on which rasters are printed, and illustrating a state in which icons are displayed using a projector.

71 is a diagram for explaining an acrylic board on which rasters are printed, and illustrating a state in which a thumbnail initial screen is displayed using a back projector.

72 is a longitudinal side view showing an expanded sectional composition of an acrylic board.

Fig is a diagram explaining the functions for creating a paper keyboard by the user; FIG. 73 (a) shows images displayed on a display device, and FIG. 73 (b) shows a state in which images are printed onto a sheet.

Fig. 74 is a diagram for explaining a GAM (graphical access method), which is one embodiment of the present invention.

FIG. 75 is a diagram for explaining a GAM, which is one embodiment of the present invention.

Fig. 76 is a diagram illustrating an order system for a restaurant menu, which is one embodiment of the present invention.

Fig.77 is a diagram illustrating an ordering system for a restaurant menu, which is one embodiment of the present invention.

78 is a diagram explaining a direction of a camera included in a scanner and a tilt of a scanner.

Fig. 79 is a diagram (1) for explaining a method of measuring a direction and an angle of inclination when performing interrupt processing of a key input or a graphical user interface operation by tilting a scanner.

80 is a diagram (2) for explaining a method of measuring a direction and an angle of inclination when performing interrupt processing of a key input or a graphical user interface operation by tilting a scanner.

Fig. 81 is a diagram for explaining a method for measuring a tilt direction when performing interrupt processing of a key input or operation of a graphical user interface by tilting a scanner.

82 is a diagram for explaining a method for measuring a tilt direction using a Fourier function when performing interrupt processing of a key input or a graphical user interface operation by tilting a scanner.

83 is a diagram for explaining a method of measuring a tilt direction using an nth degree equation when performing key input interrupt processing or a graphical user interface operation by tilting a scanner.

84 is a diagram for explaining a paper keyboard in which XY coordinate values are used as a mouse pad.

85 is a diagram for explaining a white board on which matrix blocks are created.

Fig. 86 is a diagram for explaining a raster format used on the white board shown in Fig. 85.

87 is a diagram illustrating a correspondence table between a code value and a command for rasters used on the white board shown in FIG.

88 is a diagram for explaining a print function of a desktop screen on a display and creating a paper keyboard.

89 is a diagram illustrating a correspondence table between code values and a startup program generated by printing a desktop screen on a display and creating a paper keyboard.

90 is a diagram for explaining a raster format generated by printing a desktop screen on a display and creating a paper keyboard.

EXPLANATIONS OF LETTERS OR NUMBERS

one Raster 2 Key point 3 Information point four Raster point at grid point 5 Virtual grid point CPU Central processing unit (CPU) MM Main storage device USB I / F USB interface HD Hard drive device DISP Display device KBD Keyboard NW I / F Network interface SCN Scanner

BEST MODE (S) FOR CARRYING OUT THE INVENTION

1 is a hardware block diagram illustrating a configuration of a personal computer and a scanner.

As shown in figure 1, the personal computer contains a Central processing unit (CPU) as the main constituent element, as well as the main storage device (MM), a hard disk device (HD) connected to the bus (BUS), a display device (DISP), serving as an output means, and a keyboard (KBD) serving as an input means.

A scanner (SCN) serving as an imaging tool is connected to the CPU via a USB interface (USB I / F).

Although the internal embodiment of this scanner (SCN) is not shown in the figures, the scanner (SCN) contains an infrared radiation device (light emitting diode), a filter that cuts off components with a given wavelength of reflected infrared light, and an imager (charge-coupled device (CCD) or a complementary metal oxide semiconductor (CMOS)) forming an image of reflected light. The scanner (SCN) is designed to form an image of light reflected from a paper keyboard or paper controller, and to process rasters printed on the surface of the medium as image data.

Right and left click buttons are provided on the side of the scanner (SCN). The right and left click buttons can function as a right-click and left-click function, respectively. Although FIG. 1 shows that the right click button is in the up position and the left click button is in the down position, the button layout is not limited to the location shown in FIG. 1.

Although not shown, the output device can connect not only a display device (DISP), but also a printer, speaker, etc.

The bus (BUS) is connected via a network interface (NW I / F) to a public network (NW), such as the Internet, due to which electronic data of a bitmap image, information in the form of letters or characters, information in image, voice information, information in the form of a moving image, program, etc.

Data, such as application programs, for example, a raster analysis program used in one embodiment of the present invention, electronic raster image data, information in the form of letters or characters, information in the form of an image, speech information, information in the form of a moving image, and various tables, and also the operating system (OS) is recorded on the hard disk (HD).

If the central processing unit (CPU) receives an input signal obtained by reading the data on the image raster on the surface of the medium from the scanner (SCN) and converting the image raster into a code or coordinate value via the USB interface, the central processor (CPU) reads the electronic data of the raster image, information in the form of letters or characters, information in the form of images, voice information, information in the form of a moving image, programs, etc., corresponding to the input signal from the hard disk (HD), and provides output Reads data from an output device such as a display device (DSP) or a loudspeaker, etc. (not shown).

The codes or coordinate value read by the scanner (SCN) are described in more detail below.

Although not shown in detail, the scanner (SCN) contains infrared radiation (red LED), an IR filter, an optical imaging device such as a CMOS sensor or a CCD sensor, and image forming functions of reflected light emitted on the surface of the medium. Rasters on the surface of the medium are printed with carbon black ink, and images and letters or signs other than rasters are printed with non-carbon black ink.

Since this carbon black ink has the characteristics of absorbing infrared light, black images of only raster points are formed on the image read by the optical imager.

As you can see, since only the rasters are printed with carbon black ink, the rasters can be printed superimposed on conventional printing without visually affecting the images and letters or signs printed with ink other than carbon black.

Although a carbon black based ink is mentioned as an ink having infrared absorption characteristics, the type of ink used to print rasters is not limited to carbon black. Any other paint that responds to a specific wavelength can be used.

The image of the read rasters is analyzed by the central processing unit (CPU) in the scanner, converted into coordinate values or code values and transmitted to the personal computer via the USB cable and via the USB interface (USB I / F).

The central processing unit (CPU) of a personal computer accesses a table showing the received coordinate values or code values, and provides output of electronic data of a raster image, information in the form of letters or characters, information in the form of an image, voice information or information in the form of a moving image corresponding to these coordinate values or code values from a display device (DISP) or speaker (not shown).

<Description of GRID 1 rasters>

The rasters used in the present invention are further described with reference to FIGS. 2-7.

2 is an exemplary diagram illustrating GRID 1, that is, one example of rasters in accordance with the present invention.

In these figures, the horizontal and vertical lines of the grid are given for convenience of description and do not represent the actual printed surface. Preferably, key points 2, information points 3, raster points 4 at grid points, and the like, which are rasters 1, are printed with carbon black ink having infrared absorption characteristics if the scanner is used as an imaging tool , has a means of infrared radiation.

FIG. 3 is an enlarged view illustrating an example in which information points on rasters and information bits defined by information points are displayed. 4 (a) and 4 (b) are exemplary views illustrating information points located in the middle around key points.

The method of inputting and outputting information in accordance with the present invention includes the stages in which rasters 1 are created, rasters 1 are recognized, and means for outputting information and programs from this raster 1 are used. Namely, raster 1 is selected as a camera image data, then raster points 4 at grid points, then key points 2 are extracted based on the fact that no raster points are given at positions where raster points 4 are initially present at grid points, and then extract in ormatsionnye point 3. Raster 1 thereby digitized, an information area is extracted to convert information into numerical values, and based on information in the form of numerical values of 1 raster output information and a program. For example, information, such as voice information, and a program are output from this raster 1 to an information output device, a personal computer, a personal digital assistant, a portable telephone, or the like.

In order to create each raster 1 in accordance with the present invention, very small points for recognizing information such as voice information, that is, key points 2, information points 3, and raster points 4 at grid points, are arranged according to a predetermined rule based on on the algorithm for creating point codes. As shown in figure 2, in the block raster 1, representing information, 5 × 5 raster points 4 at the points of the coordinate grid are located relative to one of the key points 2, and information points 3 are located around the central virtual points 5 of the grid, each of which is surrounded by four 4 raster dots at grid points. This block defines arbitrary information in the form of numerical values. In the example of FIG. 2, four blocks of raster 1 (shown by thick lines) are arranged in parallel. Needless to say, the number of raster 1 blocks is not limited to four.

One information and one program corresponding to one of the blocks may be output, or one information and one program corresponding to several blocks may be output.

As for the raster points 4 at the points of the coordinate grid, if the camera selects this raster 1 as image data, distortion of the camera lens, oblique image formation, enlargement or reduction of the sheet, the curved surface of the medium and print distortion can be corrected. In particular, a correction function (Xn, Yn) = f (Xn ', Yn') is calculated to convert four distorted raster points 4 at grid points to the original square, information points 3 are corrected using the same function, and vectors of correct information points 3.

If raster points 4 are located on raster 1 at grid points, the distortion introduced by the camera is corrected in the image data obtained by sampling raster 1 using the camera. Moreover, even if the image data on raster 1 is selected by a conventional camera with a lens with a high distortion coefficient, raster 1 can be accurately recognized. In addition, even if raster 1 is read by a camera tilted relative to the surface of raster 1, raster 1 can be accurately recognized.

As shown in FIG. 2, key points 2 are points obtained by arranging four raster points 4 at grid points present in the four corresponding corners of each block, offset in a certain direction. Key points 2 are representative points of raster 1 corresponding to one block representing information points 3. For example, key points 2 are obtained by displacing raster points 4 at the grid points present in the four corresponding corners of each block of raster 1 by 0.1 mm upwards. If the information points 3 are represented by the values of the X and Y coordinates, the positions of the coordinates are at the points where the key points 2 are shifted down by 0.1 mm. However, this numerical value is not limited to 0.1 mm and may vary depending on the size of each block of raster 1.

Information points 3 - these are points designed to recognize various pieces of information. Information points 3 are located around each key point 2, serving as a representative point, and are located at the end points, and the center surrounded by four raster points 4 at grid points is defined as a virtual grid point 5 and is expressed by a vector starting at a virtual grid point 5 and ending at end points. For example, each of the information points 3 is surrounded by raster points 4 at grid points. As shown in FIG. 3 (a), a point spaced 0.1 mm from the virtual point of the grid 5 has a direction and length expressed by a vector. In this case, the points are located in eight directions when the point rotates 45 degrees clockwise, and thus each point expresses three bits. Accordingly, one block of raster 1 can express 3 bits × 16 = 48 bits.

FIG. 3 (b) illustrates a method for determining an information point 3 having two bits for each grid in the raster shown in FIG. 2, that is, determining two-bit information by shifting the information point 3 in the + direction or X direction. In this case, one block may initially determine 48-bit information. However, data can be given in 32-bit intervals by dividing the block according to the purpose. By combining the + direction and the × direction, a maximum of 2 16 (approximately 65,000) raster formats can be implemented.

In the example shown, one information point 3 expresses three bits by arranging the information point 3 in each of the eight directions.

However, the present invention is not limited to this example, and each information point 3 can express four bits by arranging the information point 3 in each of 16 directions. Needless to say, the layout can be changed in different ways.

The diameter of each of their key points 2, information points 3 and raster points 4 at grid points is preferably approximately 0.05 mm for reasons of aesthetic appearance, print accuracy in accordance with paper quality, camera resolution and optimal digitization.

In addition, it is preferable that the distance between the raster points 4 at the grid points is approximately 0.5 mm in the vertical or horizontal direction for reasons of the amount of necessary information in the imaging area and possible erroneous recognition of various points 2, 3 and 4. Preferably, the key point 2 is offset from information point 3 by approximately 20% of the grid distance for reasons of possible erroneous recognition of possible erroneous recognition of raster points 4 at grid points and information points 3.

The distance between this information point 3 and the virtual grid point 5, surrounded by four raster points 4 at grid points, is preferably approximately 15-30% of the interval between adjacent virtual grid points 5. If the distance between the information point 3 and virtual grid point 5 is less than this interval, points can be recognized as a large spot, which will worsen the aesthetic appearance of raster 1. And vice versa, if the distance between the information point 3 and the virtual grid point 5 is greater than this interval, it is difficult to recognize that the information point 3 has the direction of a vector having a center around one of the neighboring virtual points of the grid 5.

For example, as shown in Fig. 4 (a), the interval between adjacent grids, where the information points I 1 -I 16 are located clockwise, starting from the center of the block, is 0.5 mm, and in a zone of 2 mm × 2 mm 3 bits × 16 = 48 bits are expressed.

In each block, it is possible to additionally provide for subunits having independent information content that is not affected by other information content. These subunits are shown in FIG. 4 (b). The subunits [I 1 , I 2 , I 3 and I 4 ], [I 5 , I 6 , I 7 and I 8 ], [I 9 , I 10 , I 11 and I 12 ] and [I 13 , I 14 , I 15 and I 16 ], each formed by four information points 3, are designed to load independent data (3 bits × 4 = 12 bits) into information points 3. By performing sub-blocks in this way, error checking can be easily performed for each sub-block .

Preferably, the directions of the vectors (directions of rotation) of the information points 3 are set at equal intervals of 30-90 degrees.

Figure 5 shows an example in which information points 3 and data defined by information points 3 are displayed in bits and another embodiment is shown.

In addition, if two types of vectors are prepared for each information point 3, i.e. long and short vectors from the virtual grid point 5, surrounded by raster points 4 at the points of the coordinate grid, and as each of these two vectors are eight directions, information point 3 can express four bits. It is preferable that each long vector has a length of about 25-30% of the distance between adjacent virtual points of the grid 5, and that the short vector has a length of about 15-20% of this distance. However, the central distance between the long and short vectors of the information points 3 is preferably greater than the diameter of each of these points.

For an aesthetic appearance, the number of information points 3 surrounded by four raster points 4 at grid points is preferably equal to one. However, if the amount of information needs to be increased, neglecting considerations of aesthetic appearance, then one bit is given one bit, and information points 3 are expressed as several points, and more information can be expressed in this case. For example, in the case of a vector having eight directions of a concentric ring, an information point 3 surrounded by four raster points 4 at grid points can express 2 8 information, and 16 information points in one block can express 2 128 information.

Figure 6 shows an example in which information points and data defined by information points are displayed as bits. Fig. 6 (a) shows that two points are located, Fig. 6 (b) shows that four points are located, and Fig. 6 (c) shows that five points are located.

7 illustrates modifications to a raster. Fig. 7 (a) is a schematic diagram of the arrangement of six information points, Fig. 7 (b) is a schematic diagram of the arrangement of nine information points, Fig. 7 (c) is a schematic diagram of the arrangement of 12 information points, and Fig. 7 ( d) is a schematic representation of the location of 36 information points.

Raster 1, shown in figures 2 and 4, illustrates an example of the location of 16 (4 × 4) information points 3 in one block. However, the number of information points 3 located in one block is not limited to 16, but can vary in different ways. For example, depending on the amount of necessary information or the resolution of the camera, six (2 × 3) information points 3 are located in one block, as shown in Fig. 7 (a), nine (3 × 3) information points 3 are located in one block as shown in Fig. 7 (b), 12 (3 × 4) data points 3 are located in one block, as shown in Fig. 7 (c), and 36 information points 3 are located in one block, as shown in Fig. 7 (d).

8-9 are exemplary views illustrating the relationship between rasters, code values, and identifiers.

Each raster is a raster formed by 4 × 4 block regions, and each of the blocks is divided into regions C 1-0 -C 31-30 . Figure 9 shows the code format of the points of the respective areas.

Fig. 9 (a) shows an example in which the raster is formed only by code values, and code values corresponding to points on the raster in the regions shown in Fig. 8 are registered in the region C 0 -C 27, respectively. Parity control is registered in the area C 28 -C 30 .

Fig. 9 (b) shows an example in which the X and Y coordinates are recorded, as well as the code values. Namely, in FIG. 8, the X coordinates, Y coordinates and code values are recorded in regions C 0 -C 7 , C 8 -C 15 and C 16 -C 27, respectively.

Thus, in this embodiment, the coordinates X and Y, as well as code values, can be recorded in the raster.

In addition, FIG. 9 (c) shows a format in which coordinate indices are recorded, as well as X and Y coordinates. A coordinate index is an area where a page number or similar designation of a sheet of paper serving as a medium is recorded, and where the identifier or the page number for identifying the medium itself, for which the X and Y coordinates are registered, can be registered as a raster.

As you can see, for rasters in accordance with the present invention, flexible formats can be used, such as the registration format of only code values, the registration format of code values and X and Y coordinates, or the registration format of X and Y coordinates, as well as coordinate indices.

<Description of GRID 2 rasters>

Next, with reference to figures 10-14, the basic principle of rasters is described in accordance with GRID 2. GRID 2 is an algorithm for arranging points using the difference method.

As shown in Fig. 10, the coordinate grid lines (y1-y7 and x1-x5) are built virtually at predetermined intervals in the X and Y directions. The intersection points of the coordinate grid lines are called grid points. In this embodiment, in each of the X and Y directions, there are four blocks (four grids), each (each) of which is a minimum block (one grid) surrounded by (surrounded by) four grid points, i.e. 4 × 4 = 16 blocks ( 16 grids) are located in the X and Y directions, and one information block is formed by 16 blocks (16 grids). The definition that an information block is formed by a unit of 16 blocks is just an example, and it goes without saying that one information block can be formed by an arbitrary number of blocks.

The four corner points that make up the rectangular area of this information block are accepted as corner raster points (x1y1, x1y5, x5y1 and x5y5) (circled dots in figure 10). Four corner raster points are made matching the corresponding grid points.

By identifying four corner raster points matching the corresponding grid points, an information block can be recognized. However, if the information block can be recognized, then the direction of the information block cannot be recognized only by angular raster points. If it is impossible to recognize the direction of the information block, for example, the following problems arise. If the same information block is rotated 90, -90 or 180 degrees and is scanned, the information on the information block is completely different from the information on the scanned block even with the same information block.

Accordingly, at the grid points of the rectangular region inside or near the rectangular region of the information block are the vector points (key points). 10, a point (x0y3) surrounded by a triangle is a key point (vector point), and this key point (vector point) is located at the first grid point, vertically to the midpoint of the coordinate grid line forming the upper side of the information block. In the same way, the lower key point of the information block is located at the first grid point (x4y3), vertically to the midpoint of the coordinate grid line forming the lower side of the information block.

In this embodiment, the mesh distance is set to 0.25 mm. Thus, one side of the information block is 0.25 mm × 4 grids = 1 mm. The area of the information block is 1 mm × 1 mm = 1 mm 2 . Within these limits, 14-bit information can be stored. If two of the 14 bits are used for control data, 12-bit information can be stored here. Setting the grid distance of 0.25 mm is just one example; it can be freely changed within, for example, 0.25-0.5 mm or more.

In GRID 2, the information points are alternately offset from the grid point in the X or Y direction. The diameter of each information point is preferably about 0.03-0.05 mm or more, and the offset value of each information point from the grid point is preferably set to about 15- 25% of the interconnect distance. Since this offset value also serves only as an example and is not always specified within these limits, generally speaking, if the offset value exceeds 25%, the raster usually visually appears as a separate image.

Namely, since the path by which the information point is offset vertically from the grid point (in the Y direction) alternates with the path by which the information point is offset horizontally from the grid point (in the X direction), the uneven distribution of the point arrangement is eliminated, and all points do not seem like moire or a single image. Therefore, the appearance of the surface of the printed sheet can be maintained aesthetically pleasing.

When using this principle of location, information points are always alternately located on the grid lines in the Y direction (see Fig. 11). From this it follows that for reading rasters it is sufficient to identify the coordinate grid lines alternately located in the Y direction or X direction, which allows us to simplify and speed up the calculation algorithm for the recognition information processing device.

In addition, if the rasters are deformed by the curvature of the surface of a sheet of paper or similar media, the coordinate grid lines are often not exactly linear. However, in this case, the grid lines are curved smoothly and are close to the lines, and therefore it is relatively easy to identify grid lines. In this regard, the algorithm is robust in deforming the surface of a sheet of paper and deflecting and distorting the reading optical system.

12 explains what the information point means. In Fig. 12, the symbol + means a grid point, and the symbol • - a point (information point). It is assumed that the information point located in the -Y direction relative to the grid point means 0, and the information point located in the + Y direction means 1.

With reference to FIG. 13, the specific state of the location of the information points and the reading algorithm are described below.

13, the information point indicated by the circled number 1 (hereinafter “information point (1)") is shifted in the + direction relative to the grid point (x2y1) and thus means “1”. The information point (2) (indicated by the circled number 2 in FIG. 13) is offset in the + Y direction relative to the grid point (x3y1) and thus means “1”. The information point (3) (indicated by the circled number 3 in FIG. 13) is offset in the -X direction relative to the grid point (x4y1) and thus means “0”. The information point (4) (indicated by the circled number 4 in FIG. 13) means “0”, and the information point (5) (indicated by the circled number 5 in FIG. 13) means “0”.

In the case of the raster shown in FIG. 13, the information points (1) to (17) have the following meanings.

(1) = 1

(2) = 1

(3) = 0

(4) = 0

(5) = 0

(6) = 1

(7) = 0

(8) = 1

(9) = 0

(10) = 1

(11) = 1

(12) = 0

(13) = 0

(14) = 0

(15) = 0

(16) = 1

(17) = 1

In this embodiment, the values of the information bits are calculated using the information collection algorithm based on the difference method, which will be described below. Alternatively, data points may be output as information bits without processing them. Alternatively, the true value for each of the information bits can be calculated by calculating the value in the data protection table, which will be described later.

With reference to FIG. 13, a method for collecting information using the raster difference method in accordance with this embodiment will then be described.

In the description of this embodiment, the number in parentheses means the number circled (enclosed in a circle) in FIG. 13, and the number in square brackets is the number in the square in FIG. 13.

In this embodiment, the values of 14 bits in the information blocks are expressed by the differences between the respective neighboring information points. For example, the first bit is obtained by calculating the difference between the information point (1) and the information point (5), located in such a way as to stand from the information point (1) by +1 grid in the direction X. That is, [1] = (5) - (one). In this case, the information point (5) means "0", and the information point (1) means "1", so the first bit [1] means 0-1, that is, "1". Similarly, the second bit [2] is expressed as [2] = (6) - (2), and the third bit [3] is expressed as [3] = (7) - (3).

In the following difference formulas, the value is taken as absolute.

[1] = (5) - (1) = 0-1 = 1

[2] = (6) - (2) = 1-1 = 0

[3] = (7) - (3) = 0-0 = 0

Then the fourth bit [4] is determined by calculating the difference between the information point (8) directly below the point of the vector and the information point (5). Accordingly, the fourth bit [4] - the sixth bit [6] is determined by calculating the differences between the information points located in such a way as to stand one grid in the + X direction and one grid in the + Y direction, respectively.

In this case, the fourth bit [4] - the sixth bit [6] can be calculated according to the following formulas:

[4] = (8) - (5) = 1-0 = 1

[5] = (9) - (6) = 0-1 = 1

[6] = (10) - (7) = 1-0 = 1

In addition, the seventh bit [7] - the ninth bit [9] is determined by calculating the differences between the information points located in such a way as to stand one grid in the + X direction and one grid in the -Y direction, respectively.

In this case, the seventh bit [7] - the ninth bit [9] can be calculated according to the following formulas:

[7] = (12) - (8) = 0-1 = 1

[8] = (13) - (9) = 0-0 = 0

[9] = (14) - (10) = 0-1 = 1

The tenth bit [10] - the twelfth bit [12] are determined by calculating the differences between information points located in such a way as to stand 1 grid in the + X direction, and are expressed respectively by the following formulas.

[10] = (15) - (12) = 0-0 = 0

[11] = (16) - (13) = 1-0 = 1

[12] = (17) - (14) = 1-0 = 1

Finally, the thirteenth bit [13] and the fourteenth bit [14] are determined by calculating the differences between the information point (8) and the information points located in such a way as to stand +1 grid and -1 grid in the X direction from the information point (8) , and are expressed respectively by the following formulas.

[13] = (8) - (4) = 1-0 = 1

[14] = (11) - (8) = 1-1 = 0

Although these bits from the first [1] to fourteenth [14] can be used as real data, that is, as true values, to guarantee protection, a data protection table corresponding to these 14 bits can be provided, and true values can be obtained by determining the main parameters corresponding to these 14 bits, and perform addition, multiplication, or similar actions with these basic parameters in relation to the corresponding real data.

In this case, the true value of T can be calculated as Tn = [n] + Kn (where n is 1-14, Tn is the true value, [n] is the real value, and Kn is the main parameter). A data protection table storing these basic parameters can be recorded in read-only memory (ROM) included in the optical reader.

For example, the following basic parameters are set in the data protection table:

K 1 = 0

K 2 = 0

K 3 = 1

K 4 = 0

K 5 = 1

K 6 = 1

K 7 = 0

K 8 = 1

K 9 = 1

K 10 = 0

K 11 = 0

K 12 = 0

K 13 = 1

K 14 = 1

In this case, the true values of T1-T14 can accordingly be calculated as follows:

T 1 = [1] + K 1 = 1 + 0 = 1

T 2 = [2] + K 2 = 0 + 0 = 0

T 3 = [3] + K 3 = 0 + 1 = 1

T 4 = [4] + K 4 = 1 + 0 = 1

T 5 = [5] + K 5 = 1 + 1 = 0

T 6 = [6] + K 6 = 1 + 1 = 0

T 7 = [7] + K 7 = 1 + 0 = 1

T 8 = [8] + K 8 = 0 + 1 = 1

T 9 = [9] + K 9 = 1 + 1 = 0

T 10 = [10] + K 10 = 0 + 0 = 0

T 11 = [11] + K 11 = 1 + 0 = 1

T 12 = [12] + K 12 = 1 + 0 = 1

T 13 = [13] + K 13 = 1 + 1 = 0

T 14 = [14] + K 14 = 0 + 1 = 1

Fig. 14 illustrates the correspondence between information bits, data protection table, and true values described above.

The above describes an example of obtaining information bits from information points and obtaining true values by accessing the data protection table. Conversely, if the rasters are created using true values, the value of the nth bit [n] can be calculated as [n] = Tn-Kn.

For example, if T1 = 1, T2 = 0 and T3 = 1, bits from the first [1] to the third [3] can be calculated using the following formulas

[1] = 1-0 = 1

[2] = 0-0 = 0

[3] = 1-1 = 0

Bits from the first [1] to the third [3] are expressed respectively by the following difference formulas:

[1] = (5) - (1)

[2] = (6) - (2)

[3] = (7) - (3)

If the initial values are given (1) = 1, (2) = 1 and (3) = 0, points (5) - (7) can be calculated as follows:

(5) = (1) + [1] = 1 + 1 = 0

(6) = (2) + [2] = 1 + 0 = 1

(7) = (3) + [3] = 0 + 0 = 0

Although the calculation is not shown, the values of points (8) - (14) can be calculated in a similar way, and these points can be located depending on their respective values.

It should be noted that the initial values of points (1) - (3) are arbitrary random numbers (0 or 1).

Thus, by adding the values of information bits [1] - [3] to the selected starting points (1) - (3), we can calculate the values of points (5) - (7) located on the next coordinate grid line in the Y direction. Thus, by adding the values of information bits [4] - [6] to the values of points (5) - (7), one can calculate the values of points (8) - (10). In addition, by adding the values of the information bits [7] - [9] to these values, it is possible to calculate the values of points (12) - (14). In addition, if we add the values of information bits [10] - [12] to these values, we can calculate the values of points (15) - (17).

The values of points (4) and (11) can be calculated by subtracting the information bit [13] from the calculated point (8) and adding the information bit [14] to the point (8).

Thus, in accordance with this embodiment, the location of the points on the grid line of the Yn coordinates is determined by the location of the points on the grid line of the Y (n-1) coordinates, and these calculations are repeated successively until the location of all information points is calculated.

(Paper keyboard)

15-17 are diagrams explaining a paper keyboard as one embodiment of the present invention.

15 is an explanatory diagram illustrating a paper keyboard performing various inputs / operations performed by a personal computer using a scanner (SCN) to read rasters printed on one surface of a paper keyboard serving as a medium (surface of a medium). This paper keyboard looks like a book with one main side closed. A keyboard layout (keys) is printed on each surface of the book page.

In particular, as shown in FIG. 16, several rectangular areas of images are provided that mimic the keys of a personal computer and in which characters of the Japanese syllabary alphabet (hiragana) or letters of the alphabet (for example, “a in hiragana”, “i in hiragana” are printed "," A ", and" B ") or words consisting of several letters or characters (for example," SEND "and" YES ").

The code values of the interrupt keys corresponding to the corresponding letters or characters (in the rectangular area of the images for one letter or character) are registered as rasters in each of the rectangular areas of the images. The code values of the interrupt keys are the same as the code values defined for letters or characters on the keys of the hardware keyboard.

For example, if a scanner (SCN) reads a raster of a rectangular image area in which the letter of the alphabet "A" is printed, the code value of the interrupt key created by pressing the "A" key on the hardware keyboard is entered into the personal computer (information processing device).

As a function that the hardware keyboard does not include, words such as greetings, such as “HAPPY TO SEE YOU”, “LONG YEAR SEEED” and “THANKS”, are printed as rectangular areas of images, and in the corresponding areas printed as rasters are lines of code values from character strings matching these words. Although character strings can be printed as rasters of code values placed on the keys as they are, the code value of an input command with given digits can be printed as a raster, the corresponding character input information can be stored in advance in the index table described below with reference to FIG. 24.

Fig. 16 shows rectangular areas of images in which words such as "BROWSE WEB SITE" and "SEND EMAIL" are printed. In the first case, the code value of the start command of the browser program is printed, and in the second case, the code value of the start command of the email delivery program is printed.

In Fig.16, the location of the keyboard is made in the order of the Japanese syllabary or in alphabetical order. However, the location of the keyboard is not limited to the location shown in FIG. 16, and may be identical to the location of the keyboard according to Japanese industry standard.

In Fig. 16, in the respective rectangular regions of the images on the paper keyboard, the above coordinate values as well as code values are registered as rasters.

If the raster is displayed on the surface of a paper keyboard (media surface), you can use a code value instead of entering it from the keyboard, or you can use the coordinate value instead of entering it with the mouse or a tablet. Regarding what to use - a code value or a coordinate value, rectangular areas of images may be provided on the paper keyboard in which the characters “CHANGE CODE / COORDINATE” are displayed, in this area a code value can be printed as a raster to switch between using the code value and using the coordinate value so that when scanning this rectangular image area, the input can be switched between entering a code value and entering a coordinate value.

In addition, code values printed in rectangular areas of images as rasters can be used as code values having a semantic value that differs from their semantic value in a different reading procedure.

For example, if the read image of the coordinate value changes, as in the case when the raster in the rectangular region of the image "A" is continuously read for a predetermined time (the scanner taps on the rectangular region of the image "A"), that is, the tip of the scanner moves repeatedly in the vertical direction, in order to continuously touch the surface of the carrier and to separate from it, or in the case when the scanner performs the operation of striking, that is, the scanner rubs the image of the key in the longitudinal or transverse direction, the scanner You can switch to a state similar to the state of pressing the Shift key on the hardware keyboard.

In particular, if the scanner (SCN) forms an “a” image of lower case (lowercase letters), the interrupt key code corresponding to the letter “A” is entered into the personal computer, and then the scanner (SCN) is tapped, then the central processing unit (CPU) the personal computer detects a change in the image read by the scanner according to the program, converts the interrupt code value corresponding to the lower case "a" into the interrupt code value corresponding to the upper case "A", and provides the resulting code value interruption to an application program such as a word processor.

In addition, if a tapping operation is performed, it may be found that the tapping operation is performed by reading the raster only for the first time and then determining only the light intensities recognized by the CMOS scanner sensor (SCN).

In addition to this tapping operation, the central processing unit (CPU) of a personal computer can perform the following operation before applying the code value to the application program: if the scanner stops for a certain time or longer on one rectangular area of the image to read the raster, the central processing unit (CPU) determines that this letter represents an "A" uppercase; the scanner stops at it for a certain time or less, the central processing unit (CPU) determines that this letter is a lowercase "a"; the central processing unit (CPU) supplies the code value to the application program.

In addition, the input can be switched between upper case input and lower case input, in which case the code value can be supplied to the application program as a code value in which the coordinate value changes when reading the same code value on a rectangular image area (scratching operation scanner), or in which the tilt of the scanner is detected by a change in the distribution of light intensities within the read image.

Fig. 31 (b) is a diagram illustrating a tapping operation on a grid.

In particular, during the operation of tapping on the scanner grid (SCN), the scanner is installed perpendicular to the bitmap image, the scanner is vertically moved, and the icon (in the image on the key in the form of the letter "A" of the alphabet in this example) is tapped on the surface of the medium.

FIGS. 31 (a) and 31 (c) are explanatory diagrams illustrating a striking operation on a scanner grid (SCN).

A grid striking operation means the operation of moving the scanner by an operator on a raster image so that the scanner wipes the surface of the raster image several times. The user (operator) performs the operation of striking the grid on the icon (in the image on the key in the form of the letter "A" of the alphabet in this example) on the surface of the medium. This operation allows you to switch the letter input into the application program between the interrupt code "A" of the upper case and the interrupt code "a" of the lower case.

32 is an explanatory diagram illustrating an operation for grinding a mesh with a scanner (SCN).

The mesh grinding operation means the operation of rotating the rear end of the scanner (the upper end of the scanner in Fig. 32) when forming the image of the same icon (image on the key in the form of the letter "A" of the alphabet in this example) on the surface of the medium. Performing grinding the mesh in the right direction relative to the surface of the sheet (in the clockwise direction) may be referred to as “grinding the mesh to the right”, and performing grinding the mesh in the left direction (counterclockwise) may be referred to as “grinding the mesh left."

As shown in FIG. 78, if the central processor recognizes the light and shadow of the image read by the scanner, and the areas of light and shadow change relative to the center of image formation, the central processor can recognize that the scanner is operating as shown in FIG. 32. By this operation of the scanner, it is possible to change the case of the keyboard, control, convert the entered letter, etc.

In addition, another example of an operation in which the light and shadow of an image read by a scanner are changed is a wiggle operation on a grid (not shown). The swaying operation on the grid means the operation of re-pressing the scanner forward or backward.

78 is a diagram explaining a relationship between a tilt and an angle of a scanner.

The rasters on the images on the keys are printed superimposed in the same direction as the longitudinal direction of the sheet surface. As shown in FIG. 78 (a), the angle between the direction of the raster and the direction of the camera in the scanner is taken as α. As shown in FIG. 78 (b), the angle between the tilt of the scanner and the direction of the camera when the user tilts the scanner is adopted as β. In this case, the angle γ between the direction of the raster and the tilt of the scanner corresponds to the angle at which the scanner is tilted relative to the images on the keys. Namely, the angle γ is expressed by the following formula:

γ = α + β

Figures 79-83 are diagrams explaining calculation methods for the light and shadow of an invention read by a scanner and the tilt direction for the above-described scanner operations.

As shown in FIG. 78 (b), the inclination of the scanner (imaging device) relative to the vertical direction of the surface of the medium (the image on the keys) can be recognized by the difference in brightness in the field of view of the image formation of the scanner.

As shown in FIG. 80 (a), the tilt direction of the scanner corresponds to the angle between the scanner and the bitmap. The direction in which the user tilts the scanner can be determined as follows.

Calibration is performed first. The scanner is placed perpendicular to the bitmap image, and in this position, the brightness of cells 1-48 shown in Fig. 79 is measured. On Fig shows the peripheral region of the scanner. The measured brightness is taken as BL0 (i), where i is the value of the cell whose brightness was measured. For example, the brightness of the 24th cell is denoted as BL0 (24).

Two LEDs are placed in the scanner. Moreover, even if the scanner is placed perpendicular to the bitmap image, the cells near the LEDs and the cells in the positions allocated from the LEDs differ in brightness. Therefore, calibration is performed.

Then measure the brightness when the scanner is tilted. As shown in FIG. 80 (a), the brightness of cells 1-48 is measured by tilting the scanner in a certain direction, and the brightness of cell i is taken as BL (i). For each of the cells, the difference between BL (i) and BL0 (i) is calculated. Then perform the following calculation:

Max (BL0 (i) -BL (i))

When the scanner is tilted, the area in the opposite direction to the tilt is dark. Since the LEDs are tilted in the direction in which the scanner is tilted, the distance between the region in the opposite direction to the tilt and the LEDs is greater than the distance between the region in the tilt direction and the LEDs. Accordingly, as shown in FIG. 80 (b), the scanner is tilted to a position that is opposite in direction from the cell with the maximum difference.

As a result, the scanner tilt direction is determined.

With reference to FIGS. 79-80, another method for determining the direction and angle of inclination by performing calibration is described below.

Calibration is performed first. The scanner is placed perpendicular to the bitmap, the brightness of cells 1-48 shown in Fig. 79 is measured, and the brightness of cell i is taken as BL0 (i).

Then the scanner is tilted at an angle of 45 degrees and rotated with the tip of the pen mounted as an axis, as shown in Fig. 80. In this case, the brightness of cell i, if the scanner is set to the position of cell i, is accepted as BL45 (i). Measure the brightness of BL45 (i) cells 1-48. These operations fully perform calibration.

Then, the brightness of cells 1-48 is measured when the user tilts the scanner, and the brightness of cell i is taken as BL (i), where i = 1 and n (= 48). In addition, perform the following calculation:

Figure 00000001

BL0 (i) -BL45 (i) is a constant. Moreover, if the value BL0 (i) -BL (i) is the maximum value, that is, BL (i) is the minimum value,

Figure 00000002

is the maximum value. As already noted, the area in the opposite direction to the scanner tilt is the darkest. In this case, the direction opposite to cell i, in this case, corresponds to the direction of inclination of the scanner.

In addition, according to the following formula, the scanner angle is determined:

Figure 00000003

In the above formula, a linear relationship between the angle θ and the brightness is assumed. Certainly, accuracy can be improved by approximating the angle θ as changing in a trigonometric or similar function. For this, the angle is expressed by the following formula:

Figure 00000004

82 illustrates a method for measuring the direction of inclination using the Fourier function.

As shown in FIG. 81, eight cells are defined as measurement points, i.e. cells 1-8, and accordingly measure the brightness of the cells.

Sinusoidal function is expressed as follows:

Figure 00000005

There are two unknowns in this equation.

Accordingly, if there are n measurement points, then there are n discrete points. In this case, the sum of n / 2 sinusoidal functions is calculated, and this sum corresponds to the brightness BL (i) at a radius from the center of analysis. In particular, BL (i) is expressed as follows:

Figure 00000006

In this equation, n = 2m (where n is the number of measurement points).

In this embodiment, eight measurement points are defined, and therefore n = 8. Accordingly, combining the four formulas of sinusoidal functions, the Fourier series α1-α4 and β1-β4 are calculated. The brightness BL (i) at a radius from the center of analysis is expressed by the sum of four sinusoidal functions.

As can be understood from these formulas, the angle θ at which BL (i) has a minimum value corresponds to the darkest position and direction, and the direction 180 degrees opposite is the direction of the scanner tilt.

On Fig illustrates a method of measuring the direction of the slope by solving the equation of the nth degree.

The graph in FIG. 83 represents an nth power function. When using the function of the nth degree, the brightness BL (i) at a radius from the center of analysis is expressed as follows:

BL (i) = α1 (θ-β1) · α2 (θ-β2) ... αj (θ- (βj), where j = n / 2, n = 2m

As shown in FIG. 81, since the number of measurement points in this embodiment is eight, eight solutions need to be obtained. One equation includes two unknowns, αj and βj, therefore, to obtain α1-α4 and β1-β4, four equations must be solved.

Solving these equations, we obtain the angle θ at which BL (i) has a minimum value. The position at the angle θ is the darkest position, and the direction 180 degrees opposite the direction of the angle θ is the direction of the scanner tilt.

The measurement methods shown in figures 82 and 83, the tilt of the scanner relative to the vertical line of images on the keys cannot be measured. Therefore, the specific angle of the scanner can be measured by combining this method with the measurement method shown in figures 79-80.

In addition, as described with reference to FIG. 78, if the rasters on the surface of the medium are read using a scanner, the inclination of the scanner relative to the surface of the medium is recognized by the difference in light and shadow of the image read by the scanner. In this case, you can perform the operation of the graphical user interface on the screen depending on the direction of the scanner tilt relative to the surface of the medium.

As shown in FIG. 78, if a central processing unit (CPU) recognizes light and shadow of an image read by a scanner, and areas of light and shadow move to the opposite side of the center of the image, the central processing unit (CPU) can determine that the scanner is inclined relative to the surface of the medium.

On the other hand, if the light and shadow of the read image changes with rotation relative to the center of the image, the central processing unit (CPU) determines that the scanner is performing a mesh grinding operation (see FIG. 32).

Further, if the light and shadow of the read image are repeatedly changed forward or backward relative to the center of the image, the central processor determines that the operation of re-pressing the scanner forward or backward (swaying operation on the grid) is performed. After this scanner operation, a graphical user interface operation, such as moving the cursor, is displayed on the display screen, or the screen can be scrolled.

Specific examples of operations of the graphical user interface on the screen include mouse-controlled operations, such as scrolling the screen, moving the cursor, specifying icons or pictograms on the screen, transfer operation with fixing to a new location (“drag and drop”), selecting a command (item) menu and an operation for issuing a command for an input position of a letter, character, or the like.

(Paper controller)

Figs. 18-30 are diagrams explaining a paper controller as one embodiment of the present invention.

As shown in FIG. 18, on the surface of the paper controller (the surface of the medium), i.e., on the surface of the paper or similar medium, commands for a browser program (for example, Internet Explorer (trade name) of Microsoft Corporation) are printed as access areas to the Internet. As shown in Fig. 18, icon areas are printed on the paper controller, indicating the command "USER REGISTRATION (OPERATOR)", a command to move the cursor when browsing web pages, a copy / link command with a URL, a command for the operation of the registration / editing panel displayed on the display device, the command to open or close the registration / editing panel and the command to remove the URL of the registration / editing panel and communication with the registration / editing panel. In the area of the icons, respectively, rasters are printed, indicating command codes. For example, in the area "UP Δ (triangle up)" in the area of the scroll icon for viewing web pages, an interrupt code is registered to move up the screen displayed by the browser program. In the "DOWN Δ (triangle down)" area, an interrupt code is registered to move down the screen displayed by the browser program.

19 illustrates a paper controller for registering URLs on the Internet as bookmarks. Rectangular areas (icon areas) bearing the letter G of the alphabet are arranged in nine rows and eleven columns. In these 99 areas of the icons, respectively, rasters of different code values are registered. In addition, icon areas representing categories are provided on the right in nine rows and two columns.

FIG. 20 is an explanatory diagram illustrating a state in which a scanner (SCN) reads rasters printed on a surface of a paper controller (media surface) explained with reference to FIGS. 18 and 19, while performing various operations of a personal computer.

The main body of the paper controller is made of a sheet of paper or synthetic resin and has a layered structure in which a printing surface, including rasters, is formed on the upper surface of the main body of the paper controller, and in which a transparent protective sheet is laid on the printing surface. Needless to say, this protective sheet is not always substantial, and the printing surface may be open.

FIG. 21 shows that the pictogram areas of the paper controller explained with reference to FIG. 19 are made as removable stickers affixed to a diary or similar notebook related to speech data, music data, and the like.

Fig. 24 (a) shows a table of local indices provided on a hard disk (HD) device of a personal computer.

As shown in FIG. 24 (a), the code numbers and instructions denoted by point codes are made corresponding to each other in the local index table. In particular, the contents (contents) of the index table can be classified as an area related to the ID (identification) (for members) for registering the commands executed when the IDs of the rasters registered on the tags are read (first area: the area indicated by "ID (for terms) "in Fig. 24 (a)), as an area in which each code number obtained by reading and converting the raster of the paper controller is associated with an access destination (second area: the area indicated by" paper controller (paper controller) "on Fig.24 (a)), and as an area in to Each code number is associated with a destination for registering content (third region: an area indicated by “medium (medium)” in FIG. 24 (a)).

For example, as an example of using the first area, if the first digit of the code number is 1 as a result of reading the tag raster, the central processing unit (CPU), referring to this index table in the analysis program, recognizes that the information is from the tag. In this case, the central processing unit (CPU) enters the point code management server and the index table (control server table shown in FIG. 24 (b)) in the point code management server.

As an example of using the second area, if the code numbers are 00001-00004 and the subsequent ones as a result of reading the rasters of the paper controller, the central processing unit (CPU) enters the files corresponding to the corresponding code numbers.

For example, as shown in FIG. 24 (a), a set of drive name, startup file, and parameter is registered as an access destination. In particular, if the raster read from the paper controller is a code number 00001, then this code number 00001 corresponds to the email application, and a command is set as a parameter, which means creating a new email. This starts the email program with a switch to a state in which you can create a new email.

Further, if the raster read from the paper controller is a code number 00002, it is intended to start the playback device

video files, and at the same time the software of the playback device registered in the personal computer is launched.

In addition, like a hardware keyboard, a string of characters can be directly entered from a paper controller. For example, if code number 00003 is read, then the letter “A” or “B” of the alphabet is entered for a particular application, and a character code is supplied to the application.

As an example of using the third area, if the raster printed in the order catalog by mail or the like (carrier) is read, and the raster is a code number 00100 or more, the URL corresponding to this code number is accessed (BROWSE WEB SITE), program execution and launch (play) movie file.

In this case, if the scanner reads the raster and converts this raster into a code number, and this code number is not in the index table, the central processing unit (CPU) enters the management server through the network.

The management server contains a user database for managing personal information and a server index table (see Fig. 24 (b)). Personal information corresponding to the code number read from each tag is registered in the server table for managing personal information (not shown). If personal information is not registered for the code number having 1 assigned to the first digit, the central processing unit (CPU) of the management server downloads the initial registration program in the personal computer in accordance with this program. In accordance with this initial registration program, personal information about the user is entered, such as address, name and phone number. In this case, on the basis of the entered personal information, a user database of the management server is created.

In particular, personal information corresponding to the tag is registered in the user database of the management server, thereby providing easy access to the network and performing authentication processing, such as ordering.

A table similar to the local index table described with reference to FIG. 24 (a) is also created in the management server table.

This management server table is a table to supplement the code numbers registered in the local index table. If the code number obtained as a result of reading by the scanner is not in the local index table, the control server table is accessed.

For example, if the result of reading by the scanner indicates the code number 00200, which is not in the local index table, the central processing unit (CPU) of the personal computer, in accordance with the program, enters the management server through the network and accesses the management server table.

In the management server table, code number 00200 defines access to the specified URL (BROWSE WEB SITE), so that the personal computer accesses the URL (BROWSE WEB SITE).

If the code number is, for example, 00201 and means distribution with streaming data, the personal computer enters the distribution server to download the streaming data to the index table.

In this case, the personal computer loads into the index table of the personal computer not only the streaming data, but also the contents of this table of the management server.

Therefore, subsequently, even if the scan result indicates the code number 00201, the personal computer can perform processing using only the local index table without access to the management server table.

25-28 are diagrams illustrating a paper controller in accordance with another embodiment.

The paper controller shown in these figures is almost similar to the paper controller described with reference to FIGS. 18-20, except that a guide block is provided corresponding to each predetermined area of the icon.

As shown in FIG. 27, an additional plastic sheet is provided on the upper surface of the main body of the paper controller, part of which protrudes on the side of the open surface and forms a guide block in the form of a rib.

Preferably, this guide block is so high as to allow the operator holding the scanner to detect this block as a slight obstruction in the sliding direction when the operator slides the tip (lower tip in FIG. 27) of the scanner over the surface of the top sheet (moves the tip of the scanner along the sheet surface ) The operator can continue to move the tip along this surface and deliberately move through this guide block.

Due to the presence of these guide blocks, the operator can, without looking, place the scanner on the intended area of the icon if he has learned the relationship in position between the guide blocks and the areas of the icons on the main body of the paper controller. For example, in FIG. 25, in each of the rectangular areas surrounded by the guide blocks, up to four icon areas are provided (for example, “URL”, “COMMUNICATION”, “ALL URLs” and “ALL COMMUNICATIONS” in the middle area in the left column of FIG. 25). By moving the scanner until the scanner can move in the directions up left, up right, down left and down right (moving the scanner four corners of each guide block), the user (operator) can precisely stop the scanner on each area of the icon for reading by the scanner the code value of each area, not paying attention to the print surface of the paper controller in the hands.

These guiding devices can be implemented as protrusions on cards by embossing or the like. Alternatively, as shown in FIG. 28, the rails can be made as separate plastic rails so that only cards can be replaced without replacing the rails.

As shown in FIGS. 29 and 30, in each area surrounded by guide blocks of this paper controller, embossed points as well as a raster are provided. Due to the location of the relief points and the raster in one area, it is possible to ensure the same input efficiency with that of a healthy person, even if a blind user (operator) uses this scanner.

In particular, in FIG. 30, a raster is printed at the top of each of the specified rectangular areas of the medium (for example, a sheet of paper or a sheet of synthetic resin), and embossed dots are provided at the bottom thereof. In addition, rectangular areas are surrounded by walls (blocks). Moreover, even a blind user can touch different areas by touch, moving the tip of the scanner through these blocks.

In this embodiment, the regions in which the rasters are printed and the regions in which the raised dots are printed are made separately. However, the present invention is not limited to this example. Needless to say, rasters and bumps can be printed superimposed in one area.

(Mouse pad)

33-39 are diagrams illustrating a mouse pad in accordance with one embodiment of the present invention.

33 is an explanatory diagram illustrating a mouse pad system for performing various operations of a personal computer using a scanner (SCN) for reading rasters printed on one surface of a mouse pad serving as a medium (media surface).

Like the paper controller and paper keyboard described above, this mouse pad is made of a sheet of paper or synthetic resin and has a layered structure in which a printing surface is formed on the upper surface of the main body of the paper controller, and in which a transparent protective sheet is laid on the printing surface sheet. Needless to say, this protective sheet is not always substantial, and the printing surface may be open.

As shown in FIG. 34 (a), the printing surface has an inner circular region and an annular outer circumferential region.

In the inner area of the round shape, the coordinate value and code A are printed as a raster. In the outer circumferential area, the coordinate value and code are printed as a raster. If the mouse pad is used to enter coordinates, the mouse pad, like a graphics tablet, can enter coordinates using all areas on the ring.

Fig. 34 (b) shows that in the outer circumferential region, image regions are provided in which the alphabetic code values are registered.

This mouse pad is not always round: it can be rectangular, as shown in figures 34 (c) and 34 (d).

On Fig shown that in the annular outer circumferential region are the area of the input commands to the personal computer. In each area of the input commands, a code value for the computer operation is printed as a raster. The mouse pad includes the functions of a conventional mouse pad and the functions of a paper controller as described above.

In FIG. 35, each functional area (the area indicated by the circled number in FIG. 35) functions as described below. In the Detailed Description section, the numbers in parentheses are used instead of the corresponding circled numbers.

(1) SELECT RANGE

The user touches the icon (functional area) with the scanner and moves the cursor inside the inner frame with the operation of the scanner. Having defined the starting point, the user detaches the scanner (SCN) from the icon. If the user again touches the icon to move the cursor, determines the end point and detaches the scanner (SCN) from the icon, the text is displayed in blue during this time and becomes active.

(2) COPY

If the user touches the icon (functional area) by the scanner, the text selected by the range is stored in memory. The copied text is placed at the top of the memory list.

(3) CUT

When the user touches the scanner with an icon (functional area), the text selected by the range is deleted and stored in memory. The cut text is placed at the top of the memory list.

(4) DEFINE THE POSITION FOR THE INSERT

If the cursor is not in the input mode, the user touches the icon (functional area) by the scanner, moves the cursor using either the cursor keys → ← ↑ ↓ or the scanner operation inside the internal frame, and opens the scanner, thereby determining the position for insertion.

(5) INSERT

If the user touches the icon (functional area) by the scanner, the text stored in the memory and activated is inserted from the cursor position in the input mode.

(6) REMOVE

If the user touches the icon (functional area) by the scanner, the text previously selected by the range is deleted. If the text has not been previously selected by the range, the text after the cursor position in the input mode is deleted one letter or a character. If the scanner is pressed for a long time, equal to or greater than two seconds, letters or signs are continuously deleted until the user tears the scanner from the icon.

(7) Return to one item with erasure

If the user touches the icon of the scanner (functional area), the text before the cursor position in the input mode is deleted one letter or a character. If the scanner is pressed for a long time, equal to or greater than two seconds, letters or signs are continuously deleted until the user tears the scanner from the icon.

(8) GAP (TRANSFER) LINE

If the user touches the icon of the scanner (functional area), a line break (transfer) is performed, and the cursor position in input mode moves to the beginning of a new line.

(9) CANCEL

When the user touches the icon (functional area) by the scanner, the mode is canceled, and the personal computer enters the standby state if the user, by clicking on the icon (functional area) (1), (4) or (15), performs no more actions.

(10) CANCEL THE PREVIOUS OPERATION

When the user touches the icon (functional area) by the scanner, the performed operation is canceled, and a return to the previous state occurs. Return to the previous state can be repeated several times.

(11) CURSOR →

(12) CURSOR ←

(13) CURSOR ↑

(14) CURSOR ↓

If the user touches the scanner with one of these icons (functional areas), the cursor position in the input mode moves one letter or character in the direction of the cursor. If the scanner is pressed for a long time, equal to or greater than two seconds, the cursor position moves continuously in the direction of the arrow. If a drop-down menu is displayed, the active icon from the displayed menu items, when you touch the icon (function area) (13) or (14), moves up or down, respectively.

(15) DISPLAY MEMORY

When this icon is clicked, the list of selected and copied or cut texts is displayed at the top in a new order. When you touch the icon (functional area) (11) or (12), the active item can move up or down, respectively. Until the active item is deleted (6), all texts are saved.

(16) ENTER

If the cursor position is shifted to a predetermined position and a command is present in this position, this command is executed when this icon (functional area) is touched. A kana-kanji conversion or similar conversion may be performed. This icon (functional area) by function corresponds to the usual ENTER key.

Fig. 36 (a) -36 (d) and Fig. 37 (a) -37 (b) illustrate a web page scroll operation in accordance with an Internet browser program by a scanner operation using this mouse pad.

Fig. 38 (a) is a plan view of a three-dimensional mouse pad, and Fig. 38 (b) is a sectional view of a three-dimensional mouse pad.

The mouse pad has annular grooves designed to allow the operator holding the scanner to recognize the difference between the areas by touch.

These grooves can be not only annular, as shown in Fig. 38, but also radial, as shown in Fig. 39.

(Another paper controller)

On Fig presents a new proposed keyboard in the form of this input reader.

On this keyboard around the “H”, “DOUBLE”, “Y”, “CONVERT” and “INPUT” images of the corresponding keys are fan-shaped. Images of the corresponding keys are located in offset positions so as not to be linear.

Like the images of the corresponding keys, the vowels (“A”, “I”, “U”, “E” and “O”) are located closer to “H”, “DOUBLE”, “Y”, “CONVERT” and “ENTER” , and consonants ("K", "S", "T", "N", "M", "Y", "R" and "W") are located farther from them.

Rasters, each having a code value and XY coordinates, as shown in Fig. 9 (b), recorded in the same format, are respectively printed overlaid on the images of these keys.

XY coordinates can be determined independently of the icons, or can be determined for the entire surface of the medium.

When using this keyboard, letters or characters can be entered by touching the scanner (SCN) on the surface of the media and tearing it away from it. For example, in order to enter "KASA (meaning kanzi-UMBRELLA)", the scanner reads part of the image of the "K" key. The scanner (SCN) sequentially rubs (slides) on the paper keyboard in the order "A" → "S" → "A". The operation between the keys can be recognized by changes in the coordinates of the rasters printed superimposed on the medium. Then the scanner comes off (rises up) from the image of the "A" key. The central processing unit (CPU) of the personal computer recognizes the “conversion command” by entering the Roman letters “KASA” and starts the operation in accordance with the recognition program, and issues the conversion command to the application program (Japanese character input program) for the personal computer or the like. As a result, at the cursor position on the display device, “KASA” is displayed in kanji. If the entered letters need to be converted into a Japanese character, the scanner can read "CONVERT".

In addition, to enter “TOKKYO” (see FIGS. 42 (1) -42 (5)), a part of the image of the “T” key is read by the scanner. The scanner then sequentially moves to "O" → "K" → "DOUBLE" → "Y" → "O", and either breaks (rises) from the image of the key of the last letter "O" or moves further to "CONVERT". In this case, “DOUBLE” is the area to be read if the previous letter is entered twice in a row.

The central processing unit (CPU) of the personal computer recognizes the "conversion command" by entering the Roman letters "CURRENT (DOUBLE) YOU" and starts the operation or subsequent reading "CONVERT" in accordance with the recognition program, and sends the command to convert to the application program (Japanese input program characters) for a personal computer or similar device. As a result, at the cursor position on the display device, “TOKKYO” is displayed in the value for KANJI PATENT.

On Fig shows a list of rules for converting to Japanese. However, these rules are not limited to this list.

Fig. 43 illustrates the use of a speech reader as an aid.

On Fig presents a system that helps speech input in Japanese. If the user (operator) makes a speech sound through a microphone, the central processing unit (CPU) of the personal computer analyzes the input of audio information and displays the candidates for conversion on the display device. On Fig shows an example in which the user (operator) pronounces the sound "ISHI". On the display device, conversion candidates corresponding to the speech sound "ISHI" are displayed, i.e., "1 ISHI (meaning KANJI INTENTION)," 2 ISHI (meaning KANJI STONE), "3 ISHI (meaning KANJI WILL (WILL))", "4 ISHI (meaning KANJI MEDICAL DOCTOR (DOCTOR)) and" 5 ISHI (meaning KANJI DESIRE OF DECEASED PERSON). "

From the candidates displayed on the display device, the user (operator) selects the candidate number and scans the area of the icon of this number (for example, “2”) on the paper controller (paper keyboard). Rasters of encoded numbers are printed respectively in the icon areas of these numbers. By a scanner operation, the encoded number is entered into a personal computer. The central processing unit (CPU) of the personal computer reads from the entered code the number associated with the entered candidate and issues a conversion sign (for example, "ISHI (meaning KANJI STONE (STONE))" corresponding to this number to the application program.

Figures 44-56 show examples of using a paper keyboard as input means for an infrared remote controller.

In these examples, the scanner is combined with a remote controller. Fig. 44 (a) shows a structure in which a scanner is provided at the end of a remote controller, and Fig. 44 (b) shows a structure in which a scanner is provided on a surface of a remote controller, opposite the surface on which the operation panel is formed.

Suppose that a user (operator) scans a field of a radio / television program in a newspaper using a remote controller scanner. In the radio / television program field of the newspaper, channels and broadcasting stations are displayed in the XY direction, and program names, artists and contents are printed as information in the form of letters or characters. View / play reservation codes are printed as rasters. When scanning one of these rasters using a scanner, the remote controller reads the reservation code assigned to each program and transmits this reservation code to the set-top box (STB) or infrared receiver in the main TV case.

On Fig shows a remote controller, made in such a way that the scanner can be placed on a stand (platform). The stand contains a central processing unit (CPU) that analyzes the read signal from the scanner and creates an infrared signal, a battery (BAT), etc.

Fig. 46 is an explanatory diagram for a case of making a reservation or recording a set-top box program for satellite broadcasting or Internet broadcasting using a scanner (SCN) and the stand shown in Fig. 45.

The scanner (SCN) can be connected to the stand not only through wired communication, as shown in figures 45 and 46, but also through wireless communication.

Fig. 47 shows an example of a paper controller used for the remote controller shown in Figures 45 and 46. Fig. 48 shows an example of a paper controller used for the remote controller controlling the set-top box.

Figures 49-56 show the correspondence of the code values of the corresponding functional areas (areas or pictograms in which the rasters are printed) of the paper controller shown in Fig. 47 with the execution commands for televisions (TVs) and set-top boxes for televisions indicated by these codes.

For example, if the “POWER” area printed on the front side shown in FIG. 47 is read by the scanner, the raster printed in this area is read and converted to a code value, and a power-on signal is transmitted to the television or set-top box.

On Fig presents a paper controller in accordance with one embodiments, implemented as a medium that controls the set-top box, installed in the guest room in the hotel.

Symbols that mean English (English), Chinese (Chinese), Korean (Korean), and Japanese (Japanese) are printed on the paper controller, and rasters are printed on the corresponding characters. The control signals are supplied from the remote controller via wireless or optical communication, and in this case, the set-top box to the TV can perform operations indicated by the corresponding symbols.

Figures 58 and 59 are examples of a paper controller (paper keyboard) controlling a music or video playback device. Although a music or video reproducing apparatus is not described in detail, video and speech can be recorded or reproduced using a scanner (SCN) and a paper controller (paper keyboard) to operate the music or video reproducing apparatus. Rasters are also entered in the corresponding command areas of the paper controller (paper keyboard). As shown in FIG. 59, it is possible to compose a paper controller (paper keyboard) with which letters or characters can be entered.

Figures 60-67 show the correspondence of the code values of the corresponding functional areas (regions or pictograms in which the rasters are printed) of the paper controller shown in figures 58 and 59 to execution instructions for music or video playback devices indicated by these codes.

Figures 68-70 show an example of using a white board as a surface of a carrier. On this white board are printed and rasters. It is assumed that on the rasters on the white board printed codes of points (see Fig.9), meaning the values of the coordinates.

As shown in FIG. 70, the projector projects a predetermined image onto a white board on which rasters are printed indicating coordinate values. The projector is connected to a personal computer (not shown). If an arbitrary position on a white board is processed using the proposed scanner (see Fig. 1) connected to this personal computer, the raster in this position is read by the scanner (SCN) and converted into the coordinate value in the personal computer. An index table is recorded on the hard disk in a personal computer (see Fig. 24), in which the coordinate values correspond to commands, addresses, etc. When accessing the index table, information or a command defined by each corresponding address is displayed or executed accordingly.

As shown in FIG. 69, a bonding layer is provided on the surface of the white board, and a transparent sheet is adhered to the bonding layer with the rasters printed on one side, the rasters printed on the side facing the bonding layer.

Accordingly, the rasters are protected by a transparent sheet per se. In this case, even if the tip of the scanner or the writing unit for the white board touches the rasters, the rasters do not deteriorate.

In the example of FIG. 70, if the scanner reads the raster on the area displayed as an icon on a white board, the raster is converted to a coordinate value in the personal computer, and an application program registered in advance in accordance with the coordinate value is launched.

Alternatively, as shown in FIG. 68, the image of the remote controller can be projected onto the left side of the white board, and the moving image controlled by the remote controller can be played on its right side.

In this case, if the scanner reads the part corresponding to the projected image of each button of the remote controller, the coordinate value corresponding to the projected image is read out by the personal computer and the operation corresponding to the coordinate value is performed, for example, playback, fast forward, rewind, pause, etc. P. video images, which allows you to control the projected video (image).

71 is an example of using a translucent acrylic board (screen board) instead of a white board. The reverse projector, located behind the acrylic board, projects the desktop screen of a personal computer or video (image).

On this screen board, on the surface of the acrylic board from the back of the projector, a sheet of infrared notch light filter is glued with a bonding layer, and a transparent sheet is glued with a bonding layer on its opposite surface. On the surface of the transparent sheet from the side of the bonding layer, rasters are printed that mean coordinate values.

When a sheet of an infrared notch light filter is glued on the surface of the screen board from the back of the projector, the infrared component of the light emitted by the back projector is cut off and the light noise of the infrared component does not reach the side of the scanner from the projector. Due to this, it is possible to maintain high raster reading accuracy.

In Fig. 71, if the scanner reads an image of a part of the pictogram of the browser program, the central processing unit (CPU) of the personal computer, referring to the correspondence table (not shown) of the coordinate pairs and processing instructions, recognizes that this part of the pictogram is in the position in which the icon of the browser program, and executes the processing command corresponding to the coordinate (in this example, launches the browser program).

On Fig shows an example of creating the above paper keyboard by the user (operator). In order to create a paper keyboard, the image information on the paper keyboard is edited on the screen, a mask is created, for example, by cutting out part of the area, and raster codes are located on the mask so that the user (operator) can use the keyboard with a free layout.

A program is compiled that allows the user (operator) to freely remove, add and arrange functional icons using the application on the screen, and the screen image is printed with the rasters or printed on the sheet on which the rasters are printed. Thus, it is possible to implement a paper keyboard on which all the execution commands for application functions for a word processor, spreadsheet programs, etc. made to order for the user (operator) and printed.

This can reduce the number of function buttons located on the screen and make the on-screen interfaces for the word processor, spreadsheet and application programs quite simple.

As shown in FIGS. 74 and 75, one embodiment using the raster reader described above, an index table, a management server table, and the like, is configured as a distribution document as a technical introduction.

As shown in this redistributable document (Figs. 74 and 75), the present method can be implemented as Grid Onput (brand name).

On Fig and 75 shows an example in which the scanner is used for a personal computer (PC) as a hardware tool, that is, as a GAM (Grid Application Manager (grid application manager) - the name of the application installed on the device of the hard disk of a personal computer) .

On Fig and 75, (1) to (5) and (7) are examples of actual operations. Namely, as shown in (1) of FIG. 74, a user executes an installation program recorded in a read-only memory on a compact disc (CD-ROM) or downloaded by accessing a distribution server on the Internet, on a personal computer (PC), and registers the GAM and driver program as resident programs on the OS (operating system). In addition, the user installs an application program, grouped by the application program administrator (GAM), and content data such as images and video onto the hard disk (HD) device.

The scanner then connects to the USB (Universal Serial Bus) terminal, and the resident driver program recognizes the scanner.

When the scanner reads the tag surface, the read image (raster) of the tag surface is read by a personal computer (PC) via a USB cable and loaded into the video memory (video RAM). GAM, read by the central processing unit (CPU), decrypts the read image (raster) into a code (code number) according to the above algorithm (GRID 1 or GRID 2).

If the scanner scans the tag for the first time, a screen is displayed on the display device (DISP) of the personal computer (PC), prompting the user to enter personal information corresponding to this tag, and the user, in accordance with this displayed screen, registers personal information such as name, address and number credit card. Personal information entered in this way is recorded in the management server table shown in FIG. 24 and described above and is used for subsequent authentication.

Namely, subsequently, when the personal computer (PC) is turned on, the tag is scanned by the scanner, and the management server performs authentication, and after this authentication is completed, GAM is launched.

Then the scanner scans (reads) paper or a paper controller (paper keyboard) on which (which) the rasters are printed, while the read image data on the rasters are entered into a personal computer, and point codes (code numbers), each of which consists of 32-bit strings are decrypted.

Then, in accordance with the point codes (code numbers), an access is made to the point code management table (index table) GAM.

If the point codes (code numbers) are already registered in the index table, then it is recognized that the content data corresponding to the point codes are already installed in the personal computer (PC), and the content data is read and played. If the content data is images or video, a video playback application, or an image display program corresponding to the content data, this film or images are displayed on a display device (DISP).

If each point code (code number) in the index table registers an Internet address (URL), a browser program (such as Internet Explorer from Microsoft Corporation) is launched and the address is accessed.

In addition, as shown in (5) of Fig. 74, if the point code (point number) obtained by reading each raster is not registered in the local point code management table (in the index table) (in a personal computer), point management server points on the Internet. If a point code (code number) is registered in the table of the point code management server, then automatically start (1) downloading content from the marked Web server or, more precisely, from server A, (2) distributing moving images with streaming data or, to be more precise, distributing data from server B, which serves as a distribution server with streaming data, and (3) viewing Web pages or, more precisely, downloading Web files indicated by the address (URL) of server C.

Then, when the content data is downloaded to a personal computer (PC), additional data (pairs of code numbers and addresses) from the point code management table (from the index table) is also downloaded to start the content data. This data is then controlled by a point code management table (index table) in a personal computer.

Therefore, subsequently, when the same code number is read, the content data downloaded to the hard disk device (HD) of the personal computer is reproduced in accordance with the code management table (index table), including newly added data, without repeated access to server A, B or With the Internet.

Figures 76 and 77 explain an example of using a paper controller for an order system in a catering facility, such as a restaurant.

As shown in these drawings, a menu in which rasters of different code numbers are printed according to menu items is placed on each table in the restaurant, and a computer terminal including a display device (DISP) is installed at the edge of the table.

A short-range wireless communication system is built into the scanner, such as a Bluetooth technology, and code numbers and menu item numbers read from the menu can be transmitted between the scanner and the computer terminal.

Code numbers corresponding to menu items and item number information read by the scanner are transmitted to the computer terminal. The central processing unit (CPU) of the computer terminal creates an order signal, in which a table number is added to the code numbers and information on the numbers of menu items, and transmits this order signal to the order server.

The order server extracts the table number, code numbers corresponding to the menu items, and information about the numbers of the menu items from the order signal and creates an order for the kitchen. In particular, the order server displays the table number, menu items corresponding to the code numbers, and item numbers on the display device in the kitchen and the chef can start cooking.

An example is described in which a table number is added to the computer terminal during the creation of an order signal. However, a seal or similar means is pre-attached to the surface of the stand of the computer terminal or to the surface of each table, on which a raster is printed indicating the number of the table. When reading the print surface using a scanner located on the table, the table number is linked to the computer terminal.

Several scanners can be laid out on the table so that several people can place orders at the same time.

84 is an example of a paper keyboard in which XY coordinate values are used as a mouse pad.

On Fig (a) shows that the area of the mouse pad is provided on the part of the paper keyboard.

In this embodiment, only code values are registered in the rasters printed on the key images, respectively, and code values and XY coordinate values are registered in the rasters printed in the area of the mouse pad. If the user moves the scanner up in the area of the mouse pad, the screen scrolls up. Similarly, if the user moves the scanner down, the screen scrolls down. The picture is similar in the case of moving to the right and moving to the left.

84 (b) shows that the entire paper keyboard is used as the area of the mouse pad.

In this embodiment, in each of all the images on the keys, both a code value corresponding to the content of the key and a coordinate value are registered. If the user taps the scanner twice or more at an arbitrary place on the paper keyboard, and then moves the scanner up, the screen scrolls up. Similarly, if the user taps the scanner twice or more and moves the scanner down, the screen scrolls down. The picture is similar in the case of moving to the right and moving to the left.

On Fig, illustrating the rasters created on the projection board, coordinate values and code values are defined in one format. The given matrix blocks are created on the board, and the same code block is assigned the same code value, despite the change in the coordinate value.

In this embodiment, the icon image is positioned so that it spreads over one or more blocks of the matrix. When reading the raster of the image of the pictogram by the reader, the image corresponding to the image of the pictogram is characteristically controlled or the program for the image corresponding to the image of the pictogram is characteristicly launched.

On Fig shows the relationship between the code values and the coordinate values of the XY rasters in each matrix on a white board.

Fig. 86 (a) is a table illustrating values determined in 32 bits from regions 0 through C 31 of rasters. As shown in FIG. 86 (a), areas C 0 -C 7 indicate Y coordinates, areas C 8 -C 15 indicate X coordinates, areas C 16 -C 29 indicate code values, and areas C 30 -C 31 indicate control bits parity.

These numerical values are located in the grid areas shown in FIG. 86 (b), and FIG. 86 (c) show specific rasters.

On Fig shows a table of correspondence between the code value and the command provided in the device of the hard disk (HD) of a personal computer. If the point code corresponding to the raster read by the scanner is, for example, 11 or 12, video playback stops. If the point code is 13, pause the video playback.

88-90 are diagrams explaining a method of creating a paper keyboard on which icons on a desktop screen are printed when capturing and printing a desktop screen.

In this embodiment, a paper keyboard output program is created on which pictograms and rasters are printed on the space of the sheet as a print medium if the desktop screen is captured by pressing the “PRINT SCREEN” button (prt sc) on a keyboard (KBD) or similar device.

On Fig (a) shows the desktop screen display (display device). This desktop screen displays images of the icons of a word processor, the Internet and a spreadsheet, as well as the START button, for example.

When capturing the desktop screen, the hard disk drive (HD) of the personal computer recognizes in which position each icon is displayed on the desktop screen and calculates the value of the coordinate of the position in which each icon is displayed. A hard disk (HD) device brings the XY coordinates on the desktop screen into line with the XY coordinates on the print sheet and creates rasters corresponding to these icons. A raster includes rasters that indicate the coordinates on the screen and code values that represent the functions of the corresponding icons in one format. The hard disk drive (HD) performs an overlay and print operation to overlay the generated rasters onto the images on the desktop screen.

FIG. 88 (b) shows a paper keyboard on which a desktop screen and rasters are printed.

89 is a diagram for explaining a table illustrating the correspondence between code values and startup programs. If the rasters are created by the above processing, a table is created in the hard disk (HD) device in which the code values of the rasters are aligned with the startup programs, which are indicated by the corresponding pictograms (pictogram functions). For example, if a raster is created that corresponds to an icon indicating a word processor and a code value 0001 is assigned to this raster, a table is created in which code value 0001 corresponds to the Warpro.exe launcher. The situation is similar for the icons indicating the Internet and the spreadsheet. When creating this table, the spreadsheet program starts if the user clicks on the mouse using the scanner, for example, on the table drawing on the paper keyboard shown in Fig. 88 (b).

Fig. 90 is a diagram illustrating a format of the above rasters. Since the format of the rasters is similar to the format described above, its description is not given.

Thus, when printing icons on the desktop screen in advance, these icons can be easily marked. Even if the icons on the desktop screen are not visible on the screen because, for example, several programs are already running, the user can easily download each program by clicking on the image of the icon on the printed paper keyboard with a scanner.

INDUSTRIAL APPLICABILITY

The present invention can be used in an input system for an information processing device, such as a personal computer, television or player.

Claims (42)

1. The input processing system for the information processing device, characterized in that
the raster that is created on the surface of the medium and in which each or one of the coordinate and code value is determined in one format is read using a scanner connected to the information processing device, while transmitting a working command to enter each or one coordinate value and code values to the central processor of the information processing device defined by the raster, the raster being printed on the surface of the medium, and
the raster on the surface of the medium is read using a scanner that reads the raster, and each or one coordinate value and code value is entered into the central processor of the information processing device.
2. The input processing system for the information processing device, characterized in that
the raster that is created on the surface of the medium is read using a scanner connected to the information processing device and converted into the code value of the interrupt key on the hardware keyboard defined by the raster, while generating interrupt processing of key input in the central processor of the information processing device, the raster created for each icon printed on the surface of the medium, and,
if an icon for which a raster has been created on the surface of the medium must be scanned using a scanner that reads the raster, before or after reading the raster, the scanner tilt relative to the surface of the medium is detected by the difference in the light and shadow of the image read by the scanner, and key input interrupt processing defined in accordance with the direction of inclination of the scanner relative to the surface of the medium.
3. The input processing system for the information processing device according to claim 2, characterized in that
the scanner operation is recognized by changing the difference in light and shadow of the image read by the scanner, and in accordance with the scanner operation, key input interruption processing is generated.
4. The input processing system for the information processing device according to claim 2 or 3, characterized in that
key input interrupt processing is a change in the type of letter or character being entered, a command to convert a letter or character, and moving the cursor.
5. The input system in Japanese, characterized in that
the raster created on the surface of the medium is read using a scanner connected to the information processing device and converted into the interrupt key code on the hardware keyboard defined by the raster, while generating interrupt key input processing in the central processor of the information processing device, the raster being created for each icon printed on the surface of the medium,
if an icon for which a raster has been created on the surface of the medium needs to be scanned using a scanner that reads the raster and a word is entered that includes only a vowel, the raster on the icon is read by touching the tip of the scanner with an icon for which the code value corresponding to that vowel is defined like a raster, and,
if an icon for which a raster has been created on the surface of the medium must be scanned using a scanner that reads the raster and a word is entered that includes the consonant and vowel letter, the raster corresponding to this consonant letter is read by touching and stopping the reader provided on the tip of the scanner, with and on the icon for which the code value corresponding to the consonant letter is defined as a raster, the scanner reader moves to the icon for which the code value corresponding to the vowel letter following the consonant, it is defined as a raster on the surface of the medium, and temporarily stops at the icon corresponding to the vowel letter to read the raster, and the reader provided on the tip of the scanner is separated from the surface of the medium so that it cannot recognize the raster, with the introduction of one letter or sign, several words or phrases.
6. The input system in Japanese according to claim 5, characterized in that
the raster is created from a material with absorption in the infrared region of the spectrum, and the coordinate value and code value are defined in the raster in the same format.
7. An information input device comprising:
a voice input device that inputs operator voice information;
a conversion device that analyzes the entered speech information and converts the entered speech information into one or more candidate words formed by or formed by letters or characters corresponding to the entered speech information;
a display device that displays one or more candidate words obtained or obtained by conversion;
a scanner that reads a raster created on the surface of the medium, and in which each or one coordinate value and code value for an arbitrary moving cursor is rasterized so as to select one of the candidate words displayed on the display device; and
a resolver that converts the raster read by the scanner into a code value and enters a candidate word corresponding to the code value as a resolved word.
8. The information input device according to claim 7, characterized in that
the raster is created from a material with absorption in the infrared region of the spectrum, and the coordinate value and code value are defined in the raster in the same format.
9. An input processing system for an information processing device, characterized in that
a raster that is created on the surface of the medium and in which each or one coordinate value and value is rasterized is read using a scanner connected to the information processing device, while transmitting a working command to the central processor of the information processing device defined by the raster, the raster printed on the surface of the carrier, and,
if the raster on the surface of the medium needs to be read using a scanner that reads the raster, the inclination of the scanner relative to the surface of the medium is detected by the difference in light and shadow of the image read by the scanner, and a graphic user interface operation is performed on the screen in accordance with the direction of the inclination of the scanner relative to the surface of the medium.
10. The input processing system for the information processing device according to claim 9, characterized in that
the raster is created from a material with absorption in the infrared region of the spectrum, and the coordinate value and code value are defined in the raster in the same format.
11. The input processing system for the information processing device according to claim 9, characterized in that
the scanner operation is recognized by changing the difference in light and shadow of the image read by the scanner, and the operation of the graphical user interface on the screen is performed in accordance with the operation of the scanner.
12. The input processing system for the information processing device according to claim 9, characterized in that
the operation of the graphical user interface on the screen is an operation controlled by a mouse, such as scrolling the screen, moving the cursor, specifying an icon on the screen, transfer operation with fixing to a new location, selecting a menu command or the operation of issuing a command to the letter or sign input position.
13. An input processing system for an information processing device, characterized in that
the raster created on the surface of the medium is read using a scanner connected to the information processing device and converted into the interrupt key code on the hardware keyboard defined by the raster, while generating interrupt key input processing in the central processor of the information processing device, the raster printed with concave and convex parts of embossed points on the surface of the carrier as an icon.
14. The input processing system for the information processing device according to item 13, wherein
the raster is created from a material with absorption in the infrared region of the spectrum, and the coordinate value and code value are defined in the raster in the same format.
15. The input processing system for the information processing device according to 14, characterized in that
a raster and relief points representing a raster are created in a given region as a pair on the surface of the carrier, and a block is provided for each region that separates and limits this region.
16. A remote controller for viewing and listening to or making backup recordings based on program information printed on a surface of a medium, the remote controller comprising:
an image generator that optically reads a raster created by rasterizing a predetermined code value based on a predetermined algorithm for each area of program information printed on the surface of the medium;
a control device that analyzes the raster from the image read by the image former and transmitted from the image former, and decrypts the raster into a code value denoted by the raster; and
a transmission device that transmits the decrypted code value to a radio receiver, tuner, recording and playing device, player, set-top box to a television for receiving broadcasts or a personal computer.
17. The remote controller according to clause 16, characterized in that
the raster is created from a material with absorption in the infrared region of the spectrum, and the coordinate value and code value are defined in the raster in the same format.
18. A remote controller for accessing a site based on site information printed on a surface of a medium, the remote controller comprising:
an image generator that optically reads a raster created by rasterizing a predetermined code value based on a predetermined algorithm for each area of site information printed on the surface of the medium;
a control device that analyzes the raster from the image read by the image former and transmitted from the image former, and decrypts the raster into a code value denoted by the raster; and
a transmission device that transmits the decrypted code value to a network access device, a network access box, or a personal computer.
19. The remote controller according to p. 18, characterized in that
the imager is a reader made as a unit with a remote controller.
20. The remote controller according to p, characterized in that it also contains:
a stand, which is the main body of the remote controller, and the stand comprises a control device and a transmission device; and
a scanner connected to the stand with wires or wireless, and the scanner contains an imager that communicates with the control device.
21. The remote controller according to p. 18, characterized in that
the raster is created from a material with absorption in the infrared region of the spectrum, and the coordinate value and code value are defined in the raster in the same format.
22. A remote controller having a raster obtained by rasterizing a given code value based on a given algorithm and created on an icon on the surface of a medium that means a control button for a radio receiver, tuner, recording and playing device, player or network access device, TV set-top box for receiving broadcasts and access to a network or personal computer, the remote controller comprising:
an imager that optically reads the raster;
a control device that analyzes the raster from the image read by the image former and transmitted from the image former, and decodes the raster into a code value denoted by the raster; and
a transmission device that transmits the decrypted code value to a radio, tuner, recording and playing device, player or network access device, a set-top box for television to receive broadcasts and network access, or a personal computer.
23. The remote controller according to item 22, wherein
the raster is created from a material with absorption in the infrared region of the spectrum, and the coordinate value and code value are defined in the raster in the same format.
24. The remote controller according to item 22, wherein
the imager is a reader made as a unit with a remote controller.
25. The remote controller according to item 22, characterized in that it also contains:
a stand, which is the main body of the remote controller, and the stand comprises a control device and a transmission device; and
a scanner connected to the stand with wires or wireless, and the scanner contains an imager that communicates with the control device.
26. A control system for a projected image and a moving image, comprising:
a projection board on which a raster is created, obtained by rasterizing each or one given coordinate value and a given code value based on a given algorithm;
wherein the projection board has one surface formed by an image display area for projecting a moving image or image, and a controller area for displaying a pictogram image for controlling a moving image or image projected onto an image display area; a projector for projecting a moving image or image at least on the image display area;
a reader that reads a raster created in the zone of the controller; and
a control device that analyzes the raster in the image of the icon created in the controller area and read by the reader, converts the raster into a coordinate value or a code value denoted by a raster, outputs a control signal corresponding to the coordinate value or code value to the projector and controls the output signal of the moving image or an image displayed in the image display area.
27. The control system for the projected image and the moving image according to p. 26, characterized in that
the surface of the projection board on which the raster is created differs from the surface onto which the image is projected, the moving image of the icon, and the projector is located as a reverse projector relative to the projection board.
28. The control system for the projected image and the moving image according to p. 26, characterized in that
the raster on the projection board is made of a material having an absorption characteristic in the infrared region of the spectrum, and an infrared filter is provided at least on the surface of the projection board from the projector.
29. The control system for the projected image and the moving image according to p. 26, characterized in that
a raster created on the projection board is defined in rasters that are identical in coordinate value and code value, and
The given matrix blocks are formed on the board and the same matrix block is assigned the identical code value, despite changes in the coordinate value.
30. The control system of the projected image and the moving image according to p. 26, characterized in that
the image of the icon is placed on one or more blocks of the matrix, and when the image raster reads the image of the icon of the icon, a command is issued to control the image corresponding to the image of the icon, or to launch a program corresponding to the image of the icon.
31. The control system of the projected image and the moving image according to p. 26, characterized in that
the raster is created from a material with absorption in the infrared region of the spectrum, and the coordinate value and code value are defined in the raster in the same format.
32. The control system for the projected image and the moving image according to p. 26, characterized in that
the projection board is made in such a way that a transparent sheet is glued to the surface of the white board with a bonding layer, and a raster is created between the transparent sheet and the bonding layer.
33. A system for processing and displaying information, comprising:
a projection board on which a raster is created, obtained by rasterizing each or one given coordinate value and a given code value based on a given algorithm;
a projector that projects an image of an icon representing at least the start of a program onto a projection board and projects an image or a moving image to display a program installed in a storage device corresponding to the image of the icon;
a reader that reads the raster created on the projected image of the icon; and
a control device that analyzes the raster created in the icon image and read by the reader converts the raster into a coordinate value or a code value denoted by a raster, and starts a program from a memory device corresponding to the coordinate value or code value by a start signal.
34. The system for processing and displaying information according to clause 33, wherein
the raster is created from a material with absorption in the infrared region of the spectrum, and the coordinate value and code value are defined in the raster in the same format.
35. The information processing and display system according to claim 33, wherein
the surface of the projection board on which the raster is created is different from the surface onto which the image, moving image or icon image is projected, and the projector is located as a back projector relative to the projection board.
36. The system for processing and displaying information according to clause 35, wherein
the raster on the projection board is made of a material having an absorption characteristic in the infrared region of the spectrum, and an infrared notch filter is provided at least on the surface of the projection board from the projector.
37. The system for processing and displaying information according to p. 33, characterized in that
a raster created on the projection board is defined in rasters identical in value to the coordinate of the code, and
The given matrix blocks are formed on the board and the same matrix block is assigned the same code value, despite the coordinate changes.
38. The information processing and display system according to clause 37, wherein the pictogram image is placed on one or more matrix blocks and when the reader reads the pictogram image raster, a command is issued to control the image corresponding to the pictogram image.
39. A control system for printing an image of an icon for printing an image of an icon displayed on a display device on a surface of a sheet of paper together with a raster corresponding to the image of the icon, comprising:
a display device that creates and displays an icon image;
a control device that links the image of the icon displayed on the display device with each or one coordinate value and a code value determined in advance, and issues a command to print the image of the icon and raster; and
a printing device that, on command from the control device, prints an icon and a raster image on the surface of a given medium.
40. The print image management system of the pictogram according to claim 39, characterized in that
the raster is created from a material with absorption in the infrared region of the spectrum, and the coordinate value and code value are defined in the raster in the same format.
41. A printing method for an information processing device for printing a desktop screen displayed on a display device on a surface of a sheet of paper together with a raster, comprising the following steps:
the stage at which coordinate values corresponding to the desktop screen are displayed;
the stage at which the raster is created, which means the coordinates on the screen when printing a desktop screen;
on which a raster is created, which includes coordinate values and a code value denoting a function of a functional image or the like in a functional image, such as an icon image on a desktop screen, in one format; and
the stage at which the desktop screen is printed along with the rasters.
42. An input processing system for an information processing device, characterized in that
a raster that is created on the surface of the medium and whose coordinate value and code value are defined in the same format is read using a scanner connected to an information processing device that transmits a working command to enter each or one coordinate value and code value into the central processor of the information processing device defined by the raster, wherein the raster is printed on the surface of the medium,
a raster on the surface of a medium is a set of specified points obtained by superimposing a raster on a controller or keyboard template for arranging raster points at grid points at specified intervals in the horizontal and vertical directions and arranging information points having values determined by how information points are displaced from the virtual grid point in the center surrounded by four raster points at grid points around the virtual grid point, and the raster contains um several information areas in which the printed rasters containing the coordinate values X, Y coordinate values and code values in a raster format, and
the raster on the surface of the medium is read using a scanner that reads the raster, while entering each or one coordinate value and the code value corresponding to the raster into the central processor of the information processing device.
RU2008139959/08A 2006-03-10 2007-03-12 Input processing system for information processing apparatus RU2457532C2 (en)

Priority Applications (5)

Application Number Priority Date Filing Date Title
JP2006066751 2006-03-10
JP2006-066751 2006-03-10
JP2006-314650 2006-11-21
JP2006314650 2006-11-21
JP2007-060495 2007-03-09

Related Child Applications (1)

Application Number Title Priority Date Filing Date
RU2012106761/08A Division RU2012106761A (en) 2006-03-10 2012-02-27 Input processing system for information processing device

Publications (2)

Publication Number Publication Date
RU2008139959A RU2008139959A (en) 2010-04-20
RU2457532C2 true RU2457532C2 (en) 2012-07-27

Family

ID=39654815

Family Applications (1)

Application Number Title Priority Date Filing Date
RU2008139959/08A RU2457532C2 (en) 2006-03-10 2007-03-12 Input processing system for information processing apparatus

Country Status (2)

Country Link
JP (10) JP4135116B2 (en)
RU (1) RU2457532C2 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9965044B2 (en) 2014-01-26 2018-05-08 Huawei Device (Dongguan) Co., Ltd. Information processing method, apparatus, and device
RU2669449C2 (en) * 2012-12-05 2018-10-12 Кенджи Йошида Facility management system control interface

Families Citing this family (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8245931B2 (en) 2008-08-12 2012-08-21 Think Laboratory Co., Ltd Information display system and dot pattern printing sheet used for same
KR101562674B1 (en) * 2008-11-10 2015-10-23 주식회사 알티캐스트 Method for controlling broadcasting receiving terminal using code and apparatus therefor
JP4291404B1 (en) * 2008-11-14 2009-07-08 健治 吉田 Broadcasting control system
JP4385169B1 (en) 2008-11-25 2009-12-16 健治 吉田 Handwriting input system, handwriting input sheet, the information input system, the information input help sheet
JP5440054B2 (en) * 2009-09-15 2014-03-12 大日本印刷株式会社 Operation sheet creation system and program thereof
JP2011244331A (en) * 2010-05-20 2011-12-01 Dainippon Printing Co Ltd Data input system and data input program
KR101019142B1 (en) * 2010-07-08 2011-03-03 주식회사 네오랩컨버전스 Content managing method in network, and web-server used therein
EP2410406A1 (en) * 2010-07-23 2012-01-25 Anoto AB Display with coding pattern
JP2012073819A (en) * 2010-09-29 2012-04-12 Dainippon Printing Co Ltd Stroke display system and program
JP5948731B2 (en) * 2011-04-19 2016-07-06 富士ゼロックス株式会社 Image processing apparatus, image processing system, and program
KR101766835B1 (en) 2011-05-04 2017-08-09 에스프린팅솔루션 주식회사 Image forming apparatus and method for controlling thereof
US20150229792A1 (en) * 2012-09-11 2015-08-13 Kenji Yoshida Document camera
JP5544609B2 (en) * 2012-10-29 2014-07-09 健治 吉田 Handwriting input / output system
JP5848230B2 (en) * 2012-11-12 2016-01-27 グリッドマーク株式会社 Handwriting input / output system, handwriting input sheet, information input system, information input auxiliary sheet
CN106104431A (en) * 2013-12-27 2016-11-09 株式会社Ip舍路信 Information input assistance sheet

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
SU1659558A1 (en) * 1985-05-11 1991-06-30 Ленинградский технологический институт целлюлозно-бумажной промышленности Method of producing printed matter on paper for blind persons
RU2198428C2 (en) * 1996-11-01 2003-02-10 К Текнолоджиз Аб Pen and process of recording
US20040189668A1 (en) * 2003-03-27 2004-09-30 Microsoft Corporation Visual and scene graph interfaces
JP2005004574A (en) * 2003-06-13 2005-01-06 Dt Research Japan Kk Information processing apparatus whose operation can be controlled by bar-code system and service provision system using the apparatus

Family Cites Families (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH03110680A (en) * 1989-09-25 1991-05-10 Konica Corp Electronic image filing device
JPH0495119A (en) * 1990-08-07 1992-03-27 Sony Corp Information input device and recording sheet used for the same
JPH0643988A (en) * 1992-07-23 1994-02-18 Nec Corp Pen touch keyboard management system
CA2346231A1 (en) * 2000-05-08 2001-11-08 Internet Number Corporation Method and system for accessing information on a network using message aliasing functions having shadow callback functions
US6594406B1 (en) * 1996-12-20 2003-07-15 Xerox Corporation Multi-level selection methods and apparatus using context identification for embedded data graphical user interfaces
JPH10283572A (en) * 1997-04-04 1998-10-23 Victor Co Of Japan Ltd Pos accumulation managing device
JPH11232026A (en) * 1998-02-16 1999-08-27 Canon Inc Image processor
JP2930934B2 (en) * 1998-04-27 1999-08-09 三菱電機株式会社 Restaurant for service system
JP3475235B2 (en) * 1999-03-08 2003-12-08 東京農工大学長 Display content control method of the display device
US6813039B1 (en) * 1999-05-25 2004-11-02 Silverbrook Research Pty Ltd Method and system for accessing the internet
SE516522C2 (en) * 1999-05-28 2002-01-22 Anoto Ab Position determining product for digitization of drawings or handwritten information, obtains displacement between symbol strings along symbol rows when symbol strings are repeated on symbol rows
AT467167T (en) * 1999-08-30 2010-05-15 Anoto Ab Personal organizer
JP2001092705A (en) * 1999-09-21 2001-04-06 Pioneer Electronic Corp File system and method
SE517445C2 (en) * 1999-10-01 2002-06-04 Anoto Ab Positioning on a surface provided with a position-coding pattern
WO2001031570A2 (en) * 1999-10-27 2001-05-03 Digital Ink, Inc. Tracking motion of a writing instrument
SE0000939L (en) * 2000-02-18 2001-08-19 Anoto Ab Inenhetsarrangemang
JP2001338115A (en) * 2000-03-23 2001-12-07 Olympus Optical Co Ltd Method for market research, printed matter to be used for the method and information resource to be used for market research
JP4850995B2 (en) * 2000-04-20 2012-01-11 アイシン・エィ・ダブリュ株式会社 Touch operation input device
JP4776832B2 (en) * 2000-10-19 2011-09-21 キヤノン株式会社 Coordinate input device and coordinate plate of image input device
JP2002149331A (en) * 2000-11-15 2002-05-24 Canon Inc Coordinate plate, coordinate input device and coordinate input/output device
JP2002367031A (en) * 2001-06-04 2002-12-20 Kokuyo Co Ltd Order reception support system, writing utensil, order sheet, and order reception supporting method
JP2003345503A (en) * 2002-05-23 2003-12-05 Dainippon Printing Co Ltd Slip for electronic pen
JP2006190270A (en) * 2002-09-26 2006-07-20 Kenji Yoshida Icon formed on medium
BR0314650A (en) * 2002-09-26 2005-08-02 Kenji Yoshida Methods and playback devices and insertion / emission information, portable electronic toy figure of unity, support mouse, mouse, electronic information device, tablet, and computer executable program
JP4629303B2 (en) * 2002-10-07 2011-02-09 大日本印刷株式会社 Calculation processing system, calculation processing system server device, calculation processing program, and electronic pen form
SE523931C2 (en) * 2002-10-24 2004-06-01 Anoto Ab Information processing system arrangement for printing on demand of position-coded base, allows application of graphic information and position data assigned for graphical object, to substrate for forming position-coded base
BRPI0318184B1 (en) * 2003-03-17 2015-09-29 Kenji Yoshida input and output method using a dot pattern
JP4125640B2 (en) * 2003-06-10 2008-07-30 Necインフロンティア株式会社 Self-order terminal
JP4589619B2 (en) * 2003-09-03 2010-12-01 株式会社リコー Paper document information operation system and information manipulation methods
JP3852435B2 (en) * 2003-10-07 2006-11-29 ソニー株式会社 An information processing apparatus and method, a display method, and a recording medium
JP4037844B2 (en) * 2004-04-20 2008-01-23 株式会社タカラトミー Information providing system
JP4565975B2 (en) * 2004-11-16 2010-10-20 大日本印刷株式会社 Document and how to create them for an electronic pen
JP3830956B1 (en) * 2005-09-14 2006-10-11 健治 吉田 Information output device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
SU1659558A1 (en) * 1985-05-11 1991-06-30 Ленинградский технологический институт целлюлозно-бумажной промышленности Method of producing printed matter on paper for blind persons
RU2198428C2 (en) * 1996-11-01 2003-02-10 К Текнолоджиз Аб Pen and process of recording
US20040189668A1 (en) * 2003-03-27 2004-09-30 Microsoft Corporation Visual and scene graph interfaces
RU2324229C2 (en) * 2003-03-27 2008-05-10 Майкрософт Корпорейшн Visual and three-dimensional graphic interfaces
JP2005004574A (en) * 2003-06-13 2005-01-06 Dt Research Japan Kk Information processing apparatus whose operation can be controlled by bar-code system and service provision system using the apparatus

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
RU2669449C2 (en) * 2012-12-05 2018-10-12 Кенджи Йошида Facility management system control interface
US9965044B2 (en) 2014-01-26 2018-05-08 Huawei Device (Dongguan) Co., Ltd. Information processing method, apparatus, and device
RU2662408C2 (en) * 2014-01-26 2018-07-25 Хуавей Дивайс (Дунгуань) Ко., Лтд. Method, apparatus and data processing device

Also Published As

Publication number Publication date
RU2008139959A (en) 2010-04-20
JP2016053962A (en) 2016-04-14
JP2011238260A (en) 2011-11-24
JP2008154211A (en) 2008-07-03
JP4135117B2 (en) 2008-08-20
JP4135116B2 (en) 2008-08-20
JP5735901B2 (en) 2015-06-17
JP2012086570A (en) 2012-05-10
JP6030728B2 (en) 2016-11-24
JP2009003952A (en) 2009-01-08
JP2008152755A (en) 2008-07-03
JP2010003305A (en) 2010-01-07
JP5848405B2 (en) 2016-01-27
JP2013254526A (en) 2013-12-19
JP5156851B2 (en) 2013-03-06
JP4391572B2 (en) 2009-12-24
JP2014238846A (en) 2014-12-18
JP2008152756A (en) 2008-07-03

Similar Documents

Publication Publication Date Title
Arai et al. PaperLink: a technique for hyperlinking from real paper to electronic content
US7479950B2 (en) Manipulating association of data with a physical object
EP0652505B1 (en) Input/display integrated information processing device
EP1236159B1 (en) Viewer with code sensor
US7800587B2 (en) Touch-type key input apparatus
US9304602B2 (en) System for capturing event provided from edge of touch screen
US5852434A (en) Absolute optical position determination
KR101016981B1 (en) Data processing system, method of enabling a user to interact with the data processing system and computer-readable medium having stored a computer program product
USRE40880E1 (en) Optical system for inputting pointer and character data into electronic equipment
US8022942B2 (en) Dynamic projected user interface
JP4752887B2 (en) Information processing apparatus, information processing method, and computer program
US20090167706A1 (en) Handheld electronic device and operation method thereof
US20020075240A1 (en) Virtual data entry device and method for input of alphanumeric and other data
US7423660B2 (en) Image display apparatus, method and program
US20050162398A1 (en) Touch pad, a stylus for use with the touch pad, and a method of operating the touch pad
US20060033725A1 (en) User created interactive interface
US8451242B2 (en) Keyboard with input-sensitive display device
KR101895503B1 (en) Semantic zoom animations
KR101328766B1 (en) System, and method for identifying a rendered documents
US7131061B2 (en) System for processing electronic documents using physical documents
US20090135147A1 (en) Input method and content displaying method for an electronic device, and applications thereof
US20070276814A1 (en) Device And Method Of Conveying Meaning
US6698660B2 (en) Electronic recording and communication of information
CN102637089B (en) The information input device
US20050052558A1 (en) Information processing apparatus, information processing method and software product

Legal Events

Date Code Title Description
TK4A Correction to the publication in the bulletin (patent)

Free format text: AMENDMENT TO CHAPTER -FG4A- IN JOURNAL: 21-2012 FOR TAG: (57)