CN1782975A - Apparatus and method of processing information input using a touchpad - Google Patents

Apparatus and method of processing information input using a touchpad Download PDF

Info

Publication number
CN1782975A
CN1782975A CN200510131012.9A CN200510131012A CN1782975A CN 1782975 A CN1782975 A CN 1782975A CN 200510131012 A CN200510131012 A CN 200510131012A CN 1782975 A CN1782975 A CN 1782975A
Authority
CN
China
Prior art keywords
character
touch pad
coordinate
input
coordinates
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN200510131012.9A
Other languages
Chinese (zh)
Inventor
尹盛暋
金范锡
李容薰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Publication of CN1782975A publication Critical patent/CN1782975A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Input By Displaying (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Apparatus and method of processing touchpad input information are provided. The method includes mapping an input region of a touchpad to a display region as absolute coordinates, converting contact location coordinates into absolute coordinates, when a pointing tool touches the input region, and moving a mouse pointer displayed on the display region according to the converted contact location coordinates. The input region of a touchpad is mapped to a display region as absolute coordinates such that information can be directly input using the touchpad.

Description

Handle the apparatus and method of the information of using the touch pad input
Technical field
General conception of the present invention relates to the apparatus and method of handling the information of using the touch pad input, more particularly, relate to such processing and use the apparatus and method of the information of touch pad input, these apparatus and method make the user can use the direct input information of touch pad by touch pad is mapped as absolute coordinates with predetermined viewing area.
Background technology
User interface facilities (afterwards, being called input equipment) allows the user that desired information is input to computing machine.Keyboard is an example of the input equipment that is widely used.Keyboard comprises a plurality of buttons, its each all have from its output being mapped to the push button signalling of each numeral or character, thereby make the user easily desired information to be input to computing machine.Specifically, when using a computer Edit Document, keyboard allows the user to import desired character effectively, has developed various technology to strengthen user experience and to make computing machine have more purposes in computer industry.
Outside keyboard, the indicating equipment of also frequent use such as mouse, touch pad or touch-screen is as input equipment.When moving when display unit (for example, the display of computing machine) is gone up the cursor (for example, mouse pointer) that shows or selected specific icon, indicating equipment provides users with the convenient.
In recent years, the slip-stick artist had developed a kind of technology, wherein used the information of importing such as the indicating equipment of Microsoft's input method editing machine (IME) to be identified as character.For example, with the linking of documents editing application module in, IME will be character by the information Recognition of indicating equipment input and the character of identification offered the documents editing application module.
When using keyboard with such as Chinese, Japanese language character, or the Arabic character creates document when (it needs to be converted into (alphanumerical) schema document of alpha-numerical establishment subsequently), can use this technology easily and flexibly.When even the sounding of the character of input is difficult to and the user may not know the accurate sounding of the character imported and user when still importing the character that knocks into voice and ideograph, this technology may be particularly useful.
But conventional art has disadvantage.
At first, for input character, user's rolling mouse pointer is pressed the mouse button that is positioned on the mouse simultaneously.In this case, the user uses his or her wrist joint input character, and the quantity of using in this character of input of knocking makes that the efficient of input process is very low.In addition, because the out of true when using mouse is knocked, may produce wrong character.Particularly, when using mouse, it is many more to import the needed quantity of knocking of complicated more character, and it is low more that then character recognition efficient will become.Because these reasons, conventional art also suitably do not solve effective character recognition problem.
Simultaneously, touch pad is the indicating equipment as mouse, and it is widely used in notebook in light weight, that size is little.Compared with using mouse, can be identified more effectively and use as character such as the touch pad input of the marking tools of finger, joystick or pen.
But because touch pad carries out and the mouse function identical functions, so move with the mouse pointer that is used for the character input and move in order to distinguish common mouse pointer, the user must be provided by the mouse button that provides in touch pad when input character.
Now with reference to Fig. 1 the traditional operation that uses touch pad to come input character is described, wherein IME links with document editor 110.
The user uses the IME application program to come input character by IME input window 120.The user uses document editor 110 to come Edit Document.When showing IME input window 120, the user contacts dilatory marking tools under the state of touch pad (1) and moves the mouse pointer 130 that shows to IME input window 120 on display unit at marking tools.
Provide the input be known as Hangul Korea character '
Figure A20051013101200051
(Ka) ' as an example, ' (Ka) ' comprise three elements, that is, and ' ', '
Figure A20051013101200053
' and '-'.
After mouse pointer 130 was moved to IME input window 120, the user pressed mouse button in the dilatory marking tools on touch pad, and imports first element ' ' (2).
For import second element '
Figure A20051013101200054
', mouse pointer 130 should be moved to position ' a '.For this reason, the user discharges the pressure that is applied on the mouse button, dilatory marking tools on touch pad, and rolling mouse pointer 130 is to position ' a ' (3) then.
When the position ' a ' of mouse pointer 130 on display unit, the user presses mouse button in the dilatory marking tools on touch pad, and import second element ' ' (4).
In order to import the 3rd element '-', the user discharges the pressure that is applied on the mouse button, dilatory marking tools on touch pad, and rolling mouse pointer 130 is to position ' b ' (5) then.
When mouse pointer 130 in the position when ' b ', user's dilatory marking tools on touch pad is pressed mouse button simultaneously, and imports the 3rd element '-' (6).
In the prior art, when the user used touch pad to come input character, the user must operate mouse button with input character and rolling mouse pointer simultaneously when repeatedly drawing marking tools.Last from the time, this operator scheme becomes more and more heavier for the user.Therefore, along with the increase of character number of taps, the user inconvenience related with using the touch pad input character has just increased inevitably.This is because the whole viewing area of touch pad and display unit is corresponding with relative coordinate.
Simultaneously, in using the situation of touch-screen, the user can be directly the touch-screen inputing characters just as the user uses pen actual writing.But touch-screen is expensive indicating equipment, therefore is not suitable for by the widely used cheap personal computer of domestic consumer (PC).
The open No.2003-196007 (character input device) of Jap.P. discloses a kind of technology, its permission shows dummy keyboard on display unit, and the user uses touch pad rolling mouse pointer and input on dummy keyboard to be mapped to the character of dummy keyboard.But in the situation of language, still be difficult to all base characters are mapped on the button that provides on the dummy keyboard with a large amount of base characters.In addition, because the user must one by one seek desired character on dummy keyboard, not that very skilled user may run into inconvenience so use dummy keyboard.
Therefore, similar with the situation of using touch-screen, need better technology so that user's direct information uses touch pad and is transfused to.
Summary of the invention
General conception of the present invention provides the apparatus and method of a kind of processing with the information of touch pad input, and it makes the user can use directly input information of touch pad by touch pad being mapped to predetermined viewing area as absolute coordinates.
Other aspects of the present invention and advantage part will be illustrated by following description, and part becomes obviously from these are described, and part can the acquistion by putting into practice general conception of the present invention.
Can realize the above-mentioned of general conception of the present invention and other aspects by the method for handling the touch pad input information is provided, described method comprises: the output area of touch pad is mapped to predetermined viewing area, as absolute coordinates, when indicating member contact input area, be absolute coordinates, and move the mouse pointer that is presented on the viewing area according to the contact position coordinate of conversion with the contact position coordinate conversion.
Can also realize the above-mentioned of general conception of the present invention and other aspects by the method that identification character from use the information that the input equipment can detect contact and to produce the contact position coordinate imports is provided, described method comprises: define corresponding relation between a plurality of absolute coordinatess of a plurality of position coordinateses of input equipment and display, the contact position coordinate conversion that will be produced by input equipment is absolute displaing coordinate, show absolute coordinates, and come identification character according to a series of coordinates with from the maximal correlation degree between the next reference character of a plurality of reference characters.
Can also realize the above-mentioned of general conception of the present invention and other aspects by the method for handling the indicated position in the presumptive area is provided, described method comprises: the viewing area that the input area of presumptive area is mapped to display, as absolute coordinates, when indicating positions, the position coordinates in presumptive area is converted to absolute coordinates, and along with the position coordinates corresponding display moving hand of indicated conversion.
Can also realize the above-mentioned of general conception of the present invention and other aspects by the device of handling the touch pad input information is provided, described device comprises: coordinate is provided with the unit, be used for the position coordinates of the input area of touch pad is mapped to the viewing area, as absolute coordinates; Coordinate transformation unit is used for the position coordinates that marking tools contacts the input area place is converted to corresponding absolute coordinates; With the mouse pointer control module, be used for moving the mouse pointer that on the viewing area, shows according to the contact position coordinate of being changed.
Can also realize the above-mentioned of general conception of the present invention and other aspects by the device that is provided for identification character from use the information that the input equipment can detect contact and output contact position coordinate imports, described device comprises: display; Converting unit, the contact position coordinate conversion that is used for being detected by input equipment is absolute displaing coordinate; The group processing unit is used for a series of absolute coordinates formation groups and is controlled at the demonstration of the set of coordinates on the display; And recognition unit, be used for coming identification character according to set of coordinates with from the maximal correlation degree between the next reference character of a plurality of reference characters.
Can also realize the above-mentioned of general conception of the present invention and other aspects by the device that is provided for handling the indicated position in the presumptive area, described device comprises: map unit is used for the input area of presumptive area is mapped to the viewing area of display as absolute coordinates; Converting unit is used for when the position is instructed to the position coordinates of presumptive area is converted to absolute coordinates; And display, be used for along with the moving of the position coordinates corresponding display display pointer of indicated conversion.
Description of drawings
In conjunction with the accompanying drawings, these and/or other the aspect of general conception of the present invention and advantage will become obviously from the description of following embodiment, wherein:
Fig. 1 shows the classic method of using the touch pad input character;
The embodiment that Fig. 2 shows the general conception according to the present invention uses the block scheme of the device of touch pad input information;
Fig. 3 shows the block scheme of control module shown in Figure 2;
Fig. 4 shows the moving of mouse pointer of the embodiment of general conception according to the present invention;
The embodiment that Fig. 5 shows the general conception according to the present invention handles the method flow diagram of touch pad input information;
The embodiment that Fig. 6 shows the general conception according to the present invention comes the process flow diagram of the method for identification character; With
Another embodiment that Fig. 7 shows the general conception according to the present invention comes the process flow diagram of the method for identification character.
Embodiment
Present with careful embodiment with reference to general conception of the present invention, its example shown in the drawings, wherein identical Reference numeral is indicated components identical in the text.Be described below embodiment in the reference accompanying drawing, to explain general conception of the present invention.
The embodiment that Fig. 2 shows the general conception according to the present invention uses the block scheme of the device of touch pad input information.
The device of Fig. 2 comprises touch panel unit 210, key-press input unit 220, control module 230 and display unit 240.This device also comprises storage unit 250, recognition unit 260 and image generation unit 270.
Touch panel unit 210 comprises touch pad 212 and coordinate processing unit 214.When marking tools contacted the input area of touch pad 212, the simulating signal that touch pad 212 detection contact points and output are produced by contact was to coordinate processing unit 214.In this case, coordinate processing unit 214 produces the digital signal of the contact position coordinate with the marking tools that contacts touch pad 212 and exports this digital signal to control module 230.
For example, when touch pad 212 is the pressure sensitive type, constitute touch pad 212, between these two resistor discs, meticulous gap is arranged by two earth resistance sheets that overlap each other.When marking tools contact touch pad 212, resistor disc contacts with each other at this point, and electric current flows through between resistor disc.In response to the contact of marking tools, touch pad 212 produces simulating signal and exports this signal to coordinate processing unit 214.Coordinate processing unit 214 extracts to be exported as digital signal about the information of corresponding contact position and with this information.Therefore, if when with touch pad 212 (more particularly, dilatory marking tools when the contact area of touch pad 212) contacting, then coordinate processing unit 214 can detect the mobile route of contact point, produce the contact position coordinate corresponding, and the contact position coordinate that output is produced is to control module 230 with mobile route.
But employed touch pad is not limited to the touch pad of pressure sensitive type in general conception of the present invention, can also comprise the other types device that can detect contact and output contact position coordinate.
Touch panel unit 210 can comprise that at least one has identical shaped and as the mouse button 2 16 of conventional mouse button.
Key-press input unit 220 can comprise that at least one button and output and the button corresponding key signal pressed are to control module 230.The input information that each push button signalling is mapped to numeral, character or has specific function.Therefore, the user can operation push-button input block 220 and the touch pad input pattern is set to relative coordinate pattern or absolute coordinates pattern.
Control module 230 can move the mouse pointer that is presented on the display unit 240 in response to the signal from touch panel unit 210 outputs.
More particularly, control module 230 can comprise that coordinate is provided with unit 232, coordinate transformation unit 234 and mouse pointer control module 236, as shown in Figure 3.
If the touch pad input pattern is the relative coordinate pattern, then coordinate whole viewing area that unit 232 touch pads 212 and display unit 240 be set is set to correspond to each other, as relative coordinate.In this case, if dilatory marking tools when contacting with touch pad 212, then coordinate transformation unit 234 is a relative coordinate with the contact position coordinate conversion, and the contact position coordinate is corresponding with the change between the contact position of marking tools before the dilatory operation and afterwards.Mouse pointer control module 236 moves the mouse pointer that is presented on the display unit 140 according to the contact position coordinate of being changed.
In this case, using the mobile of mouse pointer of touch pad 212 is to carry out according to the method identical with classic method.That is, in the relative coordinate pattern, the position of using the mouse pointer that display unit 240 shows only changes in the situation of the specified point of the touch pad 212 of marking tools contact therein.Therefore, in order to change the position of mouse pointer, when contact touch pad 212, must in the relative coordinate pattern, draw marking tools.
If the touch pad input pattern is the absolute coordinates pattern, then coordinate is provided with unit 232 touch pad 212 is set, or more particularly, the input area of touch pad 212 and the particular display area of display unit 240 is set corresponds to each other, as absolute coordinates.So, 1:1 ground is mapped to this particular display area with touch pad 212.
In this case, coordinate transformation unit 234 will be an absolute coordinate from the contact position coordinate conversion of touch panel unit 210 inputs.Mouse pointer control module 236 is controlled moving of mouse pointer on the viewing area that is mapped to touch pad 212 according to the contact position coordinate of being changed.Figure 4 illustrates its example.
With reference to Fig. 4,, mouse pointer 310 restrictions are used as absolute coordinates are mapped in the viewing area 242 of touch pad 212 if be provided with absolute coordinates.Therefore, the mouse pointer on viewing area 242 310 moves and follows according to absolute location coordinates with the path that the dilatory path (dilatory path) 340 thereon of marking tools 330 is identical across touch pad 212.In this case, for the dilatory path 340 of marking tools 330, mouse refers to that 310 mobile route 320 is corresponding to the area part of touch pad 212 ratio value with respect to viewing area 242.
Be different from the relative coordinate pattern, in the situation of absolute coordinates pattern, if marking tools 330 contact touch pads 212 just then can be placed on mouse pointer 310 on the coordinate with contact point corresponding display 242.
Similarly, the viewing area 242 that is mapped to touch pad 212 can be corresponding with the part viewing area of the whole viewing area of display unit 240 or display unit 240, as absolute coordinates.So, when the user carries out application-specific to computing machine, mouse pointer 310 is restricted to the viewing area, for example, shown execution pop-up window when using Microsoft's Window family computer operating system (OS).The executing state of application program is presented in this window.Similarly, the user can use touch pad 212 direct rolling mouse pointer 310 in corresponding viewing area.
Mouse pointer control module 230 touch pad 212 is therein shown the mobile route of mouse pointer 310 on the viewing area that is mapped to according to absolute coordinates.For example, when the dilatory marking tools of user 330 during, as shown in Figure 4, the mobile route 320 of mouse pointer 310 can be shown to the user visibly across touch pad 212.
When selected operator scheme is the absolute coordinates pattern, control module 230 will be an absolute coordinates from the contact position coordinate conversion of touch panel unit 210 outputs, and storage unit 250 is stored the contact position coordinate of being changed.In this case, before carrying out the identification of being undertaken by recognition unit 260 or before the generation image that execution is undertaken by image generation unit 270, in the time of will being absolute coordinates from the contact position coordinate conversion of touch panel unit 210 outputs by control module 230, storage unit 250 is with the contact position coordinate of one group of storage conversion.Therefore, when after the identification that carry out to use recognition unit 260 to carry out or carry out and use after image generation unit 270 carries out the image generation, in the time of will being absolute coordinates from the contact position coordinate conversion of touch panel unit 210 outputs by control module 230, storage unit 250 is stored as new group with the contact position coordinate of conversion.In storage unit 250, will have as the combination of the contact position coordinate of a group storage be formed in wherein the viewing area that touch pad 212 is shone upon according to absolute coordinates on the identical coordinate figure of coordinate of mobile route of the mouse pointer that shows.
The combination that recognition unit 260 uses formation to be stored in one group contact position coordinate in the storage unit 250 comes identification character.For this reason, recognition unit 260 can be stored the standard character as the benchmark of the various characters of identification.Recognition unit 260 search has the standard character that the maximal correlation degree is arranged with the contact position coordinate, and the standard character that is searched is identified as the user wants character or the mark imported.Recognition unit 260 can use traditional character recognition technologies to carry out identification.
When marking tools does not contact touch pad 212 in the time of growing at interval than threshold time, can carry out identifying operations by recognition unit 260.Perhaps, when using key-press input unit 220, touch panel unit 210 or other user interface section (not shown) input recognition command, can carry out identifying operation.
Image generation unit 270 produces and is presented at the corresponding image of mobile route that is mapped to the mouse pointer on the viewing area of touch pad 212 according to absolute coordinates.With the character recognition class of operation seemingly, when marking tools when not contacting touch pad 212 in the long at interval time than threshold time, or when user's input image data produces order, also can the carries out image data produce and operate.
The image data storage that produced can be presented on the display unit 240 in storage unit 250 and according to user's request.
The method of processing touch pad input information of the embodiment of the general conception according to the present invention is described now with reference to accompanying drawing.
The embodiment that Fig. 5 shows the general conception according to the present invention handles the process flow diagram of the method for touch pad input information.
In operation S110, the touch pad input pattern initially is set by the user.Can use key-press input unit 220, touch panel unit 210 or other user interface section (not shown) to import input pattern order is set.
In operation S120, coordinate is provided with unit 232 and determines whether input pattern is the absolute coordinates pattern.Be set to the absolute coordinates pattern if determine input pattern, then at operation S130, coordinate is provided with unit 232 input area and the predetermined display area on display unit 240 of touch pad 212 is mapped as absolute coordinates.So, with the input area 1:1 of touch pad 212 be mapped to predetermined viewing area.
The S140 marking tools contacts touch pad 212 and from touch pad 212 outgoing position coordinates, then touch panel unit 210 outputs to coordinate transformation unit 234 with the contact position coordinate if operating.At operation S150, coordinate transformation unit 234 will be an absolute coordinates from the contact position coordinate conversion of touch panel unit 210 outputs then.At operation S160, mouse pointer control module 236 is according to the contact position coordinate of being changed from the mapping area of touch pad 212 by coordinate transformation unit 234, rolling mouse pointer 310 on viewing area 234.
If set input pattern is not absolute coordinates pattern but relative coordinate pattern, then at operation S165, coordinate is provided with unit 232 touch pad 212 is mapped as the whole viewing area of display unit 240, as relative coordinate.
If at operation S170 marking tools contact touch pad 212 and from touch pad 212 outgoing position coordinates, then touch panel unit 210 outputs to coordinate transformation unit 234 with the contact position coordinate and coordinate transformation unit 234 will be a relative coordinate from the contact position coordinate conversion of touch panel unit 210 outputs.At operation S190, mouse pointer control module 236 according to by the contact position coordinate of coordinate transformation unit 234 conversions on display unit 240 rolling mouse pointer 310.
If the operation S160 in the absolute coordinates pattern rolling mouse pointer 310, then mouse pointer control module 236 can use display unit 240 to show the mobile route of mouse pointer on the viewing area.
The embodiment of general conception according to the present invention can store the contact position coordinate by coordinate transformation unit 234 conversions, and the contact position coordinate of being stored can be identified as character afterwards, describes this situation now with reference to Fig. 6 and 7.
The embodiment that Fig. 6 shows the general conception according to the present invention comes the process flow diagram of the method for identification character.
In operation S210, if export the contact position coordinates from touch panel unit 210 in the absolute coordinates pattern, then at operation S220, coordinate transformation unit 234 is an absolute coordinates with the contact position coordinate conversion.
In this case, storage unit 250 is converted to the contact position coordinate of absolute coordinates by coordinate transformation unit 234 in operation S230 storage.
If do not import recognition command, then repeatedly carry out the operation S210 that uses S230 operation S240 user.In this operation, storage unit 250 will be stored as a group with the contact position coordinate of having stored before together by the contact position coordinate that coordinate transformation unit 234 is changed recently.
If imported recognition command operation S240 user, then recognition unit 260 is by coming identification character as one group of contact position coordinate of storing in storage unit 250 in operation S250.
If the new contact position coordinate of coordinate transformation unit 234 conversions after character recognition, then storage unit 250 is stored as new group with the contact position coordinate of being changed.Therefore, all the contact position coordinates that will be changed before another character recognition process of execution are stored in identical group.
Another embodiment that Fig. 7 shows the general conception according to the present invention carries out the process flow diagram of the method for character recognition.
At operation S310, if export the contact position coordinates from touch panel unit 210 in the absolute coordinates pattern, then at operation S320, coordinate transformation unit 234 is an absolute coordinates with the contact position coordinate conversion.
In this case, at operation S330, storage unit 250 will be stored by the contact position coordinate that coordinate transformation unit 234 is converted to absolute coordinates.
If not from touch panel unit 210 output contact position coordinates, then at operation S340, in being no more than threshold time interlude interval, recognition unit 260 is waited for will be from the new contact position coordinate of touch panel unit 210 outputs at operation S310.If the stand-by period does not surpass threshold time at interval,, repeat operation S310 then by S340.If the stand-by period is no more than threshold time at interval, then the contact position coordinate that will be changed recently by coordinate transformation unit 234 of storage unit 250 and the contact position coordinate of having stored are before stored as one group together.
If the stand-by period has surpassed threshold time at interval in operation S350, then in operation S360, recognition unit 260 uses the contact position coordinate of storage in storage unit 250 to come identification character.
If by the new contact position coordinate of coordinate transformation unit 234 conversions, then storage unit 250 is stored as new group with the contact position coordinate of being changed after character recognition.Therefore, all the contact position coordinates that will be changed before carrying out another character recognition process all are stored in this identical group.
In this way, the embodiment of general conception according to the present invention, the dilatory operation (3 and 5) that can in character input process shown in Figure 1, omit the marking tools of the position that is used to change mouse pointer.Therefore, the user can use touch pad to write or use the mode of finger to come direct input character with pen with she or he.
After character recognition, will be stored in the storage unit 250 as new group as absolute coordinates by the new contact position coordinate of control module 230 conversions.
Though described character recognition process with reference to Fig. 6 and 7, can come discriminating digit and other symbols with the identical operations that illustrates previously.
Another embodiment of general conception according to the present invention, can use in storage unit 250 storage a set of contact position coordinates and with the shown mobile route storage of mouse pointer as view data.In this case, can carry out the image that uses image generation unit 270 and produce operation, rather than at the operation S250 of the character recognition shown in Fig. 6 with at the operation of the character recognition shown in Fig. 7 S360.
As mentioned above, in the apparatus and method of the processing touch pad input information of the various embodiment of general conception according to the present invention, the input area of touch pad is mapped as the viewing area,, thereby makes the user can use directly input information of touch pad as absolute coordinates.
Though illustrated and described several embodiment of general conception of the present invention, but one of ordinary skill in the art should be appreciated that in the scope of the general conception of the present invention that defines under the situation of principle that does not depart from general conception of the present invention and spirit, in claims and its equivalent, can change.

Claims (18)

1. method of handling the touch pad input information, this method comprises:
The input area of touch pad is mapped to predetermined viewing area, as absolute coordinates;
When indicating member contact input area, be absolute coordinates with the contact position coordinate conversion; With
Contact position coordinate according to conversion moves the mouse pointer that is presented on the viewing area.
2. method according to claim 1, also be included in a series of contact position coordinate corresponding display on show the mobile route of mouse pointer.
3. method according to claim 2 also comprises:
The contact position coordinate that storage is changed; With
Use the contact position coordinate of being stored to come identification character.
4. method according to claim 3 also comprises the character that demonstration is discerned.
5. method according to claim 4, the character of wherein shown identification replaces the mouse pointer path of demonstration.
6. method according to claim 3, wherein when the user asks character recognition or when the time of marking tools contact touch pad is long at interval unlike threshold time, execution character identification.
7. method according to claim 2 comprises that also the mobile route of storing mouse pointer is as view data.
8. the method for processing indicated position in predetermined zone, this method comprises:
The input area of presumptive area is mapped to the viewing area of display, as absolute coordinates;
When indicating positions, the position coordinates in presumptive area is converted to absolute coordinates; And
Along with the position coordinates corresponding display moving hand of indicated conversion.
9. device of handling the touch pad input information, this device comprises:
Coordinate is provided with the unit, is used for the position coordinates of the input area of touch pad is mapped to the viewing area, as absolute coordinates;
Coordinate transformation unit is used for the position coordinates that indicating member contacts the touch pad place is converted to corresponding absolute coordinates; With
The mouse pointer control module is used for moving the mouse pointer that shows according to the contact position coordinate of being changed on the viewing area.
10. device according to claim 9, wherein said mouse pointer control module shows the mobile route of mouse pointer on the viewing area.
11. device according to claim 10 also comprises:
Storage unit is used to store the contact position coordinate of conversion; With
Recognition unit is used to use the contact position coordinate of being stored to come identification character.
12. device according to claim 11, wherein when the user asks character recognition or when the time of marking tools contact touch pad is long at interval unlike threshold time, execution character identification.
13. device according to claim 10 also comprises pictcure generator, in order to the storage mouse pointer mobile route as view data.
14. the device of an identification character from the information of using the input equipment input that can detect contact and output contact position coordinate, this device comprises:
Display;
Converting unit, the contact position coordinate conversion that is used for being detected by input equipment is absolute displaing coordinate;
The group processing unit is used for a series of absolute coordinates formation groups and is controlled at the demonstration of the set of coordinates on the display; With
Recognition unit is used for coming identification character according to set of coordinates with from the maximal correlation degree between the next reference character of a plurality of reference characters.
15. device according to claim 14 also comprises:
Storage unit is used to store the image of request demonstration or the character of a series of identifications.
16. device according to claim 14 also comprises:
Switch is used to allow the user to select between absolute coordinates pattern and relative coordinate pattern, wherein in the relative coordinate pattern, uses relative coordinate rather than absolute coordinates.
17. device according to claim 14 also comprises:
The aftertreatment document element is used to handle the character of being discerned has reference character with formation document.
18. a device that is used to handle the indicated position in the presumptive area, this device comprises:
Map unit is used for the input area of presumptive area is mapped to the viewing area of display as absolute coordinates;
Converting unit is used for when the position is instructed to the position coordinates of presumptive area is converted to absolute coordinates; With
Display, be used for along with the moving of the position coordinates corresponding display display pointer of indicated conversion.
CN200510131012.9A 2004-12-03 2005-12-02 Apparatus and method of processing information input using a touchpad Pending CN1782975A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR101245/04 2004-12-03
KR1020040101245A KR100678945B1 (en) 2004-12-03 2004-12-03 Apparatus and method for processing input information of touchpad

Publications (1)

Publication Number Publication Date
CN1782975A true CN1782975A (en) 2006-06-07

Family

ID=36573628

Family Applications (1)

Application Number Title Priority Date Filing Date
CN200510131012.9A Pending CN1782975A (en) 2004-12-03 2005-12-02 Apparatus and method of processing information input using a touchpad

Country Status (4)

Country Link
US (1) US20060119588A1 (en)
JP (1) JP2006164238A (en)
KR (1) KR100678945B1 (en)
CN (1) CN1782975A (en)

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101963858A (en) * 2009-07-22 2011-02-02 义隆电子股份有限公司 Positiong method for controlling cursor on screen by using touch device
CN102077156A (en) * 2008-06-27 2011-05-25 微软公司 Virtual touchpad
CN101498984B (en) * 2008-02-01 2011-07-13 致伸科技股份有限公司 Computer cursor control system and method for controlling cursor movement
CN102169641A (en) * 2010-12-29 2011-08-31 西安交通大学 Digital image display equipment with interactive information inputted in a wireless way
CN101098533B (en) * 2006-06-26 2012-06-27 三星电子株式会社 Keypad touch user interface method and mobile terminal using the same
CN102934049A (en) * 2010-06-09 2013-02-13 微软公司 Indirect user interaction with desktop using touch-sensitive control surface
CN101661777B (en) * 2008-08-27 2013-03-06 索尼株式会社 Playback apparatus, playback method and program
CN103176691A (en) * 2011-12-23 2013-06-26 株式会社电装 Display system, display apparatus, manipulation apparatus and function selection apparatus
CN103353804A (en) * 2013-07-03 2013-10-16 深圳雷柏科技股份有限公司 Cursor control method and device based on touch tablet
CN103513916A (en) * 2012-06-21 2014-01-15 株式会社泛泰 Apparatus and method for controlling a terminal using a touch input
CN103809890A (en) * 2012-11-13 2014-05-21 联想(北京)有限公司 Information processing method and electronic equipment
CN108227968A (en) * 2018-02-08 2018-06-29 北京硬壳科技有限公司 Method and device for controlling cursor
US10037091B2 (en) 2014-11-19 2018-07-31 Honda Motor Co., Ltd. System and method for providing absolute coordinate and zone mapping between a touchpad and a display screen
US11307756B2 (en) 2014-11-19 2022-04-19 Honda Motor Co., Ltd. System and method for presenting moving graphic animations in inactive and active states

Families Citing this family (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2916545B1 (en) * 2007-05-23 2009-11-20 Inst Pour Le Dev De La Science METHOD FOR LOCATING A TOUCH ON A SURFACE AND DEVICE FOR IMPLEMENTING SAID METHOD
US8065624B2 (en) * 2007-06-28 2011-11-22 Panasonic Corporation Virtual keypad systems and methods
TW200907768A (en) * 2007-08-09 2009-02-16 Asustek Comp Inc Portable apparatus and rapid cursor positioning method
US20090096749A1 (en) * 2007-10-10 2009-04-16 Sun Microsystems, Inc. Portable device input technique
JP4372188B2 (en) * 2007-12-21 2009-11-25 株式会社東芝 Information processing apparatus and display control method
US20090167723A1 (en) * 2007-12-31 2009-07-02 Wah Yiu Kwong Input devices
JP2011521520A (en) * 2008-04-16 2011-07-21 ワイコフ, リチャード ダレル Portable multimedia receiver and transmitter
EP2277307A2 (en) * 2008-04-16 2011-01-26 Emil Stefanov Dotchevski Interactive display recognition devices and related methods and systems for implementation thereof
US8681106B2 (en) 2009-06-07 2014-03-25 Apple Inc. Devices, methods, and graphical user interfaces for accessibility using a touch-sensitive surface
JP2011134278A (en) * 2009-12-25 2011-07-07 Toshiba Corp Information processing apparatus and pointing control method
US8704783B2 (en) 2010-03-24 2014-04-22 Microsoft Corporation Easy word selection and selection ahead of finger
US9292161B2 (en) * 2010-03-24 2016-03-22 Microsoft Technology Licensing, Llc Pointer tool with touch-enabled precise placement
US8707195B2 (en) 2010-06-07 2014-04-22 Apple Inc. Devices, methods, and graphical user interfaces for accessibility via a touch-sensitive surface
GB2481606B (en) * 2010-06-29 2017-02-01 Promethean Ltd Fine object positioning
US8452600B2 (en) 2010-08-18 2013-05-28 Apple Inc. Assisted reader
US8751971B2 (en) 2011-06-05 2014-06-10 Apple Inc. Devices, methods, and graphical user interfaces for providing accessibility using a touch-sensitive surface
US9317196B2 (en) 2011-08-10 2016-04-19 Microsoft Technology Licensing, Llc Automatic zooming for text selection/cursor placement
KR20130105044A (en) * 2012-03-16 2013-09-25 삼성전자주식회사 Method for user interface in touch screen terminal and thereof apparatus
US8881269B2 (en) 2012-03-31 2014-11-04 Apple Inc. Device, method, and graphical user interface for integrating recognition of handwriting gestures with a screen reader
CN103019588A (en) * 2012-11-26 2013-04-03 中兴通讯股份有限公司 Touch positioning method, device and terminal
US9483171B1 (en) * 2013-06-11 2016-11-01 Amazon Technologies, Inc. Low latency touch input rendering
JP6149604B2 (en) * 2013-08-21 2017-06-21 ソニー株式会社 Display control apparatus, display control method, and program
CN104516620A (en) * 2013-09-27 2015-04-15 联想(北京)有限公司 Positioning method and electronic device
DK201670580A1 (en) 2016-06-12 2018-01-02 Apple Inc Wrist-based tactile time feedback for non-sighted users
TWI638300B (en) * 2017-09-28 2018-10-11 義隆電子股份有限公司 Touch input method and computer system using the same
US10996761B2 (en) 2019-06-01 2021-05-04 Apple Inc. User interfaces for non-visual output of time

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2973925B2 (en) * 1996-05-31 1999-11-08 日本電気株式会社 Touchpad input device
US6128007A (en) * 1996-07-29 2000-10-03 Motorola, Inc. Method and apparatus for multi-mode handwritten input and hand directed control of a computing device
KR19990059505A (en) * 1997-12-30 1999-07-26 구자홍 Pen input method and device using a portable information terminal
EP0944162B1 (en) * 1998-03-19 2009-02-18 Alcatel Lucent Auto-synchronized DC/DC converter and method of operating same
KR100503056B1 (en) * 1998-04-23 2005-09-09 삼성전자주식회사 Touch pad processing apparatus, method thereof and touch pad module in computer system
US6246220B1 (en) * 1999-09-01 2001-06-12 Intersil Corporation Synchronous-rectified DC to DC converter with improved current sensing
JP2001117713A (en) 1999-10-19 2001-04-27 Casio Comput Co Ltd Data processor and storage medium
JP4878667B2 (en) * 2000-02-17 2012-02-15 富士通コンポーネント株式会社 Coordinate detection apparatus and coordinate detection method

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101098533B (en) * 2006-06-26 2012-06-27 三星电子株式会社 Keypad touch user interface method and mobile terminal using the same
CN101498984B (en) * 2008-02-01 2011-07-13 致伸科技股份有限公司 Computer cursor control system and method for controlling cursor movement
CN102077156A (en) * 2008-06-27 2011-05-25 微软公司 Virtual touchpad
US8754855B2 (en) 2008-06-27 2014-06-17 Microsoft Corporation Virtual touchpad
CN101661777B (en) * 2008-08-27 2013-03-06 索尼株式会社 Playback apparatus, playback method and program
CN101963858A (en) * 2009-07-22 2011-02-02 义隆电子股份有限公司 Positiong method for controlling cursor on screen by using touch device
CN102934049A (en) * 2010-06-09 2013-02-13 微软公司 Indirect user interaction with desktop using touch-sensitive control surface
US11068149B2 (en) 2010-06-09 2021-07-20 Microsoft Technology Licensing, Llc Indirect user interaction with desktop using touch-sensitive control surface
CN102934049B (en) * 2010-06-09 2016-06-08 微软技术许可有限责任公司 Touch sensitivity control surface and desktop is used to carry out indirect user mutual
CN102169641A (en) * 2010-12-29 2011-08-31 西安交通大学 Digital image display equipment with interactive information inputted in a wireless way
CN103176691B (en) * 2011-12-23 2015-11-04 株式会社电装 Display system, display device, operating equipment and function select equipment
CN103176691A (en) * 2011-12-23 2013-06-26 株式会社电装 Display system, display apparatus, manipulation apparatus and function selection apparatus
CN103513916B (en) * 2012-06-21 2017-10-20 泛泰株式会社 Using touch input come the apparatus and method of control terminal
CN103513916A (en) * 2012-06-21 2014-01-15 株式会社泛泰 Apparatus and method for controlling a terminal using a touch input
CN103809890A (en) * 2012-11-13 2014-05-21 联想(北京)有限公司 Information processing method and electronic equipment
US9811185B2 (en) 2012-11-13 2017-11-07 Beijing Lenovo Software Ltd. Information processing method and electronic device
CN103353804B (en) * 2013-07-03 2016-06-22 深圳雷柏科技股份有限公司 A kind of cursor control method based on touch pad and device
CN103353804A (en) * 2013-07-03 2013-10-16 深圳雷柏科技股份有限公司 Cursor control method and device based on touch tablet
US10037091B2 (en) 2014-11-19 2018-07-31 Honda Motor Co., Ltd. System and method for providing absolute coordinate and zone mapping between a touchpad and a display screen
US10496194B2 (en) 2014-11-19 2019-12-03 Honda Motor Co., Ltd. System and method for providing absolute coordinate and zone mapping between a touchpad and a display screen
US11307756B2 (en) 2014-11-19 2022-04-19 Honda Motor Co., Ltd. System and method for presenting moving graphic animations in inactive and active states
CN108227968A (en) * 2018-02-08 2018-06-29 北京硬壳科技有限公司 Method and device for controlling cursor

Also Published As

Publication number Publication date
KR20060062416A (en) 2006-06-12
KR100678945B1 (en) 2007-02-07
US20060119588A1 (en) 2006-06-08
JP2006164238A (en) 2006-06-22

Similar Documents

Publication Publication Date Title
CN1782975A (en) Apparatus and method of processing information input using a touchpad
CN1269014C (en) Character input device
CN201156246Y (en) Multiple affair input system
CN101174190B (en) Software keyboard entry method for implementing composite key on screen of electronic equipments
US20110291940A1 (en) Data entry system
CN1777858A (en) Unambiguous text input method for touch screens and reduced keyboard systems
CN1358299A (en) Data entry device recording input in two dimensions
CN101427202B (en) Method and device for improving inputting speed of characters
CN1524212A (en) Text entry method and device therefor
TW200530901A (en) Text entry system and method
JP2011501312A (en) Character and number input device and input method for communication terminal
CN101822032A (en) Apparatus and method for inputting characters / numerals for communication terminal
KR20080052438A (en) Using sequential taps to enter text
CN1668994A (en) Information display input device and information display input method, and information processing device
CN102272699A (en) Gesture detection zones
KR20120001697A (en) Key input interface method
CN106227449A (en) Input control method based on sense of touch vision technique and system
US20140354550A1 (en) Receiving contextual information from keyboards
WO2017112714A1 (en) Combination computer keyboard and computer pointing device
CN1815410A (en) Capacitive information input device and method for electronic device
CN104503591A (en) Information input method based on broken line gesture
CN101872266B (en) Input processing device
JP2016071623A (en) Electronic apparatus, method and program
CN115917469A (en) Apparatus and method for inputting logograms into electronic device
CN1242313C (en) Computer input system

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C02 Deemed withdrawal of patent application after publication (patent law 2001)
WD01 Invention patent application deemed withdrawn after publication