CN107066137A - The apparatus and method of user interface are provided - Google Patents
The apparatus and method of user interface are provided Download PDFInfo
- Publication number
- CN107066137A CN107066137A CN201611178513.7A CN201611178513A CN107066137A CN 107066137 A CN107066137 A CN 107066137A CN 201611178513 A CN201611178513 A CN 201611178513A CN 107066137 A CN107066137 A CN 107066137A
- Authority
- CN
- China
- Prior art keywords
- touch
- finger
- area
- user
- controller
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 23
- 230000008859 change Effects 0.000 claims description 23
- 230000003287 optical effect Effects 0.000 claims description 17
- 230000003321 amplification Effects 0.000 claims description 5
- 238000003199 nucleic acid amplification method Methods 0.000 claims description 5
- 230000009471 action Effects 0.000 description 36
- 230000006870 function Effects 0.000 description 15
- 238000001514 detection method Methods 0.000 description 13
- 238000010586 diagram Methods 0.000 description 3
- 230000008447 perception Effects 0.000 description 2
- 230000011514 reflex Effects 0.000 description 2
- 240000002853 Nelumbo nucifera Species 0.000 description 1
- 235000006508 Nelumbo nucifera Nutrition 0.000 description 1
- 235000006510 Nelumbo pentapetala Nutrition 0.000 description 1
- 230000004913 activation Effects 0.000 description 1
- 238000005452 bending Methods 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 238000011038 discontinuous diafiltration by volume reduction Methods 0.000 description 1
- 235000013399 edible fruits Nutrition 0.000 description 1
- 230000005611 electricity Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 210000003127 knee Anatomy 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 230000003362 replicative effect Effects 0.000 description 1
- 238000010008 shearing Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0412—Digitisers structurally integrated in a display
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
- G06F3/04166—Details of scanning methods, e.g. sampling time, grouping of sub areas or time sharing with display driving
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
- G06F3/0421—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by interrupting or reflecting a light beam, e.g. optical touch-screen
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/044—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04817—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/041—Indexing scheme relating to G06F3/041 - G06F3/045
- G06F2203/04101—2.5D-digitiser, i.e. digitiser detecting the X/Y position of the input means, finger or stylus, also when it does not touch, but is proximate to the digitiser's interaction surface and also measures the distance of the input means within a short range in the Z direction, possibly with a separate measurement setup
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/041—Indexing scheme relating to G06F3/041 - G06F3/045
- G06F2203/04104—Multi-touch detection in digitiser, i.e. details about the simultaneous detection of a plurality of touching locations, e.g. multiple fingers or pen and finger
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04806—Zoom, i.e. interaction techniques or interactors for controlling the zooming operation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04808—Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen
Abstract
There is provided a kind of apparatus and method for providing user interface.The equipment includes display unit, sensor and controller.The display unit shows at least one graphic user interface (GUI).The sensor produces sensor signal according to the finger touch input of user.The controller receives sensor signal from sensor, sensor signal identification touch area and adjacent area based on reception, produced according to the adjacent area of the touch area of identification and identification based on the information for touching finger shape, control display unit is according to generation based on the presentation of information GUI for touching finger shape.
Description
It is on November 25th, 2009 applying date that the application, which is, and Application No. 200910225143.1 is entitled " to provide user circle
The divisional application of the patent application of the apparatus and method in face ".
Technical field
The present invention relates to a kind of apparatus and method for being used to provide user interface, and record for providing user circle
The computer readable recording medium storing program for performing of the program in face.
Background technology
With the development of sensor and software correlation technique, various electronic equipments are (for example, desktop computer, meter on knee
Calculation machine, palmtop computer, personal digital assistant (PDA), portable media player (PMP) and mobile phone) in
User interface more should be user friendly on using and designing.User interface based on touch is widely used, the interface energy
It is enough to perform operation in the screen of user's touch display device, so as to perform corresponding function.
However, there is such limitation in the user interface of conventional touch-based, i.e. due to touch icon be only capable of performing with
It is instructed accordingly, and multiple user's touch inputs are not allowed.
The content of the invention
The exemplary embodiment of the present invention provides the user-friendly interface that a kind of energy performs various inputs.
The present invention exemplary embodiment additionally provide it is a kind of can in small-size display equipment quickly perform input intelligence
Can user interface.
The further feature of the present invention will be set forth in the description that follows, a part will be apparent by description, Huo Zhetong
The implementation present invention is crossed to understand.
The exemplary embodiment of the present invention discloses a kind of equipment of use based on the information for touching finger type.The equipment
Including display unit, sensor and controller.The display unit shows at least one graphic user interface (GUI).It is described to pass
Sensor produces sensor signal according to user's finger touch input.Controller receives sensor signal from sensor, based on reception
The sensor signal identification touch area arrived and adjacent area, base is produced according to the adjacent area of the touch area of identification and identification
In the information for touching finger shape, control display unit is according to the presentation of information GUI based on finger shape of generation.
The exemplary embodiment of the present invention discloses a kind of use and provides user interface based on the information for touching finger type
Method.This method includes:At least one graphic user interface (GUI) is shown on screen;If have input use on screen
The finger at family is touched, then on the sensor signal recognition screen produced based at least one sensor included in display device
Finger touch area and adjacent area;Produced according to the adjacent area of the touch area of identification and identification based on touch finger type
The information of shape;According to based on the information change and display GUI for touching finger shape.
The exemplary embodiment of the present invention discloses a kind of computer readable recording medium storing program for performing, and the computer-readable record is situated between
Matter record has the executable program for providing user interface.Described program includes:For showing at least one figure on screen
It is based on being produced by sensor in the case of the instruction of shape user interface (GUI), the finger touch for have input user on screen
The instruction of raw sensor signal identification finger touch area and adjacent area, for the touch area according to identification and identification
Adjacent area produce based on touch finger shape information instruction, for according to based on touch finger shape information change and
Show GUI instruction.
It should be understood that foregoing general description and following detailed description are all exemplary and explanatory, purposes
It is to provide that claim is claimed of the invention is explained further.
Brief description of the drawings
Comprising accompanying drawing be used to provide a further understanding of the present invention, comprising in this manual and constituting this specification
A part, the accompanying drawing illustrates the exemplary embodiment of the present invention, and is used for the original of explaining the present invention together with the description
Reason.
Fig. 1 is the block diagram for showing to provide equipment according to the user interface of the exemplary embodiment of the present invention.
Fig. 2 is the flow chart that is used to provide the method for user interface of the description according to the exemplary embodiment of the present invention.
Fig. 3 a are the solids for showing the finger touch action in the equipment according to Fig. 1 of exemplary embodiment of the invention
Figure.
Fig. 3 b show the finger-image of Fig. 3 A according to an exemplary embodiment of the present invention finger touch action.
Fig. 4 is to show to provide user circle according to information of the use of the exemplary embodiment of the present invention based on finger type
The flow chart of the method in face.
Fig. 5 shows the first example that execution is instructed in Fig. 1 equipment of the exemplary embodiment according to the present invention.
Fig. 6 shows the second example that execution is instructed in the equipment in Fig. 1 according to the exemplary embodiment of the present invention.
Fig. 7 shows the 3rd example that execution is instructed in the equipment in Fig. 1 according to the exemplary embodiment of the present invention.
Fig. 8 shows the 4th example that execution is instructed in the equipment in Fig. 1 according to the exemplary embodiment of the present invention.
Fig. 9 is the user interface for information of the utilization based on finger type for showing the exemplary embodiment according to the present invention
The stereogram of finger touch action in equipment is provided.
Figure 10 shows the first example that execution is instructed in Fig. 9 of the exemplary embodiment according to present invention equipment.
Figure 11 shows the second example that execution is instructed in Fig. 9 of the exemplary embodiment according to present invention equipment.
Figure 12 is user circle for information of the utilization based on finger type for showing the exemplary embodiment according to the present invention
Face provides the stereogram of equipment.
Figure 13 shows that the instruction in Figure 11 of the exemplary embodiment according to present invention equipment is performed.
Figure 14 shows the first example of the output screen of Fig. 1 of the exemplary embodiment according to present invention equipment.
Figure 15 shows the second example of the output screen of Fig. 1 of the exemplary embodiment according to present invention equipment.
Figure 16 shows the 3rd example of the output screen of Fig. 1 of the exemplary embodiment according to present invention equipment.
Figure 17 shows the 4th example of the output screen of Fig. 1 of the exemplary embodiment according to present invention equipment.
Figure 18 shows the 5th example of the output screen of Fig. 1 of the exemplary embodiment according to present invention equipment.
Figure 19 shows the example of the output screen of Fig. 1 according to an exemplary embodiment of the present invention equipment.
Embodiment
In an exemplary embodiment of the present invention, term " graphic user interface (GUI) " refers to be shown in display device
The concept of figure on screen.GUI includes that the Drawing Object on screen can be displayed on, for example, icon, item, thumbnail, complete
Screen image etc..GUI also includes the screen being made up of Drawing Object.
The present invention is described more fully with next, with reference to accompanying drawing, the exemplary embodiment of the present invention is shown in the drawings.
However, the present invention may be realized in various forms, exemplary embodiments set forth herein should not be construed as limited to.On the contrary, carrying
For these embodiments, to cause the thoroughly open present invention, and the scope of the present invention is all conveyed into those skilled in the art.
In the accompanying drawings, for clarity, the size and relative size in layer or region are exaggerated.Identical label is represented all the time in accompanying drawing
Identical element.
In the following description, for the ease of description, it is referred to as the target by finger touch action perform function
Icon.However, in order to easily design various programs, instruction or data file, can also be formed and various programs, instruction or data
File corresponding small picture or symbol, and be shown on screen.When search pictures or image, the icon may include by subtracting
The expression size of few picture or image and the thumbnail represented.When user provides touch action, the target on screen is shown in
(such as picture or image) executable predetermined operation.That is, described icon can be used as the concept instead of GUI.
Below, the exemplary embodiment of the present invention is described in detail with reference to the attached drawings.
Fig. 1 is to show (to be hereinafter referred to as " finger class according to type of the use of the exemplary embodiment of the present invention based on finger
Type ") information user interface provide equipment block diagram.
User interface, which provides equipment 10, may include display unit 11, optical sensor 12 and controller 13.
Display unit 11 can show and can be at least the one of execution by the touch (hereinafter referred to as " touch action ") of finger
The individual corresponding icon of instruction.Sensor 12 can obtain the sensor signal of the touch action occurred in display unit 11, and can be by
The sensor signal is output to controller 13.In an embodiment of the present invention, sensor 12 can be optical sensor or touch
Touch sensor.Sensor 12 can be configured as contacting with display unit 11, can be formed as multilayer.Sensor 12 can be tool
There is the optical detecting sensor of the light emitting diode (LED) set in the matrix form toward each other and phototransistor, wherein, lead to
Cross transmitting and receive infrared-ray to obtain finger-image.When occurring touch action, optical sensor estimation is reflected from finger
Light quantity, i.e. the amount of reflected light.The amount of the reflected light of estimation can be used as recognizing the data of touch area and adjacent area.Optics
Perform detection is operated sensor in this way, i.e. when touch action occurs on screen, estimates to touch by photodiode
The image of object is touched, detection touches effective touch area of object actual contact position on screen.Changed effective touch area
For digital picture.Digital picture is analyzed, for the coordinate of effective touch area, so as to recognize the position touched.Therefore, touched
The position touched allows to perform with corresponding to by the associated various functions of the icon of touch location.
In an exemplary embodiment of the present invention, sensor 12 is realized with touch sensor.In such a situation it is preferred to
Ground, touch sensor is capacitive touch sensors.Touch sensor performs following detection operation.When being touched on screen
During action, from screen by touch partial loss electric charge, i.e. electric current flows through the touch part.Touch sensor detection is lost
The position of dead electricity lotus, and the quantity of electric charge lost, then recognize touch area and adjacent area.
In an embodiment of the present invention, sensor 12 can be configured as the combination of optical sensor and touch sensor.
In this case, touch sensor can be capacitive touch sensors or resistive touch sensor.Sensor 12 can lead to
Sensing touch is crossed to act and obtain the image of touch area to produce sensor signal.Sensor signal can be used as recognizing Petting Area
Domain and the data of adjacent area.
In an exemplary embodiment of the present invention, sensor 12 may also include for converting analog signals into data signal
A/D converter.
When the finger of user touches screen, sensor 12 (for example, optical sensing) estimates the amount of reflected light.Sensor 12
The sensor signal of the information comprising the light quantity on reflection is produced, and is output to controller 13.That is, sensor 12 is based on
The amount of the reflected light of estimation obtains the image of finger, produces the sensor signal for including the finger-image obtained, then will sensing
Device signal output is to controller 13.If sensor 12 is realized with touch sensor, touch sensor detection is lost from screen
Electric charge in change (quantity of electric charge of loss), produce comprising on loss the quantity of electric charge information sensor signal, then
Sensor signal is output to controller 13.
It is the description that controller 13 is provided by reference picture 2.
Fig. 2 is the flow chart according to an exemplary embodiment of the present invention for being used to provide the method for user interface.In the present invention
Embodiment in, it is assumed that the control display unit 11 of controller 13 is to show at least one GUI.
Controller 13 controls sensor 12, and determines the touch action (201) of user whether is there occurs on screen.If
Step 201 sensor 12 detects user's touch action on screen, then sensor 12 produces sensor signal and output it
To controller 13 (202).For example, if sensor 12 is implemented as optical sensor, sensor signal is included on reflection
Light quantity information.If sensor 12 is implemented as touch sensor, sensor signal is included on being lost on screen
The information of the change (that is, the quantity of electric charge of loss) of electric charge.
When controller 13 receives sensor signal from sensor 12 in step 202., the identification finger of controller 13 is touched
Touch region and adjacent area (203).
For example, if sensor 12 is implemented as optical sensor, the finger that light reflection is occurred mainly on screen connects
Tactile region (is referred to as touch area), and light reflection occur with going back relatively small amount it is adjacent with touch area but not with
In the region (being referred to as adjacent area) that the finger of user is directly contacted.The area that the light quantity of the perception reflex of controller 13 is mainly distributed
Domain is finger touch area.Similarly, controller 13 goes back the reflection light quantity distribution of the light amount ratio finger touch area of perception reflex
Few region is finger adjacent area.That is, controller 13 can be according to the light quantity that a certain region is reflected from screen with being used as reference
The default light quantity of value relatively recognizes touch area and adjacent area.If the light quantity reflected from a certain region is equal to or more than
Default light quantity, then the determination of controller 13 region is touch area.On the other hand, if the light quantity reflected from a certain region is few
In default light quantity, then the determination of controller 13 region is adjacent area.In an embodiment of the present invention, the first reference has been preset
Light quantity and the second quantity of reference light therefor, if the light quantity reflected in a certain region is equal to or more than the first preset reference light quantity, are controlled
Device 13 can recognize that the region is touch area.If the light quantity reflected in the region is less than the first quantity of reference light therefor but is equal to or greatly
In the second quantity of reference light therefor, then the also recognizable region of controller 13 is adjacent area.Otherwise, i.e. if reflected in a certain region
Light quantity be less than the second quantity of reference light therefor, then controller 13 can recognize that the region be open area.
Alternatively, if sensor 12 is implemented as the electric charge of the loss on touch sensor, sensor detection screen
Amount.The quantity of electric charge lost in the region (being referred to as touch area) that the finger of user touches screen is big.On the other hand, with touching
Touch region adjacent but not relatively small by the quantity of electric charge of loss in the region (being referred to as adjacent area) of the finger touch of user.
The region that the quantity of electric charge of the identification loss of controller 13 is big is finger touch area.Similarly, controller also recognizes the electric charge of loss
The region that amount is less than the quantity of electric charge lost in finger touch area is finger adjacent area.That is, controller 13 can be according on screen
The quantity of electric charge lost in a certain region and the default quantity of electric charge as reference value relatively recognize touch area and adjacent region
Domain.If the quantity of electric charge of the loss in a certain region is equal to or more than the default quantity of electric charge, controller can determine that the region is
Touch area.On the other hand, if the quantity of electric charge lost in a certain region is less than the default quantity of electric charge, controller 13 is determined should
Region is adjacent area.In an embodiment of the present invention, due to having preset the first reference charge amount and the second reference charge amount, such as
The quantity of electric charge lost in really a certain region is equal to or more than the first reference charge amount, then controller 13 can recognize that the region is touch
Region.If the quantity of electric charge lost in a certain region is less than the first reference charge amount but equal to or more than the second reference charge
Measure, then controller 13 can recognize that the region is adjacent area.Otherwise, i.e. if the quantity of electric charge lost in a certain region is less than the
Two reference charge amounts, then the identification of controller 13 region is open area.
Controller 13 produces the information based on finger type according to the touch area of identification and the adjacent area of identification
(204).In an embodiment of the present invention, the information based on finger type includes the position of user's finger, touches the touch of finger
Form and the type for touching finger.Controller 13 touches the direction vector of finger based on touch area and adjacent area detection.Control
Device 13 processed also obtains the angle of direction vector, and the angle estimation based on direction vector and the position for producing user's finger.Control
Device 13 processed can estimate the horizontal length and vertical length of touch area, by determining Petting Area according to horizontal length and vertical length
The size in domain produces the information on touch area.Alternatively, controller 13 is based on touch area and adjacent area detection is touched
Touch the direction vector of finger.Controller 13 also obtains the angle of direction vector, by being touched according to the angle-determining of direction vector
Finger is that right finger or left-hand finger touch finger type information to produce.
Information control display unit 11 based on finger type of the controller 13 based on generation changes and shows GUI (205).
In an embodiment of the present invention, controller 13 can perform rotating, move or amplifying for GUI on display unit 11.
Fig. 3 A are the stereograms for showing the finger touch action in Fig. 1 equipment, and Fig. 3 B show that Fig. 3 A finger is touched
The finger-image of action.
Reference picture 3a and Fig. 3 b, when performing finger touch action by touch finger 20 on screen, sensor 12 is produced
Sensor signal.If the sensor is implemented as optical sensor, sensor 12 can obtain the finger for touching finger 20
Image.The finger part for touching screen can be distinguished by using different contrasts and does not touch the finger part of screen to obtain
Finger-image.
Controller 13 can be based on sensor signal detection finger touch area 31 and adjacent area 32, wherein, finger is touched
Region 31 refers to that finger 20 touches the region of screen, and adjacent area 32 refers to be located on screen but finger 20 does not touch the screen
The region of curtain.Controller 13 also can recognize that be not finger touch area 31 and adjacent area 32 open area 33.Then, control
Device 13 can produce the information associated with the position of user, and the touch part of finger 20 finger type.Controller 13 can base
In finger touch area 31 and the detection direction vector 34 of adjacent area 32.Then, controller 13 can determine that left hand or the right hand
It is currently being used.In an embodiment of the present invention, controller 13 can be also detected and determined based on finger touch area 31 and be touched
Icon.When performing instruct corresponding with the icon being determined, controller 13 can be according to the letter based on finger type of generation
Cease execute instruction.
Fig. 4 is the situation that optical sensor is implemented as in sensor 12 for showing the exemplary embodiment according to the present invention
Information of the lower use based on finger type provides the flow chart of the method for user interface.
Reference picture 4, controller 13 can determine that whether optical sensor 12 detects touching for finger 20 in display unit 11
Touch action (20).
If detecting the touch action of finger 20, optical sensor 12 can obtain the whole hand of touch sensitive display unit 11
The image (30) of finger 20.
Then, controller 13 can the image based on the whole finger 20 obtained by optical sensor 12, come determine finger touch
Touch region 31 and adjacent area 32 (40).
Then, the icon that controller 13 can be shown by analyzing the coordinate of finger touch area 31 to determine in display unit 11
For the icon (50) touched.
Controller 13 can determine that what kind of instruction and the icons association (501) being determined.Controller 13 can be according to true
Finger touch area 31 and adjacent area 32 of the fixed instruction based on determination produce the information based on finger type.
If the instruction of the icons association with being determined is the instruction of the positional information using user, controller 13 can be counted
Calculate the direction vector 34 (511) of the finger 20 in finger touch area 31 and adjacent area 32.Specifically, controller 13 can be calculated
From adjacent area 32 towards the direction vector 34 of the finger 20 of finger touch area 31 (see Fig. 3 b).Controller 13 can determine that direction
The angle (512) of vector 34.The angular range of direction vector 34 is 0 ° to 360 °.0 ° can be from the left side of display unit 11 to
The trunnion axis on right side.
Controller 13 can estimate the position of user (for example, referring to table 1), and can correspondingly produce the positional information of user
(513).It any suitable means can be used to store table 1 in apparatus 10, or be supplied to equipment 10.
【Table 1】
The angle of direction vector | The positional information of user |
45 ° to 135 ° | The bottom of display unit |
More than 135 ° to 225 ° | The right side of display unit |
More than 225 ° to 315 ° | The top of display unit |
More than 315 ° to 45 ° | The left side of display unit |
As shown in table 1, if the angular range of the direction vector 33 of finger 20 is 45 ° to 135 °, the position of user is believed
Breath can represent the bottom of display unit 11.If the angular range of the direction vector 33 of finger 20 is more than 135 ° to 225 °,
The positional information of user can represent the right side of display unit 11.If the angular range of the direction vector 33 of finger 20 be more than
225 ° to 315 °, then the positional information of user can represent the top of display unit 11.If the angle of the direction vector 33 of finger 20
It is that then the positional information of user can represent the left side of display unit 11 more than 315 ° to 45 ° to spend scope.
Controller 13 performs image corresponding with the icon determined according to the positional information of the user produced in step 513
Rotation, mobile and amplification (61).If for example, the positional information of user refers to the bottom of display unit 11, in certain situation
Under, position rotation with the icon that determines corresponding image of the controller 13 relative to the user towards the bottom of display unit 11.
Controller 13 can also move described image towards the bottom of display unit 11, and/or amplify and show in the bottom of display unit
Described image.
If instruction corresponding with the icon of the determination is the instruction of the touch form using finger, controller 13 can
Determine the horizontal length (a) and vertical length (b) (521) of finger touch area 31.Formula 1 can be used to calculate vertical length (b)
With the ratio (c) of horizontal length (a).
【Formula 1】
C=b/a
Controller 13 can determine that touch area 31 is big or small according to the value c of calculating.For example, as shown in table 2, if
C is more than 1, then can determine that finger touch area 31 is big, or if c is less than 1, then can determine that finger touch area 31 is small.Accordingly
Ground, controller 13 can produce finger corresponding with the value c calculated and touch form information (522).Any suitable hand can be used
Table 2 is stored in apparatus 10 or provided it to equipment 10 by section.
【Table 2】
c | Finger touch area |
b/a≥1 | Greatly |
b/a<1 | It is small |
Controller 13 can touch the instruction (62) that form performs the icons association with determining according to the finger of generation.For example,
When icon is menu bar, if finger touch area 31 " big ", next stage menu is shown, if finger touch area 31
" small ", then show previous menu.
If instruction corresponding with the icon of determination is the instruction using finger type information, controller 13, which can determine that, is
It is no that adjacent area 32 (530) is determined.If it is determined that adjacent area 32, then controller 13 can calculate from adjacent area 32 towards
The direction vector 34 (531) of the finger 20 of finger touch area 31.
Then, controller 13 can determine that the angle (532) of direction vector 34.
Generally, can be at the size of display unit relatively small mobile end when performing different instructions according to finger type
It is quick in end to perform input action.Therefore, the user of portable terminal can be generally assumed positioned at the bottom of display unit 11.
The angular range of direction vector 33 is 0 to 180 °.
Controller 13 (for example, referring to table 3) can based on the direction vector 33 of determination angle-determining touch hand be the right hand also
It is left hand, and can correspondingly produces finger type information (535).It can be used any suitable means that table 3 is stored in into equipment 10
In or be supplied to equipment 10.
【Table 3】
Vector angle | Finger type information |
0 ° to 90 ° | Left hand |
90 ° to 180 ° | The right hand |
In table 3, if the angular range of direction vector 33 is 0 ° to 90 °, the type of finger 20 is left-hand finger.Such as
The angular range of fruit direction vector 33 is 90 ° to 180 °, then the type of finger 20 is right finger.
If not determining adjacent area 32 in step S530, controller 13 can determine that the position of finger touch area 31
It is that, on the right side or left side of display unit, and finger type information (533) can be produced.
For example, if the position of finger touch area 31 is the left side of display unit, finger 20 can be confirmed as left hand
Finger.If the position of finger touch area 31 is right side, finger 20 can be confirmed as right finger.
It is determined that finger 20 is that (after (step 533 or step 535), controller 13 can for right finger or left-hand finger
The instruction (63) associated with the icon determined is performed according to the finger type information of generation.For example, when icon be numerical key simultaneously
And the type of finger 20 be right finger when, can input right side character.If the type of finger 20 is left-hand finger, it can input
Left side character.
Embodiment 1
Therefore, various instructions can be performed according to information of the corresponding type instructed of the icon with being touched based on generation.
There is provided the example that instruction is performed below.
Fig. 5 shows the example that execution is instructed in Fig. 1 equipment.In Figure 5, can by rotation, mobile and amplification with
The picture is shown by the corresponding picture of touch icon.Then, the picture can be placed in display unit and user
The corresponding position in position.
As shown in (a) in Fig. 5, picture can be shown with the breviary diagram form of reduction, and icon can be arranged at random.When
During icon 40 in the touch sensitive display unit 11 of user 35, controller 13 can calculate the direction vector 33 of the finger-image of finger 20,
Vector angle is measured, and estimates the position of user 35.In (a) in Figure 5, the scope of the direction vector 33 of finger 20 is 45 °
To 135 °, accordingly, it can be determined that the position of user 35 corresponds to the bottom of display unit 11.When controller 13 is by based on finger side
To information determine the position of user 35 when, can be rotated, moved, amplified and shown with by the corresponding picture of touch icon 40
Show, with corresponding to the position of user 35, as shown in (b) in Fig. 5.
In traditional touch user interface, when performing this instruction, it is necessary to select icon, and the icon is pulled
To the position of user, and user rotates the icon, however, according to the exemplary embodiment of the present invention, passing through the list of icon
One touch action acts to perform these.In addition, according to the exemplary embodiment of the present invention, due to the position of user is used as into base
In the information of finger type, so when many users perform touch action in a display unit, of the invention is exemplary
Embodiment is useful.
Fig. 6 shows the second example that execution is instructed in Fig. 1 equipment.In figure 6, shown in Fig. 5 the first example
Icon can form the menu of function choosing-item.
As shown in (a) in Fig. 6, when user 35 is touched from the multiple icons for representing function choosing-item (for example, representing day
Journey table) icon 50 when, controller 13 can amplify and show the calendar 52 for managing schedule table, with the customer location with determination
Accordingly.
Fig. 7 shows the 3rd example that execution is instructed in Fig. 1 equipment.In the figure 7, by will be corresponding to by touch icon
Picture rotation show the picture to be to the position of user corresponding.
As shown in (a) in Fig. 7, when user 35 touches icon 60, controller 13 can be determined according to by controller 13
The positional information of user rotate and show the icon 60, as shown in (b) in Fig. 7.However, in the figure 7, can be according to quilt
The specific direction vector of the finger 20 of detection is determined the position of user by controller 13, is but regardless of direction vector and display unit
11 right side, left side, upside or downside is accordingly (such as the first example).Therefore, the position of user can be the direction of finger 20
The starting point of vector, icon 60 can be rotated to corresponding with the position (that is, direction vector) of user.
Fig. 8 shows the 4th example that execution is instructed in Fig. 1 equipment.In fig. 8, with predetermined party in display unit 11
The screen picture rotation of position display, with corresponding to the position of user.
In (a) in fig. 8, screen picture 70 can initially be shown as and the user positioned at the left side of display unit 11
Accordingly.As shown in Fig. 8 (a), when performing touch action by the finger 20 of the user positioned at the bottom of display unit 11,
Controller 13 can produce the information of the position of user, as shown in (b) in Fig. 8, and controller 13 is rotatable and shows the screen
Curtain image 70, with corresponding to the user positioned at the bottom of display display unit 11.
Embodiment 2
Fig. 9 is shown according to exemplary embodiment of the invention in user circle using the information based on finger type
Face provides the stereogram of finger touch action in equipment.
In the example that (a) in fig .9 is shown, most of touch sensitive display unit 11 of the bottom of finger 20, and finger
Touch area 81 can be larger.In the example that Fig. 9 (b) is shown, the only tip touch sensitive display unit 11 of finger 20, and finger
Touch area 82 can be smaller.Correspondingly, different instructions can be performed according to finger touch action.
Figure 10 shows the first example that execution is instructed in Fig. 9 equipment.In Fig. 10, form can be touched according to finger to show
Show previous menu or next stage menu.
In reference picture 10, if finger touch area 81 " big ", can show with by the corresponding next stage dish of touch icon
It is single.Although not showing in Fig. 10, if " small " of finger touch area, it can show corresponding with by touch icon
Previous menu.
Figure 11 shows the second example that execution is instructed in Fig. 9 equipment.In fig. 11, form can be touched according to finger to hold
The right button of row conventional mouse/left button operates corresponding input instruction.
Reference picture 11, if finger touch area 82 " small ", can with mouse images be located at respective icon on time point
The right button similar mode for hitting conventional mouse shows menu.Although not shown in fig. 11, if finger touch area
" big ", the then left button similar mode that can click on conventional mouse when in respective icon with being located at mouse images is performed with being touched
Icon is touched to instruct accordingly.
Figure 10 shows the touch type according to finger, by with the side similar with the right button or left button of clicking on conventional mouse
The previous menu for the instruction that formula is performed or the display of next stage menu.However, the exemplary embodiment not limited to this of the present invention,
It may be applied to the input of character.For example, when finger touch area 81 " big ", the right character of respective icon can be inputted,
When finger touch area 81 " small ", then left side character can be inputted.
Embodiment 3
Figure 12 is the vertical of the equipment for information of the use based on finger type for showing the exemplary embodiment according to the present invention
Body figure.
The direction vector 33 of detected finger can be used to determine that it is right finger to touch finger for reference picture 12, controller 13
111 or left-hand finger 112.In addition, in some cases, due in the end of display unit 11 (for example, on right side or a left side
Side) perform the touch action so that the non-touch area 32 of finger is not can determine that, it is thus impossible to determine to touch the side of finger 20
To vector 33.If not can determine that direction vector 33, controller can be directed towards display unit 11 based on finger touch area 31
Right side or left side come determine touch finger be right finger 111 or left-hand finger 112.
Figure 13 is shown when showing the character array for inputting character in display unit 11 in Figure 12 equipment
Instruction is performed.
Reference picture 13, if for example, controller 13 determines to touch icon 123 by right finger 111, character can be inputted
“+”;If controller 13 determines to touch icon 123 by left-hand finger 112, character "-" can be inputted.If controller 13
It is determined that touching icon 121 by right finger 111, then character "] can be inputted ", if controller 13 determines to be touched by left-hand finger 112
Icon 121 is touched, then can input character " [".
Therefore, according to the exemplary embodiment of the present invention, finger type can be determined by single touch action, then can held
The different instruction of row.Therefore, when the input character in the equipment (for example, mobile device) with small-size display, tradition
Input method is unnecessary, in the conventional input method, by a touch action input left side character, passes through two
Touch action input right side character.Therefore, character can be faster input.
Figure 13 shows the situation of right character or left character that respective icon is inputted according to finger type.However, of the invention
Exemplary embodiment not limited to this, but can apply to, for example, display previous menu or next stage menu, with point
The right button or left button similar mode for hitting conventional mouse show the menu of execute instruction, as explained above.If for example, touched
It is right finger 111 to touch finger type, then can show menu, similar to occurring display dish if icon is clicked by mouse right button
It is single.It is executable to be instructed with by touch icon is corresponding if it is left-hand finger 112 to touch finger type, similar to working as mouse
Left button when being clicked.
Embodiment 4
Reference picture 14 to Figure 18 is provided when user's finger touch sensitive display unit 11 and when inputting mobile in display unit 11
The operation on user interface apparatus description.In an embodiment of the present invention, the movement of user's finger corresponds to finger prick
The two dimensional motion of circle is drawn to finger touch area.
Figure 14 shows the first example of the output screen of Fig. 1 according to an exemplary embodiment of the present invention equipment.
(a) in Figure 14 shows the screen of the full screen display picture on display unit 11, wherein, the finger of user is touched
Display unit 11, and circle drafting motion is performed on display unit 11 for finger touch area.
The motion of the finger of the detection user of sensor 12, produces sensor signal, and sensor signal is output into control
Device 13.Finger touch area 31 and the adjacent area 32 of change that the identification of controller 13 changes, and the touch of the finger based on identification
The direction vector 34 for the finger that the detection of adjacent area 32 of region 31 and identification changes.Then, controller 13 is by the finger of user
Motion be input to screen before finger direction vector and user finger motion be transfused to after finger direction
Vector is compared, and estimates the change of the angle of the direction vector of finger, and perform the angle with the change of the direction vector of estimation
Spend corresponding function.Controller 13 can determine that whether the angle of the change of the direction vector of finger is equal to or more than default angle
Degree.Only when the angle of the change of the direction vector of finger is equal to or more than predetermined angle, controller can determine that event has been sent out
It is raw, then perform corresponding function.In an embodiment of the present invention, controller 13 can also estimate the angle of the direction vector of finger
Change speed.If the finger of user quickly performs circle and draws motion, controller 13 can estimate the direction vector of finger
Angle change speed, then perform corresponding function.
As shown in Figure 14, the picture shown on screen (b) amplification screen (a).In an embodiment of the present invention, user is worked as
Finger when drawing circle clockwise, display device can amplify and the image on display screen, when the finger of user is painted counterclockwise
During rounding circle, reduce and display image.If the finger of user rapidly draws circle on screen, display device can also be fast
Image is zoomed in or out fastly.On the other hand, if the finger of user draws circle on screen, the position is then stayed with,
Then display device can progressively zoom in or out image with the default cycle.
Figure 15 shows the second example of the output screen of the equipment 1 of the exemplary embodiment according to the present invention.
As shown in Figure 15, screen (a) shows that the finger of user touches option and then touches point-rendering for finger
The state of circle.
Screen (a) shows a part for whole item, and remainder is cut away due to screen size.In this feelings
Under condition, the function of the executable shearing part for replicating this of controller 13.When user's activation MMS message write-in window, multiple
Webpage is loaded in task operating, the option in webpage is touched, when then performing circle drafting motion, this is replicated automatically,
Then the annex of MMS message is registered as, or is adhered in MMS message.As shown in Figure 15, screen (b) shows selected
Item is pasted the state in MMS message write-in window automatically.
Figure 16 shows the 3rd example of the output screen of Fig. 1 according to an exemplary embodiment of the present invention equipment.
As shown in Figure 16, screen (a) shows that finger touches the state of the icon in the menu comprising submenu.When with
When family finger touches special icon, according to the information based on finger type, the icon touched shows its submenu, for example,
" twit ", " blog " and " facebook ".As shown in screen (a), the type of finger is shown in this way, i.e. referred to
Point points to upper left from bottom right.As shown in Figure 16, screen (b) is shown as finger draws circle clockwise on screen (a)
Finger tip points to the state of upper right from lower-left.In this case, sub-menu icons are also arranged from lower-left to upper right.User can also be
Finger draws colleague's movement or the arrangement GUI of circle on screen.
Figure 17 shows the 4th example of Fig. 1 according to an exemplary embodiment of the present invention output screen.
As shown in Figure 17, screen (a) shows that the finger of user touches lower-left on the screen of display content of multimedia
The reproduction icon at place simultaneously performs the state that circle draws motion.In this case, controller 13 performs F.F. or fallback function,
And control display unit 11 is with display reproduction information scroll bar.If for example, the finger of user draws circle clockwise, controlled
Device 13 performs fast-forward functionality, if the finger of user draws circle counterclockwise, controller 13 performs fallback function.
As shown in Figure 17, screen (b) shows that the finger of user touches bottom right on the screen of display content of multimedia
The volume icon at place simultaneously performs the state that circle draws motion.In this case, controller 13 performs volume and increased or decreased
Function, and control display unit 11 is to show information volume scroll bar.For example, if user's finger draws circle clockwise,
Controller 13 performs volume up function, if user's finger draws circle counterclockwise, controller 13 performs volume reduction work(
Energy.
Figure 18 shows the 5th example of the output screen of Fig. 1 according to an exemplary embodiment of the present invention equipment.
As shown in Figure 18, screen (a) shows that the finger of user touches the state of special icon.In this case,
The control unit 11 of controller 13 performs the icon touched and exports corresponding function.If only inputting touch action, control
Device 13 is by the action recognition to select or performing event, and this is similar with the click action of left mouse button.
As shown in Figure 18, screen (b) shows that the finger of user touches special icon and then performs circle and draws motion
State.In this case, the control display unit 11 of controller 13 is with the menu of displaying duplication/stickup submenu.If defeated
Touch action is entered and circle draws motion, then controller 13 recognizes that the action is menu outgoing event, with right mouse button
Click action is similar.
It should be understood that the invention is not restricted to these embodiments.For example, screen can show multiple icons.If with
The finger at family touches one in multiple icons, then controller performs its corresponding function.Alternatively, if the finger of user
One in multiple icons is touched, then circle is drawn for the icon, then present mode can be switched to and allow pair by controller
The pattern that multiple icons make a choice.
Embodiment 5
When providing the finger touch sensitive display unit 11 as user referring to Figure 19 and inputting mobile on display unit 11
Description on the operation of user interface.In an embodiment of the present invention, the mobile correspondence finger of the finger of user is slowly bent
To change the three-dimensional motion of finger touch area.
Figure 19 shows the example of the output screen of the equipment of Fig. 1 according to further example embodiment of the present invention.
As shown in Figure 19, screen (a) is similar with Figure 10 screen.Screen (a) show user's finger from top to
Tip portion between first joint all touches the state of screen.Controller 13 detects the size of finger touch area 81, so
Control display unit 11 exports its corresponding GUI afterwards.As shown in screen (a), the control display unit 11 of controller 13 is shown
Sub-menu icons.It is assumed that only performing touch action by the finger tip of the finger bent.
As shown in Figure 19, screen (b) is similar with Figure 11 screen, shows one of the tip portion of the finger of user
Divide the state for touching screen.Controller 13 detects the size of finger touch area 82, and then control display unit 11 is exported accordingly
GUI.As shown in screen (b), the control display unit 11 of controller 13 shows previous menu icon.It is assumed that only by bending hand
The finger tip of finger performs the touch action.That is, controller 13 can recognize that the change of the size of finger touch area, then control aobvious
Show that unit changes and shows GUI.
It is recorded as calculating according to the method for the information execute instruction based on the finger detected by optical touch sensitive device
Program in machine readable medium recording program performing.For detecting that the program of finger and execute instruction includes:For showing on display unit 11
Show the instruction of at least one graphic user interface (GUI), be based on for being inputted on screen in the case that the finger of user is touched
Finger touch area and adjacent area on the sensor signal recognition screen that sensor 12 is produced, for touching according to identification
The adjacent area 32 for touching region 31 and identification produces the instruction based on the information for touching finger shape, is based on touching hand for basis
The information of finger-type shape changes and shows GUI instruction.
If sensor 12 is implemented as optical sensor, described program may include command sensor 12 to detect display
The touch action of finger 20 in unit 11 simultaneously obtains the instruction of image of finger 20, instruction control unit 13 and is based on by sensor 12
The image for obtaining finger determines that finger touch area 31 and the instruction of adjacent area 32 and instruction control unit 13 are touched based on finger
Region 31 is touched to determine by the instruction of touch icon.Described program also includes instruction control unit 13 according to corresponding with detected icon
Instruction is produced based on the instruction according to finger touch area 31 and the finger type information of the determination of adjacent area 32, and order control
Device 13 processed performs the instruction instructed corresponding with being detected icon according to the information based on finger type.Letter based on finger type
Cease the position that may include user, the touch form of finger, finger type.In these exemplary embodiments, can be according to as above institute
The information stated performs these instructions.
As described above, according to the present invention, user interface can allow to perform various user friendly inputs on the display device.
In addition, user can quickly and accurately provide in the user interface of equipment of the display of small size is mounted with
Input.
It should be appreciated by those skilled in the art without departing from the spirit and scope of the present invention, can be with
Various variations and modifications are made to the present invention.Therefore, covering of the invention those fall into model by claim and its equivalent
Variant of the invention and change in enclosing.
Claims (20)
1. a kind of portable set, including:
Display unit, is configured as:Show at least one object;
Input block, is configured as:The user input associated with the touch on display unit is received, input block includes optics
Sensor and touch sensor;
Controller, is configured as:
User's input based on reception, identification includes the user input area of touch area and adjacent area,
To be touch object in the Object identifying of touch area,
Touch information corresponding with user input area is produced, touch information includes customer position information, finger and touches form letter
Breath and at least one in finger type information,
From the executable multiple instruction selection instruction associated with touching object,
According to the instruction of selection, touch object is handled using touch information.
2. portable set as claimed in claim 1, wherein, touch area corresponds to the user associated with the touch and inputted
The part touched in region, adjacent area corresponds to the portion not touched of the user input area associated with the touch
Point.
3. portable set as claimed in claim 1, wherein, controller is additionally configured to:
The direction vector of the touch is determined based on user input area, the angle associated with direction vector is obtained,
By producing customer position information based on the angle estimation customer location.
4. portable set as claimed in claim 3, wherein, controller is configured as:By based on customer position information and
The instruction of selection performs and touches the rotating of object, at least one in mobile and amplification, to handle touch object.
5. portable set as claimed in claim 1, wherein, controller is additionally configured to:
It is determined that the horizontal length and vertical length associated with touch area;
By determining the size of touch area based on horizontal length and vertical length, to produce the finger associated with the touch
Touch form information.
6. portable set as claimed in claim 5, wherein, controller is configured as:Form information is touched based on finger to lead to
Cross and perform the instruction of selection to handle touch object.
7. portable set as claimed in claim 1, wherein, controller is additionally configured to:
The direction vector of the touch is determined based on user input area,
The angle associated with direction vector is obtained,
By associated with the right hand or associated with left hand based on being touched described in the angle-determining, to produce finger type letter
Breath.
8. portable set as claimed in claim 7, wherein, controller is configured as:Based on finger type information by holding
The instruction of row selection handles touch object.
9. portable set as claimed in claim 1, wherein, optical sensor is configured as based on associated with the touch
The light quantity of reflection detect that user inputs, touch sensor is configured as the electric charge based on the change associated with the touch
Measure to detect that user inputs,
Wherein, in order to recognize user input area, controller is configured as touch area being converted to digital picture.
10. portable set as claimed in claim 1, wherein, controller is additionally configured to:
User's input corresponding to the movement associated with the touch is received from input block;
Input to recognize user input area based on the user corresponding to the movement;
The direction vector associated with the touch is determined based on user input area;
The change of the estimation angle associated with direction vector;
Perform function corresponding with the change of angle.
11. portable set as claimed in claim 1, wherein, controller is additionally configured to:
User's input corresponding to the movement associated with the touch is received from input block;
Based on user's input corresponding to the movement, the change in touch area is estimated;
Perform function corresponding with the change in touch area.
12. portable set as claimed in claim 1, wherein, controller is additionally configured to:Identification and user input area are not
Same open area, the open area corresponds to non-user input area.
13. a kind of method for being used to handle the user interface in portable set, including:
At least one object is shown on the display unit;
The user associated with the touch on display unit is received to input, by optical sensor and touch sensor at least
One come detect user input;
User's input based on reception, identification includes the user input area of touch area and adjacent area;
To be touch object in the Object identifying of touch area;
Touch information corresponding with user input area is produced, touch information includes customer position information, finger and touches form letter
Breath and at least one in finger type information;
From the executable multiple instruction selection instruction associated with touching object;
According to the instruction of selection, touch object is handled using touch information.
14. method as claimed in claim 13, wherein, the step of producing touch information also includes:
The direction vector of the touch is determined based on user input area;
Determine the angle of direction vector;
By producing customer position information based on the angle estimation customer location.
15. method as claimed in claim 13, wherein, the step of processing touches object includes:
Instruction based on customer position information and selection, come perform touch the rotating of object, at least one in mobile and amplification.
16. method as claimed in claim 13, wherein, the step of producing touch information also includes:
It is determined that the horizontal length and vertical length associated with touch area;
The size associated with touch area is determined based on horizontal length and vertical length;
Size based on determination, form information is touched to produce the finger associated with the touch.
17. method as claimed in claim 13, wherein, the step of producing touch information includes:
The direction vector of the touch is determined based on user input area;
It is determined that the angle associated with direction vector;
Angle based on determination, to determine that the touch is associated with the right hand or associated with left hand;
Determination associated with the right hand or associated with left hand is touched based on described, to produce finger type information.
18. method as claimed in claim 13, wherein, the step of producing touch information includes:
If unidentified arrive adjacent area, it is determined that the position associated with touch area corresponds to the left side of display unit still
Right side;
Finger type information is produced based on the determination.
19. method as claimed in claim 14, in addition to:
Receive the user input associated with the movement of the touch;
Inputted based on the user associated with the movement, recognize user input area;
The direction vector associated with the touch is determined based on user input area;
The change of the estimation angle associated with direction vector;
Perform function corresponding with the change of the angle of estimation.
20. method as claimed in claim 13, in addition to:
Receive the user input associated with the movement of the touch;
Inputted based on the user associated with the movement, the change in estimation touch area;
Perform function corresponding with the change estimated.
Applications Claiming Priority (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR20080117358 | 2008-11-25 | ||
KR10-2008-0117358 | 2008-11-25 | ||
KR10-2009-0113076 | 2009-11-23 | ||
KR1020090113076A KR20100059698A (en) | 2008-11-25 | 2009-11-23 | Apparatus and method for providing user interface, and computer-readable recording medium recording the same |
CN200910225143A CN101739208A (en) | 2008-11-25 | 2009-11-25 | Device and method for providing a user interface |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN200910225143A Division CN101739208A (en) | 2008-11-25 | 2009-11-25 | Device and method for providing a user interface |
Publications (2)
Publication Number | Publication Date |
---|---|
CN107066137A true CN107066137A (en) | 2017-08-18 |
CN107066137B CN107066137B (en) | 2021-04-27 |
Family
ID=42360946
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN200910225143A Pending CN101739208A (en) | 2008-11-25 | 2009-11-25 | Device and method for providing a user interface |
CN201611178513.7A Expired - Fee Related CN107066137B (en) | 2008-11-25 | 2009-11-25 | Apparatus and method for providing user interface |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN200910225143A Pending CN101739208A (en) | 2008-11-25 | 2009-11-25 | Device and method for providing a user interface |
Country Status (2)
Country | Link |
---|---|
KR (3) | KR20100059698A (en) |
CN (2) | CN101739208A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108109581A (en) * | 2018-01-16 | 2018-06-01 | 深圳鑫亿光科技有限公司 | Interactive LED display and its display methods |
Families Citing this family (27)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2203865A2 (en) | 2007-09-24 | 2010-07-07 | Apple Inc. | Embedded authentication systems in an electronic device |
KR101694787B1 (en) * | 2010-06-30 | 2017-01-10 | 엘지전자 주식회사 | Mobile terminal and control method for mobile terminal |
US20120096349A1 (en) * | 2010-10-19 | 2012-04-19 | Microsoft Corporation | Scrubbing Touch Infotip |
CN102479035A (en) * | 2010-11-23 | 2012-05-30 | 汉王科技股份有限公司 | Electronic device with touch screen, and method for displaying left or right hand control interface |
TWI456509B (en) * | 2010-11-26 | 2014-10-11 | Acer Inc | Finger recognition methods and systems, and computer program products thereof |
CN102566858B (en) * | 2010-12-09 | 2014-12-03 | 联想(北京)有限公司 | Touch control method and electronic equipment |
CN102810039A (en) * | 2011-05-31 | 2012-12-05 | 中兴通讯股份有限公司 | Left or right hand adapting virtual keyboard display method and terminal |
JP2013117890A (en) * | 2011-12-05 | 2013-06-13 | Sony Corp | Electronic apparatus and operation method of electronic apparatus |
TWI450150B (en) * | 2011-12-21 | 2014-08-21 | Wistron Corp | Touch method and touch system |
CN102707878A (en) * | 2012-04-06 | 2012-10-03 | 深圳创维数字技术股份有限公司 | User interface operation control method and device |
JP5966557B2 (en) | 2012-04-19 | 2016-08-10 | ソニー株式会社 | Information processing apparatus, information processing method, program, and information processing system |
BR112014028774B1 (en) | 2012-05-18 | 2022-05-10 | Apple Inc | Method, electronic device, computer readable storage medium and information processing apparatus |
KR102073601B1 (en) * | 2012-07-25 | 2020-02-06 | 삼성전자주식회사 | User terminal apparatus and control method thereof |
CN103576844B (en) * | 2012-08-01 | 2017-11-03 | 联想(北京)有限公司 | The method and electronic equipment of display data |
CN103403665B (en) * | 2012-08-29 | 2016-08-03 | 华为终端有限公司 | A kind of terminal unit obtains method and the terminal unit of instruction |
CN103679017B (en) * | 2012-09-05 | 2017-06-16 | 腾讯科技(深圳)有限公司 | Prevent the device and method that user interface is held as a hostage |
CN103838500A (en) * | 2012-11-20 | 2014-06-04 | 联想(北京)有限公司 | Operand set displaying method and electronic equipment |
CN103902206B (en) * | 2012-12-25 | 2017-11-28 | 广州三星通信技术研究有限公司 | The method and apparatus and mobile terminal of mobile terminal of the operation with touch-screen |
CN103927105A (en) * | 2013-01-11 | 2014-07-16 | 联想(北京)有限公司 | User interface display method and electronic device |
CN105446630B (en) * | 2014-06-16 | 2019-07-26 | 联想(北京)有限公司 | A kind of information processing method and device |
KR102255143B1 (en) * | 2014-09-02 | 2021-05-25 | 삼성전자주식회사 | Potable terminal device comprisings bended display and method for controlling thereof |
KR102344045B1 (en) * | 2015-04-21 | 2021-12-28 | 삼성전자주식회사 | Electronic apparatus for displaying screen and method for controlling thereof |
KR102461584B1 (en) | 2015-11-20 | 2022-11-02 | 삼성전자주식회사 | Input processing method and device |
KR102334521B1 (en) * | 2016-05-18 | 2021-12-03 | 삼성전자 주식회사 | Electronic apparatus and method for processing input thereof |
CN107589881A (en) * | 2016-07-06 | 2018-01-16 | 中兴通讯股份有限公司 | The method and apparatus that a kind of intelligence calls desktop layouts |
KR102123145B1 (en) * | 2018-02-21 | 2020-06-15 | 박종환 | Input method and input device based on position of finger on input device |
US11409410B2 (en) | 2020-09-14 | 2022-08-09 | Apple Inc. | User input interfaces |
Citations (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040021643A1 (en) * | 2002-08-02 | 2004-02-05 | Takeshi Hoshino | Display unit with touch panel and information processing method |
US20040150668A1 (en) * | 2003-01-31 | 2004-08-05 | Xerox Corporation | Secondary touch contextual sub-menu navigation for touch screen interface |
CN1685301A (en) * | 2002-09-30 | 2005-10-19 | 三洋电机株式会社 | Mobile digital devices |
US20060010400A1 (en) * | 2004-06-28 | 2006-01-12 | Microsoft Corporation | Recognizing gestures and using gestures for interacting with software applications |
CN1867140A (en) * | 2005-05-16 | 2006-11-22 | Lg电子株式会社 | Mobile terminal having scrolling device and method implementing functions using the same |
WO2006126310A1 (en) * | 2005-05-27 | 2006-11-30 | Sharp Kabushiki Kaisha | Display device |
US20070097096A1 (en) * | 2006-03-25 | 2007-05-03 | Outland Research, Llc | Bimodal user interface paradigm for touch screen devices |
US20070262961A1 (en) * | 2006-05-10 | 2007-11-15 | E-Lead Electronic Co., Ltd. | Method for selecting functional tables through a touch-sensitive button key |
US20070300182A1 (en) * | 2006-06-22 | 2007-12-27 | Microsoft Corporation | Interface orientation using shadows |
CN101097496A (en) * | 2006-06-29 | 2008-01-02 | 株式会社Aki | Operation method for touch panel and character input method |
US20080168403A1 (en) * | 2007-01-06 | 2008-07-10 | Appl Inc. | Detecting and interpreting real-world and security gestures on touch and hover sensitive devices |
US20080189657A1 (en) * | 2007-02-03 | 2008-08-07 | Lg Electronics Inc. | Mobile communication device and method of controlling operation of the mobile communication device |
EP1980935A1 (en) * | 2006-02-03 | 2008-10-15 | Matsushita Electric Industrial Co., Ltd. | Information processing device |
US20080276203A1 (en) * | 2007-05-04 | 2008-11-06 | Whirlpool Corporation | User interface and cooking oven provided with such user interface |
-
2009
- 2009-11-23 KR KR1020090113076A patent/KR20100059698A/en active Application Filing
- 2009-11-25 CN CN200910225143A patent/CN101739208A/en active Pending
- 2009-11-25 CN CN201611178513.7A patent/CN107066137B/en not_active Expired - Fee Related
-
2016
- 2016-06-03 KR KR1020160069196A patent/KR20160073359A/en active Application Filing
-
2017
- 2017-04-19 KR KR1020170050473A patent/KR20170046624A/en active Search and Examination
Patent Citations (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040021643A1 (en) * | 2002-08-02 | 2004-02-05 | Takeshi Hoshino | Display unit with touch panel and information processing method |
CN1685301A (en) * | 2002-09-30 | 2005-10-19 | 三洋电机株式会社 | Mobile digital devices |
US20040150668A1 (en) * | 2003-01-31 | 2004-08-05 | Xerox Corporation | Secondary touch contextual sub-menu navigation for touch screen interface |
US20060010400A1 (en) * | 2004-06-28 | 2006-01-12 | Microsoft Corporation | Recognizing gestures and using gestures for interacting with software applications |
CN1867140A (en) * | 2005-05-16 | 2006-11-22 | Lg电子株式会社 | Mobile terminal having scrolling device and method implementing functions using the same |
WO2006126310A1 (en) * | 2005-05-27 | 2006-11-30 | Sharp Kabushiki Kaisha | Display device |
EP1980935A1 (en) * | 2006-02-03 | 2008-10-15 | Matsushita Electric Industrial Co., Ltd. | Information processing device |
US20070097096A1 (en) * | 2006-03-25 | 2007-05-03 | Outland Research, Llc | Bimodal user interface paradigm for touch screen devices |
US20070262961A1 (en) * | 2006-05-10 | 2007-11-15 | E-Lead Electronic Co., Ltd. | Method for selecting functional tables through a touch-sensitive button key |
US20070300182A1 (en) * | 2006-06-22 | 2007-12-27 | Microsoft Corporation | Interface orientation using shadows |
CN101097496A (en) * | 2006-06-29 | 2008-01-02 | 株式会社Aki | Operation method for touch panel and character input method |
US20080168403A1 (en) * | 2007-01-06 | 2008-07-10 | Appl Inc. | Detecting and interpreting real-world and security gestures on touch and hover sensitive devices |
US20080189657A1 (en) * | 2007-02-03 | 2008-08-07 | Lg Electronics Inc. | Mobile communication device and method of controlling operation of the mobile communication device |
US20080276203A1 (en) * | 2007-05-04 | 2008-11-06 | Whirlpool Corporation | User interface and cooking oven provided with such user interface |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108109581A (en) * | 2018-01-16 | 2018-06-01 | 深圳鑫亿光科技有限公司 | Interactive LED display and its display methods |
CN108109581B (en) * | 2018-01-16 | 2018-12-25 | 深圳鑫亿光科技有限公司 | Interactive LED display and its display methods |
Also Published As
Publication number | Publication date |
---|---|
CN101739208A (en) | 2010-06-16 |
KR20160073359A (en) | 2016-06-24 |
CN107066137B (en) | 2021-04-27 |
KR20170046624A (en) | 2017-05-02 |
KR20100059698A (en) | 2010-06-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107066137A (en) | The apparatus and method of user interface are provided | |
EP3232315B1 (en) | Device and method for providing a user interface | |
TWI358028B (en) | Electronic device capable of transferring object b | |
CN201156246Y (en) | Multiple affair input system | |
CN102362243B (en) | Multi-telepointer, virtual object display device, and virtual object control method | |
US9146672B2 (en) | Multidirectional swipe key for virtual keyboard | |
US8816964B2 (en) | Sensor-augmented, gesture-enabled keyboard and associated apparatus and computer-readable storage medium | |
US9104308B2 (en) | Multi-touch finger registration and its applications | |
US9477396B2 (en) | Device and method for providing a user interface | |
TWI474227B (en) | Interpreting ambiguous inputs on a touch-screen | |
US20100302144A1 (en) | Creating a virtual mouse input device | |
US20120105367A1 (en) | Methods of using tactile force sensing for intuitive user interface | |
US20090102809A1 (en) | Coordinate Detecting Device and Operation Method Using a Touch Panel | |
US20120038496A1 (en) | Gesture-enabled keyboard and associated apparatus and computer-readable storage medium | |
TWI463355B (en) | Signal processing apparatus, signal processing method and selecting method of user-interface icon for multi-touch interface | |
US20110216015A1 (en) | Apparatus and method for directing operation of a software application via a touch-sensitive surface divided into regions associated with respective functions | |
TWI380201B (en) | Method for browsing a user interface for an electronic device and the software thereof | |
CN102084325A (en) | Extended touch-sensitive control area for electronic device | |
TWI361372B (en) | Touch-sensitive control systems and methods | |
EP3008575A1 (en) | Natural quick function gestures | |
TW201003468A (en) | Virtual touchpad | |
EP3100151B1 (en) | Virtual mouse for a touch screen device | |
TW201042515A (en) | Touch screen zoom displaying system and method thereof | |
TWI374374B (en) | Method for operating a user interface for an electronic device and the software thereof | |
TWI460647B (en) | Method for multi-selection for an electronic device and the software thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20210427 |
|
CF01 | Termination of patent right due to non-payment of annual fee |