CN102591450B - Information processing apparatus and operation method thereof - Google Patents

Information processing apparatus and operation method thereof Download PDF

Info

Publication number
CN102591450B
CN102591450B CN201110382170.7A CN201110382170A CN102591450B CN 102591450 B CN102591450 B CN 102591450B CN 201110382170 A CN201110382170 A CN 201110382170A CN 102591450 B CN102591450 B CN 102591450B
Authority
CN
China
Prior art keywords
indicant
speed
benchmark
movement
signal conditioning
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201110382170.7A
Other languages
Chinese (zh)
Other versions
CN102591450A (en
Inventor
山本圭一
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Canon Inc
Original Assignee
Canon Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Canon Inc filed Critical Canon Inc
Publication of CN102591450A publication Critical patent/CN102591450A/en
Application granted granted Critical
Publication of CN102591450B publication Critical patent/CN102591450B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

There is provided an information processing apparatus with an interface with high convenience for a user. A reference speed is set according to an amount of movement or a movement time period of a pointer of a stylus or a finger. It is determined based on a movement speed of the pointer and the reference speed that a flick operation with the pointer has occurred.

Description

Signal conditioning package and method of operating thereof
Technical field
The present invention relates to the technology of the input operation determining user.
Background technology
Generally, the position representing rolling mouse cursor while Continued depression mouse button " is dragged (drag) ".In addition, the mouse button representing and decontrol Continued depression subsequently " is put (drop) ".
Similarly, touch location is moved while " dragging " optional position also representing and touch on touch panel (touch panel) at the indicant (pointer) of such as stylus or finger.In addition, " put " and also represent that the finger of touch leaves from touch panel subsequently.
In addition, " slide (flick) " represent indicant optional position touched on touch panel then carry out above similar stroke the operation of sweeping after decontrol.Such as, United States Patent (USP) 7,761, determines there occurs the technology of slip when No. 814 translational speeds disclosing a kind of position on the touch panel of finger touch meet pre-determined reference value.
In addition, this technology is widely known by the people in following: at finger from after touch panel slides off, stroke the direction of sweeping, the object inertially moving image shown based on the similar of this slip.
But, following problem can be produced can receive in the touch panel dragging and slide simultaneously.
Such as, suppose by dragging the position of the object move of display to hope, and object is presented at and put position.But because stroked the operation of sweeping when putting and moved by similar and decontrol finger, so may be defined as sliding by dragging, and object may stroke the detection side sweeping operation move up similar on picture.
Summary of the invention
The object of the invention is to provide a kind of interface user to high convenience.
According to an aspect of the present invention, provide a kind of signal conditioning package, it is constructed to the slide determining indicant, and the existing device in described information place comprises: setting unit, and it is constructed to arrange according to the movement of described indicant benchmark indicant speed; And determining unit, it is constructed to the slide determining whether there occurs described indicant based on the translational speed of described indicant and the benchmark indicant speed of setting.
By with reference to the accompanying drawings to detailed description of illustrative embodiments, other characteristic sum aspects of the present invention will become clear.
Accompanying drawing explanation
Comprise in the description and form the accompanying drawing of a part for instructions, illustrate exemplary embodiment of the present invention, characteristic sum each side, be used from explanatory note one and explain principle of the present invention.
Figure 1A to Fig. 1 C illustrates the hardware configuration of signal conditioning package and the example of functional block.
Fig. 2 A to Fig. 2 B illustrates the example of external view of digital camera and the example of the external view of touch panel.
Fig. 3 A to Fig. 3 J illustrates the example of the state of the touch location movement between touch panel and finger.
Fig. 4 A and Fig. 4 B is the process flow diagram of the example of the process illustrated for determining user operation.
Fig. 5 A and Fig. 5 B illustrates the example of the state of the touch location movement between touch panel and finger.
The example of the state of the display position movement of Fig. 6 A to Fig. 6 E instantiation object.
Fig. 7 A and Fig. 7 B is the process flow diagram of the example of the process illustrated for determining user operation.
Fig. 8 is the process flow diagram of the example of the process illustrated for determining user operation.
Fig. 9 is the process flow diagram of the example of the process illustrated for determining user operation.
Figure 10 is the process flow diagram of the example of the process illustrated for determining user operation.
Figure 11 illustrates and the relation of stroking and sweeping between the direction of operation and the moving direction of object detected.
The example of the state of the display position movement of Figure 12 A to Figure 12 F instantiation object.
Figure 13 A and Figure 13 B illustrates the example of the functional block of signal conditioning package.
Figure 14 A and Figure 14 B is the process flow diagram of the example of the process illustrated for determining user operation.
Figure 15 A and Figure 15 B illustrates the example of the state of each touch location movement between touch panel and finger.
Figure 16 A and Figure 16 B is the process flow diagram of the example of the process illustrated for determining user operation.
Figure 17 is the process flow diagram of the example of the process illustrated for determining user operation.
Figure 18 is the process flow diagram of the example of the process illustrated for determining user operation.
Embodiment
Now, the various exemplary embodiments that present invention will be described in detail with reference to the accompanying, characteristic sum each side.
Figure 1A illustrates the example according to the hardware configuration of the signal conditioning package 100 of the first exemplary embodiment of the present invention.
Signal conditioning package 100 comprises CPU (central processing unit) (CPU) 101, ROM (read-only memory) (ROM) 102, random access memory (RAM) 103, input/output interface (I/F) 104, input interface (I/F) 105 and output interface (I/F) 106.
Each parts are connected to each other through system bus 110.Storage unit 107, input block 108 and output unit 109 are connected to signal conditioning package 100.The following describes parts.
The program stored in ROM 102 to be loaded in RAM 103 and to perform this program by CPU 101, thus realizes following functional block.ROM 102 stores the program that will be performed by CPU 101 or the various data for executive routine.RAM 103 is provided for the workspace of the program of load store in ROM 102.
The output data of the execution result as the various process illustrated after a while are outputted to storage unit 107 by I/O I/F 104, and obtain the data be stored in storage unit 107.
Input I/F 105 obtains the signal exported from input block 108.Export I/F106 exports signal from the execution result for controlling various process to output unit 109 or image output.
Storage unit 107 is hard disk drives, and is stored as the execution result of various process and the data exported.
Input block 108 is such as mouse, tracking ball, touch panel, keyboard or button, and detects the input operation of user, and the signal corresponding with the operation detected is outputted to signal conditioning package 100.
Output unit 109 is such as liquid crystal display (LCD), and presents the image corresponding with the execution result of various process or for controlling the signal that image exports.If input block 108 is liquid crystal panels, then input block 108 also has the function of output unit 109.
Figure 1B illustrates the functional block diagram according to the structure of the signal conditioning package 100 of the first exemplary embodiment of the present invention.
Signal conditioning package 100 comprises acquiring unit 121, designating unit 122, setting unit 123, determining unit 124 and control module 125.Input block 108 and output unit 109 are connected to signal conditioning package 100.
The program be stored in ROM 102 is loaded into RAM 103 and executive routine by CPU 101, thus realizes acquiring unit 121, designating unit 122, setting unit 123, determining unit 124 and control module 125.
In this case, CPU 101 can perform the process for reading data from storage unit 107 or the process for writing data to storage unit 107.In addition, corresponding with ROM 102 or RAM103 multiple storage unit can arrange in a device as required.
The following describes parts.The identical Reference numeral of like in Figure 1A is indicated, and the description thereof will be omitted.
Acquiring unit 121 comprises input I/F 105, CPU 101, ROM 102 and RAM 103.In addition, acquiring unit 121 obtains the signal exported from input block 108, and the information corresponding with obtained signal is outputted to designating unit 122.
The information outputting to designating unit 122 comprises such as one group of instruction for detecting the information of the position (position that user indicates) of user operation (pointing operation by using the indicant of stylus or finger to carry out) and indicating the information in the moment for detecting user operation.
Designating unit 122 comprises CPU 101, ROM 102, RAM 103.In addition, designating unit 122 is based on the amount of movement of the information intended target position exported from acquiring unit 121.If input block 108 is liquid crystal touch panels, then target location detects on touch panel by position that indicant touches.
That is, target location is the position that in operating area, indicant exists.In addition, the amount of movement of target location represents that touch location moves while the indicant such as pointed touches touch panel, then contacts displacement when indicant leaves from touch panel.In other words, the amount of movement of target location represents the amount of movement of indicant.
The amount of movement that designating unit 122 is specified can be, such as, on the touch panel length of the track of indicant movement.
Alternatively, the amount of movement that designating unit 122 is specified can be the air line distance leaving position from the touch location of indicant on touch panel apart from indicant.Alternatively, the amount of movement that designating unit 122 is specified can be determined based on the notice number of times of " the mobile notification signal of Referent position " that send from acquiring unit 121.
Setting unit 123 comprises CPU 101, and arranges for user operation being defined as the reference value (datum velocity) of sliding according to the amount of movement of target location.Amount of movement is for performing by hypothesis user the index whether subsequently slide determines target location movement.
According to this exemplary embodiment, amount of movement is described as typical case, because amount of movement is useful as index.But, alternatively other indexs of such as traveling time section can be used as These parameters.
Determining unit 124 comprises CPU 101, and user operation is defined as sliding based on the amount of movement (translational speed) of time per unit target location and the reference value of setting unit 123 setting.By when contacting finger and leaving from touch panel, unit interval finally detects that the time before time of contact condition and predetermined space (such as, 20 milliseconds) is determined.
Control module 125 comprises CPU 101 and exports I/F 106, and controls based on the determination result of determining unit 124 signal exporting output unit 109 to.
Fig. 2 A is exemplified as the external view of the digital camera 200 of the example of signal conditioning package 100.
Digital camera 200 comprises power knob 201, shutter release button 202, camera lens 203 and touch panel 204.Finger 210 is the fingers of the user at the enterprising line operate of touch panel 204.Touch panel 204 corresponds to input block 108 and output unit 109.
Fig. 2 B illustrates the external view of the operating area (input area) on touch panel 204.Operating area corresponds to whole touch panel 204 or the subregion for being identified in the touch on touch panel 204.
According to this exemplary embodiment, high 500 pixels of operating area, wide 900 pixels.On operating area, at position 211 place, finger 210 starts to touch touch panel 204, is then moved to position 212 protecting situation to be touched, and leaves from touch panel 204 in the similar mode of sweeping of stroking near position 212.
Fig. 3 A to Fig. 3 J illustrates finger 210 and touches touch panel 204, moves on touch panel 204 while keeping touching and the state of the sequence of operations left from touch panel 204.
According to this exemplary embodiment, power knob 201 is operated, electric power starting.Then, CPU101 exports for detecting the signal whether having object touching touch panel 204 with predetermined time interval (such as, 20 milliseconds).
If there is object touching touch panel 204, then CPU 101 also exports the signal of the position (target location) for detecting touch.According to this exemplary embodiment, time during electric power starting is set to 0 second.
Touch touch panels 204 at 10 milliseconds of fingers 210 after time during electric power starting, and point 210 maintenance touch start while touch panel 204 mobile.Then, point 210 positions touching touch panel 204 after 20 milliseconds and be detected as target location.RAM 103 is by one group of target location (coordinate) detected after 20 milliseconds and indicate the information of detection time to store.
Subsequently, RAM 103 stores similarly with the information of 20 milliseconds of time intervals detections and target location.If finger 210 leaves from touch panel 204 after 130 milliseconds, then there is no object after determining 140 milliseconds at touch touch panel 204.
That is, the touch pointed between 210 and touch panel 204 lasts till 130 milliseconds from 10 milliseconds.In digital camera 200, determine that the touch between finger 210 and touch panel 204 lasts till 120 milliseconds from 20 milliseconds.
According to circumstances, the accuracy of detection of amount of movement may change to a certain degree.Therefore, the value corresponding with the movement of position touch being detected can be substituted the amount of movement as target location.
According to this exemplary embodiment, amount of movement is the position starting between finger 210 and touch panel 204 to touch and the distance pointed between the position of terminating touch between 210 and touch panel 204.But, the present invention is not limited thereto.
Fig. 4 A is the process flow diagram of the process stream illustrated for determining user operation.
In step S401, acquiring unit 121 obtains the signal of instruction relieving by the indication of indicant.The signal that instruction is decontroled is that instruction determines that the state that state has touched touch panel 204 from finger 210 etc. changes over the signal touching the state stopped.
In step S402, designating unit 122 determine from touch touch panel 204 to during terminating (relieving) time period of touching, detect whether the target location (position of indicant) of touch moves.If detect that the amount of movement (movement of indicant) of the target location of touch is little, then can determine that target location is not moved.If amount of movement is 10 pixels or less, then can determine that target location is not moved.
If designating unit 122 determines that target location touch being detected moves (being yes in step S402), then perform the process in step S403.On the other hand, if designating unit 122 determines that target location touch being detected is not moved (being no in step S402), then a series of process terminates.
In step S403, designating unit 122 specifies instruction the amount of movement (amount of movement of indicant) of the movement of the target location of touch to be detected.In step s 404, setting unit 123 determines whether amount of movement is greater than predetermined threshold (such as, 300 pixels).
If setting unit 123 determines that amount of movement is greater than predetermined threshold (being yes in step S404), then perform the process in step S405.If setting unit 123 determines that amount of movement is not more than predetermined threshold (being no in step S404), then perform the process in step S406.
In step S405, setting unit 123 changes the reference value being used for determining that user operation (amount of movement of indicant) is slide.According to this exemplary embodiment, as the first datum velocity, reference value is preset as 0.5 pixel/millisecond.
In step S405, reference value is changed into the second datum velocity, that is, be greater than the 2.7 pixels/millisecond of 0.5 pixel/millisecond.In step S406, determining unit 124 determines the amount of movement (part amount of movement) of time per unit.
From finally detect before decontroling touch position and from decontrol before touch finally detected time time before different time detecting to the position touched, the amount of movement of acquisition time per unit.
From position touch being detected after the position of touch and 100 milliseconds being detected after 120 milliseconds shown in Fig. 3 A to Fig. 3 J, the amount of movement of time per unit can be obtained.
In step S 407, determining unit 124 determines whether amount of movement (translational speed of indicant) meets reference value (that is, amount of movement is greater than reference value).
If determining unit 124 determines that amount of movement meets reference value (being yes in step S407), then perform the process in step S408.If determining unit 124 determines that amount of movement does not meet reference value (being no in step S407), then a series of process terminates.In step S408, the signal that instruction is slided is outputted to control module 125 by determining unit 124.
Fig. 5 A and Fig. 5 B illustrates the state position movement of touch being detected between finger 210 and touch panel 204.
It is relatively large that Fig. 5 A illustrates amount of movement just before slide, and the therefore example that do not occur of slide.With reference to Fig. 5 A, the touch between finger 210 and touch panel 204 starts in position 501, then terminates in position 502.400 pixels from the amount of movement of position 501 to 502.
The position of touch detected before based on position and 20 milliseconds that touch finally detected before relieving, the amount of movement calculating time per unit is 2.5 pixels/millisecond.Now, determining unit 124 does not export the signal of instruction slide to control module 125.In this case, the sequence of maneuvers in Fig. 5 A is confirmed as dragging and putting, instead of slide.
Fig. 5 B illustrates because the amount of movement just before slide is relatively little, and therefore the example of slide occurs.With reference to Fig. 5 B, finger 210 starts to touch touch panel 204 in position 511, terminates the touch between them in position 512.
Amount of movement from position 511 to position 512 is 200 pixels.Amount of movement is time per unit 0.7 pixel/millisecond, and this amount of movement detected that the position calculation of touch went out based on the position of touch and its 20 milliseconds finally detected before decontroling before.
Now, determining unit 124 exports the signal of instruction slide to control module 125.That is, the sequence of operations in Fig. 5 B is defined as slide.
Fig. 6 A to Fig. 6 E illustrate according to this exemplary embodiment, the example of process that performed by control module 125.
Image 600 is presented on touch panel 204, and comprises and spend 601.Consider following situation, wherein, image 600 is exaggerated and part is presented on touch panel 204.
Fig. 6 C illustrates the part of the image 600 of display on touch panel 204.In Fig. 6 C, the display section of image 600 is corresponding with the region 610 in Fig. 6 B.
Next, perform to drag and spend 601 near with mobile touch panel 204 central authorities.By dragging, touch finger 210 to move the obvious amount of movement of user.
With reference to Fig. 6 D, for various conventional equipment, by by finger 210 is stroked the operation that the mode of sweeping leaves from touch panel 204 be defined as sliding with similar.That is, under many circumstances, simply determine, after dragging operation, slide occurs.
In this case, viewing area stroke sweep operation detection side upwards inertia move.That is, touch panel 204 under many circumstances, shows the image corresponding with region 630, instead of the region 620 that user expects.
But, according to this exemplary embodiment, though when finger 210 by similar stroke the mode of sweeping leave from touch panel 204 time, if the amount of movement before operation is relatively large, be then difficult to operation to be defined as slide.With reference to Fig. 6 E, the image corresponding with region 620 is presented on touch panel 204.
According to this exemplary embodiment, carry out slide compared with little operation of dragging, be more difficult to carry out slide after carrying out dragging operation with certain amount of movement.
Figure 12 A to Figure 12 F illustrate according to this exemplary embodiment, other examples of process of being performed by control module 125.As illustrated in fig. 12, touch panel 204 display object A to G.Object A to G can be arranged in optional position.
When object D being moved to object A to C contiguous, finger 210 touches the display position of object D.
Subsequently, as shown in Figure 12 B, threshold value can be not more than to make amount of movement by moveable finger 210.In addition, point 210 to leave to make translational speed faster than the first datum velocity from touch panel 204.Cause slide like this.Object D slides at picture when finger 210 leaves from object D, therefore moves to the vicinity of object A to C.
As indicated in fig. 12 c, moveable finger 210 exceedes threshold value to make amount of movement.In addition, pointing 210 can leave to make translational speed to be no more than the second datum velocity faster than the first datum velocity from touch panel 204.In this case, by means of only dragging operation, object D slides during dragging operation on picture, and moves to the vicinity of object A to C.
In Figure 12 B to Figure 12 C, object D moves to identical position.As above according to as described in this exemplary embodiment, by the suitable setting to the reference value for determining slide, user can be used alone operation easily.
When object D is not positioned at around object A to C, operation below can be carried out.As indicated in fig. 12d, moveable finger 210 exceedes threshold value to make amount of movement, and then finger 210 leaves from touch panel 204, to make translational speed faster than the second datum velocity (it is faster than the first datum velocity).
As a result, after dragging operation, perform slide, thus to be greater than the amount of movement mobile object D of Figure 12 B and Figure 12 C.
Fig. 4 B illustrates the process flow diagram according to the process stream of the user operation of the distortion of the first exemplary embodiment.With reference to Fig. 4 B, the process in alternative steps S404, performs the process in step S414.In addition, the process in alternative steps S405, performs the process in step S415.The identical Reference numeral of similar process is indicated, and the description thereof will be omitted.
In step S414, setting unit 123 determines whether amount of movement is greater than predetermined threshold (such as, 300 pixels).If setting unit 123 determines that amount of movement is not more than predetermined threshold (being no in step S414), then setting unit 123 performs the process in step S415.If setting unit 123 determines that amount of movement is greater than predetermined threshold (being yes in step S414), then setting unit 123 performs the process in step S406.
In step S415, setting unit 123 changes for determining that user operation is the reference value of slide.According to this exemplary embodiment, reference value is preset as 2.7 pixels/millisecond.In step S415, reference value is changed to the 0.5 pixel/millisecond being such as less than 2.7 pixels/millisecond.
According to above-mentioned exemplary embodiment, the displacement between the amount of movement of target location represents when touch location starts mobile when keeping the touch between finger 210 and touch panel 204 and when finger 210 leaves from touch panel 204.
The present invention is not limited thereto.The movement of target location can be the length of track when touch location moves while touching touch panel 204 at finger 210.
Alternatively, touch panel 204 provides multiple subregions with formed objects.Like this, the amount of movement of target location can be the quantity pointing the subregion that 210 pass through when moving touch location while touching touch panel 204 at finger 210.
Alternatively, grid is arranged to touch panel 204.Like this, the amount of movement of target location can be the quantity pointing the grid that 210 pass through when moving touch location while finger 210 touches touch panel 204.According to the type of the amount of movement of target location, the type of threshold value can be changed.
Alternatively, mobile touch location changes direction with midway.Then, the amount of movement of target location can be change after direction on amount of movement add and.
Threshold value in this exemplary embodiment can not be fixing.Alternatively, along with appointment amount of movement is larger, higher reference value can be set.
According to the second exemplary embodiment of the present invention, provide explanation by another example of the process being used for determining user operation.According to this exemplary embodiment, the structure of signal conditioning package and the similar of the first exemplary embodiment.Therefore, the description thereof will be omitted.
Fig. 7 A is the process flow diagram of the process stream illustrated for determining user operation.
In step s 701, acquiring unit 121 obtains the signal of instruction to the touch of touch panel 204.In step S702, acquiring unit 121 determines whether the signal obtaining instruction relieving.
If acquiring unit 121 determines the signal (being yes in step S702) obtaining instruction relieving, then perform the process in step S707.If acquiring unit 121 does not obtain the signal (being no in step S702) that instruction is decontroled, then perform the process in step S703.
In step S703, the amount of movement of designating unit 122 intended target position (position of touch being detected).The displacement when amount of movement of target location moves corresponding to the touch location pointing 210 while touching touch panel 204 at finger 210.
In step S704, setting unit 123 determines whether amount of movement is greater than threshold value (such as, 300 pixels).If setting unit 123 determines that amount of movement is greater than threshold value (being yes in step S704), then perform the process in step S705.If setting unit 123 determines that amount of movement is not more than threshold value (being no in step S704), then perform the process in step S702.
In step S705, setting unit 123 performs and changes for determining that user operation is the process of the reference value of slide.In step S705, reference value is preset to 0.5 pixel/millisecond, is similar to the first exemplary embodiment.In step S705, reference value is changed over the 2.7 pixels/millisecond being greater than 0.5 pixel/millisecond.
In step S706, control module 125 performs for notifying for determining that user operation is the process of the change of the reference value of slide.Touch panel 204 shows instruction for determining that user operation is the notice of the change of the reference value of slide.
If output unit 109 has loudspeaker or motor, sound or vibration can be used to send instruction for determining the notice of the change of reference value that user operation is slide.
Because the process in step S707 to S709 is similar to step S406 to S408, therefore the description thereof will be omitted.
Fig. 7 B illustrates according to this exemplary embodiment, for determining the process flow diagram of another example of the process of user operation.
With reference to Fig. 7 B, alternative steps S707, perform the process in step S717, and alternative steps S708 performs the process in step S718.The identical Reference numeral of similar process is indicated, and the description thereof will be omitted.
In step S717, determining unit 124 determines the amount of the traveling time needed for preset distance (such as, 80 pixels).By the very first time of one group of instruction when touch finally being detected before relieving and the information of position touch being detected in this very first time, and one group of instruction be different from the very first time the second time and in the second time detecting to the information of position touched, acquisition preset distance.
In step S718, determining unit 124 determines whether the amount of the time needed for the movement of preset distance meets pre-determined reference value (such as, whether time quantum is shorter than 40 milliseconds).If determining unit 124 determines that time quantum meets pre-determined reference value (being yes in step S718), then perform the process in step S709.If determining unit 124 determines that time quantum does not meet pre-determined reference value (being no in step S718), then a series of process terminates.
According to the process in step S717, the reference value changed is set to the information of instruction time in preset reference value and step S705.
According to the 3rd exemplary embodiment of the present invention, be described to another example of the process being used for determining user operation.According to this exemplary embodiment, the similar of signal conditioning package is in the structure of the first exemplary embodiment.Therefore the description thereof will be omitted.
Fig. 8 is the process flow diagram of the process stream illustrated for determining user operation.Indicate with identical Reference numeral with process similar in Fig. 4 A and Fig. 4 B, and the description thereof will be omitted.
In step S804, setting unit 123 determines whether the amount of movement of specifying is that (1) is less than first threshold Th1, (2) be more than or equal to first threshold Th1 and be less than Second Threshold Th2, and (3) are more than or equal to any one in Second Threshold Th2 (Th1 < Th2).First threshold Th1 is such as 300 pixels, and Second Threshold Th2 is such as 600 pixels.
If (1), namely amount of movement is less than threshold value Th1, then perform the process in step S805.If (2), namely amount of movement is more than or equal to first threshold Th1 and is less than Second Threshold Th2, then perform the process in step S806.If (3), namely amount of movement is more than or equal to Second Threshold Th2, then perform the process in step S807.
In step S805, setting unit 123 will be used for determining that the reference value of slide is set to benchmark A (such as, 0.5 pixel/millisecond).In step S806, setting unit 123 will be used for determining that the reference value of slide is set to benchmark B (such as, 1.5 pixels/millisecond).
In step S807, setting unit 123 will be used for determining that the reference value of slide is set to benchmark C (such as, 2.5 pixels/millisecond).In step S808, S818 and S828, determining unit 124 determines the amount of movement of time per unit.In step S809, determining unit 124 determines whether the amount of movement of time per unit meets benchmark A.
If determining unit 124 determines that the amount of movement of time per unit meets benchmark A (being yes in step S809), then perform the process in step S810.If determining unit 124 determines that the amount of movement of time per unit does not meet benchmark A (being no in step S809), then perform the process in step S811.
In step S819, determining unit 124 determines whether the amount of movement of time per unit meets benchmark B.If determining unit 124 determines that the amount of movement of time per unit meets benchmark B (being yes in step S819), then perform the process in step S820.If determining unit 124 determines that the amount of movement of time per unit does not meet benchmark B (being no in step S819), then perform the process in step S821.
In step S829, determining unit 124 determines whether the amount of movement of time per unit meets benchmark C.If determining unit 124 determines that the amount of movement of time per unit meets benchmark C (being yes in step S829), then perform the process in step S830.If determining unit 124 determines that the amount of movement of time per unit does not meet benchmark C (being no in step S829), then perform the process in step S831.
In step S810, S811, S820, S821, S830 and S831, control module 125 performs each control operation.Such as, in each control operation, in step S810 and S820, control module 125 performs and corresponding control operation of sliding.
In step S811 and S830, control module 125 performs and drags and be rivals in a contest the control operation of answering.In step S821 and S831, control module 125 performs the control operation corresponding with handwriting recognition.
If above-mentioned exemplary embodiment to be applied to the device of the touch panel 204 (operating area) comprising small size, then the mobile range of finger 210 is narrow.Therefore, picture size affects in slide the movement pointing 210.
According to the 4th exemplary embodiment of the present invention, also carry out improving to prevent this impact.According to this exemplary embodiment, with the height of touch panel 204 (operating area) and width accordingly, the threshold value on rational height and Width.According to this exemplary embodiment, the hardware configuration of signal conditioning package is similar to Figure 1A, and therefore the description thereof will be omitted.
Fig. 1 C illustrates the functional block diagram according to the structure of the signal conditioning package 140 of this exemplary embodiment.
Signal conditioning package 140 comprises acquiring unit 121, designating unit 122, setting unit 143, determining unit 124, control module 125 and holding unit 146.Indicate with the identical Reference numeral of parts similar in Figure 1B, and the description thereof will be omitted.
Holding unit 146 comprises CPU, and keeps the information of the size of instruction operating area.If signal conditioning package 140 is the digital cameras 200 in Fig. 2 A, then operating area corresponds to whole touch panel 204, or touch panel 204 recognizes the subregion of touch.
The information of the size of instruction operating area has the information of high 500 pixels, wide 900 pixels.Setting unit 143 comprises CPU, and based on the information of size of instruction operating area, the threshold value respectively on rational height and Width.
Fig. 9 is the process flow diagram of the process stream illustrated for determining user operation.Indicate with the identical Reference numeral of process similar in Fig. 4 A, and the description thereof will be omitted.
In step S902, designating unit 122 determines whether position touch being detected moves.If designating unit 122 determines that position touch being detected moves (being yes in step S902), then perform the process in step S903.If designating unit 122 determines that position touch being detected is not moved (being no in step S902), then a series of process terminates.
In step S903, designating unit 122 specifies the amount of movement indicating and detect on the middle and high degree of the amount of movement of the movement of the position of touch and Width.In step S904, setting unit 123 based in holding unit 146 keep information, the size in assigned operation region.
In step S905, setting unit 123, according to the size of the operating area of specifying, determines the threshold value on height and Width.If shown in Fig. 2 B, the appointment size of operating area is high 500 pixels, wide 900 pixels, then the threshold value in short transverse is set to 170 pixels by setting unit 123, and the threshold value on Width is set to 300 pixels.
If be highly shorter than width, then the threshold value in short transverse can be less than the threshold value on Width.In step S906, setting unit 123 determines whether the amount of movement of short transverse is greater than the threshold value of the movement corresponding to short transverse, and/or whether the amount of movement on Width is greater than the threshold value corresponding with the movement on Width.
If setting unit 123 determines that the amount of movement in short transverse is greater than the threshold value corresponding with the movement in short transverse, and/or the amount of movement on Width is greater than the threshold value (in step S906 be yes) corresponding with the movement on Width, then perform the process in step S405.
On the other hand, if setting unit 123 determines that the amount of movement in short transverse is not more than the threshold value corresponding with the movement in short transverse, and the amount of movement on Width is not more than the threshold value (in step S906 be no) corresponding with the movement on Width, then perform the process in step S406.
In step S405, setting unit 143 can also based on the size of the operating area determined in step S904, to the respective value on the reference value rational height for determining slide and Width.
According to the height of touch panel 204 (operating area) and width x length, can carry out for determining that user operation is the process of slide.If change the size of operating area according to multiple pattern, then also according to the height in respective pattern and width, user operation can be defined as slide.As a result, improve user operability.
According to the 5th exemplary embodiment of the present invention, be described to another example of the process being used for determining user operation.According to the structure of the signal conditioning package of this exemplary embodiment and the 4th exemplary embodiment similar, therefore the description thereof will be omitted.Setting unit 143 also arranges the threshold value in diagonal based on the height of operating area and width.
Figure 10 is exemplified with the process flow diagram of the process stream for determining user operation.The process similar with Fig. 4 A with Fig. 9 is indicated with identical Reference numeral, and the description thereof will be omitted.
In step S1005, setting unit 143 determines the threshold value in height, width, diagonal.In step S1006, determining unit 124 detects height, width and the diagonal moving direction in any one.
As the detection method of moving direction, detect based on by the different time before the time when position of touch finally being detected and touch finally detected before relieving before being connected to relieving the angle that the line segment of the position of touch is formed, determine moving direction.
As another detection method of moving direction, the position of touch detected based on connecting first time and the angle that the line segment of the position of touch is formed finally detected before touch stops, determining moving direction.As shown in figure 11, moving direction can be determined according to the angle determined.
In the example depicted in fig. 11, if angle is any one in " be more than or equal to 0 degree and be less than 22.5 degree ", " be more than or equal to 337.5 degree and be less than 360 degree " and " be more than or equal to 157.5 degree and be less than 202.5 degree ", then determining unit 124 is determined to move in the direction of the width.If angle is any one in " be more than or equal to 22.5 degree and be less than 67.5 degree ", " be more than or equal to 112.5 degree and be less than 157.5 degree ", " be more than or equal to 202.5 degree and be less than 247.5 degree " and " be more than or equal to 292.5 degree and be less than 337.5 degree ", then determining unit 124 is determined to move on the diagonal.If angle is any one in " be more than or equal to 67.5 degree and be less than 112.5 degree " and " be more than or equal to 247.5 degree and be less than 292.5 degree ", then determining unit 124 is determined to move in the height direction.
In step S1007, determining unit 124 determines whether the amount of movement corresponding with the direction detected is greater than the threshold value corresponding with this direction.If determining unit 124 determines that amount of movement is greater than threshold value (being yes in step S1007), then perform the process in step S405.If determining unit 124 determines that amount of movement is not more than threshold value (being no in step S404), then perform the process in step S406.
In step S405, setting unit 143 also can based on the size of the operating area of specifying in step S904, to the reference value for determining slide, and the respective value in rational height, width and diagonal.
According to this exemplary embodiment, the respective movement in height, width, diagonal can be distinguished.Therefore, be effective when changing process according to the moving direction of target location.
According to the 6th exemplary embodiment of the present invention, be described to another example of the process being used for determining user operation.According to the hardware configuration of the signal conditioning package of this exemplary embodiment and the first exemplary embodiment similar.Therefore the description thereof will be omitted.
Figure 13 A illustrates the functional block diagram according to the structure of the signal conditioning package 100 of this exemplary embodiment.Indicate with the identical Reference numeral of block similar in Figure 1B, and the description thereof will be omitted.
Designating unit 1301 comprises CPU 101, and based on the information that acquiring unit 121 exports, the traveling time section of intended target position.The traveling time section of target location is when the time period of moving touch location while keeping finger 210 to touch touch panel 204, the finger 210 that then carries out touching spends when leaving from touch panel 204.
In other words, traveling time section corresponds to the traveling time section of indicant.As the example of the computing method of the traveling time section of target location, can measure according to computing time when leaving from above to finger 210 when finger 210 touches touch panel 204.
In this case, the time period pointing 210 movements also calculates in traveling time section.As another example of the computing method of the traveling time section of target location, can with fixed intervals (such as, 20 milliseconds) monitor event on touch panel 204, and can carry out calculating to get rid of the time period that the event notifying finger 210 movement does not occur.
Figure 14 A and Figure 14 B is the process flow diagram of the process stream for determining user operation.Indicate with identical Reference numeral with process similar in Fig. 4 A and Fig. 4 B, and the description thereof will be omitted.
With reference to Figure 14 A, in step S1401, the traveling time section during designating unit 1301 specifies the position touch of finger 210 being detected to move.
In step S1402, setting unit 123 determines whether traveling time section is longer than predetermined threshold (such as, 200 milliseconds).If setting unit 123 determines that traveling time segment length is in predetermined threshold (being yes in step S1402), then the place performed in step S405 is existing.If setting unit 123 determines that traveling time section is no longer than predetermined threshold (being no in step S1402), then perform the process in step S406.
According to this exemplary embodiment, reference value is preset to 0.5 pixel/millisecond.In step S405, reference value is changed over the 2.7 pixels/millisecond being greater than 0.5 pixel/millisecond.
With reference to Figure 14 B, the process in the step S1402 in alternate figures 14A, performs the process in step S1403.In addition, the process in alternative steps S405, performs the process in step S415.
The identical Reference numeral of similar process is indicated, and the description thereof will be omitted.In step S1403, setting unit 123 determines whether traveling time section is longer than predetermined threshold (such as, 200 milliseconds).
If setting unit 123 determines that traveling time segment length is in predetermined threshold (being yes in step S1403), then perform the process in step S406.If setting unit 123 determines that traveling time section is no longer than predetermined threshold (being no in step S1403), then perform the process in step S415.
According to this exemplary embodiment, reference value is preset to 2.7 pixels/millisecond.In step S415, reference value is changed over the 0.5 pixel/millisecond being less than 2.7 pixels/millisecond.
Figure 15 A and Figure 15 B illustrates the state of mobility detect to the position of the touch between moveable finger 210 and touch panel 204.Figure 15 A and Figure 15 B is by obtaining with using the appointment of traveling time section to replace the appointment of the displacement in use Fig. 5 A and Fig. 5 B.In Figure 15 A and Figure 15 B, be described by the threshold value be provided with in the example in Figure 14 A.
With reference to Figure 15 A, it is 300 milliseconds from the traveling time section of position 501 to 502.Based on position touch being detected before the position of touch and its 20 milliseconds finally detected before relieving, calculate the amount of movement of time per unit.In addition, the amount of movement of time per unit amount is 2.5 pixels/millisecond.
Now, determining unit 124 does not export the signal of instruction slide to control module 125.That is, determining unit 124 determines that the sequence of operations shown in Figure 15 A is not slide but drag-and-drop operation.
With reference to Figure 15 B, it is 100 milliseconds from the traveling time section of position 511 to 512.Based on position touch being detected before the position of touch and its 20 milliseconds finally detected before decontroling, calculate the amount of movement of time per unit.In addition, the amount of movement of the time per unit calculated is 0.7 pixel/millisecond.
Now, determining unit 124 exports the signal of instruction slide to control module 125.In other words, determining unit 124 determines that the sequence of operations in Figure 15 B is slide to industry.
Figure 16 A and Figure 16 B is the process flow diagram of the process stream illustrated for determining user operation.Indicate with identical Reference numeral with process similar in Fig. 7 A and Fig. 7 B, and the description thereof will be omitted.
With reference to Figure 16 A, in step S1601, designating unit 1301 specifies mobility detect to arrive the traveling time section needed for position of touch.In step S1602, setting unit 123 determines whether traveling time section is longer than predetermined threshold (such as, 200 milliseconds).
If setting unit 123 determines that traveling time segment length is in predetermined threshold (being yes in step S1602), then perform the process in step S705.If setting unit 123 determines that traveling time section is no longer than predetermined threshold (being no in step S1602), then perform the process in step S702.
In other words, in this process flow diagram, when notifying the event of instruction finger 210 movement from touch panel 204 at every turn, measure traveling time section, and determine determining whether the reference value of slide changes.Process stream in Figure 16 B is similar to Fig. 7 B, and therefore the description thereof will be omitted.
Figure 17 is the process flow diagram of the process stream illustrated for determining user operation.Indicate with the identical Reference numeral of process similar in Fig. 8, and the description thereof will be omitted.
Process in step S1401 is similar to the step S1401 in Figure 14, and the description thereof will be omitted.In step S1701, whether the amount (T) of the traveling time section of specifying in setting unit 123 determining step S1401 is any one in following three kinds of situations, (1) first threshold Th1 is shorter than, (2) be longer than and equal first threshold Th1 and be shorter than Second Threshold Th2, and (3) are longer than and are equaled Second Threshold Th2 (Th1 < Th2).
First threshold Th1 is such as 200 milliseconds, and Second Threshold Th2 is such as 300 milliseconds.If (1), namely traveling time section is shorter than first threshold Th1, then perform the process in step S805.If (2), namely traveling time segment length is in equaling first threshold Th1 and being shorter than Second Threshold Th2, then perform the process in step S806.If (3), namely traveling time segment length is in equaling Second Threshold Th2, then perform the process in step S807.
According to this exemplary embodiment, according to the length of the traveling time section of target location, change the reference value for determining slide.
When target location is not moved, also traveling time section may be calculated.Therefore, can calculate and press and keep time period of operation (such as, finger 210 touches touch panels 204, touches predetermined hold-time section, then points 210 and leaves from touch panel 204).
Therefore, when carrying out pressing when not determining due to user to perform touch operation and keep operation, though finger 210 slightly mobile after, point 210 and then leave from touch panel 204, also movement is not defined as slide.
Figure 13 B illustrates the functional block diagram according to the structure of the signal conditioning package 140 of the 7th exemplary embodiment.With reference to Figure 13 B, in the structure shown in Figure 13 A, also provide holding unit 146, to keep the information of the size indicating operating area.
Setting unit 143, based on the information of the size of instruction operating area, arranges the reference value for the threshold value or slide determining traveling time section.This functional block is described, therefore omits it and illustrate.
Figure 18 is the process flow diagram of the process stream illustrated for determining user operation.The process similar with Figure 10 is indicated with identical Reference numeral, and the description thereof will be omitted.
In step S1801, whether the traveling time section of specifying in setting unit 143 determining step S1401 is longer than the threshold value corresponding with the moving direction detected in step S1006.
If the traveling time segment length specified in setting unit 143 determining step S1401 is in threshold value (being yes in step S 1801), then perform the process in step S405.If the traveling time section of specifying in setting unit 1403 determining step S1401 is no longer than threshold value (being no in step S1801), then perform the process in step S406.
In step S405, setting unit 143 also can based on the size of the operating area of specifying in step S904, to value respective in the reference value rational height for determining slide, width, diagonal.
According to this exemplary embodiment, determine the movement in height, width, diagonal.In addition, the determination process of traveling time section is carried out for all directions.Thus, when process changes according to the moving direction of target location thereupon, be effective.
The present invention can by performing lower column processing to realize.That is, via network or various storage medium, realization is supplied to system or device according to the software (program) of the function of exemplary embodiment.In addition, computing machine in system or device (or CPU (central processing unit) (CPU) or microprocessing unit (MPU)) reads and executive routine.
Although describe the present invention with reference to exemplary embodiment, be to be understood that and the invention is not restricted to disclosed exemplary embodiment.The widest explanation should be given, to contain all distortion, equivalent structure and function to the scope of claims.

Claims (11)

1. a signal conditioning package, it is constructed to the slide determining indicant, and described signal conditioning package comprises:
Setting unit, it is constructed to arrange according to the one in the first benchmark indicant speed of the amount of movement of described indicant and the second benchmark indicant speed; And
Determining unit, it is constructed to the slide determining whether there occurs described indicant based on the translational speed of described indicant and the set first or second benchmark indicant speed,
Wherein, when the amount of movement of described indicant is not more than predetermined threshold, described setting unit arranges described first benchmark indicant speed, and when described amount of movement is greater than described predetermined threshold, described setting unit arranges the described second benchmark indicant speed being greater than described first benchmark indicant speed.
2. signal conditioning package according to claim 1, described signal conditioning package also comprises:
Determining means, it is constructed to utilize described indicant to carry out the size of the operating area operated based on allowing, and determines described predetermined threshold.
3. signal conditioning package according to claim 1, wherein, when described translational speed is greater than the described first or second benchmark indicant speed, described determining unit determines the slide that there occurs described indicant, and when described translational speed is not more than the described first or second benchmark indicant speed, described determining unit determines the slide that described indicant does not occur.
4. signal conditioning package according to claim 1, described signal conditioning package also comprises acquiring unit, and described acquiring unit is constructed to the user operation detecting indicant.
5. signal conditioning package according to claim 4, described signal conditioning package also comprises designating unit, and described designating unit is constructed to the amount of movement of specifying described indicant based on the output from described acquiring unit.
6. signal conditioning package according to claim 5, wherein, described designating unit, by the notice number of times of the event of notice movement, specifies described amount of movement.
7. a method of operating for signal conditioning package, the slide of described signal conditioning package determination indicant, described method of operating comprises the following steps:
Arrange according to the one in the first benchmark indicant speed of the amount of movement of described indicant and the second benchmark indicant speed; And
Based on translational speed and the set first or second benchmark indicant speed of described indicant, determine whether the slide that there occurs described indicant,
Wherein, when the amount of movement of described indicant is not more than predetermined threshold, described setting steps arranges described first benchmark indicant speed, and when described amount of movement is greater than described predetermined threshold, described setting steps arranges the described second benchmark indicant speed being greater than described first benchmark indicant speed.
8. a signal conditioning package, it is constructed to the slide determining indicant, and described signal conditioning package comprises:
Setting unit, it is constructed to the benchmark indicant speed of the traveling time section arranged according to described indicant; And
Determining unit, it is constructed to the slide determining whether there occurs described indicant based on the translational speed of described indicant and the benchmark indicant speed of setting,
Wherein, when described traveling time section is no more than predetermined threshold, described setting unit arranges the first benchmark indicant speed, and when described traveling time section exceedes described predetermined threshold, described setting unit arranges the second benchmark indicant speed being greater than described first benchmark indicant speed.
9. signal conditioning package according to claim 8, described signal conditioning package also comprises:
Determining means, it is constructed to utilize described indicant to carry out the size of the operating area operated based on allowing, and determines described predetermined threshold.
10. signal conditioning package according to claim 8, wherein, when described translational speed is greater than described benchmark indicant speed, described determining unit determines the slide that there occurs described indicant, and when described translational speed is not more than described benchmark indicant speed, described determining unit determines the slide that described indicant does not occur.
The method of operating of 11. 1 kinds of signal conditioning packages, the slide of described signal conditioning package determination indicant, this method of operating comprises the following steps:
The benchmark indicant speed of the traveling time section according to indicant is set; And
Based on the translational speed of described indicant and the benchmark indicant speed of setting, determine whether the slide that there occurs described indicant,
Wherein, when described traveling time section is no more than predetermined threshold, described setting steps arranges the first benchmark indicant speed, and when described traveling time section exceedes described predetermined threshold, described setting steps arranges the second benchmark indicant speed being greater than described first benchmark indicant speed.
CN201110382170.7A 2010-11-24 2011-11-24 Information processing apparatus and operation method thereof Active CN102591450B (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2010261607 2010-11-24
JP2010-261607 2010-11-24
JP2011-164009 2011-07-27
JP2011164009A JP5479414B2 (en) 2010-11-24 2011-07-27 Information processing apparatus and control method thereof

Publications (2)

Publication Number Publication Date
CN102591450A CN102591450A (en) 2012-07-18
CN102591450B true CN102591450B (en) 2015-02-25

Family

ID=45318824

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201110382170.7A Active CN102591450B (en) 2010-11-24 2011-11-24 Information processing apparatus and operation method thereof

Country Status (5)

Country Link
US (1) US9459789B2 (en)
EP (1) EP2458490B1 (en)
JP (1) JP5479414B2 (en)
KR (1) KR101466240B1 (en)
CN (1) CN102591450B (en)

Families Citing this family (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9524093B2 (en) * 2012-06-29 2016-12-20 Google Inc. Systems and methods for handling scrolling actions for scrolling through content displayed on an electronic device
US20150301648A1 (en) * 2012-11-15 2015-10-22 Pioneer Corporation Information processing device, control method, program, and recording medium
JP5862587B2 (en) 2013-03-25 2016-02-16 コニカミノルタ株式会社 Gesture discrimination device, gesture discrimination method, and computer program
JP6253284B2 (en) * 2013-07-09 2017-12-27 キヤノン株式会社 Information processing apparatus, control method therefor, program, and recording medium
JP6368462B2 (en) * 2013-08-01 2018-08-01 シャープ株式会社 Information processing apparatus, information processing method, and program thereof
JP6171712B2 (en) * 2013-08-09 2017-08-02 富士ゼロックス株式会社 Information processing apparatus and program
WO2015052961A1 (en) * 2013-10-08 2015-04-16 株式会社ソニー・コンピュータエンタテインメント Information processing device
JP6221622B2 (en) * 2013-10-23 2017-11-01 富士ゼロックス株式会社 Touch panel device and image forming apparatus
JP6357787B2 (en) 2014-02-07 2018-07-18 日本電気株式会社 Data processing device
JP6305147B2 (en) * 2014-03-25 2018-04-04 キヤノン株式会社 Input device, operation determination method, computer program, and recording medium
JP6399834B2 (en) * 2014-07-10 2018-10-03 キヤノン株式会社 Information processing apparatus, information processing apparatus control method, and program
JP6410537B2 (en) 2014-09-16 2018-10-24 キヤノン株式会社 Information processing apparatus, control method therefor, program, and storage medium
JP6452456B2 (en) * 2015-01-09 2019-01-16 キヤノン株式会社 Information processing apparatus, control method therefor, program, and storage medium
CN104660911B (en) * 2015-03-16 2018-01-26 广东欧珀移动通信有限公司 A kind of snapshots method and apparatus
JP6676913B2 (en) 2015-09-30 2020-04-08 ブラザー工業株式会社 Information processing device and control program

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101727286A (en) * 2008-10-28 2010-06-09 索尼株式会社 Information processing apparatus, information processing method and program

Family Cites Families (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5880411A (en) * 1992-06-08 1999-03-09 Synaptics, Incorporated Object position detector with edge motion feature and gesture recognition
JPH07114621A (en) * 1993-10-15 1995-05-02 Hitachi Ltd Gesture recognizing method and device using same
US20070177804A1 (en) * 2006-01-30 2007-08-02 Apple Computer, Inc. Multi-touch gesture dictionary
US20020015064A1 (en) * 2000-08-07 2002-02-07 Robotham John S. Gesture-based user interface to multi-level and multi-modal sets of bit-maps
JP4723799B2 (en) * 2003-07-08 2011-07-13 株式会社ソニー・コンピュータエンタテインメント Control system and control method
US8131026B2 (en) * 2004-04-16 2012-03-06 Validity Sensors, Inc. Method and apparatus for fingerprint image reconstruction
JP2008508601A (en) * 2004-07-30 2008-03-21 アップル インコーポレイテッド Gestures for touch-sensitive input devices
US7761814B2 (en) * 2004-09-13 2010-07-20 Microsoft Corporation Flick gesture
US8147248B2 (en) * 2005-03-21 2012-04-03 Microsoft Corporation Gesture training
JP4485991B2 (en) * 2005-05-13 2010-06-23 株式会社ソニー・コンピュータエンタテインメント Image processing apparatus, image processing method, and program
JP4882319B2 (en) * 2005-09-08 2012-02-22 パナソニック株式会社 Information display device
US20070064004A1 (en) * 2005-09-21 2007-03-22 Hewlett-Packard Development Company, L.P. Moving a graphic element
JP4613142B2 (en) * 2006-03-31 2011-01-12 日本システムウエア株式会社 Gesture recognition apparatus, online operation system using the same, gesture recognition method, and computer-readable medium
JP2007334691A (en) * 2006-06-15 2007-12-27 Canon Inc Information processor and display method
TWM325544U (en) * 2007-05-15 2008-01-11 High Tech Comp Corp Electronic device with switchable user interface and electronic device with accessable touch operation
US20090107737A1 (en) * 2007-10-28 2009-04-30 Joesph K Reynolds Multiple-sensor-electrode capacitive button
JP4670860B2 (en) * 2007-11-22 2011-04-13 ソニー株式会社 Recording / playback device
KR100900295B1 (en) 2008-04-17 2009-05-29 엘지전자 주식회사 User interface method for mobile device and mobile communication system
US8526767B2 (en) 2008-05-01 2013-09-03 Atmel Corporation Gesture recognition
CN102099766B (en) * 2008-07-15 2015-01-14 意美森公司 Systems and methods for shifting haptic feedback function between passive and active modes
US8212794B2 (en) * 2008-09-30 2012-07-03 Avago Technologies Ecbu Ip (Singapore) Pte. Ltd. Optical finger navigation utilizing quantized movement information
JP4752897B2 (en) * 2008-10-31 2011-08-17 ソニー株式会社 Image processing apparatus, image display method, and image display program
JP4752900B2 (en) * 2008-11-19 2011-08-17 ソニー株式会社 Image processing apparatus, image display method, and image display program
JP2010134755A (en) * 2008-12-05 2010-06-17 Toshiba Corp Communication device
JP2010176332A (en) * 2009-01-28 2010-08-12 Sony Corp Information processing apparatus, information processing method, and program
JP5282617B2 (en) * 2009-03-23 2013-09-04 ソニー株式会社 Information processing apparatus, information processing method, and information processing program
JP5487679B2 (en) * 2009-03-31 2014-05-07 ソニー株式会社 Information processing apparatus, information processing method, and information processing program
JP5418187B2 (en) * 2009-12-02 2014-02-19 ソニー株式会社 Contact operation determination device, contact operation determination method, and program

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101727286A (en) * 2008-10-28 2010-06-09 索尼株式会社 Information processing apparatus, information processing method and program

Also Published As

Publication number Publication date
KR20120056211A (en) 2012-06-01
EP2458490A2 (en) 2012-05-30
EP2458490A3 (en) 2013-05-29
US20120131517A1 (en) 2012-05-24
JP5479414B2 (en) 2014-04-23
EP2458490B1 (en) 2016-09-28
KR101466240B1 (en) 2014-11-27
CN102591450A (en) 2012-07-18
JP2012128830A (en) 2012-07-05
US9459789B2 (en) 2016-10-04

Similar Documents

Publication Publication Date Title
CN102591450B (en) Information processing apparatus and operation method thereof
US9678606B2 (en) Method and device for determining a touch gesture
US10656821B2 (en) Moving an object by drag operation on a touch panel
US20220404917A1 (en) Cursor Mode Switching
JP5783828B2 (en) Information processing apparatus and control method thereof
CN108073334B (en) Vector operation-based suspension touch method and device
CN103616972B (en) Touch screen control method and terminal device
KR20160077122A (en) Method and apparatus for processing suspended or distance operation
CN102955568A (en) Input unit recognizing user&#39;s motion
CN103677240A (en) Virtual touch interaction method and equipment
US20140282279A1 (en) Input interaction on a touch sensor combining touch and hover actions
US8896561B1 (en) Method for making precise gestures with touch devices
US10564762B2 (en) Electronic apparatus and control method thereof
CN103186280A (en) Input device and electronic apparatus
US20170060324A1 (en) User interface for electronic device, input processing method, and electronic device
JP2010272036A (en) Image processing apparatus
JP2003280813A (en) Pointing device, pointer controller, pointer control method and recording medium with the method recorded thereon
US20240160294A1 (en) Detection processing device, detection processing method, information processing system
EP3059664A1 (en) A method for controlling a device by gestures and a system for controlling a device by gestures
EP4339746B1 (en) Touchless user-interface control method including time-controlled fading
JP6998775B2 (en) Image measuring machine and program

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant