CN102109922B - Information processing apparatus and control method therefor - Google Patents

Information processing apparatus and control method therefor Download PDF

Info

Publication number
CN102109922B
CN102109922B CN201010566083.2A CN201010566083A CN102109922B CN 102109922 B CN102109922 B CN 102109922B CN 201010566083 A CN201010566083 A CN 201010566083A CN 102109922 B CN102109922 B CN 102109922B
Authority
CN
China
Prior art keywords
operating
touch panel
touch
judged
stored
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201010566083.2A
Other languages
Chinese (zh)
Other versions
CN102109922A (en
Inventor
中川浩一
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Canon Inc
Original Assignee
Canon Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Canon Inc filed Critical Canon Inc
Publication of CN102109922A publication Critical patent/CN102109922A/en
Application granted granted Critical
Publication of CN102109922B publication Critical patent/CN102109922B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0412Digitisers structurally integrated in a display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/57Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

The invention provides an information processing apparatus and a control method thereof. The information processing apparatus acquires a plurality of touching positions on a locus of touching operations, determines an operation direction of an operation by using a determination condition set based on the plurality of acquired touching positions, and performs an action predetermined in association with the determined operation direction. In addition, the information processing apparatus stores the determined operation direction on a memory and changes the determination condition based on the operation direction stored on the memory to determine an operation direction of a touching operation executed after the operation for which the operation direction has been determined and stored.

Description

Messaging device and control method thereof
Technical field
The present invention relates to a kind ofly can carry out according to operation performed on touch panel the messaging device of control operation.
Background technology
In recent years, on market, occurred comprising touch panel and the digital device that can make user operate intuitively at its display unit.In this class digital device, conventionally in the display frame that comprises touch panel, provide button icon.In addition the function of distributing to this button icon is carried out in the operation that, user can be used for the touch panel on touch button icon by execution.In addition, classic method is according to utilizing finger or style of writing to touch touch panel by user and keeping finger or style of writing to touch the track that touch panel time shift movable contact is touched the touch location on the finger of touch panel or picture that pen forms, carrying out control.
As according to the example of the control of the track of touch location, Japanese kokai publication sho 63-174125 has illustrated method below.More specifically, in the method, the image being presented on display unit along with touching the moving of finger of touch panel is rolled.In addition, when user stops utilizing finger touch touch panel, rolling speed reduces and finally stops to roll gradually.
In addition, TOHKEMY 2005-44036 has illustrated method below.More specifically, the track of the position of this traditional technique in measuring touch touch panel is with respect to the angle of the horizontal edge of display device.In addition the respective image if the angle detecting, in predetermined angular, is rolled in the horizontal direction.In addition, the method described in TOHKEMY 2005-44036 also detects the track of the position that touches touch panel with respect to the angle of the vertical edge of display device.The respective image if the angle detecting, in predetermined angular, is rolled in vertical direction.
Yet, when the angle of the track based on touch location is carried out control according to the direction of operation, when above-mentioned classic method is distributed to as the control of each direction of operating in a plurality of direction of operating of judgement object in execution, according to the impartial condition arranging in a plurality of direction of operating, carry out decision operation direction.Therefore,, if carried out user's touch operation by wrong angle, may carry out the control of distributing to the direction that not user wants.In this case, device may break down.
As shown in Figure 7, suppose that user has carried out for utilizing finger 701 to touch the operation of touch panel 702 with arcuation, wherein, this arcuation starts along track 705 from starting point 703 and finishes at terminal 704.In this case, think that user has carried out downward operation from top to bottom.
In this case, for the track of touch location, around starting point 703, this track component is from left to right greater than this track component from top to bottom.Therefore, to be judged as user operation be operation from left to right to device.Therefore, in this case, device may be carried out the control of distributing to right.
Summary of the invention
The present invention relates to a kind ofly can highly precisely identify that user wants and the messaging device of the operation carried out while controlling carrying out according to the track that touches the position of touch panel.
According to an aspect of the present invention, provide a kind of messaging device, comprising: obtain parts, for obtaining, for the situation not interrupting touching touch panel, move down movable contact and touch a plurality of touch locations on the track of operation of position of touch panel; Decision means, for by using the Rule of judgment that obtains described a plurality of touch locations that parts obtain based on described, judges the direction of operating of described operation; Control assembly, carries out control for the direction of operating of judging according to described decision means, to carry out and the predetermined action explicitly of this direction of operating; Scu, for carrying out control, is stored on storer with the direction of operating that described decision means is judged; And change parts, for the direction of operating based on being stored on described storer, change described Rule of judgment, with the direction of operating of the touch operation carried out after the operation to having judged and stored direction of operating, judge.
According to another aspect of the invention, a kind of method for control information treatment facility is provided, described messaging device is for according to carrying out control by the performed input of touch panel, and described method comprises: obtain and for the situation not interrupting touching touch panel, move down movable contact and touch a plurality of touch locations on the track of operation of position of touch panel; By using the Rule of judgment of the described a plurality of touch locations based on obtained to judge the direction of operating of described operation; According to judged direction of operating, carry out control, to carry out and the predetermined action explicitly of this direction of operating; Carry out and control so that the direction of operating of being judged is stored on storer; And the direction of operating based on being stored on described storer changes described Rule of judgment, with the direction of operating of the touch operation carried out after the operation to having judged and stored direction of operating, judge.
By the detailed description to exemplary embodiments below with reference to accompanying drawing, further feature of the present invention and aspect will be apparent.
Accompanying drawing explanation
Be included in instructions and form the accompanying drawing of a part for instructions, exemplary embodiments of the present invention, feature and aspect being shown, and being used for explaining principle of the present invention together with instructions.
Fig. 1 is the block diagram that the exemplary configurations of the messaging device of exemplary embodiments according to the present invention is shown;
Fig. 2 A~2C illustrates the example of the demonstration on graphic user interface (GUI) picture of the first exemplary embodiments according to the present invention;
Fig. 3 A and 3B illustrate according to the present invention the first exemplary embodiments by the process flow diagram of the example of the performed processing of CPU (central processing unit) (CPU);
Fig. 4 illustrates the example of the demonstration on the GUI picture of the second exemplary embodiments according to the present invention;
Fig. 5 A and 5B illustrate according to the present invention the second exemplary embodiments by the process flow diagram of the example of the performed processing of CPU;
Fig. 6 A~6C is the outside drawing of application video camera of the present invention;
Fig. 7 illustrates direction of operating may be by the example of the touch operation of wrong identification.
Embodiment
Below with reference to accompanying drawing, describe various exemplary embodiments of the present invention, feature and aspect in detail.
Fig. 1 illustrates the exemplary configurations of the messaging device 100 that can apply each exemplary embodiments of the present invention.With reference to figure 1, the touch panel 106 that CPU (central processing unit) (CPU) 101, flash memory 102, random access memory (RAM) 103, indicative control unit 104, display 105, input block 107, input block 107 comprise, storage media drive 108 and communication interface (I/F) 110 communicate mutually via internal bus 111.Each assembly being connected with internal bus 111 can be communicated by letter via internal bus 111 executing datas.
Flash memory 102 is nonvolatile memories.Flash memory 102 temporarily storing image datas or other data, and store the various programs that CPU 101 work are used.RAM 103 is volatibility workspaces.CPU 101 is used RAM 103 as working storage, to carry out each assembly of control information treatment facility 100 according to the program on flash memory 102 of being stored in.Alternatively, can be by pre-stored at ROM (read-only memory) (ROM) (not shown) or hard disk (not shown) for operating the program of CPU 101, rather than be stored on flash memory 102.
Input block 107 receives user's operation.In addition, input block 107 operates to generate control signal according to received user.And input block 107 offers CPU 101 by generated control signal.More specifically, input block 107 comprises that conduct is for receiving text message input media input media, for example keyboard of user's operation and the indicating device of for example mouse or touch panel 106.Touch panel 106 is for exporting the input media coordinate information relevant with the position of touch input unit, that have slab construction shape.
CPU 101 is according to control signal and according to each assembly of control of correspondent program messaging device 100, and wherein, this control signal is according to operating, generated and provided from input block 107 by input block 107 via the performed user of input media.Therefore, CPU 101 can make messaging device 100 operate executable operations according to inputted user.
Indicative control unit 104 outputs are for controlling display 105 to show the display of image.More specifically, the display control signal that provides CPU 101 to generate according to program by indicative control unit 104.In addition the display control signal that, indicative control unit 104 is carried out for generating according to CPU 101 shows for forming the control of the GUI picture of GUI on display 105.
With display 105, touch panel 106 is set integratedly.More specifically, touch panel 106 is configured to the not demonstration of interferoscope 105 of transmissivity of touch panel 106.The touch panel 106 with structure like this is arranged on to the upper strata of the display surface of display 105.
In addition, by the coordinate of the input of carrying out via touch panel 106 and the coordinates correlation connection that is presented at the demonstration on display 105.Utilize said structure, can configure the GUI that user is almost completely directly operated the shown picture of display 105.
The exterior storage mediums 109 such as compact disk (CD), digital versatile disc (DVD) or storage card can be mounted to storage media drive 108.In addition, storage media drive 108 is under the control of CPU 101, from installed exterior storage medium 109 reading out datas with data are write to exterior storage medium 109.Communication I/F 110 is the interfaces that communicate with the network 120 such as LAN (Local Area Network) (LAN) or the Internet under the control of CPU 101.
CPU 101 can detect the following user's operation on touch panel.More specifically, CPU 101 can detect for utilizing finger or style of writing to touch the user operation of touch panel (below by this generic operation referred to as " touching (operation) "), for sustainable utilization finger or style of writing, touch the User Status (below by such state referred to as " touch and continue (state) ") of touch panel, for utilizing finger or touch panel touched in style of writing and the user operation of moveable finger or pen on touch panel in the situation that not interrupting touch condition (below by this generic operation referred to as " move operation "), for user's operation of stopping utilizing finger or style of writing to touch touch panel (below by this generic operation referred to as " touch shut-down operation "), and user's user's mode of operation (below by this state referred to as " not touch condition ") not of not touching touch panel.
Via internal bus 111, by utilizing finger or style of writing to touch the position of touch panel corresponding operation and coordinate with user, notify to CPU 101.CPU 101 has carried out any user's operation according to notified information judgement on touch panel.
For move operation, can, according to the variation of position coordinates, judge relatively user's finger mobile on touch panel or the moving direction of pen with the lip-deep vertical component of touch panel and horizontal component.
Here supposition, has sequentially carried out on touch panel user in the situation that of touching operation, predetermined move operation and touch shut-down operation, and user has operated its finger or pen when clicking.In this exemplary embodiments, will be called " flicking operation " for carrying out the operation of click.
Flicking operation is a kind of like this operation, in this operation, user utilizes its finger touch touch panel, then in the situation that not interrupting touch condition on touch panel its finger of fast moving or pen, then will its finger on the surface of touch panel or the mobile specific range of pen after stop touch condition.In other words, flicking operation is for carry out similar operation of sweeping motion on touch panel.
If user detected, with scheduled operation speed or larger speed, carry out the move operation of preset distance or longer distance on the surface of touch panel, and touch shut-down operation under this state, detected, CPU 101 is judged as user and has carried out and flick operation.On the other hand, if user detected, carry out the move operation of preset distance or longer distance with the speed lower than scheduled operation speed on the surface of touch panel, CPU 101 is judged as user and has carried out drag operation.
For touch panel 106, can use various touch panels.More specifically, can use resistance film formula touch panel, capacitive touch panels, surface acoustic wave type touch panel, infrared-type touch panel, induction touch panel, image recognition formula touch panel or optical sensor formula touch panel.
In the first exemplary embodiments of the present invention, carry out in the horizontal direction with the different directions of vertical direction on while rolling the rolling operation of image, the operation of flicking of carrying out continuously is highly precisely judged as to the operation of carrying out in the identical direction of the direction of operating with last operation.
Fig. 2 A~2C illustrates according to the example of the demonstration of the GUI on the display 105 of the first exemplary embodiments.In each example shown in Fig. 2 A~2C, as shown in the coordinate system in the upper left corner by this accompanying drawing of use, using direction from left to right as X positive dirction, using direction from right to left as X negative direction, using direction from the top down as Y positive dirction, and using direction from bottom to top as Y negative direction.In addition, X positive dirction and X negative direction are referred to as to horizontal direction, and Y positive dirction and Y negative direction are referred to as to vertical direction.
In the example shown in Fig. 2 A, according to the order on image taking date, arrange and be stored in the image on flash memory 102.In vertical direction, arrange in the following sequence the image that phase same date is taken: with there is the early image of image taking time and compare, to the rear side of picture Virtual Space arrange there is the more late image taking time image (, in Y negative direction, on picture Virtual Space, from front to back, the shooting time of image becomes more late).The image of arranging in display frame is called to " pattern matrix ".In the horizontal direction, Pareto diagram picture is arranged on and has early the right side of the image of shooting date (that is, from left to right, according to shooting date order Pareto diagram picture) so that have the image of more late shooting date.
Date shows that 204 represent the shooting date of the image in center image array.Date shows that 204 pattern matrixs 201 that represent to be arranged in center image array comprise the image that on November 7th, 2009 takes.Therefore, pattern matrix 202 is included on the November 7th, 2009 of specific one day captured image before.In addition, pattern matrix 203 is included on the November 7th, 2009 of specific one day captured image afterwards.
If user has been carried out and flicked operation by touch panel 106 on picture, CPU 101 judgements are flicked the direction of operation and are carried out and control in the following manner.More specifically, if upwards having carried out, user flicks operation, CPU 101 shown image in the rear side of the Virtual Space of picture (that is, to) rolling center image array in the direction of the picture degree of depth.On the other hand, if user has carried out, flick operation downwards, CPU 101 is to a shown image in the front side roll center image array of picture Virtual Space.
In other words, if user has carried out, upwards flick operation, upwards flick the operation that acts on the image that shows that in the included image of center image array, shooting time is more late for operation.On the other hand, if user has carried out, flick operation downwards, flick operation downwards with acting on the operation that shows shooting time image early in the included image of center image array.
In addition, if having carried out, user flicks operation left, CPU 101 pattern matrix that rolls left.On the other hand, if having carried out, user flicks operation to the right, CPU 101 pattern matrix that rolls to the right.In other words, if having carried out, user flicks operation left, CPU 101 is in the central authorities of the picture as center image array, shows and comprises that shooting date is later than the pattern matrix of the image that flicks the shooting date that is shown as the included image of the pattern matrix of center image array before operation left.On the other hand, if having carried out, user flicks operation to the right, CPU 101 is in the central authorities of the picture as center image array, shows and comprises that shooting date is early than the pattern matrix that flicks the image of the shooting date that is shown as the included image of the pattern matrix of center image array before operation to the right.
Fig. 2 B is illustrated under the state of picture shown in Fig. 2 A user and has carried out while flicking operation left the example of shown picture on display 105.In the example shown in Fig. 2 B, the pattern matrix 201 that comprises the image of taking on November 7th, 2009 and be displayed on center image array in the example shown in Fig. 2 A is moved to the left to the amount of movement that equals a pattern matrix.Replace pattern matrix 201, now pattern matrix 203 is shown as to center image array.
If user has carried out and flicked operation downwards under the state of picture shown in Fig. 2 A, on display 105, show the picture shown in Fig. 2 C.More specifically, in the example shown in Fig. 2 C, in the image that center image array 201 comprises, at the shown image of its lowest position of the picture shown in Fig. 2 A, from picture, disappear.In addition, the current its lowest position at pattern matrix 201 is presented at the image that the place, centre position of the pattern matrix 201 of picture shown in Fig. 2 A shows.In addition, the current place, centre position that is presented at pattern matrix 201 of the image uppermost position in fig-ure place of the pattern matrix 201 at picture shown in Fig. 2 A being shown.
In above-mentioned classic method, if the operation of device being carried out according to the direction judgement of flicking operation performed on touch panel, on touch panel, the direction of the actual track that flicks operation of carrying out may be different from the direction of the operation that user wants, as shown in Figure 7.In this case, may carry out the operation that not user wants.If repeat especially continuously and fast the operation on equidirectional, user may recognize and flicks for the first time the direction of operation and can in the direction of wanting, carry out and flick for the first time operation, but user may be different from the direction of wanting direction execution for the second time or subsequently flick operation.
On the other hand, in this exemplary embodiments, if user carries out, flick continuously operation, the operation of flicking of carrying out continuously is highly precisely identified as identical with last direction of flicking operation flicking and performedly on direction of operating flicks operation.Therefore, this exemplary embodiments is carried out for highly precisely identifying the control of the direction of the operation that user wants.
In this exemplary embodiments, " flicking the direction of operation " refers to that messaging device 100 is according to user's the direction of operating that operation is judged that flicks.In addition, messaging device 100 is carried out and is distributed to the operation of flicking operation.Process flow diagram below with reference to Fig. 3 describes the processing that realizes said method in detail.
Fig. 3 illustrates according to this exemplary embodiments for judging the process flow diagram of the exemplary process of the direction of flicking operation.By CPU 101, program is loaded into RAM 103 and carries out the processing in each step that this program realizes Fig. 3 process flow diagram at RAM 103 from flash memory 102.
When user is indicated and changed over when showing the pattern showing shown in Fig. 2 A by the operating unit that comprises of operation input block 107, start the processing of the process flow diagram of Fig. 3.With reference to figure 3, at step S301, CPU 101 carries out initial demonstration.More specifically, CPU 101 reads the image that will show from flash memory 102.In addition, CPU 101 will show with the pattern matrix shown in Fig. 2 A the image being read on display 105.
At step S302, CPU 101 is stored on RAM 103 and for accumulating variable IX and IY value of being set to " 0 " of the speed of flicking operation.To describe variable IX and IY in detail below.
At step S303, CPU 101 judges whether to utilize finger or style of writing to touch touch panel 106.More specifically, at step S303, CPU 101 has judged whether to detect user's touch persistent state.The judgement of carrying out in step S303 be for touch condition whether never touch condition be transformed into the judgement that touches persistent state.Therefore, the judgement of step S303 is for whether having carried out the judgement that touches operation substantially.
If be judged as, touch persistent state (step S303 is "Yes") detected, processed and enter step S304.On the other hand, if be judged as not detect, touch persistent state (step S303 is "No"), process and enter step S326.
At step S304, CPU 101 obtains the coordinate of touch location corresponding with the state that touches on touch panel 106.In addition, CPU 101 using obtained coordinate record on RAM 103 as coordinate (X1, Y1).
At step S305, CPU 101 waits for until through the schedule time.More specifically, CPU 101 waits for until through the schedule time, with by utilizing from detecting at step S302 the moment that touches persistent state until the stand-by period in the moment that touches persistent state detected next time, the variation of the coordinate of the input based on carrying out via touch panel, the speed of calculating move operation.
In this exemplary embodiments, suppose that the schedule time is tens of to hundreds of milliseconds.After having passed through the schedule time, process and enter step S306.
At step S306, be similar to the processing of step S303, CPU 101 judges whether to have detected touch persistent state.If be judged as, touch persistent state (step S306 is "Yes") detected, processed and enter step S307.On the other hand, if be judged as not detect, touch persistent state (step S306 is "No"), CPU 101 is identified as not carry out and flicks operation or drag-and-drop operation.In this case, process and enter step S326.
At step S307, CPU 101 obtains the coordinate of touch location corresponding with touching persistent state on touch panel 106.In addition, CPU 101 using obtained coordinate record on RAM 103 as coordinate (X2, Y2).By carrying out above-mentioned processing, CPU 101 obtains for touching touch panel 106 and in a plurality of positions that keep the operation of the moving touch location of touch condition time shift (that is, with coordinate (X1, Y1) and (X2, Y2) corresponding position).
At step S308, CPU 101 is according to the coordinate information obtaining at step S304 and S307, calculates the translational speed in the horizontal direction and in vertical direction of movement that touches touch location under persistent state.More specifically, can utilize the translational speed in formula " X2-X1 " calculated level direction.Can utilize formula " Y2-Y1 " to calculate the translational speed in vertical direction.
CPU 101 is stored on RAM 103 velocity amplitude calculating as variable V X and VY.The result of formula " X2-X1 " and " Y2-Y1 " can be calculated to precise speed divided by the scheduled wait time in step S305.Therefore, can calculate like this this speed.
Yet, because the stand-by period in step S305 is the constant time, thus whether carry out above-mentioned division do not affect between speed in X-direction and the speed in Y direction relatively or not comparison between speed in the same time of the speed of particular moment and other.Therefore,, in this exemplary embodiments, omit this division.Then process and enter step S309.
At step S309, whether the timer that CPU 101 judgements start at step S325 has become timeout mode and has had count value " 0 ".This counter starts elapsed time for calculating from progressive rolling movement pattern matrix left and right directions or above-below direction.If the value of timer is " 0 " (step S309 is "Yes"), CPU 101 be judged as from the moment of last rolling operation start through the schedule time or longer time.Therefore, in this case, process and enter step S310.On the other hand, if be judged as the value of timer, not " 0 " (step S309 is "No"), process and enter step S311.
At step S310, CPU 101 stores by being worth " 1 " variables A and the B using in the judgement of step S314 into.
Variables A and B are respectively for changing the variable of the weight of the speed in horizontal direction and vertical direction.More specifically, this variable is larger than " 1 ", and the more sure direction of operating that will flick is judged as respectively horizontal direction or vertical direction.Then process and enter step S314.
At step S311, CPU 101 reads the direction of flicking operation (the last direction of judging) of last operation from RAM 103, and judges whether to have carried out in the horizontal direction the last operation of flicking.
If carried out in the horizontal direction the last operation (step S311 is "Yes") of flicking, process and enter step S312.On the other hand, if be judged as, not to have carried out in the horizontal direction the last operation (step S311 is "No") of flicking,, based on having carried out in vertical direction last this judgement of operation of flicking, process and enter step S313.
At step S312, CPU 101 stores the variables A that will use in the judgement of step S314 into and stores the variable B that will use in the judgement of step S314 into by being worth " 1 " being worth " 2 ".As a result, can be judged as more accurately to have carried out in the horizontal direction at step S314 and flick operation.Then process and enter step S314.
At step S313, CPU 101 stores the variables A that will use in the judgement of step S314 into and stores the variable B that will use in the judgement of step S314 into by being worth " 2 " being worth " 1 ".As a result, can be judged as more accurately to have carried out in vertical direction at step S314 and flick operation.Then process and enter step S314.
At step S314, CPU 101 relatively utilizes the value that formula " | VX| * A " calculates and utilizes the size of the value that formula " | VY| * B " calculates, and judge which value is larger, wherein, the value of utilizing formula " | VX| * A " to calculate is by the absolute value of the speed component in X-direction is multiplied by the value that weight variable A calculates, and the value of utilizing formula " | VY| * B " to calculate is by the absolute value of the speed component in Y direction is multiplied by the value that weight variable B calculates.
The absolute value of speed component | VX| and | VY| is a plurality of touch locations based on by for mobile operation touches on touch panel 106 when keep touching the state of touch panel 106 and the value that calculates.Therefore, for judging that the condition " | VX| * A >=| VY| * B " of the direction of flicking operation is the condition of a plurality of touch locations based on for keeping touching the operation of the touch location on the moving touch panel 106 of state time shift of touch panel 106.
If be judged as | VX| * A >=| VY| * B (step S314 is "Yes"), processes and enters step S315.On the other hand, if be judged as and do not satisfy condition " | VX| * A >=| VY| * B " (step S314 is "No"), process and enter step S320.
At step S315, CPU 101 stores the value of utilizing formula " IX * 0.9+VX " to calculate into the variable IX being stored on RAM 103.In this exemplary embodiments, variable IX flicks for accumulating the variable that operates in the speed in horizontal direction.More specifically, in this exemplary embodiments, this accumulated value being multiplied by 0.9, thereby when speed is low, making this accumulated value reduce, that is, is not to flick operation in order to be judged as operation when speed is low.Then process and enter step S316.
At step S316, CPU 101 judges absolute value | whether IX| is greater than the predetermined threshold value of flicking.If be judged as absolute value | IX| is greater than the predetermined threshold value (step S316 is "Yes") of flicking, and based on having carried out in the horizontal direction, flicks this judgement of operation, processes and enters step S317.On the other hand, if be judged as absolute value | IX| is not more than the predetermined threshold value (step S316 is "No") of flicking, and processes and enters step S327.In this exemplary embodiments, value | IX| is the absolute value of IX.
At step S317, CPU 101 judges whether the current operation status of messaging device 100 is to touch persistent state.If be judged as the current operation status of messaging device 100, be to touch persistent state (step S317 is "Yes"), the processing of CPU 101 repeating step S317, and wait for until touch persistent state and finish (that is, waiting for until carrying out touch shut-down operation).On the other hand, if be judged as the current operation status of messaging device 100, not to touch persistent state (step S317 is "No"), process and enter step S318.The processing of execution step S317 is the image comprising for the pattern matrix that do not roll before carrying out touch shut-down operation.
At step S318, CPU 101 transmits control signal to indicative control unit 104, and wherein, this control signal is according to the direction of the speed component in the horizontal direction of accumulating in variable IX and difference.More specifically, according to the symbol of speed component be to right (to right corresponding on the occasion of) and which determines the content of the control signal sending at step S318 in direction (direction is corresponding to negative value left) left.
Be explained, if the speed component in the horizontal direction of accumulating in variable IX to the right, CPU 101 sends for controlling indicative control unit 104 by the roll control signal of a pattern matrix of pattern matrix to the right.On the other hand, if the speed component in the horizontal direction of accumulating in variable IX left, CPU 101 sends for controlling indicative control unit 104 by the roll control signal of a pattern matrix of pattern matrix left.
As a result, indicative control unit 104 generates for the video of an array that pattern matrix is rolled to the right or left, and generated video is outputed to display 105.Then process and enter step S319.At step S319, CPU 101 writes on RAM 103 and storage list shows the information of the pattern matrix that rolled in the horizontal direction.Then process and enter step S325.
At step S320, CPU 101 stores variable IY into by " IY * 0.9+VY ".In this exemplary embodiments, " IY " is for accumulating the variable of the speed of flicking operation of carrying out in vertical direction.Be similar to the processing of step S315, CPU 101 is multiplied by 0.9 by this accumulated value, to be judged as the operation of carrying out with low speed, is not to flick operation.Then process and enter step S321.
At step S321, CPU 101 judges absolute value | whether IY| is greater than the predetermined threshold value of flicking.If be judged as absolute value | IY| is greater than the predetermined threshold value (step S321 is "Yes") of flicking, and based on having carried out in vertical direction, flicks this judgement of operation, processes and enters step S322.On the other hand, if be judged as absolute value | IY| is not more than the predetermined threshold value (step S321 is "No") of flicking, and processes and enters step S327.In this exemplary embodiments, value | IY| is the absolute value of IY.
At step S322, CPU 101 judges whether the current operation status of messaging device 100 is to touch persistent state.If be judged as the current operation status of messaging device 100, be to touch persistent state (step S322 is "Yes"), the processing of CPU 101 repeating step S322, and wait for until touch persistent state and finish (that is, waiting for until carry out touch shut-down operation).On the other hand, if be judged as the current operation status of messaging device 100, not to touch persistent state (step S322 is "No"), process and enter step S323.The processing of execution step S322 is with the included image of pattern matrix that do not roll before carrying out touch shut-down operation.
At step S323, CPU 101 transmits control signal to indicative control unit 104, and wherein, this control signal is according to the direction of the speed component in the vertical direction of accumulating in variable IY and difference.More specifically, according to the direction of speed component, be which in upward direction (that is, upward direction is corresponding to negative value) and downward direction (that is, downward direction corresponding on the occasion of) judges the content of the control signal sending at step S323.
Be explained, if the speed component in the vertical direction of accumulating in variable IY upwards, CPU 101 sends for control the control signal of the image that indicative control unit 104 comprises with the center image array that rolls on the depth direction of picture.On the other hand, if the speed component in the vertical direction of accumulating in variable IY is downward, CPU 101 sends for controlling the control signal of the image that indicative control unit 104 comprises with the front side roll center image array to picture.
As a result, indicative control unit 104 generates in the direction of the degree of depth at picture or the video of the image comprising to the front side roll center image array of picture, and the video of generation is exported to display 105.Then process and enter step S324.At step S324, CPU 101 writes on RAM 103 and storage list shows the information of the image that the center image array that rolled in vertical direction comprises.Then process and enter step S325.
At step S325, CPU 101 starts timer, to judge from the last rolling of (level) direction or upper and lower (vertically) direction epigraph array in left and right starting whether to have passed through schedule time T.When having passed through time T, the value of timer becomes " 0 ".Then process and enter step S326.
At step S326, CPU 101 waits for until through the schedule time.This schedule time is the cycle for the state of touch panel 106 is sampled, and is different from above-mentioned time T.If passed through this schedule time, process and turn back to step S302.At step S302, CPU 101 initializing variable IX and IY, start whether to have re-executed the judgement of flicking operation to carry out for touching persistent state from next, and the processing of repeating step S302.
If the amount of movement of touch location does not continue and carried out and flicked operation to being judged as touch persistent state greatly, perform step the processing of S327.More specifically, at step S327, for the translational speed of continuous coverage touch location, CPU 101 stores current variable X 2 storages to new variables Y1 to new variables X1 and by current variable Y 2.Then process and turn back to step S305.
By the processing in the process flow diagram of execution graph 3, this exemplary embodiments is stored judged last direction of flicking operation in step S319 or S324.In addition, if carried out and flicked operation (step S309 is "No") from last flicking in the schedule time that operates beginning, CPU 101 changes for judging the condition of the direction of flicking operation in step S312 or S313, be substantially judged as the direction of flicking operation identical (parallel) with being stored axially on carried out and flicked operation.
Because following reason adopts said structure.More specifically, if user carries out and flicks operation continuously in same axial, owing to only repeating this operation, thereby user can carry out this operation very rapidly.In addition,, if the interval of flicking between operation is short, probably identical with last direction of flicking operation, carried out this operation on axially.
On the other hand, if the direction of flicking operation is changed over along the direction of other different axles, the interval between operation in this case becomes and be longer than the interval between operation of flicking of carrying out continuously in same axial.This is because user must change the motion of finger in this case, and because of user, recognizes the variation of finger motion.
Therefore, if step S309 be judged as from last flick operation start through time T, CPU 101 arranges equal value as the value of weight variable.In other words, this exemplary embodiments is used identical for judging current condition of flicking the direction of operation for X-direction and Y direction, and no matter carried out the last operation of flicking in which direction.
More specifically, in this exemplary embodiments, only when flicking touch that operation carries out and again carry out before having passed through time T after finishing while flicking operation last, according to last direction change of flicking operation for judging the condition of flicking operation.In addition, if last flick the touch carried out of operation passed through time T after finishing after execution flick operation, CPU 101 can not assign weight based on last direction of flicking operation.
According to above-mentioned the first exemplary embodiments, even if the direction of user's operation is being carried out and flicked operating period change continuously and rapidly, also can effectively reduce the possibility of carrying out the operation that is different from user's expectation.
Describe the second exemplary embodiments of the present invention below in detail.In above-mentioned the first exemplary embodiments, if user carries out and flicks operation continuously in positive dirction or negative direction, for the direction of operating that flicks operation, be substantially judged as and in same axial, carried out the continuous operation of flicking.
In this exemplary embodiments, if user continuously carries out and flicks operation, easily only just (or negative) identical with last direction of flicking operation axially gone up to the direction that performed operation judges is current operation.
Fig. 4 illustrates the example of the demonstration of the GUI picture showing on display 105 according to this exemplary embodiments.In this exemplary embodiments, can flick the operation image that rolls by what carry out in the horizontal direction.In addition, can operate deleted image by carrying out to flick in upward direction.In addition, can protect image by carrying out to flick to operate in downward direction.
In the example showing shown in Fig. 4, arrange in the horizontal direction and show image.In addition, by carry out the operation of flicking to right on touch panel 106, can be at the image that rolls in right.In addition, by carry out the operation of flicking of direction left on touch panel 106, can be left side's image that scroll up.
In the upper part and lower part of picture shown in Fig. 4, show with flicking in all directions and operate corresponding operation.More specifically, if user carries out and flicks operation in upward direction, delete the shown image of central authorities of picture.On the other hand, if be judged as user, in downward direction, carry out and flick operation, protect the shown image of central authorities of picture.
According in the GUI of this exemplary embodiments, to right and carry out in direction left flick operation all corresponding to the rolling of image.More specifically, although roll in different directions image, to right and left in direction the performed characteristic of flicking operation be identical.
On the other hand; for performed in upward direction and downward direction, flick operation; due to performed in upward direction, flick operational correspondence in the function for deleted image; and performedly in downward direction flick operational correspondence in the function for the protection of image, thereby the characteristic of operation is obviously mutually different.
Therefore,, in this exemplary embodiments, if user carries out and flicks operation continuously in vertical direction at short notice, user unlikely changes and flicks operation in the opposite direction.Therefore, in this exemplary embodiments, be from the different of the first exemplary embodiments, if user carries out the last operation of flicking in upward direction, substantially easily the direction of flicking operation be only judged as and carried out in upward direction.In other words, current flicking operates the possibility increase being judged as in upward direction.In addition,, if user carries out the last operation of flicking in downward direction, substantially easily the direction of flicking operation is only judged as and is carried out in downward direction.
Process flow diagram below with reference to Fig. 5 describes the exemplary process that realizes said structure in detail.Fig. 5 be illustrate according to this exemplary embodiments for judging the process flow diagram of example of the processing of the direction of flicking operation.By CPU 101, program is loaded into RAM 103 and at RAM 103 and carries out this program from flash memory 102, realize the processing of each step of the process flow diagram of Fig. 5.
In the process flow diagram of Fig. 5, the processing of most of steps is identical with the processing shown in Fig. 3.More specifically, the processing of the step S501~S522 shown in Fig. 5 is identical with the processing of the step S301~S322 shown in Fig. 3.In addition, the processing of the step S525~S527 shown in Fig. 5 is identical with the processing of the step S325~S327 shown in Fig. 3.Therefore, no longer repeat detailed description thereof here.
The processing of the process flow diagram of Fig. 5 is only following what time different from the processing shown in Fig. 3.In other words, if be judged as "No" at step S511, the processing of CPU 101 execution step S531~S533.In addition, if be judged as "No" at step S522, the processing of CPU 101 execution step S534~S538.Therefore, only describe difference below in detail.
With reference to figure 5, at step S511, CPU 101 reads last direction of flicking operation from RAM 103, to judge whether having carried out the in the horizontal direction last operation of flicking.If be judged as and carried out in the horizontal direction the last operation (step S511 is "Yes") of flicking, process and enter step S512.On the other hand, if be judged as, not to have carried out in the horizontal direction the last operation (step S511 is "No") of flicking, based on identifying in positive vertical direction (, downward direction) or negative vertical direction (, upward direction) on, carried out the last operation of flicking, processed and enter step S531.
At step S531, whether the last direction of flicking operation that CPU 101 judgements are stored on RAM 103 is upwards.In this exemplary embodiments, if carried out the processing of step S536 in last processing of flicking operation, recorded to be illustrated in and in upward direction, carried out last information of flicking operation.
If the direction of flicking operation last is upward direction (step S531 is "Yes"), processes and enters step S532.On the other hand, if be judged as last direction of flicking operation, not upward direction (that is, if the direction of flicking operation last is downward direction) (step S531 is "No"), process and enter step S533.
At step S532, whether the vertical speed component VY that CPU 101 judgements calculate at step S508 is corresponding to the speed of flicking operation performed in upward direction.In other words, at step S532, CPU 101 judges whether speed component VY has negative value.
If be judged as vertical speed component VY corresponding to the speed of flicking operation performed in upward direction (step S532 is "Yes"), process and enter step S513, so that can substantially easily current direction of flicking operation be judged as to the upward direction identical with last direction of flicking operation.This be because, in this case, if speed component VY corresponding to the speed of flicking operation performed in upward direction, the direction of vertical speed component is identical with last direction of flicking operation.
On the other hand, if be judged as vertical speed component VY, do not correspond to the speed of flicking operation performed in upward direction (step S532 is "No"), speed component VY is corresponding to the speed of flicking operation performed in downward direction, wherein, downward direction with flick the opposite direction operating in same axis.In this case, owing to not being that inevitable to be easily judged as current direction of flicking operation be the upward direction identical with last direction of flicking operation, thereby processing and enter step S510.At step S510, CPU 101 does not assign weight to Rule of judgment.
At step S533, whether the vertical speed component VY that CPU 101 judgements calculate at step S508 is corresponding to the speed of flicking operation performed in downward direction.In other words, at step S533, CPU 101 judge vertical speed component whether have on the occasion of.
If be judged as vertical speed component VY corresponding to the speed of flicking operation performed in downward direction (step S533 is "Yes"), process and enter step S513, so that can substantially easily current direction of flicking operation be judged as to the downward direction identical with last direction of flicking operation.This be because, in this case, if speed component VY corresponding to the speed of flicking operation performed in downward direction, the direction of vertical speed component is the direction identical with last direction of flicking operation.
On the other hand, if be judged as vertical speed component VY, do not correspond to the speed of flicking operation performed in downward direction (step S533 is "No"), speed component VY is corresponding to the speed of flicking operation performed in upward direction, wherein, upward direction with flick the opposite direction operating in same axis.In this case, owing to not being that inevitable to be easily judged as current direction of flicking operation be the downward direction identical with last direction of flicking operation, thereby processing and enter step S510.At step S510, CPU 101 does not assign weight to Rule of judgment.
At step S513, the speed component that flicks operation due to current in Y direction flicks operation corresponding to performed in the direction identical with last direction of flicking operation, thereby CPU 101 variables A that the judgement that is worth " 1 " and stores into step S514 is used and variable B that the judgement that is worth " 2 " and stores into step S514 is used.By the processing of execution step S513, even if the speed of flicking operation in X-direction is a bit larger tham the speed of flicking operation in Y direction, this exemplary embodiments has also increased the possibility of making judging: in the direction identical with last direction of flicking operation, carried out the current operation of flicking.
On the other hand, if be judged as messaging device 100 not in touching persistent state (step S522 is "No"), process and enter step S534.At step S534, whether the vertical speed component of accumulating in CPU 101 judgment variable IY flicks operation (that is, whether vertical speed component has negative value) corresponding to performed in upward direction.If be judged as the vertical speed component accumulated in variable IY, corresponding to performed in upward direction, flick operation (step S534 is "Yes"), process and enter step S535.On the other hand, if being judged as the vertical speed component of accumulating in variable IY does not correspond to and performedly in upward direction flicks operation (step S534 is "No"), because vertical speed component flicks operation corresponding to performed in downward direction, thereby process and enter step S537.
At step S535, CPU 101 is stored in the file control information on flash memory 102 by change, deletes the display 105 shown images of central authorities.Then, process and enter step S536.At step S536, CPU 101 writes and stores and represents to have received the information of flicking operation performed in upward direction (that is, representing the direction of flicking operation to be judged as the information of upward direction) on RAM 103.Then process and enter step S525.
At step S537, CPU 101 protections are presented at the image of display 105 central authorities.More specifically, CPU 101 changes are stored in the file control information on flash memory 102, with the image setting write-protect central shown to display 105.
Then, process and enter step S538.At step S538, CPU 101 writes and stores and represents to have received the information of flicking operation performed in downward direction (that is, representing the direction of flicking operation to be judged as the information of downward direction) on RAM 103.Then process and enter step S525.
According to above-mentioned the second exemplary embodiments, the operation of flicking on two different directions in same axial distributes mutual incoherent difference in functionality.In addition, in this exemplary embodiments, if user continuously carries out and flicks operation, easily only just (or negative) identical with last direction of flicking operation axially gone up to the direction that performed operation judges is current operation.According to this exemplary embodiments with said structure, can effectively prevent from carrying out the operation that not user wants.
In the second exemplary embodiments, in X-direction, (that is, in the horizontal direction), be similar to the first exemplary embodiments, if carry out and flick operation continuously, easily same axial be judged as to the direction of the plus or minus that flicks operation.Due to following former thereby use this structure.
More specifically, in this exemplary embodiments, to left rolling function and the right rolling function that operation distributes phase simple crosscorrelation or has similar characteristics that flick on two different directions in identical X-direction.In addition,, by frequently changing back and forth positive dirction or negative direction, can carry out continuously this operation.
In addition, " phase simple crosscorrelation or have the operation (function) of similar characteristics " comprises a plurality of operations for identical function with increase and decrease relation or lifting relation, the adjustment of volume, various settings or the various process parameter values of the sound that for example, increases or reduce to export or the reproduction position of F.F. or rapid-return motion image or voice data etc.
According to above-mentioned each exemplary embodiments, when the track based on touching the touch location of touch panel is carried out control, can highly precisely identify the operation that user wants according to the content of last operation.
In each exemplary embodiments of the present invention, can be to assigning weight for flick the condition of the direction of operation according to the operator scheme judgement of messaging device 100.More specifically, if as in above-mentioned the first and second exemplary embodiments, the operation of flicking in X-direction and Y direction distributes mutually different operation, according to last direction of flicking operation, changes and will distribute to for judging that the weight of the condition of the direction of flicking operation is useful.
On the other hand, if messaging device 100 is in showing the enlarged display mode of image with magnifying state, if to upwards, downwards, left and in right performed flick operation distributed for upwards, downwards, left with to the rolling function that changes the region of amplifying in right, CPU 101 can change in the above described manner and distributes to for judging the weight of the condition of the direction of flicking operation.
In other words, if messaging device 100 in the first operator scheme (first mode), CPU 101 assigns weight to direction of operating Rule of judgment.On the other hand, if messaging device 100 in the second operator scheme (the second pattern), CPU101 does not assign weight to direction of operating Rule of judgment.
If allow user to carry out and roll by carrying out to flick to operate in mutually different direction on four different directions, user can frequently change the direction of operating of wanting.In other words, flicking operation is not probably to flick in the identical direction of operation with last.Therefore, if change and flick direction of operating Rule of judgment according to last direction of flicking operation, adversely interference user operation.Therefore, if messaging device 100 in aforesaid operations pattern, no matter last direction of flicking operation how, CPU 101 does not assign weight to direction of operating Rule of judgment.
As allowing user to flick operation in the scroll up operator scheme of image of mutually different side by carrying out on four different directions, can use a kind of like this pattern, under this pattern, displayed map image, and can change shown part.Above-mentioned each pattern is all for changing the operator scheme of the shown part of identical demonstration object images.
In addition, for flicking arbitrarily operation performed in direction and all do not have the operator scheme of increase and decrease relation or lifting relation, owing to can thinking may frequently to change in the situation of the direction of operating of wanting in user, thereby can to direction of operating Rule of judgment, not assign weight, this is useful.
In the example shown in Fig. 3 and 5, CPU 101 changes for judging the condition of the direction of flicking operation, easily current direction of flicking operation is judged as YES to the direction identical with last direction of flicking operation.Yet if CPU 101 changes for judging current condition of flicking the direction of operation, to be not easy that current direction of flicking operation is judged as YES to the direction identical with last direction of flicking operation, this is also useful.
Due to following reason, thereby this structure is useful.More specifically, if user carries out in the same direction, flick operation, user may get into the way of by repetitive operation moveable finger or pen in the same direction.If get used to the user of moveable finger in the same direction or pen carries out and flicks operation in the different direction of the direction of the operation of being accustomed to from this user, although user wants moveable finger or pen in different directions, but because user has become, get used to very much the operation for flicking on inceptive direction, thereby user may not accurately carry out in different directions and flick operation.
In order to address the above problem, can also use structure below.More specifically, if easily the direction different from last direction of flicking operation is judged as to current direction of flicking operation, even user with the last diverse direction of direction of flicking operation on carry out and currently flicked operation failure, also current direction of flicking operation can be judged as YES be different from last or early flick operation direction flick direction of operating.
In order easily the direction of operating different from last direction of flicking operation to be judged as to current direction of flicking operation, by will be worth " 1 " at step S312 (Fig. 3 A), arrange to variables A and will be worth " 2 " setting to variable B, and at step S313 (Fig. 3 A), will be worth " 2 " arranges to variables A and will be worth " 1 " setting to variable B, to revise above-mentioned exemplary embodiments, this is useful.For shown in Fig. 5, process, in order easily the direction of operating different from last direction of flicking operation to be judged as to current direction of flicking operation, by will be worth " 1 " at step S512 (Fig. 5 A), arrange to variables A and will be worth " 2 " setting to variable B, and at step S513 (Fig. 5 A), will be worth " 2 " setting arranges " 1 " to variable B to variables A value, to revise above-mentioned exemplary embodiments, this is useful.Utilize this structure, can realize equally the effect for this exemplary embodiments of the operation that accurately identification user wants.
In each exemplary embodiments of the invention described above, apply the present invention to messaging device 100.Yet the present invention is not limited to this.More specifically, the present invention can be applicable to personal computer (PC), personal digital assistant (PDA), portable telephone terminal, portable image browser, the display device that offers printer apparatus or digital frame equipment etc. can be by carry out the messaging device of control with touch panel.In addition, the present invention can be applicable to use digital stillcamera or the digital camera of touch panel.
If apply the present invention to digital camera or portable telephone terminal etc., display unit is installed to the equipment of equipment body movably or pivotally, can switches and whether to flicking direction of operating Rule of judgment, assign weight according to the position of display unit.
As the said equipment of the example of the messaging device 100 of the exemplary embodiments according to the present invention, Fig. 6 A~6C illustrates the outward appearance of digital camera.In the example shown in Fig. 6 A~6C, via connecting portion 604 by display unit 601 to be pivotally installed to camera body 602.Camera body 602 comprises image unit and camera maintaining part 603, and wherein, image unit comprises camera lens and imageing sensor.
In the example shown in Fig. 6 B and 6C (although not shown in Fig. 6 A), fixing printable character " F " 605 on the surface of the display unit 601 of the specific part of display unit 601, with simply represent display unit 601 towards.Fig. 6 A illustrates display unit 601 and keeps folding camera closed condition.In this state, do not expose the display 105 facing to camera body 602.
In the example shown in Fig. 6 B, in pivot direction 606, display unit 601 is opened to 90 degree, to open display unit 601 from the closed condition shown in Fig. 6 A.More specifically, in the example shown in Fig. 6 B, expose the display 105 that is integrated with touch panel 106.In this state, allow user visually to identify and operation display 105.
In the example shown in Fig. 6 C, in the direction of the vertical pivot direction 607 of the rotation axis with pivot direction 606, from the state shown in Fig. 6 B, display unit 601 pivots are rotated to 180 degree, and in pivot direction 606, further pivot rotates 90 degree, with Folding display unit 601.Under the state shown in Fig. 6 C, the same display 105 that comprises the touch panel 106 of installing integratedly that exposes.Therefore, user can check and operation display 105.
In addition, display unit 601 can be arranged on to position below: in this position, for opening the direction of display unit 601 in pivot direction 606 from state shown in Fig. 6 B, further (display unit 601 pivots are rotated to 90 degree, display unit 601 can be arranged on to position below, in this position, from the state shown in Fig. 6 A, opening in direction of pivot direction 606, display unit 601 is opened to 180 degree).Below above-mentioned position is called " position D ".
CPU 101 can identify the position as the display unit 601 of one of above-mentioned position.In the example shown in Fig. 6 B, user utilizes a hand to keep camera body 602 at maintaining part 603 places.Therefore, user utilizes another hand executable operations on touch panel 106.
In this case, user keeps cameras at display unit 601 places of the opposite side of connecting portion 604 most probably, and utilizes the finger touch touch panel 106 of another hand.Therefore, if user thinks that he has carried out and flicked operation in the direction of wanting, the motion track of very possible user's finger may be wrong arcuation shape, as above with reference to as described in figure 7.
Therefore, if being judged as display unit 601, CPU 101 is positioned at the position shown in Fig. 6 B, the processing shown in this exemplary embodiments execution graph 3 or Fig. 5.In this case, CPU 101, according to the last direction of flicking operation being close to, assigns weight to flicking direction of operating Rule of judgment.If display unit 601 is positioned at D place, position, this is applicable equally.
On the other hand, if display unit 601 is positioned at the position shown in Fig. 6 C, user may not can utilize the hand of operating touch panel 106 to keep camera at the display unit 601 part place relative with connecting portion 604.Therefore, if be judged as display unit 601, be positioned at position shown in Fig. 6 C, CPU 101 is judged as user and can be in the direction of wanting accurately carries out and flick operation.In this case, CPU 101 omits the weighting of carrying out in the processing shown in Fig. 3 or Fig. 5.In addition, in this case, CPU 101 can all arrange value " 1 " to variables A and B, and no matter last direction of flicking operation how.
If assign weight, can change according to the position of display unit 601 ratio of the weight that will distribute.More specifically, if display unit 601 is positioned at position shown in Fig. 6 B, CPU 101 can be according to last direction of flicking operation, to flicking the assign weight large weight of variable quantity of coefficient of direction of operating Rule of judgment.In this case, the variable quantity of coefficient is equivalent to poor with the coefficient using when not assigning weight.
On the other hand, in this case, if display unit 601 is positioned at position shown in Fig. 6 C, CPU 101 can, according to last direction of flicking operation, assign weight to flicking direction of operating Rule of judgment.Yet while being positioned at position shown in Fig. 6 B with display unit 601, the variable quantity of set weight coefficient is compared, and the variable quantity of weight coefficient is arranged littlely.
With reference to the example described in figure 6A~6C, be not limited to digital camera above.More specifically, the present invention can be applicable to comprise the equipment of display unit, and wherein, this display unit is pivotally installed to equipment body, and no matter whether pass through slidably mechanism or pivotable mechanism's mobile display unit.In other words, the present invention can be applicable to Foldable portable telephone terminal or digital camera, game station, portable music player, E-book reader and PDA.
In each exemplary embodiments of the invention described above, the present invention is applied to digital camera.Yet the present invention is not limited to this.More specifically, the present invention can be applicable to any equipment that PC, PDA, portable telephone terminal, image viewer, the display that offers printer apparatus, digital frame equipment, game station, music player, E-book reader, automatic machine or car navigation device etc. have touch panel.
Can also be by the storage medium of the program code of the software that stores the function that realizes embodiment be provided to system or equipment, and by utilizing the computing machine (CPU or miniature processing unit (MPU)) of this system or equipment to read and carry out the program code being stored in storage medium, realize the present invention.In this case, the program code reading from storage medium itself is realized the function of above-described embodiment, therefore for storing the storage medium of this program code, forms the present invention.
Can also utilize read and the program of executive logging in storage arrangement to carry out the system of function of above-described embodiment and the computing machine of equipment devices such as (or) CPU or MPU or to realize each aspect of the present invention by method below, wherein, by the program that the computing machine of system or equipment is for example read and executive logging is set up at memory device, carry out the function of above-described embodiment, to carry out the step of the method.For this reason, for example, can this program be offered to computing machine by network or for example, by the various types of recording mediums (computer-readable medium) as storage arrangement.
Although the present invention has been described with reference to exemplary embodiments, should be appreciated that, the present invention is not limited to disclosed exemplary embodiments.The scope of appended claims meets the widest explanation, to comprise all modifications, equivalent structure and function.

Claims (9)

1. a messaging device, comprising:
Obtain parts, for obtaining, for the situation not interrupting touching touch panel, move down movable contact and touch a plurality of touch locations on the track of operation of position of touch panel;
Decision means, for by using the Rule of judgment that obtains described a plurality of touch locations that parts obtain based on described, judges the direction of operating of described operation;
Control assembly, carries out control for the direction of operating of judging according to described decision means, to carry out and the predetermined action explicitly of this direction of operating;
Scu, for carrying out control, is stored on storer with the direction of operating that described decision means is judged; And
Change parts, for the direction of operating based on being stored on described storer, change described Rule of judgment, with the direction of operating of the touch operation of the execution operation to having judged and stored direction of operating after, judge,
Wherein, if carry out in the special time after last touch panel touch operation finishes for the situation not interrupting touching touch panel and move down the operation that movable contact is touched the position of touch panel, described decision means is by being used the Rule of judgment after described change parts change, direction of operating based on stored carrys out decision operation direction, and
If passed through situation about carrying out after special time for not interrupting touching touch panel after last touch panel touch operation finishes, move down the operation that movable contact is touched the position of touch panel, described decision means is not by carrying out decision operation direction with the Rule of judgment not changed by described change parts.
2. messaging device according to claim 1, it is characterized in that, described change parts change described Rule of judgment, so that compare with other direction, the possibility that described decision means is judged as the direction parallel with stored direction of operating by direction of operating is higher.
3. messaging device according to claim 1, it is characterized in that, described change parts change described Rule of judgment, so that compare with other direction, the possibility that described decision means is judged as the direction parallel with stored direction of operating by direction of operating is lower.
4. messaging device according to claim 1, it is characterized in that, described change parts change described Rule of judgment, so that compare with other direction, the possibility that described decision means is judged as the direction identical with stored direction of operating by direction of operating is higher.
5. messaging device according to claim 1, it is characterized in that, described change parts change described Rule of judgment, so that compare with other direction, the possibility that described decision means is judged as the direction identical with stored direction of operating by direction of operating is lower.
6. messaging device according to claim 1, it is characterized in that, if in the corresponding action of performed operation and the direction contrary with direction of operating with stored, performed another corresponding action of operation has increase and decrease relation or lifting relation for identical function in the direction identical with direction of operating with stored, described change parts change described Rule of judgment, so that compare with other direction, the possibility that described decision means is judged as the direction parallel with stored direction of operating by direction of operating is higher, and
If in the corresponding action of performed operation and the direction contrary with direction of operating with stored, performed another corresponding action of operation not has increase and decrease relation or lifting relation for identical function in the direction identical with direction of operating with stored, described change parts change described Rule of judgment, so that compare with other direction, the possibility that described decision means is judged as the direction identical with stored direction of operating by direction of operating is higher.
7. messaging device according to claim 1, is characterized in that, if the current pattern of described messaging device is first mode, described change parts change described Rule of judgment, and
If the current pattern of described messaging device is the second pattern, described change parts do not change described Rule of judgment.
8. messaging device according to claim 7, it is characterized in that, described the second pattern comprises at least one in following pattern: according to performed operation on any direction of operating, change the pattern of the display section that shows object and the pattern that the action corresponding with operation on any direction of operating all do not have increase and decrease relation or lifting relation.
9. for a method for control information treatment facility, described messaging device is carried out control for basis by the performed input of touch panel, and described method comprises:
Obtain and for the situation not interrupting touching touch panel, move down movable contact and touch a plurality of touch locations on the track of operation of position of touch panel;
By using the Rule of judgment of the described a plurality of touch locations based on obtained to judge the direction of operating of described operation;
According to judged direction of operating, carry out control, to carry out and the predetermined action explicitly of this direction of operating;
Carry out and control so that the direction of operating of being judged is stored on storer; And
Direction of operating based on being stored on described storer changes described Rule of judgment, with the direction of operating of the touch operation carried out after the operation to having judged and stored direction of operating, judges,
Wherein, if carry out in the special time after last touch panel touch operation finishes for the situation not interrupting touching touch panel and move down the operation that movable contact is touched the position of touch panel, Rule of judgment after changing by use, direction of operating based on stored carrys out decision operation direction, and
If passed through situation about carrying out after special time for not interrupting touching touch panel after last touch panel touch operation finishes, move down the operation that movable contact is touched the position of touch panel, by carrying out decision operation direction with unaltered Rule of judgment.
CN201010566083.2A 2009-12-25 2010-11-25 Information processing apparatus and control method therefor Active CN102109922B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2009295436A JP5506375B2 (en) 2009-12-25 2009-12-25 Information processing apparatus and control method thereof
JP2009-295436 2009-12-25

Publications (2)

Publication Number Publication Date
CN102109922A CN102109922A (en) 2011-06-29
CN102109922B true CN102109922B (en) 2014-04-09

Family

ID=43798439

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201010566083.2A Active CN102109922B (en) 2009-12-25 2010-11-25 Information processing apparatus and control method therefor

Country Status (5)

Country Link
US (1) US8810527B2 (en)
EP (1) EP2339439B1 (en)
JP (1) JP5506375B2 (en)
KR (1) KR101547174B1 (en)
CN (1) CN102109922B (en)

Families Citing this family (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010156772A (en) * 2008-12-26 2010-07-15 Fuji Xerox Co Ltd Liquid crystal-containing composition and liquid crystal display device
GB201110156D0 (en) * 2011-06-16 2011-07-27 Light Blue Optics Ltd Touch-sensitive display devices
US9547369B1 (en) * 2011-06-19 2017-01-17 Mr. Buzz, Inc. Dynamic sorting and inference using gesture based machine learning
JP5814057B2 (en) * 2011-09-28 2015-11-17 京セラ株式会社 Electronic device, electronic device control method, and electronic device application program
JP5451944B2 (en) 2011-10-07 2014-03-26 パナソニック株式会社 Imaging apparatus and imaging method
JP6202777B2 (en) * 2011-12-05 2017-09-27 カシオ計算機株式会社 Display data control apparatus, display data control method, and program
JP6011937B2 (en) * 2012-01-05 2016-10-25 パナソニックIpマネジメント株式会社 Input device via touchpad
KR101710547B1 (en) * 2012-01-10 2017-02-27 엘지전자 주식회사 Mobile termianl and method for controlling of the same
JP5699959B2 (en) * 2012-02-13 2015-04-15 コニカミノルタ株式会社 Portable terminal, print instruction program, and print instruction method
US8451246B1 (en) 2012-05-11 2013-05-28 Google Inc. Swipe gesture classification
CN102868797B (en) * 2012-08-30 2015-08-12 东莞宇龙通信科技有限公司 Mobile terminal and method of operation thereof
US10032303B2 (en) * 2012-12-14 2018-07-24 Facebook, Inc. Scrolling 3D presentation of images
JP2014164630A (en) * 2013-02-27 2014-09-08 Sony Corp Information processing apparatus, information processing method, and program
JP2015035136A (en) * 2013-08-09 2015-02-19 本田技研工業株式会社 Input device
JP5850895B2 (en) * 2013-09-20 2016-02-03 ヤフー株式会社 SEARCH SYSTEM, SEARCH METHOD, TERMINAL DEVICE, AND SEARCH PROGRAM
JP6683605B2 (en) 2013-10-07 2020-04-22 アップル インコーポレイテッドApple Inc. Method and system for providing position or motion information for controlling at least one function of a vehicle
US10937187B2 (en) 2013-10-07 2021-03-02 Apple Inc. Method and system for providing position or movement information for controlling at least one function of an environment
JP2015095760A (en) * 2013-11-12 2015-05-18 オリンパス株式会社 Microscopic image display control method, microscopic image display control program, and microscopic image display device
US10091367B2 (en) * 2013-11-29 2018-10-02 Kyocera Document Solutions Inc. Information processing device, image forming apparatus and information processing method
JP6399834B2 (en) * 2014-07-10 2018-10-03 キヤノン株式会社 Information processing apparatus, information processing apparatus control method, and program
CN104253905B (en) * 2014-09-24 2017-09-08 努比亚技术有限公司 The call-establishing method and device of mobile terminal
CN106168864A (en) * 2015-05-18 2016-11-30 佳能株式会社 Display control unit and display control method
US10564762B2 (en) 2015-09-17 2020-02-18 Canon Kabushiki Kaisha Electronic apparatus and control method thereof
JP6751310B2 (en) * 2016-05-26 2020-09-02 オリンパス株式会社 Microscope image display device
KR20180098021A (en) 2017-02-24 2018-09-03 삼성전자주식회사 Electronic apparatus and control method thereof

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1494673A (en) * 2001-07-31 2004-05-05 ���µ�����ҵ��ʽ���� Mobile information terminal
CN1945513A (en) * 2005-10-06 2007-04-11 鸿富锦精密工业(深圳)有限公司 Cursor controlling device and method
CN101013350A (en) * 2006-02-02 2007-08-08 三星电子株式会社 Apparatus and method for controlling speed of moving between menu list items
JP2007242035A (en) * 1998-01-26 2007-09-20 Wayne Westerman Multi-touch surface device
US7355620B2 (en) * 2002-09-11 2008-04-08 Kabushiki Kaisha Toshiba Digital still camera and user instruction input method
CN101246413A (en) * 2007-02-12 2008-08-20 三星电子株式会社 Method of displaying information by using touch input in mobile terminal
CN101539839A (en) * 2008-03-19 2009-09-23 捷讯研究有限公司 Electronic device including touch sensitive input surface and method of determining user-selected input

Family Cites Families (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0693241B2 (en) 1987-01-14 1994-11-16 富士通株式会社 File search device
US7345675B1 (en) * 1991-10-07 2008-03-18 Fujitsu Limited Apparatus for manipulating an object displayed on a display device by using a touch screen
US7084859B1 (en) * 1992-09-18 2006-08-01 Pryor Timothy R Programmable tactile touch screen displays and man-machine interfaces for improved vehicle instrumentation and telematics
JPH06187098A (en) * 1992-12-21 1994-07-08 Hitachi Ltd Mouse control method
US6535897B1 (en) * 1993-05-20 2003-03-18 Microsoft Corporation System and methods for spacing, storing and recognizing electronic representations of handwriting printing and drawings
JP3968477B2 (en) * 1997-07-07 2007-08-29 ソニー株式会社 Information input device and information input method
US6249606B1 (en) * 1998-02-19 2001-06-19 Mindmaker, Inc. Method and system for gesture category recognition and training using a feature vector
US6614422B1 (en) * 1999-11-04 2003-09-02 Canesta, Inc. Method and apparatus for entering data using a virtual input device
AU5238100A (en) * 1999-06-09 2000-12-28 Malvern Scientific Solutions Limited Communication system and method
US7176896B1 (en) * 1999-08-30 2007-02-13 Anoto Ab Position code bearing notepad employing activation icons
JP3905670B2 (en) * 1999-09-10 2007-04-18 株式会社リコー Coordinate input detection apparatus, information storage medium, and coordinate input detection method
JP3950624B2 (en) * 2000-11-22 2007-08-01 日本電気株式会社 Medical support system, display method thereof, and recording medium recording the program
JP4127982B2 (en) * 2001-05-28 2008-07-30 富士フイルム株式会社 Portable electronic devices
US7162087B2 (en) * 2001-12-28 2007-01-09 Anoto Ab Method and apparatus for recording of electronic handwriting
US6690387B2 (en) * 2001-12-28 2004-02-10 Koninklijke Philips Electronics N.V. Touch-screen image scrolling system and method
US7193609B2 (en) * 2002-03-19 2007-03-20 America Online, Inc. Constraining display motion in display navigation
TWI234105B (en) * 2002-08-30 2005-06-11 Ren-Guang King Pointing device, and scanner, robot, mobile communication device and electronic dictionary using the same
JP2005044036A (en) 2003-07-24 2005-02-17 Ricoh Co Ltd Scroll control method and program making computer execute the method
US7411575B2 (en) * 2003-09-16 2008-08-12 Smart Technologies Ulc Gesture recognition method and touch system incorporating the same
WO2005029269A2 (en) * 2003-09-19 2005-03-31 Stanislaw Lewak Manual user data entry method and system
US7519223B2 (en) * 2004-06-28 2009-04-14 Microsoft Corporation Recognizing gestures and using gestures for interacting with software applications
US7574672B2 (en) * 2006-01-05 2009-08-11 Apple Inc. Text entry interface for a portable communication device
KR100781706B1 (en) 2006-08-16 2007-12-03 삼성전자주식회사 Device and method for scrolling list in mobile terminal
US8232973B2 (en) * 2008-01-09 2012-07-31 Apple Inc. Method, device, and graphical user interface providing word recommendations for text input
KR101467766B1 (en) 2008-03-21 2014-12-10 엘지전자 주식회사 Mobile terminal and screen displaying method thereof
JP2009251817A (en) * 2008-04-03 2009-10-29 Olympus Imaging Corp Image display device
JP2009288882A (en) * 2008-05-27 2009-12-10 Ntt Docomo Inc Mobile terminal and information display method

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007242035A (en) * 1998-01-26 2007-09-20 Wayne Westerman Multi-touch surface device
CN1494673A (en) * 2001-07-31 2004-05-05 ���µ�����ҵ��ʽ���� Mobile information terminal
US7355620B2 (en) * 2002-09-11 2008-04-08 Kabushiki Kaisha Toshiba Digital still camera and user instruction input method
CN1945513A (en) * 2005-10-06 2007-04-11 鸿富锦精密工业(深圳)有限公司 Cursor controlling device and method
CN101013350A (en) * 2006-02-02 2007-08-08 三星电子株式会社 Apparatus and method for controlling speed of moving between menu list items
CN101246413A (en) * 2007-02-12 2008-08-20 三星电子株式会社 Method of displaying information by using touch input in mobile terminal
CN101539839A (en) * 2008-03-19 2009-09-23 捷讯研究有限公司 Electronic device including touch sensitive input surface and method of determining user-selected input

Also Published As

Publication number Publication date
EP2339439B1 (en) 2018-06-27
JP5506375B2 (en) 2014-05-28
US8810527B2 (en) 2014-08-19
KR101547174B1 (en) 2015-08-25
KR20110074663A (en) 2011-07-01
EP2339439A3 (en) 2015-11-04
US20110157047A1 (en) 2011-06-30
EP2339439A2 (en) 2011-06-29
JP2011134260A (en) 2011-07-07
CN102109922A (en) 2011-06-29

Similar Documents

Publication Publication Date Title
CN102109922B (en) Information processing apparatus and control method therefor
US8446389B2 (en) Techniques for creating a virtual touchscreen
US8525776B2 (en) Techniques for controlling operation of a device with a virtual touchscreen
US10416777B2 (en) Device manipulation using hover
US8982045B2 (en) Using movement of a computing device to enhance interpretation of input events produced when interacting with the computing device
CN102681721B (en) Display apparatus,operation method thereof, and controller for display apparatus
EP2652579B1 (en) Detecting gestures involving movement of a computing device
US8957854B2 (en) Zero-click activation of an application
EP3543832B1 (en) Apparatus and method for controlling motion-based user interface
CN102625931A (en) User interface for initiating activities in an electronic device
CN102063247B (en) Display control apparatus and control method thereof
CN102902469A (en) Gesture recognition method and touch system
CN105760019A (en) Touch operation method based on interactive whiteboard and system thereof
WO2010017711A1 (en) Execution method, apparatus and movable terminal for graphic touch commands
EP2728456B1 (en) Method and apparatus for controlling virtual screen
CN107656633A (en) A kind of smart pen, the control method of smart pen, device, equipment and storage medium
CN104346072A (en) Display control apparatus and control method thereof
US9367228B2 (en) Fine object positioning
KR101442438B1 (en) Single touch process to achieve dual touch experience field
WO2013101206A1 (en) Interactive drawing recognition
JP2021117766A (en) Display control program, display control method, and display control device
JP5165624B2 (en) Information input device, object display method, and computer-executable program
CN101727208B (en) Mouse with rolling function
CN105528059A (en) A three-dimensional gesture operation method and system
AU2015258317A1 (en) Apparatus and method for controlling motion-based user interface

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant