WO2014034181A1 - Information processing device, information processing method, and program - Google Patents

Information processing device, information processing method, and program Download PDF

Info

Publication number
WO2014034181A1
WO2014034181A1 PCT/JP2013/062175 JP2013062175W WO2014034181A1 WO 2014034181 A1 WO2014034181 A1 WO 2014034181A1 JP 2013062175 W JP2013062175 W JP 2013062175W WO 2014034181 A1 WO2014034181 A1 WO 2014034181A1
Authority
WO
WIPO (PCT)
Prior art keywords
locus
control unit
information processing
correction amount
display
Prior art date
Application number
PCT/JP2013/062175
Other languages
French (fr)
Japanese (ja)
Inventor
晃一 寺尾
Original Assignee
Necカシオモバイルコミュニケーションズ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Necカシオモバイルコミュニケーションズ株式会社 filed Critical Necカシオモバイルコミュニケーションズ株式会社
Publication of WO2014034181A1 publication Critical patent/WO2014034181A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Definitions

  • the present invention relates to an information processing apparatus, an information processing method, and a program.
  • Such an information processing apparatus usually includes a touch sensor superimposed on a display panel, and an input operation is performed based on a position where a user touches or approaches a finger as an operating body on the operation surface of the touch sensor. Recognize Hereinafter, contacting or approaching is referred to as touch, and a touched position is referred to as a touch position.
  • an operation locus a touch position locus
  • a predetermined process corresponding to the input operation is executed.
  • examples of such an input operation include a drag operation for moving a finger while touching the operation surface, and a flick operation for quickly sliding after touching the operation surface and releasing the finger.
  • the operation trajectory drawn by the user performing such an input operation has a feature for each user. For example, when performing a drag operation in the horizontal direction, a user may always draw a trajectory that goes diagonally upward to the right. In this case, the ideal trajectory that draws a straight line in the horizontal direction deviates from the actual trajectory, and the user's input operation may not be recognized as intended.
  • Patent Documents 1 and 2 There are techniques described in Patent Documents 1 and 2 as techniques for improving the recognition accuracy of input operations.
  • Patent Document 1 feature data relating to an operation locus, which is a locus of a pointing position corresponding to an input operation using a stick pointer, is collected in a state where an application for calibrating an input operation is activated.
  • a calibration device that corrects an input pointing position based on data is disclosed.
  • Patent Document 2 when a user performs an input operation for rotating a trackball in a state where a calibration application is activated, feature data relating to an operation locus of a pointing position that changes according to the input operation is collected.
  • a correction device that corrects an input position based on the characteristics of the operation locus is disclosed.
  • JP 2000-347800 A Japanese Patent Laid-Open No. 10-097380
  • an object of the present invention is to provide an information processing apparatus, an information processing method, and a program capable of improving the recognition accuracy of an input operation while reducing the user's trouble.
  • An information processing apparatus includes a display control unit that displays a predetermined display screen on a display unit, a detection unit that detects a touch position that is in contact with or close to the operation surface, and a locus of the touch position.
  • a storage unit that stores a correction amount to be corrected, a locus output unit that calculates a locus of a touch position detected by the detection unit, and outputs a correction locus obtained by correcting the locus based on the correction amount as a recognition locus; In the state where the display screen is displayed, a specific process is executed when the recognition locus draws a predetermined pattern, and the state is based on the locus and the predetermined pattern. And a control unit for adjusting the correction amount.
  • An information processing method displays a predetermined display screen on a display unit, detects a touch position that is in contact with or close to an operation surface, calculates a trajectory of the touch position, and corrects the trajectory.
  • the correction amount to be acquired is acquired from the storage unit, a correction locus obtained by correcting the locus based on the correction amount is generated, an input operation is recognized based on the correction locus, and the display screen is displayed, As the input operation, when a specific operation in which the correction locus draws a predetermined pattern is recognized, a specific process is executed, and the correction amount is calculated based on the locus and the pattern in the state. Adjust.
  • the program according to the present invention includes a procedure for displaying a predetermined display screen on the display unit, a procedure for detecting a touch position that is in contact with or close to the operation surface, a procedure for calculating a locus of the touch position, A procedure for acquiring a correction amount for correcting the trajectory from a storage unit; a procedure for generating a correction trajectory obtained by correcting the trajectory based on the correction amount; a procedure for recognizing an input operation based on the correction trajectory; When a specific operation for drawing a predetermined pattern in which the correction locus is set in advance is recognized as the input operation in a state in which a display screen is displayed, a procedure for executing a specific process, and in the state, the locus And a procedure for adjusting the correction amount based on the pattern.
  • FIG. 1 is an external view of an information processing apparatus according to a first embodiment of the present invention. It is a block diagram which shows the structure of the information processing apparatus concerning this embodiment. It is explanatory drawing which shows an example of the unlocking screen which receives the input operation of the horizontal direction which the information processing apparatus concerning this embodiment displays. It is explanatory drawing which shows an example of the unlocking screen which receives the input operation of the vertical direction which the information processing apparatus concerning this embodiment displays. It is explanatory drawing for demonstrating the judgment of the application conditions which the information processing apparatus concerning this embodiment performs in the state which displayed the lock release screen. It is explanatory drawing for demonstrating correction
  • FIG. 1 is an external view of an information processing apparatus according to the first embodiment of the present invention.
  • An information processing apparatus 100 illustrated in FIG. 1 includes a display device 11 and a touch sensor 12 superimposed on the display device 11.
  • the surface of the panel on which the display device 11 and the touch sensor 12 are superimposed is protected by a protective member such as a glass material (not shown).
  • a protective member such as a glass material (not shown).
  • the operating body is a user's finger, but the operating body may be an object other than the finger.
  • the operation body may be a part other than the finger of the user's body, or a stylus pen.
  • the information processing apparatus 100 is, for example, a mobile phone, a smartphone, a game machine, a PC (Personal Computer), a PDA (Personal Digital Assistant), a navigation apparatus, a music playback apparatus, and a video processing apparatus.
  • the PC includes a tablet PC and a notebook PC.
  • the video processing device also includes a camera, a recorder, a player, and the like.
  • FIG. 2 is a block diagram showing the configuration of the information processing apparatus according to the present embodiment.
  • the information processing apparatus 100 includes a display device 11, a touch sensor 12, a control device 14, and a memory 13.
  • Display device 11 displays a display screen.
  • the display device 11 is, for example, a liquid crystal display device or an organic EL (Electro Luminescence) display device.
  • the touch sensor 12 detects the touch position and inputs an input signal indicating the detected touch position to the control device 14.
  • the touch sensor 12 since the touch sensor 12 is provided so as to overlap the display device 11, the display surface of the display device 11 is also used as the operation surface of the touch sensor 12.
  • the touch sensor 12 is a capacitive touch sensor, for example.
  • the touch sensor 12 may be a resistive film type or other type touch sensor.
  • the memory 13 is an example of a storage unit, and is a recording medium that can be read by the control device 14.
  • the memory 13 stores a program that defines the operation of the control device 14, for example.
  • the memory 13 stores a correction amount for correcting the locus of the touch position.
  • the control device 14 controls the information processing device 100 and is, for example, a CPU (Central Processing Unit).
  • the control device 14 reads the program stored in the memory 13 and executes the program, thereby realizing the display control unit 21, the locus output unit 22, and the overall control unit 23.
  • CPU Central Processing Unit
  • the display control unit 21 generates a display screen and displays the generated display screen on the display device 11.
  • Examples of the display screen include a standby screen displayed in a standby state, an application launcher screen for selecting an application to be activated, and an unlock screen for releasing a lock state in which the functions of the information processing apparatus 100 are restricted. Is mentioned.
  • the locus output unit 22 calculates an operation locus that is a locus of the touch position indicated by the input signal input from the touch sensor 12, and corrects the operation locus based on the correction amount in the memory 13.
  • a locus obtained by correcting the operation locus is referred to as a corrected locus.
  • the trajectory output unit 22 may correct all the calculated operation trajectories, or may correct the operation trajectory when the operation trajectory satisfies a predetermined application condition. In the latter case, the trajectory output unit 22 outputs the correction trajectory as a recognition trajectory that is a trajectory used for recognition of the input operation when the operation trajectory satisfies the application condition, and the operation trajectory when the operation trajectory does not satisfy the application condition. Is output as a recognition trajectory.
  • the application condition is, for example, that a difference between an operation locus and a predetermined locus is within a predetermined range. There may be a plurality of predetermined trajectories. In this case, the correction amount and the application condition are provided for each predetermined locus.
  • the trajectory output unit 22 determines whether or not the operation trajectory satisfies the application condition for each of the predetermined trajectories. The trajectory output unit 22 outputs the correction trajectory as a recognition trajectory when the operation trajectory satisfies at least one application condition, and outputs the operation trajectory as a recognition trajectory when the operation trajectory does not satisfy any of the application conditions.
  • the locus output unit 22 selects a predetermined locus having the smallest difference from the operation locus, and corrects the operation locus based on the correction amount of the selected predetermined locus. To do.
  • the overall control unit 23 is an example of a control unit that controls the operation of the information processing apparatus 100.
  • the overall control unit 23 uses the display control unit 21 to display various display screens on the display device 11.
  • the overall control unit 23 recognizes the input operation based on the recognition trajectory output from the trajectory output unit 22, and executes a process according to the recognized input operation. For example, when the recognition trajectory draws a predetermined pattern in a state where a predetermined display screen is displayed, the overall control unit 23 executes specific processing according to the input operation indicated by the recognition trajectory. Examples of the predetermined display screen include the lock release screen, the standby screen, and the application launcher screen. For example, when a drag operation or a flick operation is recognized as an input operation in a state where the application launcher screen is displayed, the overall control unit 23 changes the display area of the application launcher screen according to the direction of the operation. For example, when the unlocking operation is recognized as an input operation in a state where the unlocking screen is displayed, the overall control unit 23 releases the locking state. Further, the predetermined pattern may be different for each predetermined display screen.
  • the predetermined trajectory is a trajectory in which a predetermined pattern is drawn. Even when the predetermined pattern is different for each predetermined display screen, the trajectory output unit 22 has the operation trajectory for all the predetermined trajectories regardless of the display screen displayed when the operation trajectory is calculated. It is determined whether the application condition is satisfied.
  • the overall control unit 23 adjusts the correction amount stored in the memory 13 based on the operation locus calculated by the locus output unit 22 and the predetermined pattern. More specifically, the overall control unit 23 adjusts the correction amount based on the difference between the operation locus and the predetermined pattern.
  • the predetermined display screen is desirably a display screen that is displayed after the information processing apparatus 100 is started and before a selection operation for selecting an application to be started is recognized.
  • FIG. 3 is an explanatory diagram illustrating an example of an unlock screen that is displayed by the information processing apparatus according to the present embodiment and that accepts an input operation in the horizontal direction.
  • FIG. 4 is an explanatory diagram illustrating an example of an unlock screen that is displayed by the information processing apparatus according to the present embodiment and that accepts an input operation in the vertical direction.
  • the display control unit 21 displays the unlock screen 102 shown in FIG. 3 or 4 in the locked state in which the function of the information processing apparatus 100 is limited.
  • the display control unit 21 displays the unlock screen 102 when, for example, a predetermined operation (touch on the operation surface or button press) is detected while the information processing apparatus 100 is in the locked state.
  • the unlock screen 102 includes an unlock icon 201 and a slide guide 202.
  • the overall control unit 23 detects a touch position on the unlock icon 201 and then recognizes a drag operation or a flick operation to slide the unlock icon 201 in the direction along the slide guide 202, the overall control unit 23 releases the locked state.
  • the overall control unit 23 when the overall control unit 23 detects a touch position on the unlock icon 201S, it recognizes a drag operation or a flick operation that slides the unlock icon 201S to the right along the slide guide 202S. Release the lock state.
  • the overall control unit 23 when the overall control unit 23 detects a touch position on the lock release icon 201H and then recognizes a drag operation or a flick operation that slides upward along the slide guide 202L, the overall control unit 23 changes the lock state. To release.
  • FIG. 5 is an explanatory diagram for explaining determination of application conditions performed by the information processing apparatus according to the present embodiment in a state where the unlock screen is displayed.
  • FIG. 6 is an explanatory diagram for explaining the correction of the operation trajectory and the recognition of the input operation performed by the information processing apparatus according to the present embodiment in a state where the unlock screen is displayed.
  • a vector indicating a predetermined pattern is a reference vector V1
  • a vector indicating an operation locus is an operation vector V2.
  • the application condition for the reference vector V1 is that the difference between the operation vector V2 and the reference vector V1 is within a predetermined range.
  • the trajectory output unit 22 determines the application condition based on whether or not the operation vector V2 is within the application range 203 where the difference between the operation vector V2 and the reference vector V1 is within a predetermined range. Therefore, when the operation vector V2 is within the application range 203, the trajectory output unit 22 corrects the operation vector V2 using the correction amount stored in the memory 13.
  • the locus output unit 22 outputs the obtained corrected vector V4 as a recognition vector indicating a recognition locus.
  • the overall control unit 23 recognizes the input operation based on the corrected vector V4.
  • FIG. 7 is an explanatory diagram for explaining adjustment of the correction amount performed in a state in which the information processing apparatus according to the present embodiment displays the unlock screen.
  • the overall control unit 23 obtains the difference vector V3 and stores it in the memory 13 every time the locus output unit 22 calculates the operation vector V2.
  • the difference vectors V3 are referred to as difference vector V3 1 , difference vector V3 2 ... Difference vector V3 n in the order stored here.
  • the overall control unit 23 determines whether or not the difference vector V3 is already stored in the memory 13.
  • the central control unit 23 stores the difference vector V3 1 In this case the memory 13 as the correction amount.
  • the overall control unit 23 adds a plurality of difference vectors V3, which is a combination of the already stored difference vectors V3 1 to V3 n ⁇ 1 and the currently stored difference vector V3 n.
  • An average vector V3m which is the average of, is obtained.
  • the overall control unit 23 obtains the average vector V3m using N difference vectors V3 from the newly obtained one.
  • the overall control unit 23 stores the obtained average vector V3m in the memory 13 as a correction amount.
  • the average vector V3m is a correction amount vector indicating a correction amount.
  • the operation locus can be corrected using the average vector V3m, and the input operation can be recognized using the corrected vector V4.
  • FIG. 8 is an explanatory diagram for explaining the adjustment of the correction amount that is executed in a state where the information processing apparatus according to the present embodiment displays the lock release screen in the vertical direction.
  • FIG. 9 is an explanatory diagram illustrating an example of a standby screen displayed by the information processing apparatus according to the present embodiment.
  • FIG. 10 is an explanatory diagram illustrating another example of a standby screen displayed by the information processing apparatus according to the present embodiment.
  • the display control unit 21 displays one of the plurality of standby screens 104 as a display screen.
  • the display control unit 21 can display either the standby screen 104-1 shown in FIG. 9 or the standby screen 104-2 shown in FIG. 10 as a display screen.
  • the standby screen 104 has an indicator display area 211, a desktop area 212, and a dock area 213.
  • the indicator display area 211 an indicator indicating the position of the currently displayed display area in the entire display screen displayed in the desktop area 212 is displayed.
  • the rightmost display area of the four display areas is currently displayed.
  • the second display area from the right of the four display areas is currently displayed.
  • an icon for starting an application and a widget for adding a specific function to the desktop are displayed.
  • the display control unit 21 switches the display region displayed in the desktop region 212.
  • the dock area 213 is an area independent of the desktop area 212. For this reason, even if the desktop area 212 is moved left and right, the dock area 213 does not move.
  • the overall control unit 23 adjusts the correction amount based on the operation locus of the drag operation or flick operation that is the unlock operation, but the standby screen 104 is displayed. In the state, the correction amount is adjusted based on a drag operation or a flick operation that is a display area switching operation.
  • FIG. 11 is an explanatory diagram for explaining the adjustment of the correction amount that is executed by the information processing apparatus according to the present embodiment while the standby screen is displayed.
  • the overall control unit 23 obtains a difference vector V3 by obtaining a difference between the operation vector V2 and the reference vector V1. Similar to the adjustment of the correction amount performed by the overall control unit 23 with the unlock screen 102 displayed, the overall control unit 23 performs correction based on the obtained difference vector V3 and the difference vector V3 stored in the memory 13. Adjust the amount.
  • FIG. 12 is an explanatory diagram illustrating an example of an application launcher screen displayed by the information processing apparatus according to the present embodiment.
  • the application launcher screen 106 has an icon display area 221, a slide bar 222, and a dock area 223.
  • the icon display area 221 a plurality of icons are displayed side by side. If not all icons can be displayed in the icon display area 221, the display control unit 21 displays some icons in the icon display area 221, and the overall control unit 23 recognizes the drag operation or flick operation. Then, the icon displayed in the icon display area 221 is changed according to the drag operation or the flick operation.
  • FIG. 13 is an explanatory diagram for explaining the adjustment of the correction amount executed by the information processing apparatus according to the present embodiment in a state where the application launcher screen is displayed.
  • the overall control unit 23 obtains a difference vector V3 by obtaining a difference between the operation vector V2 and the reference vector V1. Similar to the adjustment of the correction amount performed by the overall control unit 23 with the unlock screen 102 displayed, the overall control unit 23 performs correction based on the obtained difference vector V3 and the difference vector V3 stored in the memory 13. Adjust the amount.
  • FIG. 9 and FIG. 10 show the standby screen 104 that accepts an input operation in the horizontal direction
  • the standby screen may accept an input operation in the vertical direction.
  • a display screen that can be scrolled in the vertical direction as in the application launcher screen 106 shown in FIG. 12 may be used as the standby screen.
  • the application launcher screen 106 is a display screen that can be scrolled in the vertical direction, but the application launcher screen may be a display screen that can be scrolled in the horizontal direction.
  • each component described above can be replaced with another component having a similar function.
  • the configuration to be used can be changed as appropriate according to the technical level at the time of carrying out the present embodiment.
  • a computer program for realizing each function of the information processing apparatus 100 according to the present embodiment as described above, and a computer-readable recording medium storing such a computer program can be provided.
  • the recording medium is, for example, a magnetic disk, an optical disk, a magneto-optical disk, a flash memory, or the like.
  • the above computer program may be distributed via a network, for example, without using a recording medium.
  • FIG. 14 is a flowchart for explaining an operation example of the information processing apparatus according to the present embodiment.
  • the display control unit 21 displays a predetermined display screen according to an instruction from the overall control unit 23 (step S100).
  • the display screen displayed by the display control unit 21 here is a display that is displayed until the application is started after the information processing apparatus 100 is started, such as the unlock screen 102, the standby screen 104, and the application launcher screen 106. It is a screen.
  • trajectory output part 22 judges whether touch operation was performed (step S105).
  • the trajectory output unit 22 calculates an operation trajectory based on the input signal input from the touch sensor 12 (step S110).
  • the trajectory output unit 22 determines whether or not the calculated operation trajectory satisfies a predetermined application condition (step S115). When the operation locus satisfies the application condition, the locus output unit 22 corrects the operation locus using the correction amount stored in the memory 13 (step S120). At this time, the locus output unit 22 outputs a corrected locus obtained by correcting the operation locus as a recognition locus. On the other hand, when the operation trajectory does not satisfy the application condition, the trajectory output unit 22 can omit the process of step S120. At this time, the trajectory output unit 22 outputs the operation trajectory as a recognition trajectory.
  • the overall control unit 23 receives an input operation based on the recognition locus output by the locus output unit 22, and recognizes the input operation based on the recognition locus (step S125).
  • the overall control unit 23 calculates a difference vector between the operation trajectory and a reference trajectory indicating a predetermined pattern for the input operation (step S130). Then, the overall control unit 23 stores the calculated difference vector in the memory 13 (step S135).
  • the overall control unit 23 calculates a difference vector calculated by performing the process of step S130 this time and a plurality of difference vectors calculated by performing the process of step S130 until the previous time and stored in the memory 13. Is calculated (step S140). Then, the overall control unit 23 stores the average vector as a correction amount (Step S145). The correction amount stored here is used in subsequent input operation recognition.
  • steps described in the flowchart here include not only processing performed in time series in the described order but also processing executed in parallel even if not necessarily processed in time series. Even in the steps processed in time series, the order can be appropriately changed depending on circumstances.
  • the recognition of the input operation described in steps S115 and S120 and the adjustment of the correction amount described in steps S125 to S140 may be performed in parallel, or the adjustment of the correction amount is input. It may be executed prior to recognition of the operation.
  • the correction amount can be adjusted before the application is activated, and the recognition accuracy of the input operation after the application is activated can be reliably improved.
  • an unlock screen As an example of a display screen that is displayed before a selection operation for selecting an application to be started is recognized, an unlock screen, a standby screen, and an application launcher screen are given. .
  • These display screens are display screens that are frequently used by the user in a normal usage method. In order to perform specific processing in a state in which these display screens are displayed, information processing devices such as a drag operation and a flick operation are used. On the other hand, an input operation that is frequently used is performed reliably. For this reason, it is possible to reliably collect the characteristics of the operation trajectory input for performing the specific processing and adjust the correction amount.
  • the overall control unit 23 determines any of a plurality of patterns prepared in advance as the predetermined pattern, and the display control unit 21 A predetermined display screen corresponding to the determined predetermined pattern is displayed.
  • FIG. 15 is a flowchart for explaining an operation example of the information processing apparatus according to the present embodiment.
  • the overall control unit 23 determines any of a plurality of patterns prepared in advance as a predetermined pattern (step S200).
  • the display control unit 21 displays a display screen according to a predetermined pattern determined by the overall control unit 23.
  • the overall control unit 23 determines a predetermined pattern randomly or according to a predetermined order.
  • the overall control unit 23 determines any of a plurality of linear patterns having different stretching directions as a predetermined pattern.
  • the extending direction is, for example, a direction substantially parallel to one side in the vertical or horizontal direction of the operation surface that is a rectangular region.
  • the overall control unit 23 determines the predetermined pattern as either a vertical or horizontal linear pattern
  • the display control unit 21 displays either the unlock screen 102S of FIG. 3 or the unlock screen 102L of FIG. 4 on the display device 11 according to the predetermined pattern determined by the overall control unit 23.
  • the overall control unit 23 determines a predetermined pattern as either a vertical or horizontal linear pattern, and the display control unit 21 In the case of a horizontal linear pattern according to the predetermined pattern determined by the overall control unit 23, the standby screen 104 of FIG. 9 or 10 is displayed on the display device 11, and the vertical linear pattern is displayed. In this case, a standby screen that can be scrolled in the vertical direction (not shown) is displayed on the display device 11.
  • step S205 input operation recognition and correction amount adjustment processing in step S205 are the same as steps S100 to S145 in FIG.
  • a predetermined pattern is determined from a plurality of patterns prepared in advance. For this reason, the characteristics of the operation trajectory are learned for a plurality of patterns. Therefore, the operation trajectory can be learned for a plurality of patterns on the display screen having the same function, and the learning efficiency can be improved.
  • the predetermined pattern changes randomly or according to a predetermined order. For this reason, it becomes possible to learn equally about a some pattern.
  • FIG. 16 is a block diagram showing the configuration of the information processing apparatus according to the third embodiment of the present invention.
  • 16 includes a display control unit 31, a detection unit 32, a storage unit 33, a trajectory output unit 34, and a control unit 35.
  • the display control unit 31 displays a predetermined display screen on the display unit.
  • the detection unit 32 detects a touch position that is a position in contact with or close to the operation surface.
  • the storage unit 33 stores a correction amount for correcting the locus of the touch position.
  • the locus output unit 34 calculates the locus of the touch position detected by the detection unit, and outputs a correction locus obtained by correcting the locus based on the correction amount as a recognition locus.
  • the control unit 35 performs a specific process when the recognition trajectory draws a predetermined pattern in a state where the display screen is displayed. In addition, the control unit 35 performs the trajectory and the predetermined pattern in the above state. Based on this, the correction amount is adjusted.
  • FIG. 17 is a flowchart illustrating the operation of the information processing apparatus according to the present embodiment.
  • the display control unit 31 displays a predetermined display screen on a display unit (not shown) (step S300).
  • the detection unit 32 detects the touch position (step S305).
  • the locus output unit 34 calculates the locus of the touch position (Step S310).
  • the locus output unit 34 outputs a corrected locus obtained by correcting the locus based on the correction amount stored in the storage unit 33 as a recognition locus (step S315).
  • the control unit 35 determines whether or not the recognition locus draws a predetermined pattern (step S320). When the recognition locus draws a predetermined pattern, the control unit 35 executes a specific process (step S325). On the other hand, when the recognition locus does not draw a predetermined pattern, the control unit 35 omits the process of step S325. Then, the control unit 35 adjusts the correction amount based on the locus calculated in step S310 and the predetermined pattern (step S330).
  • the predetermined pattern is a linear pattern in the vertical or horizontal direction, but the present invention is not limited to such an example.
  • the predetermined pattern may be a linear pattern in a direction other than the vertical and horizontal directions, or may be a pattern having a shape other than a linear shape such as an arc shape or a wave shape.
  • the overall control unit 23 uses the average vector V3m of the difference vector V3 as the correction amount, but the present invention is not limited to such an example.
  • the correction amount may be an amount that brings the operation trajectory closer to the predetermined pattern based on the difference between the operation trajectory and the predetermined pattern.
  • Control unit 100, 200, 300 Information processing device 11 Display device 12 Touch sensor 13 Control device 14 Memory 21 Display control unit 22 Trajectory output unit 23 Overall control unit 31 Display control unit 32 Detection unit 33 Storage unit 34 Trajectory output unit 35 Control unit

Abstract

According to the present invention, a display control unit (31) displays a prescribed display screen on a display unit. A detection unit (32) detects a touch position, which is a position either in contact with or proximity to the operation surface. A storage unit (33) stores a correction amount for correcting a touch position locus. A locus output unit (34) calculates the locus of the touch position detected by the detection unit, and outputs, as a recognition locus, a corrected locus in which the locus has been corrected on the basis of the correction amount. A control unit (35), in a state in which the display screen is being displayed, executes a specific process when the recognition locus depicts a predetermined prescribed pattern, and, in the above-mentioned state, adjusts the correction amount on the basis of the locus and the prescribed pattern.

Description

情報処理装置、情報処理方法、およびプログラムInformation processing apparatus, information processing method, and program
 本発明は、情報処理装置、情報処理方法、およびプログラムに関する。 The present invention relates to an information processing apparatus, an information processing method, and a program.
 近年、スマートフォンやタブレット端末のような情報処理装置が普及している。このような情報処理装置は、通常、表示パネルに重畳したタッチセンサを搭載しており、ユーザがこのタッチセンサの操作面上に操作体である指を接触または近接させた位置に基づいて入力操作を認識する。以下、接触または近接させることをタッチと称し、タッチした位置をタッチ位置と称する。 In recent years, information processing devices such as smartphones and tablet terminals have become widespread. Such an information processing apparatus usually includes a touch sensor superimposed on a display panel, and an input operation is performed based on a position where a user touches or approaches a finger as an operating body on the operation surface of the touch sensor. Recognize Hereinafter, contacting or approaching is referred to as touch, and a touched position is referred to as a touch position.
 このような情報処理装置は、例えばタッチ位置の軌跡(以下、操作軌跡と称する。)が予め定められたパターンを描く入力操作を認識した場合に、その入力操作に応じた所定の処理を実行することがある。例えばこのような入力操作の一例としては、操作面にタッチしたまま指を動かすドラッグ操作、および操作面をタッチした後に素早くスライドしてから指をはじくように離すフリック操作などが挙げられる。ユーザがこのような入力操作を行うことで描かれる操作軌跡は、ユーザ毎に特徴がある。例えば横方向のドラッグ操作を行うとき、あるユーザはいつも右斜め上方向に向かう軌跡を描いてしまう場合がある。この場合、横方向の直線を描く理想的な軌跡と実際の軌跡とがずれてしまうため、ユーザの入力操作が意図した通りに認識されないことがある。 For example, when such an information processing apparatus recognizes an input operation in which a touch position locus (hereinafter referred to as an operation locus) draws a predetermined pattern, a predetermined process corresponding to the input operation is executed. Sometimes. For example, examples of such an input operation include a drag operation for moving a finger while touching the operation surface, and a flick operation for quickly sliding after touching the operation surface and releasing the finger. The operation trajectory drawn by the user performing such an input operation has a feature for each user. For example, when performing a drag operation in the horizontal direction, a user may always draw a trajectory that goes diagonally upward to the right. In this case, the ideal trajectory that draws a straight line in the horizontal direction deviates from the actual trajectory, and the user's input operation may not be recognized as intended.
 入力操作の認識精度を改善するための技術としては、特許文献1および2に記載の技術がある。 There are techniques described in Patent Documents 1 and 2 as techniques for improving the recognition accuracy of input operations.
 特許文献1には、入力操作の校正用のアプリケーションを起動した状態で、スティックポインタを用いた入力操作に応じたポインティング位置の軌跡である操作軌跡に関する特徴データを収集しておき、この収集した特徴データに基づいて、入力されたポインティング位置を補正する校正装置が開示されている。 In Patent Document 1, feature data relating to an operation locus, which is a locus of a pointing position corresponding to an input operation using a stick pointer, is collected in a state where an application for calibrating an input operation is activated. A calibration device that corrects an input pointing position based on data is disclosed.
 また特許文献2には、校正用のアプリケーションを起動した状態で、ユーザがトラックボールを回転する入力操作を行った場合、その入力操作に応じて変化するポインティング位置の操作軌跡に関する特徴データを収集し、この操作軌跡の特徴に基づいて、入力された位置を補正する補正装置が開示されている。 Further, in Patent Document 2, when a user performs an input operation for rotating a trackball in a state where a calibration application is activated, feature data relating to an operation locus of a pointing position that changes according to the input operation is collected. A correction device that corrects an input position based on the characteristics of the operation locus is disclosed.
特開2000-347800号公報JP 2000-347800 A 特開平10-097380号公報Japanese Patent Laid-Open No. 10-097380
 しかしながら、特許文献1に記載された校正装置、および特許文献2に記載された補正装置では、いずれも操作軌跡の特徴を収集するための特定のアプリケーションが起動された状態で、ユーザは、操作軌跡の特徴を収集させるための入力操作を行う必要があった。このため、ユーザは、操作軌跡の特徴を収集させるためだけに入力操作を行わなければならず、ユーザの手間が大きいという問題がある。 However, in both the calibration apparatus described in Patent Document 1 and the correction apparatus described in Patent Document 2, the user can operate the operation locus in a state where a specific application for collecting the characteristics of the operation locus is activated. It was necessary to perform an input operation to collect the characteristics of For this reason, the user has to perform an input operation only to collect the characteristics of the operation trajectory, and there is a problem that the user has a lot of trouble.
 そこで本発明の目的は、ユーザの手間を低減させつつ、入力操作の認識精度を改善することが可能な情報処理装置、情報処理方法、およびプログラムを提供することである。 Therefore, an object of the present invention is to provide an information processing apparatus, an information processing method, and a program capable of improving the recognition accuracy of an input operation while reducing the user's trouble.
 本発明による情報処理装置は、所定の表示画面を表示部に表示する表示制御部と、操作面に対して接触又は近接した位置であるタッチ位置を検知する検知部と、前記タッチ位置の軌跡を補正する補正量を記憶する記憶部と、前記検知部の検知したタッチ位置の軌跡を算出し、前記補正量に基づいて前記軌跡を補正した補正軌跡を認識用軌跡として出力する軌跡出力部と、前記表示画面が表示された状態で、前記認識用軌跡が予め定められた所定のパターンを描く場合に特定の処理を実行し、かつ、前記状態で、前記軌跡と、前記所定のパターンとに基づいて、前記補正量を調整する制御部と、を有する。 An information processing apparatus according to the present invention includes a display control unit that displays a predetermined display screen on a display unit, a detection unit that detects a touch position that is in contact with or close to the operation surface, and a locus of the touch position. A storage unit that stores a correction amount to be corrected, a locus output unit that calculates a locus of a touch position detected by the detection unit, and outputs a correction locus obtained by correcting the locus based on the correction amount as a recognition locus; In the state where the display screen is displayed, a specific process is executed when the recognition locus draws a predetermined pattern, and the state is based on the locus and the predetermined pattern. And a control unit for adjusting the correction amount.
 本発明による情報処理方法は、所定の表示画面を表示部に表示し、操作面に対して接触または近接した位置であるタッチ位置を検知し、前記タッチ位置の軌跡を算出し、前記軌跡を補正する補正量を記憶部から取得し、前記軌跡を前記補正量に基づいて補正した補正軌跡を生成し、前記補正軌跡に基づいて入力操作を認識し、前記表示画面が表示された状態で、前記入力操作として、前記補正軌跡が予め定められた所定のパターンを描く特定操作が認識された場合、特定の処理を実行し、前記状態で、前記軌跡と、前記パターンとに基づいて、前記補正量を調整する。 An information processing method according to the present invention displays a predetermined display screen on a display unit, detects a touch position that is in contact with or close to an operation surface, calculates a trajectory of the touch position, and corrects the trajectory. The correction amount to be acquired is acquired from the storage unit, a correction locus obtained by correcting the locus based on the correction amount is generated, an input operation is recognized based on the correction locus, and the display screen is displayed, As the input operation, when a specific operation in which the correction locus draws a predetermined pattern is recognized, a specific process is executed, and the correction amount is calculated based on the locus and the pattern in the state. Adjust.
 本発明によるプログラムは、所定の表示画面を表示部に表示する手順と、操作面に対して接触または近接した位置であるタッチ位置を検知する手順と、前記タッチ位置の軌跡を算出する手順と、前記軌跡を補正する補正量を記憶部から取得する手順と、前記軌跡を前記補正量に基づいて補正した補正軌跡を生成する手順と、前記補正軌跡に基づいて入力操作を認識する手順と、前記表示画面が表示された状態で、前記入力操作として、前記補正軌跡が予め定められた所定のパターンを描く特定操作が認識された場合、特定の処理を実行する手順と、前記状態で、前記軌跡と、前記パターンとに基づいて、前記補正量を調整する手順と、をコンピュータに実行させる。 The program according to the present invention includes a procedure for displaying a predetermined display screen on the display unit, a procedure for detecting a touch position that is in contact with or close to the operation surface, a procedure for calculating a locus of the touch position, A procedure for acquiring a correction amount for correcting the trajectory from a storage unit; a procedure for generating a correction trajectory obtained by correcting the trajectory based on the correction amount; a procedure for recognizing an input operation based on the correction trajectory; When a specific operation for drawing a predetermined pattern in which the correction locus is set in advance is recognized as the input operation in a state in which a display screen is displayed, a procedure for executing a specific process, and in the state, the locus And a procedure for adjusting the correction amount based on the pattern.
 本発明によれば、ユーザの手間を低減させつつ、入力操作の認識精度を改善することが可能になる。 According to the present invention, it is possible to improve the recognition accuracy of the input operation while reducing the user's trouble.
本発明の第1の実施形態にかかる情報処理装置の外観図である。1 is an external view of an information processing apparatus according to a first embodiment of the present invention. 本実施形態にかかる情報処理装置の構成を示すブロック図である。It is a block diagram which shows the structure of the information processing apparatus concerning this embodiment. 本実施形態にかかる情報処理装置が表示する、横方向の入力操作を受け付けるロック解除画面の一例を示す説明図である。It is explanatory drawing which shows an example of the unlocking screen which receives the input operation of the horizontal direction which the information processing apparatus concerning this embodiment displays. 本実施形態にかかる情報処理装置が表示する、縦方向の入力操作を受け付けるロック解除画面の一例を示す説明図である。It is explanatory drawing which shows an example of the unlocking screen which receives the input operation of the vertical direction which the information processing apparatus concerning this embodiment displays. 本実施形態にかかる情報処理装置が、ロック解除画面を表示した状態で行う適用条件の判断について説明するための説明図である。It is explanatory drawing for demonstrating the judgment of the application conditions which the information processing apparatus concerning this embodiment performs in the state which displayed the lock release screen. 本実施形態にかかる情報処理装置が、ロック解除画面を表示した状態で行う操作軌跡の補正および入力操作の認識について説明するための説明図である。It is explanatory drawing for demonstrating correction | amendment of the operation locus | trajectory and recognition of input operation which the information processing apparatus concerning this embodiment performs in the state which displayed the lock release screen. 本実施形態にかかる情報処理装置がロック解除画面を表示した状態で行う補正量の調整について説明するための説明図である。It is explanatory drawing for demonstrating adjustment of the correction amount performed in the state which displayed the lock release screen by the information processing apparatus concerning this embodiment. 本実施形態にかかる情報処理装置が縦方向のロック解除画面を表示した状態で実行する補正量の調整について説明するための説明図である。It is explanatory drawing for demonstrating adjustment of the correction amount performed in the state which the information processing apparatus concerning this embodiment displayed the lock release screen of the vertical direction. 本実施形態にかかる情報処理装置が表示する待ち受け画面の一例を示す説明図である。It is explanatory drawing which shows an example of the standby screen which the information processing apparatus concerning this embodiment displays. 本実施形態にかかる情報処理装置が表示する待ち受け画面の他の一例を示す説明図である。It is explanatory drawing which shows another example of the standby screen which the information processing apparatus concerning this embodiment displays. 本実施形態にかかる情報処理装置が待ち受け画面を表示した状態で実行する補正量の調整について説明するための説明図である。It is explanatory drawing for demonstrating adjustment of the correction amount performed in the state which displayed the standby screen by the information processing apparatus concerning this embodiment. 本実施形態にかかる情報処理装置が表示するアプリケーションランチャ画面の一例を示す説明図である。It is explanatory drawing which shows an example of the application launcher screen which the information processing apparatus concerning this embodiment displays. 本実施形態にかかる情報処理装置がアプリケーションランチャ画面を表示した状態で実行する補正量の調整について説明するための説明図である。It is explanatory drawing for demonstrating adjustment of the correction amount performed in the state which displayed the application launcher screen by the information processing apparatus concerning this embodiment. 本実施形態にかかる情報処理装置の動作例を説明するためのフローチャートである。It is a flowchart for demonstrating the operation example of the information processing apparatus concerning this embodiment. 本発明の第2の実施形態にかかる情報処理装置の動作例を説明するためのフローチャートである。It is a flowchart for demonstrating the operation example of the information processing apparatus concerning the 2nd Embodiment of this invention. 本発明の第3の実施形態にかかる情報処理装置の構成を示すブロック図である。It is a block diagram which shows the structure of the information processing apparatus concerning the 3rd Embodiment of this invention. 本実施形態にかかる情報処理装置の動作例を示すフローチャートである。It is a flowchart which shows the operation example of the information processing apparatus concerning this embodiment.
 以下、本発明の実施形態について図面を参照して説明する。なお、以下の説明では、同じ機能を有するものには同じ符号を付け、その説明を省略する場合がある。 Hereinafter, embodiments of the present invention will be described with reference to the drawings. In the following description, components having the same function may be denoted by the same reference numerals and description thereof may be omitted.
 (第1の実施形態)
 まず本発明の第1の実施形態について説明する。図1は、本発明の第1の実施形態にかかる情報処理装置の外観図である。図1に示す情報処理装置100は、表示装置11と、表示装置11に重畳されたタッチセンサ12とを有する。表示装置11およびタッチセンサ12が重畳されたパネルの表面は、図示しないガラス材料などの保護部材で保護されている。そしてユーザが保護部材に対して操作体を接触または近接させると、この操作体の位置をタッチセンサ12が検知する。
(First embodiment)
First, a first embodiment of the present invention will be described. FIG. 1 is an external view of an information processing apparatus according to the first embodiment of the present invention. An information processing apparatus 100 illustrated in FIG. 1 includes a display device 11 and a touch sensor 12 superimposed on the display device 11. The surface of the panel on which the display device 11 and the touch sensor 12 are superimposed is protected by a protective member such as a glass material (not shown). When the user brings the operating body into contact with or close to the protection member, the touch sensor 12 detects the position of the operating body.
 なお以下の説明において、操作体はユーザの指とするが、操作体は指以外の物体であってもよい。例えば操作体は、ユーザの身体のうち指以外の部位、またはスタイラスペンなどでもよい。 In the following description, the operating body is a user's finger, but the operating body may be an object other than the finger. For example, the operation body may be a part other than the finger of the user's body, or a stylus pen.
 情報処理装置100は、例えば携帯電話、スマートフォン、ゲーム機器、PC(Personal Computer)、PDA(Personal Digital Assistant:携帯情報端末)、ナビゲーション装置、音楽再生装置、および映像処理装置などである。なおPCには、タブレット型PCやノート型PCなども含まれる。また映像処理装置には、カメラ、レコーダ、プレイヤなども含まれる。 The information processing apparatus 100 is, for example, a mobile phone, a smartphone, a game machine, a PC (Personal Computer), a PDA (Personal Digital Assistant), a navigation apparatus, a music playback apparatus, and a video processing apparatus. Note that the PC includes a tablet PC and a notebook PC. The video processing device also includes a camera, a recorder, a player, and the like.
 図2は、本実施形態にかかる情報処理装置の構成を示すブロック図である。図2に示すように情報処理装置100は、表示装置11と、タッチセンサ12と、制御装置14と、メモリ13とを有する。 FIG. 2 is a block diagram showing the configuration of the information processing apparatus according to the present embodiment. As illustrated in FIG. 2, the information processing apparatus 100 includes a display device 11, a touch sensor 12, a control device 14, and a memory 13.
 表示装置11は、表示画面を表示する。表示装置11は、例えば液晶ディスプレイ装置や有機EL(Electro Luminescence)ディスプレイ装置などである。 Display device 11 displays a display screen. The display device 11 is, for example, a liquid crystal display device or an organic EL (Electro Luminescence) display device.
 タッチセンサ12は、タッチ位置を検知し、検知したタッチ位置を示す入力信号を制御装置14に入力する。なお本実施形態において、タッチセンサ12は、表示装置11と重畳して設けられるため、表示装置11の表示面はタッチセンサ12の操作面と兼用される。タッチセンサ12は、例えば静電容量式のタッチセンサである。またタッチセンサ12は、抵抗膜方式、または他の方式のタッチセンサであってもよい。 The touch sensor 12 detects the touch position and inputs an input signal indicating the detected touch position to the control device 14. In the present embodiment, since the touch sensor 12 is provided so as to overlap the display device 11, the display surface of the display device 11 is also used as the operation surface of the touch sensor 12. The touch sensor 12 is a capacitive touch sensor, for example. The touch sensor 12 may be a resistive film type or other type touch sensor.
 メモリ13は、記憶部の一例であり、制御装置14によって読取り可能な記録媒体である。メモリ13は、例えば制御装置14の動作を規定するプログラムを記憶する。またメモリ13は、タッチ位置の軌跡を補正する補正量を記憶する。 The memory 13 is an example of a storage unit, and is a recording medium that can be read by the control device 14. The memory 13 stores a program that defines the operation of the control device 14, for example. The memory 13 stores a correction amount for correcting the locus of the touch position.
 制御装置14は、情報処理装置100を制御し、例えばCPU(Central Processing Unit)である。制御装置14は、メモリ13に記憶されたプログラムを読み取り、このプログラムを実行することによって、表示制御部21、軌跡出力部22、および統括制御部23を実現する。 The control device 14 controls the information processing device 100 and is, for example, a CPU (Central Processing Unit). The control device 14 reads the program stored in the memory 13 and executes the program, thereby realizing the display control unit 21, the locus output unit 22, and the overall control unit 23.
 表示制御部21は、表示画面を生成し、生成した表示画面を表示装置11に表示する。表示画面の一例としては、待ち受け状態で表示される待ち受け画面、起動するアプリケーションを選択するためのアプリケーションランチャ画面、および情報処理装置100の機能が制限されたロック状態を解除するためのロック解除画面などが挙げられる。 The display control unit 21 generates a display screen and displays the generated display screen on the display device 11. Examples of the display screen include a standby screen displayed in a standby state, an application launcher screen for selecting an application to be activated, and an unlock screen for releasing a lock state in which the functions of the information processing apparatus 100 are restricted. Is mentioned.
 軌跡出力部22は、タッチセンサ12から入力された入力信号の示すタッチ位置の軌跡である操作軌跡を算出し、当該操作軌跡をメモリ13内の補正量に基づいて補正する。以下、操作軌跡を補正した軌跡を補正軌跡と称する。このとき軌跡出力部22は、算出した全ての操作軌跡を補正してもよいし、操作軌跡が予め定められた適用条件を満たす場合に、当該操作軌跡を補正してもよい。後者の場合、軌跡出力部22は、操作軌跡が適用条件を満たす場合、補正軌跡を入力操作の認識に用いる軌跡である認識用軌跡として出力し、操作軌跡が適用条件を満たさない場合、操作軌跡を認識用軌跡として出力する。 The locus output unit 22 calculates an operation locus that is a locus of the touch position indicated by the input signal input from the touch sensor 12, and corrects the operation locus based on the correction amount in the memory 13. Hereinafter, a locus obtained by correcting the operation locus is referred to as a corrected locus. At this time, the trajectory output unit 22 may correct all the calculated operation trajectories, or may correct the operation trajectory when the operation trajectory satisfies a predetermined application condition. In the latter case, the trajectory output unit 22 outputs the correction trajectory as a recognition trajectory that is a trajectory used for recognition of the input operation when the operation trajectory satisfies the application condition, and the operation trajectory when the operation trajectory does not satisfy the application condition. Is output as a recognition trajectory.
 適用条件は、例えば、操作軌跡と予め定められた所定の軌跡との差異が所定の範囲内であることである。所定の軌跡は複数あってもよい。この場合、補正量および適用条件は所定の軌跡毎に設けられる。そして軌跡出力部22は、所定の軌跡それぞれについて操作軌跡が適用条件を満たすか否かを判断する。軌跡出力部22は、操作軌跡が少なくとも一つの適用条件を満たす場合、補正軌跡を認識用軌跡として出力し、操作軌跡が適用条件のいずれも満たさない場合、操作軌跡を認識用軌跡として出力する。なお、操作軌跡が複数の適用条件を満たす場合、軌跡出力部22は、操作軌跡との差異が最も小さい所定の軌跡を選択し、選択された所定の軌跡の補正量に基づいて操作軌跡を補正する。 The application condition is, for example, that a difference between an operation locus and a predetermined locus is within a predetermined range. There may be a plurality of predetermined trajectories. In this case, the correction amount and the application condition are provided for each predetermined locus. The trajectory output unit 22 determines whether or not the operation trajectory satisfies the application condition for each of the predetermined trajectories. The trajectory output unit 22 outputs the correction trajectory as a recognition trajectory when the operation trajectory satisfies at least one application condition, and outputs the operation trajectory as a recognition trajectory when the operation trajectory does not satisfy any of the application conditions. When the operation locus satisfies a plurality of application conditions, the locus output unit 22 selects a predetermined locus having the smallest difference from the operation locus, and corrects the operation locus based on the correction amount of the selected predetermined locus. To do.
 統括制御部23は、情報処理装置100の動作を制御する制御部の一例である。 The overall control unit 23 is an example of a control unit that controls the operation of the information processing apparatus 100.
 統括制御部23は、表示制御部21を用いて、種々の表示画面を表示装置11に表示する。 The overall control unit 23 uses the display control unit 21 to display various display screens on the display device 11.
 また、統括制御部23は、軌跡出力部22から出力された認識用軌跡に基づいて入力操作を認識し、認識した入力操作に応じた処理を実行する。例えば、統括制御部23は、所定の表示画面が表示された状態で、認識用軌跡が所定のパターンを描く場合に、その認識用軌跡が示す入力操作に応じた特定の処理を実行する。所定の表示画面としては、上記のロック解除画面、待ち受け画面、およびアプリケーションランチャ画面などが挙げられる。例えば、アプリケーションランチャ画面が表示された状態で、入力操作としてドラッグ操作またはフリック操作を認識すると、統括制御部23は、当該操作の方向に応じて、アプリケーションランチャ画面の表示領域を変更する。また例えば、ロック解除画面が表示された状態で、入力操作としてロック解除操作を認識すると、統括制御部23は、ロック状態を解除する。また、所定のパターンは、所定の表示画面ごとに異なっていてもよい。 Further, the overall control unit 23 recognizes the input operation based on the recognition trajectory output from the trajectory output unit 22, and executes a process according to the recognized input operation. For example, when the recognition trajectory draws a predetermined pattern in a state where a predetermined display screen is displayed, the overall control unit 23 executes specific processing according to the input operation indicated by the recognition trajectory. Examples of the predetermined display screen include the lock release screen, the standby screen, and the application launcher screen. For example, when a drag operation or a flick operation is recognized as an input operation in a state where the application launcher screen is displayed, the overall control unit 23 changes the display area of the application launcher screen according to the direction of the operation. For example, when the unlocking operation is recognized as an input operation in a state where the unlocking screen is displayed, the overall control unit 23 releases the locking state. Further, the predetermined pattern may be different for each predetermined display screen.
 なお、所定の軌跡は、所定のパターンを描く軌跡である。所定のパターンが所定の表示画面ごとに異なる場合であっても、軌跡出力部22は、操作軌跡を算出したときに表示されていた表示画面に関わらず、全ての所定の軌跡について、操作軌跡が適用条件を満たすか否かを判断する。 Note that the predetermined trajectory is a trajectory in which a predetermined pattern is drawn. Even when the predetermined pattern is different for each predetermined display screen, the trajectory output unit 22 has the operation trajectory for all the predetermined trajectories regardless of the display screen displayed when the operation trajectory is calculated. It is determined whether the application condition is satisfied.
 また、所定の表示画面が表示されている場合、統括制御部23は、軌跡出力部22が算出した操作軌跡と、所定のパターンとに基づいて、メモリ13に記憶された補正量を調整する。より具体的には、統括制御部23は、操作軌跡と所定のパターンとの差異に基づいて、補正量を調整する。所定の表示画面は、情報処理装置100が起動した後、起動するアプリケーションを選択する選択操作が認識される前に表示される表示画面であることが望ましい。 Further, when the predetermined display screen is displayed, the overall control unit 23 adjusts the correction amount stored in the memory 13 based on the operation locus calculated by the locus output unit 22 and the predetermined pattern. More specifically, the overall control unit 23 adjusts the correction amount based on the difference between the operation locus and the predetermined pattern. The predetermined display screen is desirably a display screen that is displayed after the information processing apparatus 100 is started and before a selection operation for selecting an application to be started is recognized.
 次に、軌跡出力部22の行う操作軌跡の補正、および統括制御部23の行う入力操作の認識および補正量の調整について、具体例を挙げてより詳細に説明する。 Next, the correction of the operation trajectory performed by the trajectory output unit 22 and the recognition of the input operation performed by the overall control unit 23 and the adjustment of the correction amount will be described in more detail with specific examples.
 まず本実施形態にかかる情報処理装置が表示するロック解除画面について説明する。図3は、本実施形態にかかる情報処理装置が表示する、横方向の入力操作を受け付けるロック解除画面の一例を示す説明図である。図4は、本実施形態にかかる情報処理装置が表示する、縦方向の入力操作を受け付けるロック解除画面の一例を示す説明図である。 First, the unlock screen displayed by the information processing apparatus according to the present embodiment will be described. FIG. 3 is an explanatory diagram illustrating an example of an unlock screen that is displayed by the information processing apparatus according to the present embodiment and that accepts an input operation in the horizontal direction. FIG. 4 is an explanatory diagram illustrating an example of an unlock screen that is displayed by the information processing apparatus according to the present embodiment and that accepts an input operation in the vertical direction.
 表示制御部21は、情報処理装置100の機能が制限されたロック状態で、図3または図4に示すロック解除画面102を表示する。なお、表示制御部21は、例えば、情報処理装置100がロック状態で所定の操作(操作面にタッチ、またはボタンの押下)が検知された場合に、このロック解除画面102を表示する。 The display control unit 21 displays the unlock screen 102 shown in FIG. 3 or 4 in the locked state in which the function of the information processing apparatus 100 is limited. The display control unit 21 displays the unlock screen 102 when, for example, a predetermined operation (touch on the operation surface or button press) is detected while the information processing apparatus 100 is in the locked state.
 ロック解除画面102は、ロック解除アイコン201とスライドガイド202とを含む。統括制御部23は、ロック解除アイコン201上にタッチ位置を検知した後、ロック解除アイコン201をスライドガイド202に沿った方向にスライドさせるドラッグ操作またはフリック操作を認識すると、ロック状態を解除する。 The unlock screen 102 includes an unlock icon 201 and a slide guide 202. When the overall control unit 23 detects a touch position on the unlock icon 201 and then recognizes a drag operation or a flick operation to slide the unlock icon 201 in the direction along the slide guide 202, the overall control unit 23 releases the locked state.
 例えば図3の例では、統括制御部23は、ロック解除アイコン201S上にタッチ位置を検知した後、ロック解除アイコン201Sをスライドガイド202Sに沿って右方向にスライドさせるドラッグ操作またはフリック操作を認識すると、ロック状態を解除する。同様に図4の例では、統括制御部23は、ロック解除アイコン201H上にタッチ位置を検知した後、スライドガイド202Lに沿って上方向にスライドさせるドラッグ操作またはフリック操作を認識すると、ロック状態を解除する。 For example, in the example of FIG. 3, when the overall control unit 23 detects a touch position on the unlock icon 201S, it recognizes a drag operation or a flick operation that slides the unlock icon 201S to the right along the slide guide 202S. Release the lock state. Similarly, in the example of FIG. 4, when the overall control unit 23 detects a touch position on the lock release icon 201H and then recognizes a drag operation or a flick operation that slides upward along the slide guide 202L, the overall control unit 23 changes the lock state. To release.
 図3のロック解除画面102Sが表示された状態において情報処理装置100の行う操作軌跡の補正および入力操作の認識について、説明する。図5は、本実施形態にかかる情報処理装置が、ロック解除画面を表示した状態で行う適用条件の判断について説明するための説明図である。図6は、本実施形態にかかる情報処理装置が、ロック解除画面を表示した状態で行う操作軌跡の補正および入力操作の認識について説明するための説明図である。 The correction of the operation trajectory and the recognition of the input operation performed by the information processing apparatus 100 in a state where the lock release screen 102S of FIG. FIG. 5 is an explanatory diagram for explaining determination of application conditions performed by the information processing apparatus according to the present embodiment in a state where the unlock screen is displayed. FIG. 6 is an explanatory diagram for explaining the correction of the operation trajectory and the recognition of the input operation performed by the information processing apparatus according to the present embodiment in a state where the unlock screen is displayed.
 ここで所定のパターンを示すベクトルを基準ベクトルV1とし、操作軌跡を示すベクトルを操作ベクトルV2とする。ここで、基準ベクトルV1についての適用条件は、操作ベクトルV2と基準ベクトルV1との差異が所定の範囲内であることである。軌跡出力部22は、操作ベクトルV2が、操作ベクトルV2と基準ベクトルV1との差異が所定の範囲内となる適用範囲203内であるか否かに基づいて適用条件を判断する。このため、軌跡出力部22は、操作ベクトルV2が適用範囲203内である場合、メモリ13に記憶された補正量を用いて、操作ベクトルV2を補正する。具体的には、軌跡出力部22は、操作ベクトルV2と補正量を示す補正量ベクトルV3mとの差を求めることで、補正後ベクトルV4を求める。これにより、V4=V2-V3mとなる。軌跡出力部22は、求めた補正後ベクトルV4を認識用軌跡を示す認識用ベクトルとして出力する。統括制御部23は、この補正後ベクトルV4に基づいて、入力操作を認識する。 Here, a vector indicating a predetermined pattern is a reference vector V1, and a vector indicating an operation locus is an operation vector V2. Here, the application condition for the reference vector V1 is that the difference between the operation vector V2 and the reference vector V1 is within a predetermined range. The trajectory output unit 22 determines the application condition based on whether or not the operation vector V2 is within the application range 203 where the difference between the operation vector V2 and the reference vector V1 is within a predetermined range. Therefore, when the operation vector V2 is within the application range 203, the trajectory output unit 22 corrects the operation vector V2 using the correction amount stored in the memory 13. Specifically, the locus output unit 22 obtains a corrected vector V4 by obtaining a difference between the operation vector V2 and a correction amount vector V3m indicating a correction amount. As a result, V4 = V2−V3m. The locus output unit 22 outputs the obtained corrected vector V4 as a recognition vector indicating a recognition locus. The overall control unit 23 recognizes the input operation based on the corrected vector V4.
 ここで補正量を示す補正量ベクトルV3mの調整について説明する。図7は、本実施形態にかかる情報処理装置がロック解除画面を表示した状態で行う補正量の調整について説明するための説明図である。統括制御部23は、操作ベクトルV2と基準ベクトルV1との差を、基準軌跡と操作軌跡との差異を示す差異ベクトルV3として求める。これにより、V3=V2-V1となる。 Here, the adjustment of the correction amount vector V3m indicating the correction amount will be described. FIG. 7 is an explanatory diagram for explaining adjustment of the correction amount performed in a state in which the information processing apparatus according to the present embodiment displays the unlock screen. The overall control unit 23 obtains a difference between the operation vector V2 and the reference vector V1 as a difference vector V3 indicating a difference between the reference locus and the operation locus. As a result, V3 = V2−V1.
 統括制御部23は、軌跡出力部22が操作ベクトルV2を算出する度に、差異ベクトルV3を求め、メモリ13に記憶する。ここで記憶された順番に差異ベクトルV3をそれぞれ差異ベクトルV3、差異ベクトルV3・・・差異ベクトルV3と称する。そして、差異ベクトルV3をメモリ13に記憶するとき、統括制御部23は、すでに差異ベクトルV3がメモリ13に記憶されているか否かを判断する。 The overall control unit 23 obtains the difference vector V3 and stores it in the memory 13 every time the locus output unit 22 calculates the operation vector V2. The difference vectors V3 are referred to as difference vector V3 1 , difference vector V3 2 ... Difference vector V3 n in the order stored here. When storing the difference vector V3 in the memory 13, the overall control unit 23 determines whether or not the difference vector V3 is already stored in the memory 13.
 差異ベクトルV3が記憶されていない場合、メモリ13には補正量の初期値として0が記憶されている。統括制御部23は、このとき差異ベクトルV3を補正量としてメモリ13に記憶する。 When the difference vector V3 is not stored, 0 is stored in the memory 13 as the initial value of the correction amount. The central control unit 23 stores the difference vector V3 1 In this case the memory 13 as the correction amount.
 またすでに差異ベクトルV3が記憶されている場合、統括制御部23は、すでに記憶された差異ベクトルV3~V3n-1と、現時点において記憶した差異ベクトルV3とを合わせた複数の差異ベクトルV3の平均である平均ベクトルV3mを求める。ここで統括制御部23は、所定個数N以上の差異ベクトルV3が求められた後は、新しく求められた方からN個の差異ベクトルV3を用いて平均ベクトルV3mを求める。統括制御部23は、求めた平均ベクトルV3mを補正量としてメモリ13に記憶する。なお平均ベクトルV3mは、補正量を示す補正量ベクトルである。 When the difference vector V3 is already stored, the overall control unit 23 adds a plurality of difference vectors V3, which is a combination of the already stored difference vectors V3 1 to V3 n−1 and the currently stored difference vector V3 n. An average vector V3m, which is the average of, is obtained. Here, after obtaining the difference vector V3 of the predetermined number N or more, the overall control unit 23 obtains the average vector V3m using N difference vectors V3 from the newly obtained one. The overall control unit 23 stores the obtained average vector V3m in the memory 13 as a correction amount. The average vector V3m is a correction amount vector indicating a correction amount.
 なお、ここではロック解除画面102Sにて、操作軌跡の補正を行う場合について説明したが、本発明はかかる例に限定されない。他の表示画面においても、平均ベクトルV3mを用いて操作軌跡の補正を行い、補正後ベクトルV4を用いて、入力操作を認識することができる。 In addition, although the case where the operation locus is corrected on the unlock screen 102S has been described here, the present invention is not limited to such an example. On other display screens, the operation locus can be corrected using the average vector V3m, and the input operation can be recognized using the corrected vector V4.
 また縦方向の入力操作を受け付けるロック解除画面102Lが表示された状態で、統括制御部23が行う補正量の調整についても、横方向の入力操作を受け付けるロック解除画面102Sが表示された状態で統括制御部23が行う補正量の調整の場合と同様である。図8は、本実施形態にかかる情報処理装置が縦方向のロック解除画面を表示した状態で実行する補正量の調整について説明するための説明図である。統括制御部23は、操作ベクトルV2と基準ベクトルV1との差を求めることで、差異ベクトルV3を求める。これにより、V3=V2-V1となる。 Further, the adjustment of the correction amount performed by the overall control unit 23 in a state where the unlocking screen 102L for accepting the input operation in the vertical direction is displayed is also performed in the state in which the unlocking screen 102S for accepting the input operation in the horizontal direction is displayed. This is the same as the correction amount adjustment performed by the control unit 23. FIG. 8 is an explanatory diagram for explaining the adjustment of the correction amount that is executed in a state where the information processing apparatus according to the present embodiment displays the lock release screen in the vertical direction. The overall control unit 23 obtains a difference vector V3 by obtaining a difference between the operation vector V2 and the reference vector V1. As a result, V3 = V2−V1.
 また統括制御部23は、待ち受け画面が表示された状態で補正量の調整を行ってもよい。図9は、本実施形態にかかる情報処理装置が表示する待ち受け画面の一例を示す説明図である。図10は、本実施形態にかかる情報処理装置が表示する待ち受け画面の他の一例を示す説明図である。 Further, the overall control unit 23 may adjust the correction amount while the standby screen is displayed. FIG. 9 is an explanatory diagram illustrating an example of a standby screen displayed by the information processing apparatus according to the present embodiment. FIG. 10 is an explanatory diagram illustrating another example of a standby screen displayed by the information processing apparatus according to the present embodiment.
 表示制御部21は、複数の待ち受け画面104のうちのいずれかを表示画面として表示する。例えば表示制御部21は、図9に示す待ち受け画面104-1および図10に示す待ち受け画面104-2のいずれかを表示画面として表示することができる。 The display control unit 21 displays one of the plurality of standby screens 104 as a display screen. For example, the display control unit 21 can display either the standby screen 104-1 shown in FIG. 9 or the standby screen 104-2 shown in FIG. 10 as a display screen.
 待ち受け画面104は、インジケータ表示領域211と、デスクトップ領域212と、ドック領域213とを有する。インジケータ表示領域211には、デスクトップ領域212に表示される表示画面全体のうち現在表示されている表示領域の位置を示すインジケータが表示される。図9の場合には、4つの表示領域のうち一番右側の表示領域が現在表示されていることが示されている。また図10の場合には、4つの表示領域のうち右から2番目の表示領域が現在表示されていることが示されている。デスクトップ領域には、アプリケーションを起動するためのアイコンや、特定の機能をデスクトップに加えるウィジェットが表示される。表示制御部21は、統括制御部23が左右のドラッグ操作またはフリック操作を認識すると、このデスクトップ領域212に表示される表示領域を切り替える。ドック領域213は、デスクトップ領域212から独立した領域である。このため、デスクトップ領域212が左右に移動されても、ドック領域213は動かない。 The standby screen 104 has an indicator display area 211, a desktop area 212, and a dock area 213. In the indicator display area 211, an indicator indicating the position of the currently displayed display area in the entire display screen displayed in the desktop area 212 is displayed. In the case of FIG. 9, the rightmost display area of the four display areas is currently displayed. In the case of FIG. 10, the second display area from the right of the four display areas is currently displayed. In the desktop area, an icon for starting an application and a widget for adding a specific function to the desktop are displayed. When the overall control unit 23 recognizes the left or right drag operation or flick operation, the display control unit 21 switches the display region displayed in the desktop region 212. The dock area 213 is an area independent of the desktop area 212. For this reason, even if the desktop area 212 is moved left and right, the dock area 213 does not move.
 統括制御部23は、ロック解除画面102が表示された状態では、ロック解除操作であるドラッグ操作またはフリック操作の操作軌跡に基づいて、補正量の調整を行ったが、待ち受け画面104が表示された状態では、表示領域切替操作であるドラッグ操作またはフリック操作に基づいて、補正量の調整を行う。図11は、本実施形態にかかる情報処理装置が待ち受け画面を表示した状態で実行する補正量の調整について説明するための説明図である。統括制御部23は、操作ベクトルV2と基準ベクトルV1との差を求めることで、差異ベクトルV3を求める。ロック解除画面102を表示した状態で統括制御部23が行う補正量の調整と同様に、統括制御部23は、求めた差異ベクトルV3とメモリ13に記憶された差異ベクトルV3とに基づいて、補正量を調整する。 In the state where the unlock screen 102 is displayed, the overall control unit 23 adjusts the correction amount based on the operation locus of the drag operation or flick operation that is the unlock operation, but the standby screen 104 is displayed. In the state, the correction amount is adjusted based on a drag operation or a flick operation that is a display area switching operation. FIG. 11 is an explanatory diagram for explaining the adjustment of the correction amount that is executed by the information processing apparatus according to the present embodiment while the standby screen is displayed. The overall control unit 23 obtains a difference vector V3 by obtaining a difference between the operation vector V2 and the reference vector V1. Similar to the adjustment of the correction amount performed by the overall control unit 23 with the unlock screen 102 displayed, the overall control unit 23 performs correction based on the obtained difference vector V3 and the difference vector V3 stored in the memory 13. Adjust the amount.
 また統括制御部23は、アプリケーションランチャ画面が表示された状態で、補正量の調整を行ってもよい。図12は、本実施形態にかかる情報処理装置が表示するアプリケーションランチャ画面の一例を示す説明図である。アプリケーションランチャ画面106は、アイコン表示領域221と、スライドバー222と、ドック領域223とを有する。アイコン表示領域221には、複数のアイコンが並んで表示される。ここでアイコン表示領域221内に全てのアイコンを表示できない場合には、表示制御部21は、一部のアイコンをアイコン表示領域221内に表示し、統括制御部23がドラッグ操作またはフリック操作を認識すると、ドラッグ操作またはフリック操作に応じて、アイコン表示領域221内に表示させるアイコンを変更する。 Further, the overall control unit 23 may adjust the correction amount while the application launcher screen is displayed. FIG. 12 is an explanatory diagram illustrating an example of an application launcher screen displayed by the information processing apparatus according to the present embodiment. The application launcher screen 106 has an icon display area 221, a slide bar 222, and a dock area 223. In the icon display area 221, a plurality of icons are displayed side by side. If not all icons can be displayed in the icon display area 221, the display control unit 21 displays some icons in the icon display area 221, and the overall control unit 23 recognizes the drag operation or flick operation. Then, the icon displayed in the icon display area 221 is changed according to the drag operation or the flick operation.
 図13は、本実施形態にかかる情報処理装置がアプリケーションランチャ画面を表示した状態で実行する補正量の調整について説明するための説明図である。統括制御部23は、操作ベクトルV2と基準ベクトルV1との差を求めることで、差異ベクトルV3を求める。ロック解除画面102を表示した状態で統括制御部23が行う補正量の調整と同様に、統括制御部23は、求めた差異ベクトルV3とメモリ13に記憶された差異ベクトルV3とに基づいて、補正量を調整する。 FIG. 13 is an explanatory diagram for explaining the adjustment of the correction amount executed by the information processing apparatus according to the present embodiment in a state where the application launcher screen is displayed. The overall control unit 23 obtains a difference vector V3 by obtaining a difference between the operation vector V2 and the reference vector V1. Similar to the adjustment of the correction amount performed by the overall control unit 23 with the unlock screen 102 displayed, the overall control unit 23 performs correction based on the obtained difference vector V3 and the difference vector V3 stored in the memory 13. Adjust the amount.
 なお、図9および図10では、横方向の入力操作を受け付ける待ち受け画面104を示したが、待ち受け画面は縦方向の入力操作を受け付けてもよい。例えば図12に示すアプリケーションランチャ画面106と同様に縦方向にスクロール可能な表示画面を待ち受け画面とすることもできる。また図12では、アプリケーションランチャ画面106は、縦方向にスクロール可能な表示画面であったが、アプリケーションランチャ画面は、横方向にスクロール可能な表示画面であってもよい。 Although FIG. 9 and FIG. 10 show the standby screen 104 that accepts an input operation in the horizontal direction, the standby screen may accept an input operation in the vertical direction. For example, a display screen that can be scrolled in the vertical direction as in the application launcher screen 106 shown in FIG. 12 may be used as the standby screen. In FIG. 12, the application launcher screen 106 is a display screen that can be scrolled in the vertical direction, but the application launcher screen may be a display screen that can be scrolled in the horizontal direction.
 以上、本実施形態にかかる情報処理装置100の構成の一例について説明した。上記の各構成要素は、同様の機能を有する他の構成要素に置き換えることができる。本実施形態を実施する時々の技術レベルに応じて、適宜、利用する構成を変更することが可能である。また、上述のような本実施形態にかかる情報処理装置100の各機能を実現するためのコンピュータプログラムや、このようなコンピュータプログラムが格納された、コンピュータで読取り可能な記録媒体を提供することができる。記録媒体は、例えば、磁気ディスク、光ディスク、光磁気ディスク、フラッシュメモリなどである。また、上記のコンピュータプログラムは、記録媒体を用いずに、例えばネットワークを介して配信してもよい。 Heretofore, an example of the configuration of the information processing apparatus 100 according to the present embodiment has been described. Each component described above can be replaced with another component having a similar function. The configuration to be used can be changed as appropriate according to the technical level at the time of carrying out the present embodiment. In addition, a computer program for realizing each function of the information processing apparatus 100 according to the present embodiment as described above, and a computer-readable recording medium storing such a computer program can be provided. . The recording medium is, for example, a magnetic disk, an optical disk, a magneto-optical disk, a flash memory, or the like. Further, the above computer program may be distributed via a network, for example, without using a recording medium.
 次に、本実施形態にかかる情報処理装置100の動作について説明する。図14は、本実施形態にかかる情報処理装置の動作例を説明するためのフローチャートである。 Next, the operation of the information processing apparatus 100 according to the present embodiment will be described. FIG. 14 is a flowchart for explaining an operation example of the information processing apparatus according to the present embodiment.
 まず表示制御部21は、統括制御部23の指示に従って、所定の表示画面を表示する(ステップS100)。例えばここで表示制御部21が表示する表示画面は、ロック解除画面102、待ち受け画面104、アプリケーションランチャ画面106など、情報処理装置100が起動した後、アプリケーションを起動するまでの間に表示される表示画面である。 First, the display control unit 21 displays a predetermined display screen according to an instruction from the overall control unit 23 (step S100). For example, the display screen displayed by the display control unit 21 here is a display that is displayed until the application is started after the information processing apparatus 100 is started, such as the unlock screen 102, the standby screen 104, and the application launcher screen 106. It is a screen.
 そして軌跡出力部22は、タッチ操作が行われたか否かを判断する(ステップS105)。そしてタッチ操作が行われた場合、軌跡出力部22は、タッチセンサ12から入力された入力信号に基づいて操作軌跡を算出する(ステップS110)。 And the locus | trajectory output part 22 judges whether touch operation was performed (step S105). When a touch operation is performed, the trajectory output unit 22 calculates an operation trajectory based on the input signal input from the touch sensor 12 (step S110).
 そして軌跡出力部22は、算出した操作軌跡が所定の適用条件を満たすか否かを判断する(ステップS115)。操作軌跡が適用条件を満たす場合、軌跡出力部22は、メモリ13に記憶された補正量を用いて操作軌跡を補正する(ステップS120)。このとき軌跡出力部22は、操作軌跡を補正した補正軌跡を認識用軌跡として出力する。一方、操作軌跡が適用条件を満たさない場合には、軌跡出力部22は、ステップS120の処理を省略することができる。このとき軌跡出力部22は、操作軌跡を認識用軌跡として出力する。 The trajectory output unit 22 determines whether or not the calculated operation trajectory satisfies a predetermined application condition (step S115). When the operation locus satisfies the application condition, the locus output unit 22 corrects the operation locus using the correction amount stored in the memory 13 (step S120). At this time, the locus output unit 22 outputs a corrected locus obtained by correcting the operation locus as a recognition locus. On the other hand, when the operation trajectory does not satisfy the application condition, the trajectory output unit 22 can omit the process of step S120. At this time, the trajectory output unit 22 outputs the operation trajectory as a recognition trajectory.
 そして統括制御部23は、軌跡出力部22が出力した認識用軌跡に基づいて、入力操作を受け付け、その認識用軌跡に基づいて、入力操作を認識する(ステップS125)。 Then, the overall control unit 23 receives an input operation based on the recognition locus output by the locus output unit 22, and recognizes the input operation based on the recognition locus (step S125).
 また統括制御部23は、操作軌跡と、入力操作について予め定められたパターンを示す基準軌跡との差異ベクトルを算出する(ステップS130)。そして統括制御部23は、算出した差異ベクトルをメモリ13に記憶する(ステップS135)。 Further, the overall control unit 23 calculates a difference vector between the operation trajectory and a reference trajectory indicating a predetermined pattern for the input operation (step S130). Then, the overall control unit 23 stores the calculated difference vector in the memory 13 (step S135).
 次に統括制御部23は、今回ステップS130の処理が行われることで算出された差異ベクトルと、前回までにステップS130の処理が行われることで算出され、メモリ13に記憶された複数の差異ベクトルの平均ベクトルを算出する(ステップS140)。そして統括制御部23は、平均ベクトルを補正量として記憶する(ステップS145)。ここで記憶された補正量は、以降の入力操作認識において用いられる。 Next, the overall control unit 23 calculates a difference vector calculated by performing the process of step S130 this time and a plurality of difference vectors calculated by performing the process of step S130 until the previous time and stored in the memory 13. Is calculated (step S140). Then, the overall control unit 23 stores the average vector as a correction amount (Step S145). The correction amount stored here is used in subsequent input operation recognition.
 なおここでフローチャートに記述されたステップは、記載された順序に沿って時系列的に行われる処理はもちろん、必ずしも時系列的に処理されなくとも、並列的に実行される処理をも含む。また時系列的に処理されるステップでも、場合によっては適宜順序を変更することが可能である。 Note that the steps described in the flowchart here include not only processing performed in time series in the described order but also processing executed in parallel even if not necessarily processed in time series. Even in the steps processed in time series, the order can be appropriately changed depending on circumstances.
 例えば、ステップS115およびステップS120で説明された入力操作の認識と、ステップS125からステップS140で説明された補正量の調整とは、並列的に実行されてもよいし、また補正量の調整が入力操作の認識よりも先に実行されてもよい。 For example, the recognition of the input operation described in steps S115 and S120 and the adjustment of the correction amount described in steps S125 to S140 may be performed in parallel, or the adjustment of the correction amount is input. It may be executed prior to recognition of the operation.
 以上説明したように、本実施形態によれば、所定の表示画面が表示された状態で、タッチ位置の軌跡である操作軌跡に応じた特定の処理を行うとともに、その操作軌跡に基づいて補正量の調整を行う。このため、補正量の調整を行うためだけに入力操作を行う必要がなくなり、ユーザの手間を低減することが可能になる。また調整した補正量を用いて入力操作の認識を行うことで、入力操作の認識精度を改善することが可能になる。したがって、ユーザの手間を低減させつつ、入力操作の認識精度を改善することが可能になる。 As described above, according to the present embodiment, in a state where a predetermined display screen is displayed, specific processing is performed according to the operation locus that is the locus of the touch position, and the correction amount is based on the operation locus. Make adjustments. For this reason, it is not necessary to perform an input operation only for adjusting the correction amount, and it is possible to reduce the user's trouble. Also, by recognizing the input operation using the adjusted correction amount, it is possible to improve the recognition accuracy of the input operation. Therefore, it becomes possible to improve the recognition accuracy of input operation, reducing a user's effort.
 また本実施形態では、情報処理装置を起動した後、起動するアプリケーションを選択する選択操作が認識される前に表示する表示画面が表示された状態で、操作軌跡に応じた特定の処理を行うとともに、補正量の調整を行う。これにより、アプリケーションが起動される前に補正量を調整することができ、アプリケーション起動後の入力操作の認識精度を確実に改善することが可能になる。 In the present embodiment, after the information processing apparatus is activated, a specific process corresponding to the operation trajectory is performed in a state in which the display screen to be displayed is displayed before the selection operation for selecting the application to be activated is recognized. Adjust the correction amount. As a result, the correction amount can be adjusted before the application is activated, and the recognition accuracy of the input operation after the application is activated can be reliably improved.
 また本実施形態では、情報処理装置を起動した後、起動するアプリケーションを選択する選択操作が認識される前に表示する表示画面の一例として、ロック解除画面、待ち受け画面、アプリケーションランチャ画面が挙げられた。これらの表示画面は、通常の使用方法にてユーザが多く用いる表示画面であり、これらの表示画面が表示された状態で、特定の処理を行うために、ドラッグ操作およびフリック操作など情報処理装置に対して多く用いられる入力操作が確実に行われる。このため、確実に、特定処理を行うために入力される操作軌跡の特徴を収集し、補正量を調整することが可能になる。 Moreover, in this embodiment, after starting the information processing apparatus, as an example of a display screen that is displayed before a selection operation for selecting an application to be started is recognized, an unlock screen, a standby screen, and an application launcher screen are given. . These display screens are display screens that are frequently used by the user in a normal usage method. In order to perform specific processing in a state in which these display screens are displayed, information processing devices such as a drag operation and a flick operation are used. On the other hand, an input operation that is frequently used is performed reliably. For this reason, it is possible to reliably collect the characteristics of the operation trajectory input for performing the specific processing and adjust the correction amount.
 (第2の実施形態)
 次に、本発明の第2の実施形態について説明する。なお本発明の第2の実施形態にかかる情報処理装置100の外観および構成は、第1の実施形態と同じである。
(Second Embodiment)
Next, a second embodiment of the present invention will be described. The appearance and configuration of the information processing apparatus 100 according to the second embodiment of the present invention are the same as those in the first embodiment.
 本実施形態では、所定の表示画面を表示する際に、統括制御部23は、予め用意された複数のパターンのいずれかを所定のパターンとして決定し、表示制御部21は、統括制御部23が決定した所定のパターンに応じた所定の表示画面を表示する。 In the present embodiment, when the predetermined display screen is displayed, the overall control unit 23 determines any of a plurality of patterns prepared in advance as the predetermined pattern, and the display control unit 21 A predetermined display screen corresponding to the determined predetermined pattern is displayed.
 ここで本実施形態にかかる情報処理装置100の動作例について説明する。図15は、本実施形態にかかる情報処理装置の動作例を説明するためのフローチャートである。 Here, an operation example of the information processing apparatus 100 according to the present embodiment will be described. FIG. 15 is a flowchart for explaining an operation example of the information processing apparatus according to the present embodiment.
 まず統括制御部23は、予め用意された複数のパターンのいずれかを所定のパターンとして決定する(ステップS200)。表示制御部21は、統括制御部23が決定した所定のパターンに応じた表示画面を表示する。このとき統括制御部23は、ランダムに、または予め定められた順序に従って、所定のパターンを決定する。 First, the overall control unit 23 determines any of a plurality of patterns prepared in advance as a predetermined pattern (step S200). The display control unit 21 displays a display screen according to a predetermined pattern determined by the overall control unit 23. At this time, the overall control unit 23 determines a predetermined pattern randomly or according to a predetermined order.
 例えば統括制御部23は、延伸方向がそれぞれ異なる複数の直線状パターンのいずれかを所定のパターンとして決定する。延伸方向は、例えば矩形領域である操作面の縦または横方向の一辺と略平行な方向である。 For example, the overall control unit 23 determines any of a plurality of linear patterns having different stretching directions as a predetermined pattern. The extending direction is, for example, a direction substantially parallel to one side in the vertical or horizontal direction of the operation surface that is a rectangular region.
 具体的には、例えば表示制御部21がロック解除画面102を表示画面として表示する場合には、統括制御部23は、所定のパターンを縦または横方向の直線状パターンのいずれかに決定し、表示制御部21は、統括制御部23が決定した所定のパターンに応じて、図3のロック解除画面102Sまたは図4のロック解除画面102Lのいずれかを表示装置11に表示する。 Specifically, for example, when the display control unit 21 displays the unlock screen 102 as a display screen, the overall control unit 23 determines the predetermined pattern as either a vertical or horizontal linear pattern, The display control unit 21 displays either the unlock screen 102S of FIG. 3 or the unlock screen 102L of FIG. 4 on the display device 11 according to the predetermined pattern determined by the overall control unit 23.
 また例えば表示制御部21が待ち受け画面104を表示画面として表示する場合には、統括制御部23は、所定のパターンを縦または横方向の直線状パターンのいずれかに決定し、表示制御部21は、統括制御部23が決定した所定のパターンに応じて、横方向の直線状パターンの場合には、図9または図10の待ち受け画面104を表示装置11に表示し、縦方向の直線状パターンの場合には、図示しない縦方向にスクロールすることのできる待ち受け画面を表示装置11に表示する。 For example, when the display control unit 21 displays the standby screen 104 as a display screen, the overall control unit 23 determines a predetermined pattern as either a vertical or horizontal linear pattern, and the display control unit 21 In the case of a horizontal linear pattern according to the predetermined pattern determined by the overall control unit 23, the standby screen 104 of FIG. 9 or 10 is displayed on the display device 11, and the vertical linear pattern is displayed. In this case, a standby screen that can be scrolled in the vertical direction (not shown) is displayed on the display device 11.
 なおステップS205の入力操作の認識および補正量の調整処理は、図14のステップS100からステップS145と同様であるため、ここでは説明を省略する。 Note that the input operation recognition and correction amount adjustment processing in step S205 are the same as steps S100 to S145 in FIG.
 以上説明したように、本実施形態によれば、予め用意された複数のパターンの中から、所定のパターンが決定される。このため、複数のパターンについて操作軌跡の特徴が学習される。したがって、同じ機能の表示画面において複数のパターンについて操作軌跡の学習を行うことができ、学習効率を向上することが可能になる。 As described above, according to the present embodiment, a predetermined pattern is determined from a plurality of patterns prepared in advance. For this reason, the characteristics of the operation trajectory are learned for a plurality of patterns. Therefore, the operation trajectory can be learned for a plurality of patterns on the display screen having the same function, and the learning efficiency can be improved.
 また本実施形態では、所定のパターンがランダムに、または予め定められた順序に従って変化する。このため、複数のパターンについて均等に学習を行うことが可能になる。 In the present embodiment, the predetermined pattern changes randomly or according to a predetermined order. For this reason, it becomes possible to learn equally about a some pattern.
 (第3の実施形態)
 次に本発明の第3の実施形態について説明する。図16は、本発明の第3の実施形態にかかる情報処理装置の構成を示すブロック図である。
(Third embodiment)
Next, a third embodiment of the present invention will be described. FIG. 16 is a block diagram showing the configuration of the information processing apparatus according to the third embodiment of the present invention.
 図16に示す情報処理装置300は、表示制御部31と、検知部32と、記憶部33と、軌跡出力部34と、制御部35とを有する。 16 includes a display control unit 31, a detection unit 32, a storage unit 33, a trajectory output unit 34, and a control unit 35.
 表示制御部31は、所定の表示画面を表示部に表示する。 The display control unit 31 displays a predetermined display screen on the display unit.
 検知部32は、操作面に対して接触又は近接した位置であるタッチ位置を検知する。 The detection unit 32 detects a touch position that is a position in contact with or close to the operation surface.
 記憶部33は、タッチ位置の軌跡を補正する補正量を記憶する。 The storage unit 33 stores a correction amount for correcting the locus of the touch position.
 軌跡出力部34は、検知部の検知したタッチ位置の軌跡を算出し、補正量に基づいて軌跡を補正した補正軌跡を認識用軌跡として出力する。 The locus output unit 34 calculates the locus of the touch position detected by the detection unit, and outputs a correction locus obtained by correcting the locus based on the correction amount as a recognition locus.
 制御部35は、表示画面が表示された状態で、認識用軌跡が予め定められた所定のパターンを描く場合に特定の処理を実行し、かつ、上記状態で、軌跡と、所定のパターンとに基づいて、補正量を調整する。 The control unit 35 performs a specific process when the recognition trajectory draws a predetermined pattern in a state where the display screen is displayed. In addition, the control unit 35 performs the trajectory and the predetermined pattern in the above state. Based on this, the correction amount is adjusted.
 次に本実施形態にかかる情報処理装置300の動作について説明する。図17は、本実施形態にかかる情報処理装置の動作を示すフローチャートである。 Next, the operation of the information processing apparatus 300 according to the present embodiment will be described. FIG. 17 is a flowchart illustrating the operation of the information processing apparatus according to the present embodiment.
 表示制御部31は、所定の表示画面を表示部(図示せず。)に表示する(ステップS300)。 The display control unit 31 displays a predetermined display screen on a display unit (not shown) (step S300).
 検知部32は、タッチ位置を検知する(ステップS305)。 The detection unit 32 detects the touch position (step S305).
 軌跡出力部34は、タッチ位置の軌跡を算出する(ステップS310)。 The locus output unit 34 calculates the locus of the touch position (Step S310).
 軌跡出力部34は、記憶部33に記憶された補正量に基づいて軌跡を補正した補正軌跡を認識用軌跡として出力する(ステップS315)。 The locus output unit 34 outputs a corrected locus obtained by correcting the locus based on the correction amount stored in the storage unit 33 as a recognition locus (step S315).
 制御部35は、認識用軌跡は所定のパターンを描くか否かを判断する(ステップS320)。認識用軌跡が所定のパターンを描く場合、制御部35は、特定の処理を実行する(ステップS325)。一方、認識用軌跡が所定のパターンを描かない場合、制御部35は、ステップS325の処理を省略する。そして、制御部35は、ステップS310にて算出した軌跡と所定のパターンとに基づいて、補正量を調整する(ステップS330)。 The control unit 35 determines whether or not the recognition locus draws a predetermined pattern (step S320). When the recognition locus draws a predetermined pattern, the control unit 35 executes a specific process (step S325). On the other hand, when the recognition locus does not draw a predetermined pattern, the control unit 35 omits the process of step S325. Then, the control unit 35 adjusts the correction amount based on the locus calculated in step S310 and the predetermined pattern (step S330).
 以上説明したように本実施形態でも、所定の表示画面が表示された状態で、タッチ位置の軌跡である操作軌跡に応じた特定の処理を行うとともに、その操作軌跡に基づいて補正量の調整を行う。このため、補正量の調整を行うためだけに入力操作を行う必要がなくなり、ユーザの手間を低減することが可能になる。また調整した補正量を用いて入力操作の認識を行うことで、入力操作の認識精度を改善することが可能になる。したがって、ユーザの手間を低減させつつ、入力操作の認識精度を改善することが可能になる。 As described above, also in this embodiment, in a state where a predetermined display screen is displayed, specific processing is performed according to the operation locus that is the locus of the touch position, and the correction amount is adjusted based on the operation locus. Do. For this reason, it is not necessary to perform an input operation only for adjusting the correction amount, and it is possible to reduce the user's trouble. Also, by recognizing the input operation using the adjusted correction amount, it is possible to improve the recognition accuracy of the input operation. Therefore, it becomes possible to improve the recognition accuracy of input operation, reducing a user's effort.
 以上、添付図面を参照しながら本発明の好適な実施形態について詳細に説明したが、本発明の技術的範囲はかかる例に限定されない。本発明の技術分野における通常の知識を有するものであれば、特許請求の範囲に記載された技術的思想の範疇内において想到しうる各種の変更例または修正例についても、当然に本発明の技術的範囲に属する。 The preferred embodiments of the present invention have been described in detail above with reference to the accompanying drawings, but the technical scope of the present invention is not limited to such examples. If the person has ordinary knowledge in the technical field of the present invention, various modifications or corrections that can be conceived within the scope of the technical idea described in the scope of the claims will naturally be applied. Belong to the scope.
 例えば上記実施形態では、所定のパターンが縦または横方向の直線状パターンである場合について例示したが、本発明はかかる例に限定されない。所定のパターンは、縦および横以外の方向の直線状パターンであってもよいし、弧状、または波状などの直線状以外の形状のパターンであってよい。 For example, in the above embodiment, the case where the predetermined pattern is a linear pattern in the vertical or horizontal direction is exemplified, but the present invention is not limited to such an example. The predetermined pattern may be a linear pattern in a direction other than the vertical and horizontal directions, or may be a pattern having a shape other than a linear shape such as an arc shape or a wave shape.
 また上記実施形態では、統括制御部23は、差異ベクトルV3の平均ベクトルV3mを補正量としたが、本発明はかかる例に限定されない。補正量は、操作軌跡と所定のパターンとの差異に基づいた、操作軌跡を所定のパターンに近づける量であればよい。 In the above embodiment, the overall control unit 23 uses the average vector V3m of the difference vector V3 as the correction amount, but the present invention is not limited to such an example. The correction amount may be an amount that brings the operation trajectory closer to the predetermined pattern based on the difference between the operation trajectory and the predetermined pattern.
 この出願は、2012年8月29日に出願された日本出願特願2012-188361を基礎とする優先権を主張し、その開示の全てをここに取り込む。 This application claims priority based on Japanese Patent Application No. 2012-188361 filed on Aug. 29, 2012, the entire disclosure of which is incorporated herein.
100,200,300  情報処理装置
11   表示装置
12   タッチセンサ
13   制御装置
14   メモリ
21   表示制御部
22   軌跡出力部
23   統括制御部
31   表示制御部
32   検知部
33   記憶部
34   軌跡出力部
35   制御部
 
100, 200, 300 Information processing device 11 Display device 12 Touch sensor 13 Control device 14 Memory 21 Display control unit 22 Trajectory output unit 23 Overall control unit 31 Display control unit 32 Detection unit 33 Storage unit 34 Trajectory output unit 35 Control unit

Claims (10)

  1.  所定の表示画面を表示部に表示する表示制御部と、
     操作面に対して接触又は近接した位置であるタッチ位置を検知する検知部と、
     前記タッチ位置の軌跡を補正する補正量を記憶する記憶部と、
     前記検知部が検知したタッチ位置の軌跡を算出し、前記補正量に基づいて前記軌跡を補正した補正軌跡を認識用軌跡として出力する軌跡出力部と、
     前記表示画面が表示された状態で、前記認識用軌跡が予め定められた所定のパターンを描く場合に特定の処理を実行し、かつ、前記状態で、前記軌跡と、前記所定のパターンとに基づいて、前記補正量を調整する制御部と、を備える情報処理装置。
    A display control unit for displaying a predetermined display screen on the display unit;
    A detection unit that detects a touch position that is in contact with or close to the operation surface;
    A storage unit for storing a correction amount for correcting the locus of the touch position;
    A locus output unit that calculates a locus of the touch position detected by the detection unit and outputs a correction locus obtained by correcting the locus based on the correction amount as a recognition locus;
    In the state where the display screen is displayed, a specific process is executed when the recognition locus draws a predetermined pattern, and the state is based on the locus and the predetermined pattern. And a control unit that adjusts the correction amount.
  2.  前記軌跡出力部は、前記軌跡が予め定められた条件を満たす場合、前記補正軌跡を前記認識用軌跡として出力し、前記軌跡が前記条件を満たさない場合、前記軌跡を前記認識用軌跡として出力する、請求項1に記載の情報処理装置。 The locus output unit outputs the corrected locus as the recognition locus when the locus satisfies a predetermined condition, and outputs the locus as the recognition locus when the locus does not satisfy the condition. The information processing apparatus according to claim 1.
  3.  前記表示制御部は、前記表示画面を、当該情報処理装置を起動した後、起動するアプリケーションを選択する操作が認識される前に表示する、請求項1または2のいずれかに記載の情報処理装置。 The information processing apparatus according to claim 1, wherein the display control unit displays the display screen after the information processing apparatus is activated and before an operation for selecting an application to be activated is recognized. .
  4.  前記表示制御部は、前記表示画面として、当該情報処理装置の機能が制限されたロック状態を解除するためのロック解除画面を表示し、
     前記制御部は、前記特定の処理として、前記ロック状態を解除する処理を実行する、請求項1ないし3のいずれか1項に記載の情報処理装置。
    The display control unit displays, as the display screen, a lock release screen for releasing a lock state in which the function of the information processing device is limited,
    The information processing apparatus according to claim 1, wherein the control unit executes a process of releasing the lock state as the specific process.
  5.  前記表示制御部は、前記表示画面として、複数の待ち受け画面のいずれかを表示し、
     前記制御部は、前記特定の処理として、前記表示制御部が表示する待ち受け画面を変更する処理を実行する、請求項1ないし4のいずれか1項に記載の情報処理装置。
    The display control unit displays one of a plurality of standby screens as the display screen,
    The information processing apparatus according to claim 1, wherein the control unit executes a process of changing a standby screen displayed by the display control unit as the specific process.
  6.  前記制御部は、予め用意された複数のパターンのいずれかを前記所定のパターンとして決定し、
     前記表示制御部は、前記制御部が決定した所定のパターンに応じた前記表示画面を表示する、請求項1ないし5のいずれか1項に記載の情報処理装置。
    The control unit determines any of a plurality of patterns prepared in advance as the predetermined pattern,
    The information processing apparatus according to claim 1, wherein the display control unit displays the display screen according to a predetermined pattern determined by the control unit.
  7.  前記制御部は、ランダムに、または予め定められた順序に従って、前記複数のパターンのいずれかを前記所定のパターンとして決定する、請求項6に記載の情報処理装置。 The information processing apparatus according to claim 6, wherein the control unit determines any of the plurality of patterns as the predetermined pattern at random or according to a predetermined order.
  8.  前記制御部は、延伸方向がそれぞれ異なる複数の直線状パターンのいずれかを、前記所定のパターンとして決定する、請求項6または7のいずれかに記載の情報処理装置。 The information processing apparatus according to claim 6 or 7, wherein the control unit determines any of a plurality of linear patterns having different extending directions as the predetermined pattern.
  9.  所定の表示画面を表示部に表示し、
     操作面に対して接触または近接した位置であるタッチ位置を検知し、
     前記タッチ位置の軌跡を算出し、
     前記軌跡を補正する補正量を記憶部から取得し、
     前記軌跡を前記補正量に基づいて補正した補正軌跡を生成し、
     前記補正軌跡に基づいて入力操作を認識し、
     前記表示画面が表示された状態で、前記入力操作として、前記補正軌跡が予め定められた所定のパターンを描く特定操作が認識された場合、特定の処理を実行し、
     前記状態で、前記軌跡と、前記パターンとに基づいて、前記補正量を調整する、情報処理方法。
    A predetermined display screen is displayed on the display unit,
    Detects the touch position that is in contact with or close to the operation surface,
    Calculating the locus of the touch position;
    A correction amount for correcting the locus is acquired from the storage unit,
    Generating a correction trajectory in which the trajectory is corrected based on the correction amount;
    Recognizing an input operation based on the correction locus,
    When a specific operation in which the correction locus draws a predetermined pattern is recognized as the input operation in a state where the display screen is displayed, a specific process is executed,
    An information processing method for adjusting the correction amount in the state based on the locus and the pattern.
  10.  所定の表示画面を表示部に表示する手順と、
     操作面に対して接触または近接した位置であるタッチ位置を検知する手順と、
     前記タッチ位置の軌跡を算出する手順と、
     前記軌跡を補正する補正量を記憶部から取得する手順と、
     前記軌跡を前記補正量に基づいて補正した補正軌跡を生成する手順と、
     前記補正軌跡に基づいて入力操作を認識する手順と、
     前記表示画面が表示された状態で、前記入力操作として、前記補正軌跡が予め定められた所定のパターンを描く特定操作が認識された場合、特定の処理を実行する手順と、
     前記状態で、前記軌跡と、前記パターンとに基づいて、前記補正量を調整する手順と、をコンピュータに実行させるためのプログラム。
     
    A procedure for displaying a predetermined display screen on the display unit;
    A procedure for detecting a touch position that is in contact with or close to the operation surface;
    Calculating a locus of the touch position;
    Obtaining a correction amount for correcting the locus from the storage unit;
    A procedure for generating a corrected trajectory obtained by correcting the trajectory based on the correction amount;
    Recognizing an input operation based on the correction locus;
    A procedure for executing a specific process when a specific operation in which the correction locus draws a predetermined pattern is recognized as the input operation in a state in which the display screen is displayed;
    A program for causing a computer to execute the procedure for adjusting the correction amount based on the trajectory and the pattern in the state.
PCT/JP2013/062175 2012-08-29 2013-04-25 Information processing device, information processing method, and program WO2014034181A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2012188361 2012-08-29
JP2012-188361 2012-08-29

Publications (1)

Publication Number Publication Date
WO2014034181A1 true WO2014034181A1 (en) 2014-03-06

Family

ID=50183009

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2013/062175 WO2014034181A1 (en) 2012-08-29 2013-04-25 Information processing device, information processing method, and program

Country Status (1)

Country Link
WO (1) WO2014034181A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023206544A1 (en) * 2022-04-29 2023-11-02 北京小米移动软件有限公司 Input method and apparatus, terminal device and readable storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000347800A (en) * 1999-06-03 2000-12-15 Alps Electric Co Ltd Device and method for proofreading pointing cursor
JP2004054413A (en) * 2002-07-17 2004-02-19 Casio Comput Co Ltd Input position adjusting device and input position adjusting program
JP2008181325A (en) * 2007-01-24 2008-08-07 Fuji Xerox Co Ltd Touch panel input device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000347800A (en) * 1999-06-03 2000-12-15 Alps Electric Co Ltd Device and method for proofreading pointing cursor
JP2004054413A (en) * 2002-07-17 2004-02-19 Casio Comput Co Ltd Input position adjusting device and input position adjusting program
JP2008181325A (en) * 2007-01-24 2008-08-07 Fuji Xerox Co Ltd Touch panel input device

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023206544A1 (en) * 2022-04-29 2023-11-02 北京小米移动软件有限公司 Input method and apparatus, terminal device and readable storage medium

Similar Documents

Publication Publication Date Title
JP6618122B2 (en) Input device and touch panel control method
JP5798103B2 (en) Terminal device, screen display method, program
US10198163B2 (en) Electronic device and controlling method and program therefor
EP2652580B1 (en) Using movement of a computing device to enhance interpretation of input events produced when interacting with the computing device
US9060068B2 (en) Apparatus and method for controlling mobile terminal user interface execution
WO2013183208A1 (en) Input device, input support method, and program
US20100177121A1 (en) Information processing apparatus, information processing method, and program
JP6022137B1 (en) Coordinate correction apparatus, coordinate correction method, and coordinate correction program
US9430089B2 (en) Information processing apparatus and method for controlling the same
CN103838456A (en) Method and system for controlling display positions of desktop icons
US20140333549A1 (en) Input device, input method, and program
US20170131882A1 (en) Information processing apparatus, information processing method, and computer program product
JP2012141895A (en) Display control device, display control method, and program
WO2012086133A1 (en) Touch panel device
JP5628991B2 (en) Display device, display method, and display program
JP2010287121A (en) Information processor, program, recording medium and display controller
US20150355819A1 (en) Information processing apparatus, input method, and recording medium
WO2013047023A1 (en) Display apparatus, display method, and program
JP5721602B2 (en) Portable terminal device and program
WO2014034181A1 (en) Information processing device, information processing method, and program
JP5769234B2 (en) Display device, display method, and program
US20130201159A1 (en) Information processing apparatus, information processing method, and program
WO2013190857A1 (en) Processing device, sensitivity adjustment method and program
JP2016115042A (en) Electronic apparatus
KR101678570B1 (en) Method and apparatus for controlling of operation of device having touch screen

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13832625

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13832625

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP