WO2011077525A1 - Electronic device, operation detection method and operation detection program - Google Patents

Electronic device, operation detection method and operation detection program Download PDF

Info

Publication number
WO2011077525A1
WO2011077525A1 PCT/JP2009/071399 JP2009071399W WO2011077525A1 WO 2011077525 A1 WO2011077525 A1 WO 2011077525A1 JP 2009071399 W JP2009071399 W JP 2009071399W WO 2011077525 A1 WO2011077525 A1 WO 2011077525A1
Authority
WO
WIPO (PCT)
Prior art keywords
pattern
unit
touch input
registered
determination
Prior art date
Application number
PCT/JP2009/071399
Other languages
French (fr)
Japanese (ja)
Inventor
亜佐子 西田
智 菊地
Original Assignee
富士通株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 富士通株式会社 filed Critical 富士通株式会社
Priority to JP2011547140A priority Critical patent/JPWO2011077525A1/en
Priority to PCT/JP2009/071399 priority patent/WO2011077525A1/en
Publication of WO2011077525A1 publication Critical patent/WO2011077525A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/04162Control or interface arrangements specially adapted for digitisers for exchanging data with external devices, e.g. smart pens, via the digitiser sensing hardware
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/66Substation equipment, e.g. for use by subscribers with means for preventing unauthorised or fraudulent calling
    • H04M1/667Preventing unauthorised calls from a telephone set
    • H04M1/67Preventing unauthorised calls from a telephone set by electronic means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/22Details of telephonic subscriber devices including a touch pad, a touch sensor or a touch detector

Definitions

  • the present invention relates to operation detection of an electronic device equipped with a touch panel, such as a mobile terminal device, and relates to an electronic device, an operation detection method, and an operation detection program that specify an operation pattern with a plurality of physical elements obtained from touch input, for example.
  • various inputs such as input of characters and figures, selection input, etc. can be performed by touching the touch panel with a finger or a touch pen.
  • Patent Document 1 a control for switching the key input mode of the touch panel to an erroneous operation prevention mode and invalidating the call-ending operation by key input.
  • an object of the electronic device, the operation detection method, and the operation detection program of the present disclosure is to improve touch input discrimination by using at least two types of physical information as determination elements and to improve the convenience of touch operation. .
  • the electronic device of the present disclosure includes a touch input unit, a pattern recognition unit, and a pattern determination unit.
  • the touch input unit is means for inputting by touch or stroke with a finger, a touch pen, or the like.
  • the pattern recognition unit recognizes a pattern using at least two types of physical information obtained from touch input as determination elements.
  • the pattern determination unit matches the registered pattern when the pattern recognized by the pattern recognition unit matches the registered pattern registered in advance in the registration unit or the approximate range of the registered pattern. Alternatively, an output indicating that the approximate range of the registered pattern is met is generated.
  • the operation detection method of the present disclosure is an operation detection method executed by an electronic device equipped with a touch input unit, and includes a pattern recognition step and a pattern determination step.
  • the pattern recognition step recognizes a pattern using at least two types of physical information obtained from touch input as determination elements.
  • the pattern determination step if the recognized pattern matches a registered pattern registered in advance in the registration unit or matches an approximate range of the registered pattern, the pattern determining step matches the registered pattern or the registered pattern Produces an output indicating that the approximate range is met.
  • an operation detection program of the present disclosure is an operation detection program executed by an electronic device equipped with a touch input unit, and includes a pattern recognition function and a pattern determination function.
  • the pattern recognition function recognizes a pattern using at least two types of physical information obtained from touch input as determination elements.
  • the pattern determination function matches the registered pattern when the recognized pattern matches the registered pattern registered in advance in the registration unit or the approximate range of the registered pattern, or the registered pattern Produces an output indicating that the approximate range is met.
  • the electronic device According to the electronic device, the operation detection method, or the operation detection program of the present disclosure, the following effects can be obtained.
  • the touch input pattern is specified using at least two types of physical information obtained from the touch input as a determination element, the touch input can be identified more easily and the convenience of the touch operation can be improved.
  • the first embodiment recognizes a touch input pattern using at least two types of physical information obtained from touch input as a determination element, and determines whether or not the input pattern matches a registered pattern or a neighborhood range of the registered pattern.
  • the touch input includes a touch on the touch input unit or an input by a stroke thereof.
  • the physical information that is a determination element is any one of the number of touch points, position, velocity, acceleration, angle, pressure, and angular velocity.
  • the pattern includes, for example, a point, a line, or a figure specified by a touch input or a stroke thereof as described above, and the line may be continuous or discontinuous, or may be a curve. Good.
  • FIG. 1 is a diagram illustrating an electronic apparatus according to the first embodiment.
  • the configuration of FIG. 1 is an example, and the present invention is not limited to such a configuration.
  • the electronic device 2 is an example of the electronic device, the operation detection method, or the operation detection program of the present disclosure. As illustrated in FIG. 1, the electronic device 2 includes a touch input unit 4, a pattern recognition unit 6, and a pattern registration unit as functional units. 8 and a pattern determination unit 10.
  • the touch input unit 4 accepts a touch with a finger or a touch pen (for example, a stylus pen) as a touch input unit, and converts the input of the point and the stroke into electronic information.
  • the touch point is a touch position by a finger or a touch pen, and is a recognizable point set in the touch input unit 4 or its line (may be a set of points).
  • the stroke is, for example, a touch operation with a predetermined pressure or more, and is a movement (trajectory) of successive touch points.
  • the pattern recognition unit 6 is a functional unit that recognizes the touch input pattern using the above-described physical information obtained from the touch input to the touch input unit 4 as a determination element.
  • the physical information of the determination element is as described above.
  • the number of points described above is, for example, the number of touch points that are equal to or higher than a predetermined pressure constituting a stroke.
  • the position is, for example, the position of the touch point on coordinates, or the locus position of the touch point.
  • the angle is, for example, the angle of the locus on the coordinates.
  • the speed is, for example, the moving speed of the touch point on the coordinates.
  • the acceleration is, for example, the acceleration in moving the touch point on the coordinates.
  • the angular velocity is, for example, the angular velocity in the locus of the touch point on the coordinates.
  • the pressure is, for example, a pressing force equal to or greater than a predetermined value (threshold value).
  • the pattern registration unit 8 is a functional unit that registers a pattern recognized by the pattern recognition unit 6. Since the registered pattern uses at least two types or two or more types of the physical information described above as determination factors, for example, there are various combinations such as the number of points and position, the point position and angle, and the point position and speed.
  • the registered pattern is a pattern specified from the touch input, and may be the touch input itself, or a pattern within the approximate range of the specified pattern by giving a width to the pattern specified from the touch input. Also good.
  • the pattern determination unit 10 is a functional unit that determines whether or not the input pattern matches the registered pattern. That is, the pattern determination unit 10 determines whether or not the pattern recognized by the pattern recognition unit 6 matches the registered pattern in the pattern registration unit 8. In this case, as a determination target, it may be determined whether the input pattern matches the registered pattern and the approximate range. Therefore, it is determined whether the input pattern matches the registered pattern, or whether it matches the approximate range, and if it matches, an output representing it is generated.
  • the pattern recognition unit 6 recognizes the pattern by the above-described determination element by the touch input of the touch input unit 4, and the pattern recognition unit 6 stores the pattern recognized by the pattern recognition unit 6. be registered.
  • an input pattern is recognized from the touch input of the touch input unit 4, it is determined whether the input pattern matches the registered pattern in the pattern registration unit 8 or its approximate range, and a determination output indicating the determination result is obtained. It is done.
  • FIG. 2 is a flowchart showing a pattern processing procedure.
  • This pattern processing procedure is an example of the operation detection method or the operation detection program of the present disclosure. As shown in FIG. 2, touch input (step S1), pattern recognition (step S2), pattern registration presence / absence determination ( Step S3), pattern registration (Step S4), pattern determination (Step S5), and generation of determination output (Step S6) are included.
  • the touch input unit 4 detects a touch input (step S1), and the pattern recognition unit 6 recognizes the pattern using the above-described physical information obtained from the touch input as a determination element (step S2).
  • the pattern to be registered may be a pattern obtained from touch input, and a pattern in an approximate range that approximates the pattern may be registered.
  • step S3 If there is pattern registration (YES in step S3), it is determined whether the input pattern recognized from the touch input matches the registered pattern (step S5). In this case, it may be determined whether or not the input pattern matches including the approximate range of the registered pattern.
  • the pattern determination unit 10 generates and outputs a determination output indicating a determination result of whether the input pattern matches the registered pattern or its approximate range or not (step S6).
  • a touch input pattern is generated using at least two types of physical information obtained from touch input as a determination element, and the registered pattern and the input pattern are compared. As a result, the identifiability of touch input is improved and the convenience of touch operation can be improved.
  • the pattern is recognized and generated using the above-described physical information obtained from touch input as a determination element. Thereby, the influence of the touch input by a user's unconscious operation can be reduced.
  • a pattern in which the above-described physical information is generated as a determination element is used. Therefore, it can be used for unlocking, end-of-call processing, application activation, login to the WWW (World Wide Web) site, etc., and the processing accuracy and security can be improved.
  • the second embodiment is a mobile terminal device equipped with a touch panel unit.
  • This portable terminal device recognizes a touch input pattern using the above-described physical information obtained from touch input as a determination element, and determines whether or not the input pattern matches a registered pattern or a neighborhood range of the registered pattern It is the structure which has.
  • FIG. 3 is a diagram illustrating an example of a functional unit of the mobile terminal device.
  • the configuration shown in FIG. 3 is an example, and the present invention is not limited to such a configuration.
  • FIG. 3 the same parts as those in FIG.
  • the mobile terminal device 20 is an example of the electronic device, the operation detection method, or the operation detection program of the present disclosure. As shown in FIG. 3, the mobile terminal device 20 includes a pattern determination unit 10, a pattern recognition unit 16, a touch detection unit 22, a position detection unit 24, an acceleration calculation unit 26, and a pressure calculation unit 28. , An angular velocity calculation unit 30, a pattern registration control unit 32, a communication control unit 34, a voice input / output control unit 36, a display control unit 38, and a lock control unit 40, which are processed by a computer (calculation, It is a functional unit realized by determination and various controls.
  • the pattern determination unit 10 is as described above.
  • the pattern recognition unit 16 recognizes a touch input pattern using the above-described physical information obtained from the touch input detected by the touch detection unit 22 as a determination element.
  • physical information determination elements are as follows.
  • the touch detection unit 22 detects, for example, a touch or stroke with a finger or a touch pen as touch input to the touch panel unit 56 (FIG. 4), and obtains touch information as physical information.
  • the position detection unit 24 detects the touch position, stroke position, angle, etc. of a finger or a touch pen with respect to the touch panel unit 56 (FIG. 4), and obtains physical information (position information).
  • the acceleration calculation unit 26 calculates acceleration as physical information from position information of a touch input of a finger or a touch pen with respect to the touch panel unit 56 (FIG. 4).
  • the pressure calculation unit 28 calculates the pressure at the touch position of the finger or the touch pen with respect to the touch panel unit 56 (FIG. 4). That is, the pressure at the touch position received by the touch panel unit 56 is calculated, and the pressure is obtained as physical information.
  • the angular velocity calculation unit 30 calculates an angular velocity of the mobile terminal device 20 and outputs angular velocity information that is physical information.
  • the pattern registration control unit 32 controls registration of patterns recognized by the pattern recognition unit 16 described above.
  • the pattern to be registered may be a pattern within the approximate range of the pattern recognized from the touch input.
  • the pattern recognition unit 16 described above generates a touch input pattern using at least two types of physical information obtained from the touch detection unit 22, the position detection unit 24, the acceleration calculation unit 26, the pressure calculation unit 28, or the angular velocity calculation unit 30. recognize.
  • the recognized pattern is registered or used for comparison with the registered pattern.
  • the communication control unit 34 controls wireless communication and voice communication. If the input pattern matches the registered pattern, the communication control unit 34 performs call control and call termination control according to the determination output.
  • the voice input / output control unit 36 controls output of voice signals and voice input.
  • the display control unit 38 controls the display of characters and images on the display unit 48 (FIG. 4).
  • the lock control unit 40 constitutes a functional unit that controls the unlocking of the functions of the mobile terminal device 20 and the lock input if the touch input is appropriate.
  • FIG. 5 is a diagram illustrating a hardware configuration example of the mobile terminal device
  • FIG. 5 is a diagram illustrating the mobile terminal device viewed from the touch panel unit side
  • FIG. 6 is a diagram illustrating the mobile terminal device including the pressure sensor unit.
  • the configurations shown in FIGS. 4, 5, and 6 are examples, and the present invention is not limited to such configurations. 4, FIG. 5 and FIG. 6, the same parts as those in FIG.
  • the mobile terminal device 20 includes hardware that implements the above-described functional unit (FIG. 3). As shown in FIG. 4, the processor 42, the sensor unit 44, the input unit 46, and the display unit 48 are provided. A voice input / output unit 50, a communication unit 52, and a storage unit 54.
  • the processor 42 executes an OS (Operating System) and an application program stored in the storage unit 54, and performs input control such as capturing of detection information from the sensor unit 44, detection of various information, calculation, display control, communication control, and the like. To do.
  • the above-described functional unit (FIG. 3) is realized by the processor 42.
  • the processor 42 constitutes an application activation unit that activates an application by pattern determination.
  • an application activation unit that activates an application in the program storage unit 72 is configured.
  • the sensor unit 44 includes a touch panel unit 56, a pressure sensor unit 58, an acceleration sensor unit 60, a position sensor unit 62, and an angular velocity sensor unit 64.
  • the touch panel unit 56 is installed on the front surface of the display unit 48 and detects a touch input by a finger or a touch pen.
  • the pressure sensor unit 58 detects pressure applied to the touch panel unit 56 by a finger or a touch pen.
  • the acceleration sensor unit 60 outputs acceleration information detected from a touch input of a finger or a touch pen.
  • the position sensor unit 62 outputs position information (coordinate information) detected from touch input of a finger or a touch pen.
  • the angular velocity sensor unit 64 detects the angular velocity of the mobile terminal device 20 and outputs the angular velocity information.
  • the input unit 46 inputs sensor information of the sensor unit 44 to the processor 42 under the control of the processor 42.
  • the display unit 48 is installed behind the touch panel unit 56, and displays input buttons assigned to the touch panel unit 56 and output information under the control of the processor 42.
  • the voice input / output unit 50 includes a microphone 66 as voice input means and a receiver 68 as voice output means.
  • the voice input / output unit 50 takes in voice from the microphone 66 and reproduces voice from the receiver 68 under the control of the processor 42.
  • the communication unit 52 includes an antenna 70, and executes communication with the radio base station under the control of the processor 42.
  • the communication unit 52 together with the processor 42, constitutes an end call function unit for terminating a call and a login function unit for logging in to a website based on pattern determination.
  • the storage unit 54 includes a program storage unit 72, a data storage unit 74, and a RAM (Random-Access Memory) 76.
  • the program storage unit 72 stores the OS and application programs described above, and examples of the application program include an operation detection program, a lock program, and an unlock program.
  • the data storage unit 74 includes a pattern registration unit, and a pattern to be compared with the input pattern is registered in the pattern registration unit.
  • the RAM 76 constitutes a work area for various processes such as registration and determination of patterns recognized from input strokes.
  • a touch panel unit 56 is installed together with a display unit 48 on the front surface of a casing 78.
  • a microphone 66 and a receiver 68 are installed in the housing 78 with the display unit 48 interposed therebetween.
  • a pressure sensor unit 58 is installed on the back side of the touch panel unit 56.
  • the pressure P is detected by the pressure sensor unit 58.
  • This detection output is output as an electrical signal representing pressure information, and the level represents the pressure level.
  • FIG. 7 is a diagram showing the touch panel unit
  • FIG. 8 is a diagram showing the touch panel unit and point movement
  • FIG. 9 is a diagram showing detection points where the point movement is detected.
  • a plurality of rows and a plurality of columns of addresses are set on the panel surface 82 of the touch panel unit 56.
  • detection points 84 of 12 rows and 10 columns are set. That is, the detection points 84 in 12 rows and 10 columns constitute an example of the position sensor 62.
  • Alphabets a, b... L represent rows (X coordinates), and numerals 1, 2,... 10 represent columns (Y coordinates).
  • a spiral stroke 86 is drawn on the panel surface 82 of the touch panel unit 56 as an example of touch input using a finger or a touch pen, thereby generating a point locus 88.
  • the stroke 86 is an example of an input operation that the user does not perform unconsciously.
  • the stroke 86 starts from the upper right portion of the touch panel portion 56, moves in a curved shape from the start point 90 toward the center portion, forms a revolving portion 92 at the center portion, and forms a cross portion 94.
  • the curve moves from the portion 94 to the lower left and forms an end point 96 at the lower left.
  • This point trajectory 88 starts at the point 9b and reaches the point 8b-7b-7c-6c-5c-5d-4d-3e.
  • the point 4e is the point of inflection and the points 4e-4f-4g-5g-6g -7g-7f, point 7f as an inflection point, point 7e-6e-5e-5f-4f, point trajectory 88 crosses at this point 4f, and from point 4f to 4g-3g-3h-2i- 2j-2k-2l is reached, and the point 2l is the end point 96. That is, the point locus 88 is written on a diagonal line starting from the upper right corner portion side of the panel surface 82 and directed to the lower left corner portion side, and forms a cross portion 94 together with the circulating portion 92 in the intermediate portion.
  • the point locus 88 corresponding to the stroke 86 is detected by a plurality of continuous detection point rows on the panel surface 82 as shown in FIG. That is, the point locus 88 corresponds to the detection points 9b-8b-7b-7c-6c-5c-5d-4d-3e-4e-3f-4f-5g-6g-7g-7f-7e-6e-5e-5f-4f. -4g-3g-3h-2h-2i-2j-2k-2l.
  • the detection point 4f is detected twice with a time difference ⁇ t, and the cross portion 94 exists on the point locus 88 from the start point 90 to the end point 96 with this detection. I understand.
  • the point locus 88 corresponding to the stroke 86 constitutes the pattern 98. That is, the pattern 98 is recognized as the stroke 86 of the touch input as an aggregate of the point locus 88.
  • An approximate range pattern 100 is formed.
  • the pattern 98 is represented by diagonal lines
  • the pattern 100 is represented by an encircling line consisting of a thick solid line.
  • the pattern 98 by using the touch position and the touch point which are physical information of the determination element of the touch input for the stroke 86 which is not operated unconsciously.
  • the pattern 100 which is the vicinity range is recognized by the range of the above-described detection points 6a to 3l around the pattern 98 and the detection points (including the detection point 6f in this case) inside the circulating portion 92. Can do.
  • These patterns 98 or 100 may be registered.
  • FIG. 10 is a flowchart showing the position detection processing procedure
  • FIG. 11 is a flowchart showing the acceleration calculation processing procedure
  • FIG. 12 is a flowchart showing the pressure calculation processing procedure
  • FIG. 13 shows the angular velocity calculation processing procedure. It is a flowchart.
  • the touch input position detection processing procedure is a process of detecting a touch position or locus on the touch panel unit 56 and outputting position information representing the position or locus.
  • the touch panel unit 56 is touched with a finger or a touch pen (step S111).
  • the processing procedure for calculating touch input acceleration is processing for calculating acceleration from position information representing the position and locus of touch on the touch panel unit 56.
  • step S121 the touch of the finger or the touch pen on the touch panel unit 56 and the position thereof are monitored. If a finger or a touch pen is touching the touch panel unit 56 (YES in step S121), X-axis and Y-axis position information indicating the touch position of the touch panel unit 56 is detected (step S122). Whether the position information has changed is monitored (step S123). If the position information has changed (YES in step S123), X-axis and Y-axis position information indicating the touch position of the touch panel unit 56 is detected (step S124). An acceleration is calculated from the position information (step S125), and an acceleration that is a determination element for touch input is output (step S126).
  • step S123 If the X-axis and Y-axis position information representing the touch position on the touch panel unit 56 has not changed (NO in step S123), the acceleration is not calculated, and this process ends.
  • the processing procedure for calculating the pressure of the touch input is a process of detecting the position and locus of the touch on the touch panel unit 56 and outputting the pressure representing the position and locus.
  • step S131 the touch of the finger or the touch pen on the touch panel unit 56 and the position thereof are monitored (step S131).
  • the pressure of the part in contact with the touch panel unit 56 is calculated (step S132), and the pressure that is the touch input determination element is output (step S133).
  • the processing procedure for calculating the angular velocity of the touch input is a process of monitoring the movement of the mobile terminal device and calculating the angular velocity.
  • a touch position, acceleration, pressure, and angular velocity are obtained as physical information that is a determination element of touch input, and a touch input pattern can be recognized from the physical information.
  • FIG. 14 is a flowchart illustrating a lock setting process.
  • This lock setting processing procedure recognizes the touch input pattern using the above-described physical information obtained from the touch input to the touch panel unit 56 for the lock setting, and sets the registered pattern as the release information.
  • step S201 it is determined whether or not the lock is set. If the lock is not set (NO in step S201), the process ends.
  • step S201 If it is a lock setting (YES of step S201), pattern registration mode will be started and pattern registration will be performed (step S202). It is determined whether or not the pattern has been registered (step S203). If the pattern is not registered (NO in step S203), this process ends. If the pattern is registered (YES in step S203), lock setting is executed (step S204). Since this lock setting is related to pattern registration by touch input, it is necessary for the registration pattern and the input pattern to coincide with each other in order to cancel the lock setting.
  • FIG. 15 is a flowchart showing a processing procedure during registration.
  • This pattern registration processing procedure is a process of registering a touch input pattern for the touch panel unit 56 when setting a function lock or the like.
  • the registered pattern is used for user authentication processing for unlocking.
  • step S211 the position information of the X axis and the Y axis representing the point where the finger is placed on the touch panel unit 56 is detected (step S211). It is determined to which range the position of the point corresponds (step S212). The pressure from the finger or the touch pen is calculated (step S213). The movement of the point is monitored (step S214). If there is a movement of the point (YES in step S214), the acceleration of the point movement is calculated or the movement speed of the point is calculated from the movement of the position of the point (step S215).
  • a pattern is generated by the position information, pressure, and acceleration of the touch input (step S216), this pattern is registered in the pattern registration unit (step S217), and the process returns to step S202 of the main routine (FIG. 14). Then, lock setting is executed.
  • FIG. 16 is a flowchart showing the unlocking procedure.
  • This unlock process is a process for unlocking when there is a registered pattern as unlock information.
  • step S221 it is determined whether or not the lock is released (step S221). If the lock is not released (NO in step S221), this process is terminated. If the lock is released (YES in step S221), the above-described touch input is performed (step S222). The touch input pattern is recognized using the above-described physical information obtained from the touch input as a determination element.
  • step S223 It is determined whether the input pattern obtained by touch input matches the registered pattern (step S223). If the input pattern matches the registered pattern (YES in step S223), the lock is released (step S224). If the input pattern does not match the registered pattern (NO in step S223), the lock cannot be released. (Step S225). In this case, as described above, the match determination between the input pattern and the registered pattern may be determined by including the approximate range in the registered pattern.
  • FIG. 17 is a flowchart illustrating an example of a specific processing procedure for unlocking.
  • step S231 the position (X, Y) of the point where the finger or the touch pen is placed on the touch panel unit 56 is calculated (step S231).
  • step S232 it is determined to which part of the coordinate range (1a to 10l) the position of the point corresponds (step S232). That is, it is determined whether or not the position of the point at the time of registration falls within the range of front, rear, left and right (step S233). For example, when the point 9b is registered, it is determined whether the point 8a to 10a, 8b to 10b, and 8c to 10c are applicable. In this determination, if it does not correspond to the coordinate range of the position of the point (NO in step S233), the lock cannot be released (step S234).
  • step S233 if the coordinate range of the position of the point falls (YES in step S233), as a second determination whether the input pattern matches the registered pattern, the pressure is compared with the registered pattern for determination (step (step S233). S235). For example, if the pressure at the point of registration is XX [N], the pressure is within a predetermined pressure range, for example, XX [N] ⁇ 0.8 to XX [N] ⁇ 1.2. Is determined (step S236). In this pressure determination, if the pressure does not fall within the predetermined pressure range (NG in step S236), the lock cannot be released (step S237).
  • step S236 If such pressure determination is within a predetermined pressure range (OK in step S236), the movement of the point is monitored (step S238). If there is a movement of the point (YES in step S238), an acceleration is calculated from the movement of the position distance of the point, and this acceleration is compared with the registered pattern for determination (step S239). For example, if the acceleration at the point at the time of registration is ⁇ [cm / s 2 ], the acceleration is within a predetermined range, for example, ⁇ [N] ⁇ 0.8 to ⁇ [N] ⁇ 1.2. It is determined whether it falls within the range (step S240). If this acceleration is outside the predetermined range (NG in step S240), the lock cannot be released (step S241).
  • step S240 If this acceleration is within a predetermined range (OK in step S240), the lock is released (step S242), and this process is terminated.
  • FIG. 18 is a flowchart showing the processing procedure for setting the end call.
  • This end-call setting processing procedure recognizes the touch input pattern using the above-described physical information obtained from touch input to the touch panel unit 56 for the end-call processing as a determination element, and sets the registered pattern as end-call information.
  • step S301 it is determined whether or not the end of the call is set. If the end of call is not set (NO in step S301), this process ends.
  • step S301 If it is the end call setting (YES in step S301), the pattern registration mode is activated and pattern registration is performed (step S302). It is determined whether or not the pattern is registered (step S303). If the pattern is not registered (NO in step S303), this process is terminated. If the pattern is registered (YES in step S303), the end of call setting process is executed (step S304). Since this end-call setting process is related to pattern registration by touch input, it is necessary for the end-call setting process to match the registered pattern and the input pattern.
  • FIG. 19 is a flowchart showing a pattern registration processing procedure.
  • This pattern registration processing procedure is a process of registering a touch input pattern for the touch panel unit 56 when setting a function lock or the like.
  • the registered pattern is used for user authentication processing for unlocking.
  • step S311 position information on the X axis and the Y axis representing the point where the finger is placed on the touch panel unit 56 is detected (step S311). It is determined to which range the position of the point corresponds (step S312). The pressure from the finger or the touch pen is calculated (step S313). The movement of the point is monitored (step S314). If there is a movement of the point (YES in step S314), the acceleration of the movement of the point is calculated or the movement speed of the point is calculated from the movement of the position of the point (step S315).
  • a pattern is generated by the position information, pressure, and acceleration of the touch input (step S316), this pattern is registered in the pattern registration unit (step S317), and the process returns to step S302 of the main routine (FIG. 18). Then, the end call setting is executed.
  • FIG. 20 is a flowchart showing the processing procedure of the end talk.
  • This end story processing procedure is the end story processing when a registered pattern exists as the end story information.
  • step S321 it is determined whether or not a call is in progress (step S321). If the call is not in progress (NO in step S321), this process ends. If the call is in progress (YES in step S321), touch input is performed for the end of the call (step S322). Recognize patterns from touch input. That is, the touch input pattern is recognized using the above-described physical information obtained from the touch input as a determination element.
  • step S323 It is determined whether the input pattern obtained by touch input matches the registered pattern (step S323). If the input pattern matches the registered pattern (YES in step S323), the end of call lock is released (step S324), and the end of call process is executed (step S325). If the input pattern does not match the registered pattern (NO in step S323), the end-of-call lock cannot be released (step S326). In this case, as described above, the match determination between the input pattern and the registered pattern may be determined by including the approximate range in the registered pattern.
  • FIG. 21 is a flowchart illustrating an example of a specific processing procedure for unlocking.
  • step S331 the position (X, Y) of the point where the finger or the touch pen is placed on the touch panel unit 56 is calculated (step S331).
  • step S332 it is determined to which part of the coordinate range the position of the point corresponds (step S332). That is, it is determined whether or not the position of the point at the time of registration falls within the range of front, rear, left and right (step S333). For example, when the point 9b is registered, it is determined whether the point 8a to 10a, 8b to 10b, and 8c to 10c are applicable. In this determination, if it does not correspond to the coordinate range of the position of the point (NO in step S333), the call cannot be ended (step S334).
  • the pressure is compared with the registered pattern for determination (step S335). For example, if the pressure at the point of registration is XX [N], the pressure is within a predetermined pressure range, for example, XX [N] ⁇ 0.8 to XX [N] ⁇ 1.2. Determine whether. In this pressure determination, if it does not fall within the predetermined pressure range (NG in step S336), the call cannot be ended (step S337).
  • step S336 If such a pressure determination is within a predetermined pressure range (OK in step S336), the movement of the point is monitored (step S338). An acceleration is calculated, and this acceleration is compared with a registered pattern for determination (step S339). For example, if the acceleration of the point at the time of registration is ⁇ [cm / s 2 ], the acceleration is a predetermined range, for example, ⁇ [N] ⁇ 0.8 to ⁇ [N] ⁇ 1.2. It is determined whether it falls within the range (step S340). If this acceleration is outside the predetermined range (NG in step S340), the end of speech is impossible (step S341).
  • step S240 If the acceleration is within the predetermined range (OK in step S240), the call lock is released, the call ending process is executed (step S342), and the process ends.
  • the third embodiment has a function of recognizing a touch input pattern using five types of physical information obtained from touch input as a determination element and determining whether the input pattern matches the registered pattern or the vicinity thereof.
  • FIG. 22 is referred to for the third embodiment.
  • FIG. 22 is a flowchart illustrating a lock setting process.
  • the configuration shown in FIG. 22 is an example, and the present invention is not limited to such a configuration.
  • This processing procedure is a lock setting for the mobile terminal device as in the second embodiment, and is an example of the electronic device, the operation detection method, or the operation detection program of the present disclosure. Also in this embodiment, the functional unit shown in FIG. 3 and the hardware configuration shown in FIG. 4 are used.
  • step S401 determination of lock setting (step S401), touch input (steps S402, S403, S404, S405, S406), pattern registration (step S407), and lock setting (step S408). )It is included.
  • step S401 it is determined whether or not lock setting is to be performed. If lock setting is not to be performed (NO in step S401), this processing is terminated. If the lock is set (YES in step S401), the process proceeds to touch input. As the first touch input, the number of points where the finger is first placed on the touch panel unit 56 (FIGS. 4 and 5) is detected, and the number of points is obtained as a determination element of the touch input (step S402).
  • step S403 the position of a point on the touch panel unit 56 (FIG. 4) is detected as the second touch input, and the position information (X, Y) is obtained as a determination element of the touch input (step S403).
  • step S404 As the third touch input, the acceleration at which the point moves is detected, and the acceleration is obtained as a determination element of the touch input (step S404).
  • step S405 As the fourth touch input, an angle at which the point moves is detected, and the angle is obtained as a touch input determination element.
  • a change in pressure for pressing the touch panel unit 56 (FIG. 6) with a finger is detected, and a pressure change is obtained as a determination element for the touch input (step S406).
  • Step S407 a pattern is generated using the five types of physical information of the number of points, position, acceleration, angle, and pressure change as determination elements, and this pattern is set in the data storage unit 74 (FIG. 4).
  • lock setting is executed (step S408).
  • the lock setting executed together with pattern registration cannot be unlocked unless there is an input pattern that matches the registered pattern.
  • FIG. 23 is a diagram illustrating an input operation
  • FIG. 24 is a diagram illustrating a touch input.
  • the index finger 102 is brought into contact with the upper left side of the touch panel unit 56, and the tip of the thumb 104 is brought into contact with the lower right side. While maintaining the contact state, the index finger 102 is turned to the arrow a and the thumb 104 is turned to the arrow b. According to such touch input, two separated points and a determination element based on the movement can be obtained.
  • 0 to 60 detection points are set in the X-axis direction and 0 to 90 detection points are set in the Y-axis direction on the touch panel unit 56 of the mobile terminal device 20.
  • Point A, point A ′, and arrow a indicate the pressing point of the index finger 102 and its movement locus
  • point B, point B ′, and arrow b indicate the pressing point of the thumb 104 and its movement locus.
  • Y A ′ ( 25) ⁇ Is obtained.
  • FIG. 25 is a diagram illustrating another touch input
  • FIG. 26 is a diagram illustrating another input operation.
  • the tip of the index finger 102 is brought into contact with the upper right side of the touch panel unit 56, and the index finger 102 is placed in Japan while maintaining the contact state. Move to draw the kana character "no" of the word. According to such touch input, a determination element of touch input by a stroke 86 (FIG. 8) of one stroke can be obtained.
  • FIG. 27 is a diagram showing a change in acceleration of a touch input
  • FIG. 28 is a diagram showing an angle range of the touch input
  • FIG. 29 is a diagram showing a pressure change in the touch input.
  • acceleration changing with time is detected between the start point 106 and the end point 108 of the touch input.
  • This acceleration is one piece of physical information and can be used as a touch input determination element.
  • the touch input angle ⁇ can be determined as a touch input determination element, for example, as shown in FIG. In this case, using the intersection point of the X axis and the Y axis set in the touch panel unit 56 as a reference point, the angle ⁇ formed by the aforementioned points A and A ′ of the touch input is obtained. In this case, the angle ⁇ formed by point A and point A ′ is 35 °.
  • a pressure that changes with time is detected between the start point 106 and the end point 108 of the touch input as shown in FIG.
  • This pressure change is one piece of physical information and can be used as a determination element for touch input.
  • FIG. 30 is a diagram illustrating another input operation
  • FIG. 31 is a diagram illustrating another touch input.
  • the touch input may be a character input.
  • the alphabet finger “ON” may be drawn by bringing the index finger 102 into contact with the touch panel unit 56, and this character “ON” is used as a touch input determination element as shown in FIG. 31.
  • Position information on the X axis and the Y axis of the touch panel unit 56 is obtained.
  • FIG. 32 is a flowchart showing the unlocking processing procedure.
  • This processing procedure is the unlocking process corresponding to the above-described lock setting (FIG. 22).
  • the lock setting is as described above.
  • the five types of determination elements of touch input point number, point position, point acceleration, moving angle, and pressure are used. Is used. Therefore, in this processing procedure, as shown in FIG. 32, determination of unlocking (step S411), touch input (steps S412, S413, S414, S415, S416), pattern determination (step S417), and unlocking are impossible. (Steps S418 and S419) are included.
  • step S411 it is determined whether or not the lock is released (step S411). If the lock is not released (NO in step S411), the process is terminated. In the case of unlocking (YES in step S411), the process proceeds to touch input. As the first touch input, the number of points where the finger is first placed on the touch panel unit 56 (FIGS. 4 and 5) is detected, and the number of points is obtained as a determination element of the touch input (step S412).
  • step S413 the position of the point on the touch panel unit 56 (FIG. 4) is detected as the second touch input, and the position information (X, Y) is obtained as the determination element of the touch input (step S413).
  • step S4114 As the third touch input, the acceleration at which the point moves is detected, and the acceleration is obtained as a touch input determination element (step S414).
  • a change in pressure for pressing the touch panel unit 56 (FIG. 6) with a finger is detected, and a pressure change is obtained as a determination element for the touch input (step S416).
  • a pattern is generated using the five types of physical information of the number of points, position, acceleration, angle, and pressure change as judgment elements, and this input pattern matches the registered pattern in the data storage unit 74 (FIG. 4). It is determined whether or not to perform (step S417).
  • step S417 if the input pattern matches the registered pattern (YES in step S417), unlocking is executed (step S418). If the input pattern does not match the registered pattern (NO in step S417), the lock cannot be released (step S419).
  • the number of touch input points, position, acceleration, moving angle, and change in pressure are detected as physical information, and the touch input pattern is recognized by the logical product of these and registered.
  • Each is characteristic physical information, and a pattern is generated by a plurality of these logical products. For this reason, the determination accuracy between the registered pattern and the input pattern can be improved, touch input due to the unconsciousness of the user can be eliminated, and malfunction can be prevented.
  • the fourth embodiment is configured to prevent the user from unintentionally ending the call during a call by ending the end-of-call lock on the display screen after releasing the end-of-call lock. is there.
  • FIG. 33 is a flowchart showing the processing procedure of the end call
  • FIG. 34 is a view showing a display example of the inquiry display and selection button at the end of the call
  • FIG. 35 is a view showing a display example of the end call.
  • the processing procedure of the end speech processing is an example of the electronic device, the operation detection method, or the operation detection program of the present disclosure. Also in this embodiment, the functional unit shown in FIG. 3 and the hardware configuration shown in FIG. 4 are used.
  • This processing procedure includes a call lock by starting a call (step S501), a call lock release (steps S502, S503, S504, S505), and a call termination process (steps S506, S507).
  • the call is locked when the call starts (step S501).
  • This lock setting may be performed, for example, in the lock setting (FIGS. 14 and 15) of the second embodiment.
  • step S503 the unlocked monitoring state is maintained (NO in step S502 and step S502). If the lock is released (YES in step S502), the user performs a touch input (step S503). This touch input is input by an operation that is not performed by the user unconsciously. As described above, the input pattern is recognized from this touch input, and it is determined whether the input pattern matches the registered pattern (step S504). . If the input pattern does not match the registered pattern (NO in step S504), the lock is maintained, so that touch input is required again (step S503).
  • step S504 If the input pattern matches the registered pattern (YES in step S504), the lock is released (step S505). When the lock is released, an end-of-speech operation is requested (step S506), and the end of the talk is made by this operation (step S507), and this process is terminated.
  • an inquiry message 110 for the end speech operation and selection buttons 112 and 114 are displayed on the display unit 48 by display control.
  • the message 110 displays “Do you want to end the call?”.
  • the selection button 112 is a button for instructing an end story “YES”
  • the selection button 114 is a button for instructing an end story rejection “NO”.
  • the end-of-call process is performed by the touch input. If the selection button 114 is touched, the call termination is rejected by the touch input, and the call is maintained.
  • a message 116 for notifying the end of speech is displayed on the display unit 48 by display control.
  • This message 116 displays, for example, “Call ended”. As a result, the user can know that the end of the conversation has ended.
  • the fifth embodiment is a configuration in which touch input pattern determination is used to start an application program (application) in the program storage unit 72 (FIG. 4).
  • FIG. 36 is a flowchart showing the processing procedure of application activation setting
  • FIG. 37 is a flowchart showing the processing procedure of pattern registration
  • FIG. 38 is a flowchart showing the processing procedure of application activation
  • FIG. It is a flowchart which shows a specific process sequence. Also in this embodiment, the functional unit shown in FIG. 3 and the hardware configuration shown in FIG. 4 are used.
  • the processing procedure of this application activation setting is an example of the electronic device, the operation detection method, or the operation detection program of the present disclosure.
  • a touch input pattern is recognized by using the above-described physical information obtained from touch input to the touch panel unit 56 for application activation setting, and the registered pattern is set as application activation information.
  • step S601 it is determined whether the application activation setting is set (step S601), and if it is not the application activation setting (NO in step S601), the process is terminated.
  • step S601 If it is an application activation setting (YES in step S601), the pattern registration mode is activated and pattern registration is performed (step S602). It is determined whether pattern registration has been performed (step S603). If pattern registration is not performed (NO in step S603), this process ends. If the pattern is registered (YES in step S603), the application activation setting is executed (step S604). Since this application activation setting is associated with pattern registration by touch input, it is necessary that the registration pattern and the input pattern match for the application activation.
  • the pattern registration process used for starting the application is a process of registering a touch input pattern for the touch panel unit 56 in setting the application activation.
  • the registered pattern is used for user authentication processing for starting an application.
  • step S611 position information on the X axis and the Y axis representing the point where the finger is placed on the touch panel unit 56 is detected (step S611). It is determined to which range the position of the point corresponds (step S612). The pressure from the finger or the touch pen is calculated (step S613). The movement of the point is monitored (step S614). If there is a movement of the point (YES in step S614), the acceleration of the movement of the point or the movement speed of the point is calculated from the movement of the position of the point (step S615).
  • a pattern is generated by the position information, pressure, and acceleration of the touch input (step S616), the pattern is registered in the pattern registration unit (step S617), and the process returns to step S602 of the main routine (FIG. 36). Then, the application activation setting is executed.
  • the application activation processing procedure requires that the input pattern matches the registered pattern by pattern recognition by touch input for application activation.
  • step S621 it is determined whether the application is activated. If the application is not activated (NO in step S621), the process is terminated. If the application is activated (YES in step S621), the touch input described above is performed (step S622). The touch input pattern is recognized using the above-described physical information obtained from the touch input as a determination element.
  • step S623 It is determined whether the input pattern obtained by touch input matches the registered pattern. If the input pattern matches the registered pattern (YES in step S623), the application is activated (step S624). If the input pattern does not match the registered pattern (NO in step S623), the application cannot be activated. (Step S625). In this case, as described above, the match determination between the input pattern and the registered pattern may be determined by including the approximate range in the registered pattern.
  • step S631 whether the application is started is monitored (step S631), and if the application is started, touch input is performed.
  • the position (X, Y) of the point where the finger or the touch pen is placed on the touch panel unit 56 is calculated (step S632).
  • step S633 as a first determination as to whether the input pattern matches the registered pattern, it is determined to which part of the coordinate range the position of the point corresponds (step S633). That is, it is determined whether or not the position of the point at the time of registration falls within the range of front, back, left and right (step S634).
  • step S635 when the point 9b is registered, it is determined whether the point 8a to 10a, 8b to 10b, and 8c to 10c are applicable. If this determination does not correspond to the coordinate range of the position of the point (NO in step S634), the application cannot be activated (step S635).
  • the pressure is compared with the registered pattern for determination (step) S636). For example, if the pressure at the point of registration is XX [N], the pressure is within a predetermined pressure range, for example, XX [N] ⁇ 0.8 to XX [N] ⁇ 1.2. Determine whether. If the pressure does not fall within the predetermined pressure range (NG in step S637), the application cannot be activated (step S638).
  • step S637 If such pressure determination is within a predetermined pressure range (OK in step S637), the movement of the point is monitored (step S639), and if there is a movement of the point (YES in step S639), the movement of the position of the point from An acceleration is calculated, and the acceleration is compared with a registered pattern for determination (step S640). For example, if the acceleration at the point at the time of registration is ⁇ [cm / s 2 ], the acceleration is within a predetermined range, for example, ⁇ [N] ⁇ 0.8 to ⁇ [N] ⁇ 1.2. It is determined whether it falls within the range (step S641). If this acceleration is outside the predetermined range (NG in step S641), the application cannot be activated (step S642).
  • step S643 If this acceleration is within a predetermined range (OK in step S641), application activation is executed (step S643).
  • the application activation is permitted by using a pattern that recognizes at least two types of physical information obtained from touch input as a determination element in this way, the application activation when the input pattern does not match the registered pattern can be prevented.
  • the application can be protected from free activation, and security can be increased.
  • touch input pattern determination is used for logging in to a website using a network communication function.
  • FIG. 40 is a flowchart showing the processing procedure for login setting to the website
  • FIG. 41 is a flowchart showing the processing procedure for pattern registration
  • FIG. 42 is a flowchart showing the processing procedure for logging in to the website
  • FIG. It is a flowchart which shows the specific process sequence of login to a website.
  • the processing procedure for setting login to this website is an example of the electronic device, the operation detection method, or the operation detection program of the present disclosure.
  • the touch input pattern is recognized using the above-described physical information obtained from the touch input to the touch panel unit 56 for the login setting, and the registered pattern is set as the login information.
  • step S701 it is determined whether or not the login setting is set. If the login setting is not set (NO in step S701), the processing ends.
  • step S701 If it is login setting (YES in step S701), the pattern registration mode is activated and pattern registration is performed (step S702). It is determined whether or not the pattern has been registered (step S703). If the pattern is not registered (NO in step S703), this process ends. If the pattern is registered (YES in step S703), login setting is executed (step S704). Since this login setting is related to pattern registration by touch input, it is necessary for the login pattern to match the input pattern for login to the website.
  • the pattern registration process used for login is a process of registering a touch input pattern for the touch panel unit 56 at the time of login setting.
  • the registered pattern is used for user authentication processing for login.
  • step S711 position information on the X axis and the Y axis representing the point where the finger on the touch panel unit 56 is placed is detected (step S711). It is determined to which part of the coordinate range the position of the point corresponds (step S712). The pressure from the finger or the touch pen is calculated (step S713). The movement of the point is monitored (step S714). If there is a point movement (YES in step S714), the acceleration of the point movement is calculated or the movement speed of the point is calculated from the position distance movement of the point (step S715).
  • a pattern is generated based on the position information, pressure, and acceleration of the touch input (step S716), the pattern is registered in the pattern registration unit (step S717), and the process returns to step S702 of the main routine (FIG. 40). Then, login settings are executed.
  • the login processing procedure requires that the input pattern matches the registered pattern by pattern recognition by touch input for login.
  • step S721 it is determined whether or not the user is logged in (step S721). If not logged in (NO in step S721), the process is terminated. If it is login (YES in step S721), the touch input described above is performed (step S722). The touch input pattern is recognized using the above-described physical information obtained from the touch input as a determination element.
  • step S723 It is determined whether the input pattern obtained by touch input matches the registered pattern (step S723). If the input pattern matches the registration pattern (YES in step S723), login is executed (step S724). If the input pattern does not match the registration pattern (NO in step S723), login is disabled (step S723). S725). In this case, as described above, the match determination between the input pattern and the registered pattern may be determined by including the approximate range in the registered pattern.
  • step S731 it is monitored whether it is a login (step S731), and if it is a login, touch input is performed.
  • the position (X, Y) of the point where the finger or the touch pen is placed on the touch panel unit 56 is calculated (step S732).
  • step S733 as a first determination as to whether the input pattern matches the registered pattern, it is determined to which part of the coordinate range the position of the point corresponds (step S733). That is, it is determined whether or not the position of the point at the time of registration falls within the range of front, rear, left and right (step S734).
  • step S735 login to the site is disabled.
  • step S734 if it falls within the coordinate range of the position of the point (YES in step S734), as a second determination of whether the input pattern matches the registered pattern, the pressure is compared with the registered pattern for determination (step (step S734). S736). For example, if the pressure at the point of registration is XX [N], the pressure is within a predetermined pressure range, for example, XX [N] ⁇ 0.8 to XX [N] ⁇ 1.2. Determine whether. If the pressure does not fall within the predetermined pressure range (NG in step S737), login is disabled (step S738).
  • step S737 If such a pressure determination is within a predetermined pressure range (OK in step S737), the movement of the point is monitored (step S739), and if there is a movement of the point (YES in step S739), the movement of the position of the point from An acceleration is calculated, and this acceleration is compared with a registered pattern for determination (step S740). For example, if the acceleration at the point at the time of registration is ⁇ [cm / s 2 ], the acceleration is within a predetermined range, for example, ⁇ [N] ⁇ 0.8 to ⁇ [N] ⁇ 1.2. It is determined whether it falls within the range (step S741). If this acceleration is outside the predetermined range (NG in step S741), login is disabled (step S742).
  • step S743 login is executed (step S743).
  • login is permitted by using a pattern that recognizes at least two types of physical information obtained from touch input as a determination element in this way, login when the input pattern does not match the registered pattern can be prevented. Moreover, free login can be prevented and security can be improved.
  • the seventh embodiment is a process of selecting whether an existing registration pattern is shared or a new pattern is registered.
  • FIG. 44 is a flowchart illustrating a lock setting process.
  • step S801 it is determined whether or not the lock is set, and if the lock is not set, this processing ends.
  • step S802 If it is the lock setting, it is determined whether the registered pattern already registered is used for unlocking (step S802). If an existing registration pattern is used for unlocking (YES in step S802), lock setting is executed (step S803), and this process is terminated.
  • step S804 If an existing registered pattern is not used (NO in step S802), it is determined whether a new pattern is set (step S804). If no new pattern is set (NO in step S804), this process ends.
  • step S804 When a new pattern is set (YES in step S804), pattern registration is executed (step S805). Since the pattern registration is as described in the above embodiment, its description is omitted.
  • the comparative example is a portable terminal device having a malfunction prevention lock on the touch panel unit, and operates the call end button at the end of the call.
  • FIG. 45 is a diagram illustrating a comparative example of the mobile terminal device
  • FIG. 46 is a flowchart illustrating a processing procedure of the end call.
  • the portable terminal device 220 has a touch panel unit 256 installed together with a display unit 248 on the front surface of a housing 278.
  • a microphone 266 and a lock key 118 and a receiver 268 are installed on the housing 278 with the display unit 248 interposed therebetween, and an end button 120 is installed on the touch panel unit 256.
  • the end button 120 is enabled after the touch panel lock is released by depressing the lock key 118, and the end operation can be performed.
  • a call is determined (step S901). If the call is not in progress (NO in step S901), the process ends.
  • step S901 the malfunction prevention operation of the touch panel unit 256 is started. That is, when the user places the portable terminal device 220 on the ear, the touch panel unit 256 detects it and sets the touch panel lock (step S902).
  • This touch panel lock is a process of blocking touch input to the touch panel unit 256 in order to prevent an end of talk due to careless touch input. In this case, the operation of the end call button 120 is invalid.
  • step S903 the lock key 118 is pressed (step S903), and the lock is released (step S904). With this unlocking, the touch input on the touch panel unit 256 and the operation of the call end button 120 become effective.
  • step S905 when the end button 120 is selected (step S905) and pressed, the end of the call is reached (step S906).
  • the touch input pattern is recognized using at least two types of physical information obtained from the touch input as determination elements.
  • the physical information include the number of touch input points, position, speed, acceleration, angle, pressure, and angular velocity, but are not limited thereto. Therefore, the physical information may include any one of these or any of these.
  • the pattern recognition may be based on the logical product of two or more types of physical information.
  • the logical product of the touch position and acceleration (second embodiment: FIG. 11), the logical product of the number of touch points, its position, acceleration, angle and pressure (third embodiment: FIG. 22).
  • FIG. 32 the fourth embodiment: FIG. 32
  • the logical product of physical information may be a combination of different types of physical information, and the physical information may be at least two types, and three or more types may be combined to generate a pattern by logical product.
  • the acceleration sensor unit 60 is provided, but the present invention is not limited to this.
  • a configuration may be used in which the velocity is obtained from the movement of the touch position per unit time, and the acceleration is calculated by differentiating the velocity.
  • the angular velocity sensor unit 64 detects the angular velocity of the mobile terminal device 20, but is not limited to this.
  • a sensor that detects the movement of the mobile terminal device 20 may be provided, and the angular velocity may be calculated from the sensor output. Moreover, it replaces with the angular velocity of the portable terminal device 20, and the angular velocity of the touch position with respect to the touch panel part 56 is calculated, and it is good also as physical information.
  • the mobile terminal device 20 is illustrated as an example of the electronic device, but the electronic device of the present disclosure is not limited to this. Any electronic device capable of touch input may be used.
  • it may be a personal computer (PC) 320 (FIG. 47), a foldable mobile phone 420 (FIG. 48), a remote controller 520 (FIG. 49), or a game as long as it is an electronic device capable of touch input. It may be a device or an automobile engine starting device.
  • the PC 320 is configured such that the keyboard side casing 322 and the display side casing 324 can be folded by a hinge 326, and the display side casing 324 includes a display unit. 48 and the touch panel unit 56 are set.
  • the touch panel unit 56 is configured according to the above-described embodiment, and can perform pattern recognition and pattern registration by the above-described processing from touch input. For this reason, also in this PC320, the same function as the said embodiment is implement
  • the mobile phone 420 is an example of an electronic device or the mobile terminal device 20 described above.
  • the operation-side housing unit 422 and the display-side housing unit 424 include a hinge unit 426.
  • the touch panel unit 56 is set in the display-side housing unit 424 together with the display unit 48.
  • the touch panel unit 56 is configured according to the above-described embodiment, and can perform pattern recognition and pattern registration by the above-described processing from touch input. For this reason, this mobile phone 420 can realize the same functions as those in the above-described embodiment and obtain the effects.
  • the keypad 428 is included in the input unit 46 (FIG. 4) described above.
  • the remote controller 520 is means for remotely operating various electronic devices 522 such as a television receiver using radio waves or sound waves, as shown in FIG.
  • a remote controller 520 can be equipped with the functions described for the electronic device or the mobile terminal device of the present disclosure, and the same functions and effects can be obtained.
  • the engine can be started with an input pattern that matches the pattern registered only by the user, and security can be improved.
  • the point pressure [N] is exemplified (FIGS. 17, 21, 39, and 43), but is not limited thereto.
  • a pressure [Pa] obtained by dividing the force acting on the point by the area may be used.
  • the electronic device, the operation detection method, or the operation detection program according to the present disclosure recognizes a pattern using at least two types of physical information obtained from touch input as a determination element, and performs pattern registration and input pattern determination. Therefore, the convenience of the device including the touch input unit is improved and useful.
  • Pattern determination unit 10 Pattern determination unit 40 Lock control unit 42 Processor (application activation unit) 44 sensor unit 52 communication unit (end call function unit, login function unit) 54 storage unit 56 touch panel unit 58 pressure sensor unit 60 acceleration sensor unit 62 position sensor unit 64 angular velocity sensor unit 72 program storage unit 74 data storage unit

Abstract

An electronic device is provided with a touch input unit (4), a pattern recognition unit (6), a registration unit (pattern registration unit (8)), and a pattern determination unit (10). The pattern recognition unit recognizes a pattern using at least two kinds of physical information obtained from a touch input as elements of determination. The pattern registration unit registers the recognized pattern. When the recognized pattern matches a registration pattern in the registration unit or matches the approximate range of the registration pattern, the pattern determination unit generates an output indicating the matching.

Description

電子機器、操作検出方法及び操作検出プログラムElectronic device, operation detection method, and operation detection program
 本発明は、携帯端末装置等、タッチパネルを搭載した電子機器の操作検出に関し、例えば、タッチ入力から得られる複数の物理要素で操作パターンを特定する電子機器、操作検出方法及び操作検出プログラムに関する。
The present invention relates to operation detection of an electronic device equipped with a touch panel, such as a mobile terminal device, and relates to an electronic device, an operation detection method, and an operation detection program that specify an operation pattern with a plurality of physical elements obtained from touch input, for example.
 タッチパネルを搭載した携帯端末装置では、タッチパネルに対する指やタッチペンのタッチ操作により文字、図形の入力、選択入力等、各種の入力が行える。 In a mobile terminal device equipped with a touch panel, various inputs such as input of characters and figures, selection input, etc. can be performed by touching the touch panel with a finger or a touch pen.
 この種の携帯端末装置にあって、タッチパネルのキー入力モードを誤操作防止モードに切り替え、キー入力による終話操作を無効にする制御が知られている(特許文献1)。
In this type of portable terminal device, there is known a control for switching the key input mode of the touch panel to an erroneous operation prevention mode and invalidating the call-ending operation by key input (Patent Document 1).
特開2005-269567号公報JP 2005-269567 A
 タッチパネルが搭載された携帯端末装置では、通話中、耳がタッチパネルにタッチすることを認識して通話をロック状態にし、通話解除を禁止することができる。このロック状態は携帯端末装置にあるロック解除ボタンの押下により解除できるとともに、通話中、終話ボタンの操作を無効とする。即ち、終話ボタン操作の無効化は不用意な終話を防止ためである。このため、終話の実行には、ロック解除後、終話ボタンの操作が必要であり、不便である。 In a mobile terminal device equipped with a touch panel, during a call, it is possible to recognize that the ear touches the touch panel and lock the call, thereby prohibiting the cancellation of the call. This locked state can be released by pressing an unlock button on the mobile terminal device, and the operation of the end button is invalidated during a call. That is, disabling the end button operation is for preventing an inadvertent end. For this reason, in order to execute the end call, it is necessary to operate the end call button after unlocking, which is inconvenient.
 また、ロック解除に用いるロック解除ボタンを設置すれば、その操作や制御が必要である。そこで、タッチパネルにロック機能とともに、ロック解除ボタンに代わる簡易なタッチ操作が割り当てられると、不用意なタッチでロック解除されるという不都合がある。 Also, if a lock release button used for unlocking is installed, its operation and control are required. Therefore, when a simple touch operation instead of the unlock button is assigned to the touch panel together with the lock function, there is an inconvenience that the lock is released by an inadvertent touch.
 そこで、本開示の電子機器、操作検出方法及び操作検出プログラムの目的は、少なくとも2種類の物理情報を判断要素に用いてタッチ入力の識別性を高め、タッチ操作の利便性を向上させることにある。
Accordingly, an object of the electronic device, the operation detection method, and the operation detection program of the present disclosure is to improve touch input discrimination by using at least two types of physical information as determination elements and to improve the convenience of touch operation. .
 上記目的を達成するため、本開示の電子機器は、タッチ入力部と、パターン認識部と、パターン判定部とを備える。前記タッチ入力部は、指やタッチペン等によりタッチやストロークによって入力する手段である。前記パターン認識部は、タッチ入力から得られる少なくとも2種類の物理情報を判断要素としてパターンを認識する。前記パターン判定部は、前記パターン認識部で認識されたパターンが、登録部に予め登録されている登録パターンに合致し又は該登録パターンの近似範囲に合致する場合に、前記登録パターンに合致すること又は前記登録パターンの近似範囲に合致することを表す出力を発生する。 In order to achieve the above object, the electronic device of the present disclosure includes a touch input unit, a pattern recognition unit, and a pattern determination unit. The touch input unit is means for inputting by touch or stroke with a finger, a touch pen, or the like. The pattern recognition unit recognizes a pattern using at least two types of physical information obtained from touch input as determination elements. The pattern determination unit matches the registered pattern when the pattern recognized by the pattern recognition unit matches the registered pattern registered in advance in the registration unit or the approximate range of the registered pattern. Alternatively, an output indicating that the approximate range of the registered pattern is met is generated.
 また、上記目的を達成するため、本開示の操作検出方法は、タッチ入力部を搭載した電子機器が実行する操作検出方法であって、パターン認識ステップと、パターン判定ステップとを含む。前記パターン認識ステップは、タッチ入力から得られる少なくとも2種類の物理情報を判断要素としてパターンを認識する。前記パターン判定ステップは、認識されたパターンが、登録部に予め登録されている登録パターンに合致し又は該登録パターンの近似範囲に合致する場合に、前記登録パターンに合致すること又は前記登録パターンの近似範囲に合致することを表す出力を発生する。 In order to achieve the above object, the operation detection method of the present disclosure is an operation detection method executed by an electronic device equipped with a touch input unit, and includes a pattern recognition step and a pattern determination step. The pattern recognition step recognizes a pattern using at least two types of physical information obtained from touch input as determination elements. In the pattern determination step, if the recognized pattern matches a registered pattern registered in advance in the registration unit or matches an approximate range of the registered pattern, the pattern determining step matches the registered pattern or the registered pattern Produces an output indicating that the approximate range is met.
 また、上記目的を達成するため、本開示の操作検出プログラムは、タッチ入力部を搭載した電子機器によって実行される操作検出プログラムであって、パターン認識機能と、パターン判定機能とを含む。前記パターン認識機能は、タッチ入力から得られる少なくとも2種類の物理情報を判断要素としてパターンを認識する。前記パターン判定機能は、認識されたパターンが、登録部に予め登録されている登録パターンに合致し又は該登録パターンの近似範囲に合致する場合に、前記登録パターンに合致すること又は前記登録パターンの近似範囲に合致することを表す出力を発生する。
In order to achieve the above object, an operation detection program of the present disclosure is an operation detection program executed by an electronic device equipped with a touch input unit, and includes a pattern recognition function and a pattern determination function. The pattern recognition function recognizes a pattern using at least two types of physical information obtained from touch input as determination elements. The pattern determination function matches the registered pattern when the recognized pattern matches the registered pattern registered in advance in the registration unit or the approximate range of the registered pattern, or the registered pattern Produces an output indicating that the approximate range is met.
 本開示の電子機器、操作検出方法又は操作検出プログラムによれば、次のような効果が得られる。 According to the electronic device, the operation detection method, or the operation detection program of the present disclosure, the following effects can be obtained.
 (1) タッチ入力から得られる少なくとも2種類の物理情報を判断要素に用いてタッチ入力のパターンを特定するので、タッチ入力の識別性が高められ、タッチ操作の利便性を向上させることができる。 (1) Since the touch input pattern is specified using at least two types of physical information obtained from the touch input as a determination element, the touch input can be identified more easily and the convenience of the touch operation can be improved.
 (2) タッチ入力から得られる少なくとも2種類の物理情報を判断要素としてパターンを認識し生成させるので、人の無意識操作によるタッチ入力の影響を排除ないし軽減できる。 (2) Since the pattern is recognized and generated using at least two types of physical information obtained from touch input as judgment elements, it is possible to eliminate or reduce the influence of touch input caused by unconscious operation by a person.
 (3) 少なくとも2種類の物理情報を判断要素として生成させたパターンを用いるので、ロック解除、終話処理、アプリ起動、ウェブサイトへのログイン等の処理精度やセキュリティが高められる。
(3) Since a pattern generated by using at least two types of physical information as a determination factor is used, processing accuracy and security such as unlocking, end-of-call processing, application activation, and login to a website are improved.
 そして、本発明の他の目的、特徴及び利点は、添付図面及び各実施の形態を参照することにより、一層明確になるであろう。
Other objects, features, and advantages of the present invention will become clearer with reference to the accompanying drawings and each embodiment.
第1の実施の形態に係る電子機器の一例を示す図である。It is a figure which shows an example of the electronic device which concerns on 1st Embodiment. タッチ入力のパターン登録及びパターン判定の処理手順を示すフローチャートである。It is a flowchart which shows the process procedure of pattern registration and pattern determination of touch input. 第2の実施の形態に係る携帯端末装置の機能部の一例を示す図である。It is a figure which shows an example of the function part of the portable terminal device which concerns on 2nd Embodiment. 第2の実施の形態に係る携帯端末装置のハードウェア構成の一例を示す図である。It is a figure which shows an example of the hardware constitutions of the portable terminal device which concerns on 2nd Embodiment. タッチパネル部側から見た携帯端末装置を示す図である。It is a figure which shows the portable terminal device seen from the touchscreen part side. 圧力センサ部を備える携帯端末装置を示す図である。It is a figure which shows a portable terminal device provided with a pressure sensor part. タッチパネル部の構成例を示す図である。It is a figure which shows the structural example of a touchscreen part. タッチパネル部に対するタッチ入力を示す図である。It is a figure which shows the touch input with respect to a touchscreen part. タッチパネル部のタッチ検出を示す図である。It is a figure which shows the touch detection of a touchscreen part. タッチ入力の位置検出の処理手順を示すフローチャートである。It is a flowchart which shows the process sequence of the position detection of touch input. タッチ入力の加速度算出の処理手順を示すフローチャートである。It is a flowchart which shows the process sequence of acceleration calculation of touch input. タッチ入力の圧力算出の処理手順を示すフローチャートである。It is a flowchart which shows the process sequence of the pressure calculation of touch input. 角速度算出の処理手順を示すフローチャートである。It is a flowchart which shows the process sequence of angular velocity calculation. ロック設定の処理手順を示すフローチャートである。It is a flowchart which shows the process sequence of lock setting. パターン登録の処理手順を示すフローチャートである。It is a flowchart which shows the process sequence of pattern registration. ロック解除の処理手順を示すフローチャートである。It is a flowchart which shows the process sequence of lock release. ロック解除の処理手順を示すフローチャートである。It is a flowchart which shows the process sequence of lock release. 終話設定の処理手順を示すフローチャートである。It is a flowchart which shows the process sequence of an end call setting. パターン登録の処理手順を示すフローチャートである。It is a flowchart which shows the process sequence of pattern registration. 終話の処理手順を示すフローチャートである。It is a flowchart which shows the process sequence of an end talk. 終話の処理手順を示すフローチャートである。It is a flowchart which shows the process sequence of an end talk. 第3の実施の形態に係るロック設定の処理手順を示すフローチャートである。It is a flowchart which shows the process sequence of the lock setting which concerns on 3rd Embodiment. タッチ入力の入力操作を示す図である。It is a figure which shows input operation of touch input. タッチ入力を示す図である。It is a figure which shows a touch input. タッチ入力の他の入力操作を示す図である。It is a figure which shows other input operation of a touch input. 他のタッチ入力を示す図である。It is a figure which shows another touch input. タッチ入力の加速度変化を示す図である。It is a figure which shows the acceleration change of a touch input. タッチ入力の角度範囲を示す図である。It is a figure which shows the angle range of a touch input. タッチ入力の圧力変化を示す図である。It is a figure which shows the pressure change of a touch input. タッチ入力の他の入力操作を示す図である。It is a figure which shows other input operation of a touch input. 他のタッチ入力を示す図である。It is a figure which shows another touch input. ロック解除の処理手順を示すフローチャートである。It is a flowchart which shows the process sequence of lock release. 第4の実施の形態に係る終話の処理手順を示すフローチャートである。It is a flowchart which shows the process sequence of the end story which concerns on 4th Embodiment. 終話時の問い合わせ表示及び選択ボタンの表示例を示す図である。It is a figure which shows the example of a display of the inquiry display at the time of an end talk, and a selection button. 終話の表示例を示す図である。It is a figure which shows the example of a display of an end talk. 第5の実施の形態に係るアプリケーションの起動設定の処理手順を示すフローチャートである。It is a flowchart which shows the process sequence of the starting setting of the application which concerns on 5th Embodiment. パターン登録の処理手順を示すフローチャートである。It is a flowchart which shows the process sequence of pattern registration. アプリケーションの起動の処理手順を示すフローチャートである。It is a flowchart which shows the process sequence of starting of an application. アプリケーションの起動の具体的な処理手順を示すフローチャートである。It is a flowchart which shows the specific process sequence of starting of an application. 第6の実施の形態に係るウェブサイトのログイン設定の処理手順を示すフローチャートである。It is a flowchart which shows the process sequence of the login setting of the website which concerns on 6th Embodiment. パターン登録の処理手順を示すフローチャートである。It is a flowchart which shows the process sequence of pattern registration. ウェブサイトのログインの処理手順を示すフローチャートである。It is a flowchart which shows the process sequence of login of a website. ウェブサイトのログインの具体的な処理手順を示すフローチャートである。It is a flowchart which shows the specific process sequence of login of a website. 第7の実施の形態に係るロック設定の処理手順を示すフローチャートである。It is a flowchart which shows the process sequence of the lock setting which concerns on 7th Embodiment. 携帯端末装置の比較例を示す図である。It is a figure which shows the comparative example of a portable terminal device. 比較例である終話の処理手順を示すフローチャートである。It is a flowchart which shows the process sequence of the end story which is a comparative example. 他の実施の形態に係るパーソナルコンピュータの一例を示す図である。It is a figure which shows an example of the personal computer which concerns on other embodiment. 他の実施の形態に係る携帯電話機の一例を示す図である。It is a figure which shows an example of the mobile telephone which concerns on other embodiment. 他の実施の形態に係るリモートコントローラの一例を示す図である。It is a figure which shows an example of the remote controller which concerns on other embodiment.
〔第1の実施の形態〕 [First Embodiment]
 第1の実施の形態は、タッチ入力から得られる少なくとも2種類の物理情報を判断要素としてタッチ入力のパターンを認識し、登録パターン又は該登録パターンの近傍範囲に入力パターンが合致するか否かを判定している。ここで、タッチ入力は、タッチ入力部に対するタッチ又はそのストロークによる入力を含む。判断要素である物理情報は、タッチのポイント数、位置、速度、加速度、角度、圧力、角速度の何れかである。パターンには、タッチ入力として例えば、タッチ又はそのストロークから既述の判断要素を以て特定されるポイント、線又は図形が含まれ、線は、連続又は不連続であってもよく、曲線であってもよい。このようなタッチ入力から得られる既述の物理情報を判断要素とすることにより、ユーザの無意識での操作(ストローク)によるタッチ入力を排除し、無意識操作での誤動作の軽減又は防止が図られる。 The first embodiment recognizes a touch input pattern using at least two types of physical information obtained from touch input as a determination element, and determines whether or not the input pattern matches a registered pattern or a neighborhood range of the registered pattern. Judgment. Here, the touch input includes a touch on the touch input unit or an input by a stroke thereof. The physical information that is a determination element is any one of the number of touch points, position, velocity, acceleration, angle, pressure, and angular velocity. The pattern includes, for example, a point, a line, or a figure specified by a touch input or a stroke thereof as described above, and the line may be continuous or discontinuous, or may be a curve. Good. By using the above-described physical information obtained from such touch input as a determination element, it is possible to eliminate touch input due to an unintentional operation (stroke) of the user and to reduce or prevent malfunctions due to unconscious operation.
 この第1の実施の形態について、図1を参照する。図1は、第1の実施の形態に係る電子機器を示す図である。図1の構成は一例であって、本発明は斯かる構成に限定されるものではない。 For this first embodiment, refer to FIG. FIG. 1 is a diagram illustrating an electronic apparatus according to the first embodiment. The configuration of FIG. 1 is an example, and the present invention is not limited to such a configuration.
 この電子機器2は、本開示の電子機器、操作検出方法又は操作検出プログラムの一例であって、図1に示すように、機能部としてタッチ入力部4と、パターン認識部6と、パターン登録部8と、パターン判定部10とを備えている。 The electronic device 2 is an example of the electronic device, the operation detection method, or the operation detection program of the present disclosure. As illustrated in FIG. 1, the electronic device 2 includes a touch input unit 4, a pattern recognition unit 6, and a pattern registration unit as functional units. 8 and a pattern determination unit 10.
 タッチ入力部4は、タッチ入力手段として例えば、指やタッチペン(例えば、スタイラスペン)によるタッチを受け付け、そのポイントやそのストロークの入力を電子情報に変換する。タッチポイントは、指やタッチペンによるタッチ位置であり、タッチ入力部4に設定された認識可能なポイント又はその線(ポイントの集合であってもよい。)である。ストロークは例えば、所定圧力以上のタッチ操作であり、連続するタッチポイントの移動(軌跡)である。 The touch input unit 4 accepts a touch with a finger or a touch pen (for example, a stylus pen) as a touch input unit, and converts the input of the point and the stroke into electronic information. The touch point is a touch position by a finger or a touch pen, and is a recognizable point set in the touch input unit 4 or its line (may be a set of points). The stroke is, for example, a touch operation with a predetermined pressure or more, and is a movement (trajectory) of successive touch points.
 パターン認識部6は、タッチ入力部4に対するタッチ入力から得られる既述の物理情報を判断要素としてタッチ入力のパターンを認識する機能部である。ここで、判断要素の物理情報は既述の通りである。この物理情報において、既述のポイント数は例えば、ストロークを構成する所定圧力以上のタッチポイントの数である。位置は例えば、座標上のタッチポイントの位置、タッチポイントの軌跡位置である。角度は例えば、座標上の軌跡の角度である。速度は例えば、座標上のタッチポイントの移動速度である。加速度は例えば、座標上のタッチポイントの移動における加速度である。角速度は例えば、座標上のタッチポイントの軌跡における角速度である。また、圧力は例えば、所定値(閾値)以上の押圧力である。 The pattern recognition unit 6 is a functional unit that recognizes the touch input pattern using the above-described physical information obtained from the touch input to the touch input unit 4 as a determination element. Here, the physical information of the determination element is as described above. In this physical information, the number of points described above is, for example, the number of touch points that are equal to or higher than a predetermined pressure constituting a stroke. The position is, for example, the position of the touch point on coordinates, or the locus position of the touch point. The angle is, for example, the angle of the locus on the coordinates. The speed is, for example, the moving speed of the touch point on the coordinates. The acceleration is, for example, the acceleration in moving the touch point on the coordinates. The angular velocity is, for example, the angular velocity in the locus of the touch point on the coordinates. The pressure is, for example, a pressing force equal to or greater than a predetermined value (threshold value).
 パターン登録部8は、パターン認識部6で認識されたパターンを登録する機能部である。登録されるパターンは、既述の物理情報の少なくとも2種類又は2種類以上を判断要素としているから、例えば、ポイント数と位置、ポイント位置と角度、ポイント位置と速度等、各種の組み合わせがある。登録されるパターンは、タッチ入力から特定されるパターンであって、タッチ入力そのものでもよいし、タッチ入力から特定されたパターンに幅を持たせ、特定されたパターンの近似範囲にあるパターンであってもよい。 The pattern registration unit 8 is a functional unit that registers a pattern recognized by the pattern recognition unit 6. Since the registered pattern uses at least two types or two or more types of the physical information described above as determination factors, for example, there are various combinations such as the number of points and position, the point position and angle, and the point position and speed. The registered pattern is a pattern specified from the touch input, and may be the touch input itself, or a pattern within the approximate range of the specified pattern by giving a width to the pattern specified from the touch input. Also good.
 パターン判定部10は、入力パターンが登録パターンに合致するか否かを判定する機能部である。即ち、このパターン判定部10では、パターン認識部6で認識されたパターンが、パターン登録部8にある登録パターンに合致するか否かを判定する。この場合、判定対象としては、その登録パターンと近似範囲に入力パターンが合致するかを判定してもよい。そこで、入力パターンが登録パターンに合致するかを判定し、又はその近似範囲に合致するかを判定し、合致する場合には、それを表す出力を発生する。 The pattern determination unit 10 is a functional unit that determines whether or not the input pattern matches the registered pattern. That is, the pattern determination unit 10 determines whether or not the pattern recognized by the pattern recognition unit 6 matches the registered pattern in the pattern registration unit 8. In this case, as a determination target, it may be determined whether the input pattern matches the registered pattern and the approximate range. Therefore, it is determined whether the input pattern matches the registered pattern, or whether it matches the approximate range, and if it matches, an output representing it is generated.
 斯かる構成によれば、タッチ入力部4のタッチ入力により、パターン認識部6は、既述の判断要素によってパターンを認識し、パターン登録部8には、パターン認識部6で認識されたパターンが登録される。 According to such a configuration, the pattern recognition unit 6 recognizes the pattern by the above-described determination element by the touch input of the touch input unit 4, and the pattern recognition unit 6 stores the pattern recognized by the pattern recognition unit 6. be registered.
 そこで、タッチ入力部4のタッチ入力から入力パターンが認識され、その入力パターンがパターン登録部8にある登録パターン又はその近似範囲に合致しているかを判定し、その判定結果を表す判定出力が得られる。 Therefore, an input pattern is recognized from the touch input of the touch input unit 4, it is determined whether the input pattern matches the registered pattern in the pattern registration unit 8 or its approximate range, and a determination output indicating the determination result is obtained. It is done.
 そこで、このパターンの登録及び判定について、図2を参照する。図2は、パターンの処理手順を示すフローチャートである。 Therefore, refer to FIG. 2 for registration and determination of this pattern. FIG. 2 is a flowchart showing a pattern processing procedure.
 このパターンの処理手順は、本開示の操作検出方法又は操作検出プログラムの一例であって、図2に示すように、タッチ入力(ステップS1)、パターン認識(ステップS2)、パターン登録の有無判定(ステップS3)、パターン登録(ステップS4)、パターン判定(ステップS5)、判定出力の生成(ステップS6)を含んでいる。 This pattern processing procedure is an example of the operation detection method or the operation detection program of the present disclosure. As shown in FIG. 2, touch input (step S1), pattern recognition (step S2), pattern registration presence / absence determination ( Step S3), pattern registration (Step S4), pattern determination (Step S5), and generation of determination output (Step S6) are included.
 そこで、この処理手順では、タッチ入力部4でタッチ入力を検出し(ステップS1)、パターン認識部6がタッチ入力から得られる既述の物理情報を判断要素としてパターンを認識する(ステップS2)。 Therefore, in this processing procedure, the touch input unit 4 detects a touch input (step S1), and the pattern recognition unit 6 recognizes the pattern using the above-described physical information obtained from the touch input as a determination element (step S2).
 このパターン認識の後、パターン登録部8にパターンの登録があるかを判定する(ステップS3)。パターン登録がなければ(ステップS3のNO)、認識されたパターンをパターン登録部8に登録し(ステップS4)、この処理を終了する。この場合、登録されるパターンは、タッチ入力から得られたパターンであればよく、そのパターンに近似する近似範囲にあるパターンを登録してもよい。 After this pattern recognition, it is determined whether there is a pattern registration in the pattern registration unit 8 (step S3). If there is no pattern registration (NO in step S3), the recognized pattern is registered in the pattern registration unit 8 (step S4), and this process ends. In this case, the pattern to be registered may be a pattern obtained from touch input, and a pattern in an approximate range that approximates the pattern may be registered.
 また、パターン登録があれば(ステップS3のYES)、タッチ入力から認識された入力パターンが登録パターンに合致するかを判定する(テップS5)。この場合、入力パターンが登録パターンの近似範囲を含めて合致するか否かを判定してもよい。パターン判定部10は、入力パターンが登録パターン又はその近似範囲に合致するか、合致しないかの判定結果を表す判定出力を生成し(ステップS6)、出力する。 If there is pattern registration (YES in step S3), it is determined whether the input pattern recognized from the touch input matches the registered pattern (step S5). In this case, it may be determined whether or not the input pattern matches including the approximate range of the registered pattern. The pattern determination unit 10 generates and outputs a determination output indicating a determination result of whether the input pattern matches the registered pattern or its approximate range or not (step S6).
 以上の構成によれば、タッチ入力から得られる少なくとも2種類の物理情報を判断要素に用いてタッチ入力のパターンを生成させ、その登録パターンと入力パターンとを対比している。この結果、タッチ入力の識別性が高められ、タッチ操作の利便性を向上させることができる。 According to the above configuration, a touch input pattern is generated using at least two types of physical information obtained from touch input as a determination element, and the registered pattern and the input pattern are compared. As a result, the identifiability of touch input is improved and the convenience of touch operation can be improved.
 タッチ入力から得られる既述の物理情報を判断要素としてパターンを認識し、生成させている。これにより、ユーザの無意識操作によるタッチ入力の影響を軽減できる。 The pattern is recognized and generated using the above-described physical information obtained from touch input as a determination element. Thereby, the influence of the touch input by a user's unconscious operation can be reduced.
 また、既述の物理情報を判断要素として生成させたパターンを用いている。そこで、ロック解除、終話処理、アプリ起動、WWW(World Wide Web)のサイトへのログイン等に利用でき、その処理精度やセキュリティを高めることができる。 Also, a pattern in which the above-described physical information is generated as a determination element is used. Therefore, it can be used for unlocking, end-of-call processing, application activation, login to the WWW (World Wide Web) site, etc., and the processing accuracy and security can be improved.
〔第2の実施の形態〕 [Second Embodiment]
 第2の実施の形態は、タッチパネル部を搭載した携帯端末装置である。この携帯端末装置は、タッチ入力から得られる既述の物理情報を判断要素としてタッチ入力のパターンを認識し、登録パターン又は該登録パターンの近傍範囲に入力パターンが合致するか否かを判定する機能を有する構成である。 The second embodiment is a mobile terminal device equipped with a touch panel unit. This portable terminal device recognizes a touch input pattern using the above-described physical information obtained from touch input as a determination element, and determines whether or not the input pattern matches a registered pattern or a neighborhood range of the registered pattern It is the structure which has.
 この第2の実施の形態について、図3を参照する。図3は、携帯端末装置の機能部の一例を示す図である。図3に示す構成は一例であって、本発明は斯かる構成に限定されるものではない。図3において、図1と同一部分には同一符号を付してある。 Referring to FIG. 3 for the second embodiment. FIG. 3 is a diagram illustrating an example of a functional unit of the mobile terminal device. The configuration shown in FIG. 3 is an example, and the present invention is not limited to such a configuration. In FIG. 3, the same parts as those in FIG.
 この携帯端末装置20は本開示の電子機器、操作検出方法又は操作検出プログラムの一例である。この携帯端末装置20には、図3に示すように、パターン判定部10と、パターン認識部16と、タッチ検出部22と、位置検出部24と、加速度算出部26と、圧力算出部28と、角速度算出部30と、パターン登録制御部32と、通信制御部34と、音声入出力制御部36と、表示制御部38と、ロック制御部40とを備え、これらはコンピュータの処理(演算、判定及び各種制御)によって実現される機能部である。 The mobile terminal device 20 is an example of the electronic device, the operation detection method, or the operation detection program of the present disclosure. As shown in FIG. 3, the mobile terminal device 20 includes a pattern determination unit 10, a pattern recognition unit 16, a touch detection unit 22, a position detection unit 24, an acceleration calculation unit 26, and a pressure calculation unit 28. , An angular velocity calculation unit 30, a pattern registration control unit 32, a communication control unit 34, a voice input / output control unit 36, a display control unit 38, and a lock control unit 40, which are processed by a computer (calculation, It is a functional unit realized by determination and various controls.
 パターン判定部10は既述の通りである。パターン認識部16は、タッチ検出部22に検出されたタッチ入力から得られる既述の物理情報を判断要素としてタッチ入力のパターンを認識する。この実施の形態において、物理情報の判断要素は以下の通りである。 The pattern determination unit 10 is as described above. The pattern recognition unit 16 recognizes a touch input pattern using the above-described physical information obtained from the touch input detected by the touch detection unit 22 as a determination element. In this embodiment, physical information determination elements are as follows.
 (1) タッチパネル部56(図4)上にタッチしたポイント数(図8、図9、図25、図26の例ではポイント数は1、図23、図24、図28の例ではポイント数は2である。)
 (2) ポイントの位置(座標位置X、Y)
 (3) ポイントを動かす速度、加速度、角度
 (4) 指又はタッチペンでタッチパネル部56(図4)を押す圧力
(1) The number of points touched on the touch panel unit 56 (FIG. 4) (the number of points is 1 in the examples of FIGS. 8, 9, 25 and 26, and the number of points in the examples of FIGS. 23, 24 and 28 is 2)
(2) Point position (coordinate position X, Y)
(3) Speed, acceleration, and angle to move the point (4) Pressure to press the touch panel 56 (Fig. 4) with your finger or touch pen
 タッチ検出部22は、タッチパネル部56(図4)に対するタッチ入力として例えば、指やタッチペンによるタッチやストロークを検出し、物理情報としてタッチ情報を得る。 The touch detection unit 22 detects, for example, a touch or stroke with a finger or a touch pen as touch input to the touch panel unit 56 (FIG. 4), and obtains touch information as physical information.
 位置検出部24は、タッチパネル部56(図4)に対する指やタッチペンのタッチ位置、ストローク位置、角度等を検出し、物理情報(位置情報)を得る。 The position detection unit 24 detects the touch position, stroke position, angle, etc. of a finger or a touch pen with respect to the touch panel unit 56 (FIG. 4), and obtains physical information (position information).
 加速度算出部26は、タッチパネル部56(図4)に対する指やタッチペンのタッチ入力の位置情報から、物理情報として加速度を算出する。 The acceleration calculation unit 26 calculates acceleration as physical information from position information of a touch input of a finger or a touch pen with respect to the touch panel unit 56 (FIG. 4).
 圧力算出部28は、タッチパネル部56(図4)に対する指やタッチペンのタッチ位置の圧力を算出する。即ち、タッチパネル部56が受けるタッチ位置の圧力を算出し、物理情報として圧力を得る。 The pressure calculation unit 28 calculates the pressure at the touch position of the finger or the touch pen with respect to the touch panel unit 56 (FIG. 4). That is, the pressure at the touch position received by the touch panel unit 56 is calculated, and the pressure is obtained as physical information.
 角速度算出部30は、携帯端末装置20の角速度を算出し、物理情報である角速度情報を出力する。 The angular velocity calculation unit 30 calculates an angular velocity of the mobile terminal device 20 and outputs angular velocity information that is physical information.
 パターン登録制御部32は、既述のパターン認識部16で認識されたパターンの登録を制御する。この場合、登録するパターンは、タッチ入力から認識されたパターンの近似範囲にあるパターンを登録してもよい。既述のパターン認識部16は、タッチ検出部22、位置検出部24、加速度算出部26、圧力算出部28又は角速度算出部30から得られる少なくとも2種類の物理情報を用いてタッチ入力のパターンを認識する。認識されたパターンが登録され、又は登録パターンとの対比に用いられる。 The pattern registration control unit 32 controls registration of patterns recognized by the pattern recognition unit 16 described above. In this case, the pattern to be registered may be a pattern within the approximate range of the pattern recognized from the touch input. The pattern recognition unit 16 described above generates a touch input pattern using at least two types of physical information obtained from the touch detection unit 22, the position detection unit 24, the acceleration calculation unit 26, the pressure calculation unit 28, or the angular velocity calculation unit 30. recognize. The recognized pattern is registered or used for comparison with the registered pattern.
 通信制御部34は、無線通信や音声通話を制御し、入力パターンが登録パターンに合致すれば、その判定出力に応じて発呼制御や終話制御をする。 The communication control unit 34 controls wireless communication and voice communication. If the input pattern matches the registered pattern, the communication control unit 34 performs call control and call termination control according to the determination output.
 音声入出力制御部36は音声信号の出力や、音声入力の取込みを制御する。 The voice input / output control unit 36 controls output of voice signals and voice input.
 表示制御部38は表示部48(図4)に対する文字や画像の表示を制御する。 The display control unit 38 controls the display of characters and images on the display unit 48 (FIG. 4).
 ロック制御部40は、携帯端末装置20の機能のロックや、タッチ入力が適正であれば、そのロック解除を制御する機能部を構成する。 The lock control unit 40 constitutes a functional unit that controls the unlocking of the functions of the mobile terminal device 20 and the lock input if the touch input is appropriate.
 次に、この携帯端末装置のハードウェアについて、図4、図5及び図6を参照する。図4は、携帯端末装置のハードウェア構成例を示す図、図5は、タッチパネル部側から見た携帯端末装置を示す図、図6は、圧力センサ部を備える携帯端末装置を示す図である。図4、図5及び図6に示す構成は一例であって、本発明は斯かる構成に限定されるものではない。図4、図5及び図6において、図3と同一部分には同一符号を付してある。 Next, referring to FIG. 4, FIG. 5, and FIG. 4 is a diagram illustrating a hardware configuration example of the mobile terminal device, FIG. 5 is a diagram illustrating the mobile terminal device viewed from the touch panel unit side, and FIG. 6 is a diagram illustrating the mobile terminal device including the pressure sensor unit. . The configurations shown in FIGS. 4, 5, and 6 are examples, and the present invention is not limited to such configurations. 4, FIG. 5 and FIG. 6, the same parts as those in FIG.
 この携帯端末装置20は、既述の機能部(図3)を実現するハードウェアを備えており、図4に示すように、プロセッサ42と、センサ部44と、入力部46と、表示部48と、音声入出力部50と、通信部52と、記憶部54とを備える。 The mobile terminal device 20 includes hardware that implements the above-described functional unit (FIG. 3). As shown in FIG. 4, the processor 42, the sensor unit 44, the input unit 46, and the display unit 48 are provided. A voice input / output unit 50, a communication unit 52, and a storage unit 54.
 プロセッサ42は、記憶部54にあるOS(Operating System)やアプリケーションプログラムを実行し、センサ部44からの検出情報の取り込み等の入力制御、各種情報の検出、演算、表示制御、通信制御等を実行する。既述の機能部(図3)は、このプロセッサ42によって実現される。このプロセッサ42は、パターン判定により、アプリケーションを起動させるアプリケーション起動部を構成する。また、プログラム記憶部72にあるアプリケーションを起動させるアプリケーション起動部を構成する。 The processor 42 executes an OS (Operating System) and an application program stored in the storage unit 54, and performs input control such as capturing of detection information from the sensor unit 44, detection of various information, calculation, display control, communication control, and the like. To do. The above-described functional unit (FIG. 3) is realized by the processor 42. The processor 42 constitutes an application activation unit that activates an application by pattern determination. In addition, an application activation unit that activates an application in the program storage unit 72 is configured.
 センサ部44は、タッチパネル部56と、圧力センサ部58と、加速度センサ部60と、位置センサ部62と、角速度センサ部64とを備える。タッチパネル部56は、表示部48の前面部に設置され、指やタッチペンによるタッチ入力を検出する。圧力センサ部58は、タッチパネル部56に指やタッチペンによって加えられる圧力を検出する。加速度センサ部60は、指やタッチペンのタッチ入力から検出される加速度情報を出力する。位置センサ部62は、指やタッチペンのタッチ入力から検出される位置情報(座標情報)を出力する。角速度センサ部64は、携帯端末装置20の角速度を検出し、その角速度情報を出力する。 The sensor unit 44 includes a touch panel unit 56, a pressure sensor unit 58, an acceleration sensor unit 60, a position sensor unit 62, and an angular velocity sensor unit 64. The touch panel unit 56 is installed on the front surface of the display unit 48 and detects a touch input by a finger or a touch pen. The pressure sensor unit 58 detects pressure applied to the touch panel unit 56 by a finger or a touch pen. The acceleration sensor unit 60 outputs acceleration information detected from a touch input of a finger or a touch pen. The position sensor unit 62 outputs position information (coordinate information) detected from touch input of a finger or a touch pen. The angular velocity sensor unit 64 detects the angular velocity of the mobile terminal device 20 and outputs the angular velocity information.
 入力部46は、プロセッサ42の制御により、センサ部44のセンサ情報をプロセッサ42に入力する。 The input unit 46 inputs sensor information of the sensor unit 44 to the processor 42 under the control of the processor 42.
 表示部48は、タッチパネル部56の背後に設置され、プロセッサ42の制御により、タッチパネル部56に割り当てられた入力ボタンの表示や、出力情報を表示する。 The display unit 48 is installed behind the touch panel unit 56, and displays input buttons assigned to the touch panel unit 56 and output information under the control of the processor 42.
 音声入出力部50は、音声入力手段としてマイクロフォン66、音声出力手段としてレシーバ68を備え、プロセッサ42の制御により、マイクロフォン66から音声を取り込み、レシーバ68から音声を再生する。 The voice input / output unit 50 includes a microphone 66 as voice input means and a receiver 68 as voice output means. The voice input / output unit 50 takes in voice from the microphone 66 and reproduces voice from the receiver 68 under the control of the processor 42.
 通信部52は、アンテナ70を備え、プロセッサ42の制御により、無線基地局との通信を実行する。この通信部52はプロセッサ42とともに、パターン判定に基づき、通話を終了させる終話機能部や、ウェブサイトへログインさせるログイン機能部を構成している。 The communication unit 52 includes an antenna 70, and executes communication with the radio base station under the control of the processor 42. The communication unit 52, together with the processor 42, constitutes an end call function unit for terminating a call and a login function unit for logging in to a website based on pattern determination.
 記憶部54は、プログラム記憶部72と、データ記憶部74と、RAM(Random-Access Memory)76とを備える。プログラム記憶部72は、既述のOSやアプリケーションプログラムを格納し、アプリケーションプログラムには例えば、操作検出プログラム、ロックプログラム、ロック解除プログラム等がある。データ記憶部74にはパターン登録部が備えられ、このパターン登録部に入力パターンの対比対象であるパターンが登録される。RAM76は、入力ストロークから認識されるパターンの登録や判定等の各種処理のワークエリアを構成する。 The storage unit 54 includes a program storage unit 72, a data storage unit 74, and a RAM (Random-Access Memory) 76. The program storage unit 72 stores the OS and application programs described above, and examples of the application program include an operation detection program, a lock program, and an unlock program. The data storage unit 74 includes a pattern registration unit, and a pattern to be compared with the input pattern is registered in the pattern registration unit. The RAM 76 constitutes a work area for various processes such as registration and determination of patterns recognized from input strokes.
 この携帯端末装置20には図5に示すように、筐体78の前面部に表示部48とともにタッチパネル部56が設置されている。筐体78には表示部48を挟んでマイクロフォン66と、レシーバ68とが設置されている。 In this portable terminal device 20, as shown in FIG. 5, a touch panel unit 56 is installed together with a display unit 48 on the front surface of a casing 78. A microphone 66 and a receiver 68 are installed in the housing 78 with the display unit 48 interposed therebetween.
 タッチパネル部56の背面側には図6に示すように、圧力センサ部58が設置されている。タッチパネル部56の上面に指80の腹部をタッチさせ、タッチパネル部56を介して圧力Pを加えると、その圧力Pが圧力センサ部58に検出される。この検出出力は、圧力情報を表す電気信号として出力され、そのレベルが圧力レベルを表す。 As shown in FIG. 6, a pressure sensor unit 58 is installed on the back side of the touch panel unit 56. When the abdomen of the finger 80 is touched on the upper surface of the touch panel unit 56 and pressure P is applied via the touch panel unit 56, the pressure P is detected by the pressure sensor unit 58. This detection output is output as an electrical signal representing pressure information, and the level represents the pressure level.
 次に、タッチパネル部及びタッチ入力について、図7、図8及び図9を参照する。図7は、タッチパネル部を示す図、図8は、タッチパネル部とポイント移動を示す図、図9は、ポイント移動を検知した検出ポイントを表す図である。 Next, refer to FIGS. 7, 8 and 9 for the touch panel section and touch input. FIG. 7 is a diagram showing the touch panel unit, FIG. 8 is a diagram showing the touch panel unit and point movement, and FIG. 9 is a diagram showing detection points where the point movement is detected.
 タッチパネル部56のパネル面82には、図7に示すように、複数行、複数列のアドレスが設定され、この実施の形態では12行、10列の検出ポイント84が設定されている。即ち、12行、10列の検出ポイント84が位置センサ62の一例を構成する。アルファベットa、b・・・lが行(X座標)を表し、数字1、2・・・10が列(Y座標)を表している。 As shown in FIG. 7, a plurality of rows and a plurality of columns of addresses are set on the panel surface 82 of the touch panel unit 56. In this embodiment, detection points 84 of 12 rows and 10 columns are set. That is, the detection points 84 in 12 rows and 10 columns constitute an example of the position sensor 62. Alphabets a, b... L represent rows (X coordinates), and numerals 1, 2,... 10 represent columns (Y coordinates).
 そこで、このタッチパネル部56のパネル面82に対し、図8に示すように、指又はタッチペンを用いたタッチ入力の一例として螺旋状のストローク86が描かれ、これによって、ポイント軌跡88が生成される。この場合、ストローク86は、ユーザが無意識で行わない入力操作の一例である。 Therefore, as shown in FIG. 8, a spiral stroke 86 is drawn on the panel surface 82 of the touch panel unit 56 as an example of touch input using a finger or a touch pen, thereby generating a point locus 88. . In this case, the stroke 86 is an example of an input operation that the user does not perform unconsciously.
 このストローク86は、タッチパネル部56の右上部を始点90とし、この始点90から中央部に向かって曲線状に移動し、中央部で周回部92を成すとともに、クロス部94を形成し、このクロス部94から左下部に曲線状に移動し、左下で終点96を形成している。このポイント軌跡88は、ポイント9bを始点90に、ポイント8b-7b-7c-6c-5c-5d-4d-3eに至り、ポイント3eを変曲点として、ポイント4e-4f-4g-5g-6g-7g-7fに至り、ポイント7fを変曲点として、ポイント7e-6e-5e-5f-4fに至り、このポイント4fでポイント軌跡88がクロスし、ポイント4fから4g-3g-3h-2i-2j-2k-2lに至り、ポイント2lが終点96である。即ち、ポイント軌跡88は、パネル面82の右上方角部側を起点とし、左下方角部側に向かう対角線上に記され、中間部に周回部92とともに、クロス部94を形成している。 The stroke 86 starts from the upper right portion of the touch panel portion 56, moves in a curved shape from the start point 90 toward the center portion, forms a revolving portion 92 at the center portion, and forms a cross portion 94. The curve moves from the portion 94 to the lower left and forms an end point 96 at the lower left. This point trajectory 88 starts at the point 9b and reaches the point 8b-7b-7c-6c-5c-5d-4d-3e. The point 4e is the point of inflection and the points 4e-4f-4g-5g-6g -7g-7f, point 7f as an inflection point, point 7e-6e-5e-5f-4f, point trajectory 88 crosses at this point 4f, and from point 4f to 4g-3g-3h-2i- 2j-2k-2l is reached, and the point 2l is the end point 96. That is, the point locus 88 is written on a diagonal line starting from the upper right corner portion side of the panel surface 82 and directed to the lower left corner portion side, and forms a cross portion 94 together with the circulating portion 92 in the intermediate portion.
 そして、ストローク86に対応するポイント軌跡88は、図9に示すように、パネル面82の連続した複数の検出ポイント列によって検出される。即ち、ポイント軌跡88は、検出ポイント9b-8b-7b-7c-6c-5c-5d-4d-3e-4e-3f-4f-5g-6g-7g-7f-7e-6e-5e-5f-4f-4g-3g-3h-2h-2i-2j-2k-2lで検出されている。このポイント軌跡88において、検出ポイント4fでは、時間差Δtを以て、2回の検出が行われており、この検出を以て、始点90から終点96に至るポイント軌跡88上にクロス部94が存在していることが判る。 Then, the point locus 88 corresponding to the stroke 86 is detected by a plurality of continuous detection point rows on the panel surface 82 as shown in FIG. That is, the point locus 88 corresponds to the detection points 9b-8b-7b-7c-6c-5c-5d-4d-3e-4e-3f-4f-5g-6g-7g-7f-7e-6e-5e-5f-4f. -4g-3g-3h-2h-2i-2j-2k-2l. In this point locus 88, the detection point 4f is detected twice with a time difference Δt, and the cross portion 94 exists on the point locus 88 from the start point 90 to the end point 96 with this detection. I understand.
 ストローク86に対応するポイント軌跡88がパターン98を構成する。即ち、タッチ入力のストローク86がポイント軌跡88の集合体としてパターン98が認識される。そして、このパターン98を包囲する複数の検出ポイント6a、7a、8a、9a、10a、4b、5b、6b、10b、3c、4c、8c、9c、10c、2d、3d、6d、7d、8d、2e、8e、2f、6f、8f、1g、2g、8g、1h、4h、5h、6h、7h、8h、1i、3i、4i、1j、3j、1k、3k、1l、3lより、パターン98に対する近似範囲のパターン100が形成される。この場合、パターン98は斜線で表され、パターン100は太い実線からなる包囲線で示している。 The point locus 88 corresponding to the stroke 86 constitutes the pattern 98. That is, the pattern 98 is recognized as the stroke 86 of the touch input as an aggregate of the point locus 88. A plurality of detection points 6a, 7a, 8a, 9a, 10a, 4b, 5b, 6b, 10b, 3c, 4c, 8c, 9c, 10c, 2d, 3d, 6d, 7d, 8d, which surround the pattern 98, 2e, 8e, 2f, 6f, 8f, 1g, 2g, 8g, 1h, 4h, 5h, 6h, 7h, 8h, 1i, 3i, 4i, 1j, 3j, 1k, 3k, 1l, 3l An approximate range pattern 100 is formed. In this case, the pattern 98 is represented by diagonal lines, and the pattern 100 is represented by an encircling line consisting of a thick solid line.
 斯かる構成とすれば、無意識で操作されないストローク86をタッチ入力の判断要素の物理情報であるタッチ位置、タッチポイントを用いてパターン98を認識することができる。また、パターン98の周囲の一つの既述の検出ポイント6a~3lの範囲及び周回部92の内側にある検出ポイント(この場合、検出ポイント6fも含む)を以て近傍範囲であるパターン100を認識することができる。これらパターン98又はパターン100を登録すればよい。 With such a configuration, it is possible to recognize the pattern 98 by using the touch position and the touch point which are physical information of the determination element of the touch input for the stroke 86 which is not operated unconsciously. In addition, the pattern 100 which is the vicinity range is recognized by the range of the above-described detection points 6a to 3l around the pattern 98 and the detection points (including the detection point 6f in this case) inside the circulating portion 92. Can do. These patterns 98 or 100 may be registered.
 次に、タッチ入力の位置検出、加速度算出、圧力算出及び角速度算出について、図10、図11、図12及び図13を参照する。図10は、位置検出の処理手順を示すフローチャート、図11は、加速度算出の処理手順を示すフローチャート、図12は、圧力算出の処理手順を示すフローチャート、図13は、角速度算出の処理手順を示すフローチャートである。 Next, with reference to FIG. 10, FIG. 11, FIG. 12, and FIG. 13 for position detection of touch input, acceleration calculation, pressure calculation, and angular velocity calculation. FIG. 10 is a flowchart showing the position detection processing procedure, FIG. 11 is a flowchart showing the acceleration calculation processing procedure, FIG. 12 is a flowchart showing the pressure calculation processing procedure, and FIG. 13 shows the angular velocity calculation processing procedure. It is a flowchart.
 タッチ入力の位置検出の処理手順は、タッチパネル部56上のタッチの位置や軌跡を検出し、その位置や軌跡を表す位置情報を出力する処理である。 The touch input position detection processing procedure is a process of detecting a touch position or locus on the touch panel unit 56 and outputting position information representing the position or locus.
 この処理手順は、図10に示すように、指やタッチペンでタッチパネル部56をタッチする(ステップS111)。タッチパネル部56上のタッチ位置を表す座標軸であるX軸及びY軸の位置情報を検出し(ステップS112)、タッチ入力の判断要素である位置情報を出力する(ステップS113)。 In this processing procedure, as shown in FIG. 10, the touch panel unit 56 is touched with a finger or a touch pen (step S111). Position information on the X axis and the Y axis, which are coordinate axes representing the touch position on the touch panel unit 56, is detected (step S112), and position information that is a determination element for touch input is output (step S113).
 次に、タッチ入力の加速度算出の処理手順は、タッチパネル部56上のタッチの位置や軌跡を表す位置情報から加速度を算出する処理である。 Next, the processing procedure for calculating touch input acceleration is processing for calculating acceleration from position information representing the position and locus of touch on the touch panel unit 56.
 この処理手順は、図11に示すように、タッチパネル部56に対する指やタッチペンの接触及びその位置を監視する(ステップS121)。指やタッチペンがタッチパネル部56にタッチしていれば(ステップS121のYES)、タッチパネル部56のタッチ位置を表すX軸及びY軸の位置情報を検出する(ステップS122)。その位置情報が変化しているかを監視する(ステップS123)。位置情報が変化していれば(ステップS123のYES)、タッチパネル部56のタッチ位置を表すX軸及びY軸の位置情報を検出する(ステップS124)。その位置情報をから加速度を算出し(ステップS125)、タッチ入力の判断要素である加速度を出力する(ステップS126)。 In this processing procedure, as shown in FIG. 11, the touch of the finger or the touch pen on the touch panel unit 56 and the position thereof are monitored (step S121). If a finger or a touch pen is touching the touch panel unit 56 (YES in step S121), X-axis and Y-axis position information indicating the touch position of the touch panel unit 56 is detected (step S122). Whether the position information has changed is monitored (step S123). If the position information has changed (YES in step S123), X-axis and Y-axis position information indicating the touch position of the touch panel unit 56 is detected (step S124). An acceleration is calculated from the position information (step S125), and an acceleration that is a determination element for touch input is output (step S126).
 タッチパネル部56上のタッチ位置を表すX軸及びY軸の位置情報が変化していなければ(ステップS123のNO)、加速度の算出はなく、この処理を終了する。 If the X-axis and Y-axis position information representing the touch position on the touch panel unit 56 has not changed (NO in step S123), the acceleration is not calculated, and this process ends.
 次に、タッチ入力の圧力算出の処理手順は、タッチパネル部56上のタッチの位置や軌跡を検出し、その位置や軌跡を表す圧力を出力する処理である。 Next, the processing procedure for calculating the pressure of the touch input is a process of detecting the position and locus of the touch on the touch panel unit 56 and outputting the pressure representing the position and locus.
 この処理手順は、図12に示すように、タッチパネル部56に対する指やタッチペンの接触及びその位置を監視する(ステップS131)。タッチパネル部56に接触している部分の圧力を算出し(ステップS132)、タッチ入力の判断要素である圧力を出力する(ステップS133)。この実施の形態では、ステップS133で、算出した圧力を出力するかについて判断し、出力しない場合にはステップS131に復帰しているが、このような判断をすることなく、出力してもよい。 In this processing procedure, as shown in FIG. 12, the touch of the finger or the touch pen on the touch panel unit 56 and the position thereof are monitored (step S131). The pressure of the part in contact with the touch panel unit 56 is calculated (step S132), and the pressure that is the touch input determination element is output (step S133). In this embodiment, it is determined whether or not the calculated pressure is output in step S133, and if it is not output, the process returns to step S131, but may be output without making such a determination.
 次に、タッチ入力の角速度算出の処理手順は、携帯端末装置の移動を監視し、その角速度を算出する処理である。 Next, the processing procedure for calculating the angular velocity of the touch input is a process of monitoring the movement of the mobile terminal device and calculating the angular velocity.
 この処理手順は、図13に示すように、携帯端末装置20が静止しているかを監視し(ステップS141)、携帯端末装置20が静止していなければ(ステップS141のNO)、その移動から携帯端末装置20の角速度を算出する(ステップS142)。また、携帯端末装置20が静止していれば(ステップS141のYES)、携帯端末装置20の角速度Vθは零(Vθ=0)である(ステップS143)。 As shown in FIG. 13, this processing procedure monitors whether the portable terminal device 20 is stationary (step S141). If the portable terminal device 20 is not stationary (NO in step S141), the mobile terminal device 20 is moved from the movement to the portable terminal device. The angular velocity of the terminal device 20 is calculated (step S142). If the mobile terminal device 20 is stationary (YES in step S141), the angular velocity Vθ of the mobile terminal device 20 is zero (Vθ = 0) (step S143).
 そして、判断要素である物理情報として角速度Vθ又はVθ=0を出力する(ステップS144)。 Then, the angular velocity Vθ or Vθ = 0 is output as physical information that is a determination element (step S144).
 斯かる構成によれば、タッチ入力の判断要素である物理情報として、タッチ位置、加速度、圧力、角速度が得られ、これらの物理情報からタッチ入力のパターンを認識することができる。 According to such a configuration, a touch position, acceleration, pressure, and angular velocity are obtained as physical information that is a determination element of touch input, and a touch input pattern can be recognized from the physical information.
 次に、携帯端末装置のロック設定について、図14を参照する。図14は、ロック設定の処理手順を示すフローチャートである。 Next, FIG. 14 will be referred to regarding the lock setting of the mobile terminal device. FIG. 14 is a flowchart illustrating a lock setting process.
 このロック設定の処理手順は、ロック設定にタッチパネル部56に対するタッチ入力から得られる既述の物理情報を判断要素としてタッチ入力のパターンを認識し、その登録パターンを解除情報として設定する。 This lock setting processing procedure recognizes the touch input pattern using the above-described physical information obtained from the touch input to the touch panel unit 56 for the lock setting, and sets the registered pattern as the release information.
 そこで、この処理手順では、図14に示すように、ロック設定かを判定し(ステップS201)、ロック設定でなければ(ステップS201のNO)、この処理を終了する。 Therefore, in this processing procedure, as shown in FIG. 14, it is determined whether or not the lock is set (step S201). If the lock is not set (NO in step S201), the process ends.
 ロック設定であれば(ステップS201のYES)、パターン登録モードを起動し、パターン登録を行う(ステップS202)。パターンを登録したかを判定し(ステップS203)、パターン登録をしなければ(ステップS203のNO)、この処理を終了する。また、パターン登録をすれば(ステップS203のYES)、ロック設定を実行する(ステップS204)。このロック設定は、タッチ入力によるパターン登録が関係付けられているので、その解除には登録パターンと入力パターンとが合致することが必要である。 If it is a lock setting (YES of step S201), pattern registration mode will be started and pattern registration will be performed (step S202). It is determined whether or not the pattern has been registered (step S203). If the pattern is not registered (NO in step S203), this process ends. If the pattern is registered (YES in step S203), lock setting is executed (step S204). Since this lock setting is related to pattern registration by touch input, it is necessary for the registration pattern and the input pattern to coincide with each other in order to cancel the lock setting.
 次に、パターン登録の処理について、図15を参照する。図15は、登録時の処理手順を示すフローチャートである。 Next, referring to FIG. 15 for pattern registration processing. FIG. 15 is a flowchart showing a processing procedure during registration.
 このパターン登録の処理手順は、機能ロック等の設定に際し、タッチパネル部56に対するタッチ入力のパターンを登録する処理である。登録されたパターンは、ロック解除のためのユーザ認証の処理に用いられる。 This pattern registration processing procedure is a process of registering a touch input pattern for the touch panel unit 56 when setting a function lock or the like. The registered pattern is used for user authentication processing for unlocking.
 この処理手順は、図15に示すように、タッチパネル部56上の指が置かれたポイントを表すX軸及びY軸の位置情報を検出する(ステップS211)。ポイントの位置がどの範囲に該当するかを判断する(ステップS212)。指やタッチペンからの圧力を算出する(ステップS213)。ポイントの移動を監視する(ステップS214)。ポイントの移動があれば(ステップS214のYES)、ポイントの位置距離移動より、そのポイント移動の加速度を算出する、又は、ポイントの移動速度を算出する(ステップS215)。 In this processing procedure, as shown in FIG. 15, the position information of the X axis and the Y axis representing the point where the finger is placed on the touch panel unit 56 is detected (step S211). It is determined to which range the position of the point corresponds (step S212). The pressure from the finger or the touch pen is calculated (step S213). The movement of the point is monitored (step S214). If there is a movement of the point (YES in step S214), the acceleration of the point movement is calculated or the movement speed of the point is calculated from the movement of the position of the point (step S215).
 そして、この場合、タッチ入力の位置情報、圧力、加速度によってパターンを生成させ(ステップS216)、このパターンをパターン登録部に登録し(ステップS217)、メインルーティン(図14)のステップS202に戻る。そして、ロック設定が実行される。 In this case, a pattern is generated by the position information, pressure, and acceleration of the touch input (step S216), this pattern is registered in the pattern registration unit (step S217), and the process returns to step S202 of the main routine (FIG. 14). Then, lock setting is executed.
 次に、ロック解除の処理について、図16を参照する。図16は、ロック解除の処理手順を示すフローチャートである。 Next, FIG. 16 will be referred to regarding the unlocking process. FIG. 16 is a flowchart showing the unlocking procedure.
 このロック解除の処理手順は、ロック解除情報として登録パターンが存在する場合のロック解除の処理である。 This unlock process is a process for unlocking when there is a registered pattern as unlock information.
 この処理手順では、図16に示すように、ロックを解除するかの判定を行い(ステップS221)、ロック解除でなければ(ステップS221のNO)、この処理を終了する。ロック解除であれば(ステップS221のYES)、既述のタッチ入力を行う(ステップS222)。このタッチ入力から得られる既述の物理情報を判断要素としてタッチ入力のパターンを認識する。 In this processing procedure, as shown in FIG. 16, it is determined whether or not the lock is released (step S221). If the lock is not released (NO in step S221), this process is terminated. If the lock is released (YES in step S221), the above-described touch input is performed (step S222). The touch input pattern is recognized using the above-described physical information obtained from the touch input as a determination element.
 タッチ入力によって得られた入力パターンが登録パターンに合致するかの判定が行われる(ステップS223)。入力パターンが登録パターンに合致した場合には(ステップS223のYES)、ロック解除が行われ(ステップS224)、入力パターンが登録パターンに合致しなければ(ステップS223のNO)、ロック解除不可となる(ステップS225)。この場合、入力パターンと登録パターンとの合致判定は、登録パターンに近似範囲を含んで判断してもよいことは既述の通りである。 It is determined whether the input pattern obtained by touch input matches the registered pattern (step S223). If the input pattern matches the registered pattern (YES in step S223), the lock is released (step S224). If the input pattern does not match the registered pattern (NO in step S223), the lock cannot be released. (Step S225). In this case, as described above, the match determination between the input pattern and the registered pattern may be determined by including the approximate range in the registered pattern.
 そこで、このロック解除の処理について、図17を参照する。図17は、ロック解除の具体的な処理手順の一例を示すフローチャートである。 Therefore, FIG. 17 will be referred to regarding this unlocking process. FIG. 17 is a flowchart illustrating an example of a specific processing procedure for unlocking.
 この処理手順では、ロック解除の場合、図17に示すように、タッチパネル部56上に指やタッチペンの置かれたポイントの位置(X、Y)を算出する(ステップS231)。この場合、入力パターンが登録パターンに合致するかの第1の判断として、ポイントの位置が座標範囲(1a~10l)のどの部分に該当するかを判断する(ステップS232)。即ち、登録時のポイントの位置の前後左右の範囲に入るかどうかを判定する(ステップS233)。例えば、登録時がポイント9b部分の場合、ポイント8a~10a、8b~10b、8c~10cに当てはまるかを判定する。この判定において、ポイントの位置の座標範囲に該当しなければ(ステップS233のNO)、ロック解除不可となる(ステップS234)。 In this processing procedure, in the case of unlocking, as shown in FIG. 17, the position (X, Y) of the point where the finger or the touch pen is placed on the touch panel unit 56 is calculated (step S231). In this case, as a first determination as to whether the input pattern matches the registered pattern, it is determined to which part of the coordinate range (1a to 10l) the position of the point corresponds (step S232). That is, it is determined whether or not the position of the point at the time of registration falls within the range of front, rear, left and right (step S233). For example, when the point 9b is registered, it is determined whether the point 8a to 10a, 8b to 10b, and 8c to 10c are applicable. In this determination, if it does not correspond to the coordinate range of the position of the point (NO in step S233), the lock cannot be released (step S234).
 この判定において、ポイントの位置の座標範囲に該当すれば(ステップS233のYES)、入力パターンが登録パターンに合致するかの第2の判断として、圧力について、登録パターンと比較し、判定する(ステップS235)。例えば、登録時のポイントの圧力を○○〔N〕とすると、その圧力が所定の圧力範囲として例えば、○○〔N〕×0.8~○○〔N〕×1.2の範囲にあるかを判定する(ステップS236)。この圧力判定において、所定の圧力範囲に該当しなければ(ステップS236のNG)、ロック解除不可となる(ステップS237)。 In this determination, if the coordinate range of the position of the point falls (YES in step S233), as a second determination whether the input pattern matches the registered pattern, the pressure is compared with the registered pattern for determination (step (step S233). S235). For example, if the pressure at the point of registration is XX [N], the pressure is within a predetermined pressure range, for example, XX [N] × 0.8 to XX [N] × 1.2. Is determined (step S236). In this pressure determination, if the pressure does not fall within the predetermined pressure range (NG in step S236), the lock cannot be released (step S237).
 このような圧力判定が所定の圧力範囲にあれば(ステップS236のOK)、ポイントの移動を監視する(ステップS238)。ポイントの移動があれば(ステップS238のYES)、ポイントの位置距離移動から加速度を算出し、この加速度について、登録パターンと比較し、判定する(ステップS239)。例えば、登録時のポイントの加速度が△△〔cm/s〕であれば、その加速度が所定の範囲として例えば、△△〔N〕×0.8~△△〔N〕×1.2の範囲に該当するかを判定する(ステップS240)。この加速度が所定の範囲外であれば(ステップS240のNG)、ロック解除不可となる(ステップS241)。 If such pressure determination is within a predetermined pressure range (OK in step S236), the movement of the point is monitored (step S238). If there is a movement of the point (YES in step S238), an acceleration is calculated from the movement of the position distance of the point, and this acceleration is compared with the registered pattern for determination (step S239). For example, if the acceleration at the point at the time of registration is ΔΔ [cm / s 2 ], the acceleration is within a predetermined range, for example, ΔΔ [N] × 0.8 to ΔΔ [N] × 1.2. It is determined whether it falls within the range (step S240). If this acceleration is outside the predetermined range (NG in step S240), the lock cannot be released (step S241).
 また、この加速度が所定の範囲内であれば(ステップS240のOK)、ロック解除となり(ステップS242)、この処理を終了する。 If this acceleration is within a predetermined range (OK in step S240), the lock is released (step S242), and this process is terminated.
 このようにタッチ入力から得られる既述の物理情報を判断要素として認識されるパターンを用いることにより、ロック設定及びロック解除を行えば、ユーザの無意識の操作によるロック解除を防止できる。また、セキュリティを高めることができる。 By using a pattern that recognizes the above-described physical information obtained from touch input as a determination element in this way, if the lock is set and unlocked, it is possible to prevent unlocking by an unconscious operation of the user. Moreover, security can be improved.
 次に、終話設定の処理について、図18を参照する。図18は、終話設定の処理手順を示すフローチャートである。 Next, with reference to FIG. FIG. 18 is a flowchart showing the processing procedure for setting the end call.
 この終話設定の処理手順は、終話処理にタッチパネル部56に対するタッチ入力から得られる既述の物理情報を判断要素としてタッチ入力のパターンを認識し、その登録パターンを終話情報として設定する。 This end-call setting processing procedure recognizes the touch input pattern using the above-described physical information obtained from touch input to the touch panel unit 56 for the end-call processing as a determination element, and sets the registered pattern as end-call information.
 そこで、この処理手順では、図18に示すように、終話設定かを判定し(ステップS301)、終話設定でなければ(ステップS301のNO)、この処理を終了する。 Therefore, in this processing procedure, as shown in FIG. 18, it is determined whether or not the end of the call is set (step S301). If the end of call is not set (NO in step S301), this process ends.
 終話設定であれば(ステップS301のYES)、パターン登録モードを起動し、パターン登録を行う(ステップS302)。パターン登録をしたかを判定し(ステップS303)、パターン登録をしなければ(ステップS303のNO)、この処理を終了する。また、パターン登録をすれば(ステップS303のYES)、終話設定処理を実行する(ステップS304)。この終話設定処理は、タッチ入力によるパターン登録が関係付けられているので、その終話設定処理には登録パターンと入力パターンとが合致することが必要である。 If it is the end call setting (YES in step S301), the pattern registration mode is activated and pattern registration is performed (step S302). It is determined whether or not the pattern is registered (step S303). If the pattern is not registered (NO in step S303), this process is terminated. If the pattern is registered (YES in step S303), the end of call setting process is executed (step S304). Since this end-call setting process is related to pattern registration by touch input, it is necessary for the end-call setting process to match the registered pattern and the input pattern.
 次に、終話設定のためのパターン登録の処理について、図19を参照する。図19は、パターン登録の処理手順を示すフローチャートである。 Next, referring to FIG. 19 for pattern registration processing for setting the end of call. FIG. 19 is a flowchart showing a pattern registration processing procedure.
 このパターン登録の処理手順は、機能ロック等の設定に際し、タッチパネル部56に対するタッチ入力のパターンを登録する処理である。登録されたパターンは、ロック解除のためのユーザ認証の処理に用いられる。 This pattern registration processing procedure is a process of registering a touch input pattern for the touch panel unit 56 when setting a function lock or the like. The registered pattern is used for user authentication processing for unlocking.
 この処理手順は、図19に示すように、タッチパネル部56上の指が置かれたポイントを表すX軸及びY軸の位置情報を検出する(ステップS311)。ポイントの位置がどの範囲に該当するかを判断する(ステップS312)。指やタッチペンからの圧力を算出する(ステップS313)。ポイントの移動を監視する(ステップS314)。ポイントの移動があれば(ステップS314のYES)、ポイントの位置距離移動より、そのポイント移動の加速度を算出する、又は、ポイントの移動速度を算出する(ステップS315)。 In this processing procedure, as shown in FIG. 19, position information on the X axis and the Y axis representing the point where the finger is placed on the touch panel unit 56 is detected (step S311). It is determined to which range the position of the point corresponds (step S312). The pressure from the finger or the touch pen is calculated (step S313). The movement of the point is monitored (step S314). If there is a movement of the point (YES in step S314), the acceleration of the movement of the point is calculated or the movement speed of the point is calculated from the movement of the position of the point (step S315).
 そして、この場合、タッチ入力の位置情報、圧力、加速度によってパターンを生成させ(ステップS316)、このパターンをパターン登録部に登録し(ステップS317)、メインルーティン(図18)のステップS302に戻る。そして、終話設定が実行される。 In this case, a pattern is generated by the position information, pressure, and acceleration of the touch input (step S316), this pattern is registered in the pattern registration unit (step S317), and the process returns to step S302 of the main routine (FIG. 18). Then, the end call setting is executed.
 次に、終話の処理について、図20を参照する。図20は、終話の処理手順を示すフローチャートである。 Next, referring to FIG. FIG. 20 is a flowchart showing the processing procedure of the end talk.
 この終話の処理手順は、終話情報として登録パターンが存在する場合の終話の処理である。 This end story processing procedure is the end story processing when a registered pattern exists as the end story information.
 この処理手順では、図20に示すように、通話中であるかの判定を行い(ステップS321)、通話中でなければ(ステップS321のNO)、この処理を終了する。通話中であれば(ステップS321のYES)、終話のためにタッチ入力をする(ステップS322)。タッチ入力からパターンを認識する。即ち、タッチ入力から得られる既述の物理情報を判断要素としてタッチ入力のパターンを認識する。 In this processing procedure, as shown in FIG. 20, it is determined whether or not a call is in progress (step S321). If the call is not in progress (NO in step S321), this process ends. If the call is in progress (YES in step S321), touch input is performed for the end of the call (step S322). Recognize patterns from touch input. That is, the touch input pattern is recognized using the above-described physical information obtained from the touch input as a determination element.
 タッチ入力によって得られた入力パターンが登録パターンに合致するかの判定が行われる(ステップS323)。入力パターンが登録パターンに合致した場合には(ステップS323のYES)、終話ロックが解除され(ステップS324)、終話処理が実行される(ステップS325)。入力パターンが登録パターンに合致しなければ(ステップS323のNO)、終話ロックの解除不可となる(ステップS326)。この場合、入力パターンと登録パターンとの合致判定は、登録パターンに近似範囲を含んで判断してもよいことは既述の通りである。 It is determined whether the input pattern obtained by touch input matches the registered pattern (step S323). If the input pattern matches the registered pattern (YES in step S323), the end of call lock is released (step S324), and the end of call process is executed (step S325). If the input pattern does not match the registered pattern (NO in step S323), the end-of-call lock cannot be released (step S326). In this case, as described above, the match determination between the input pattern and the registered pattern may be determined by including the approximate range in the registered pattern.
 そこで、この終話の処理について、図21を参照する。図21は、ロック解除の具体的な処理手順の一例を示すフローチャートである。 Therefore, referring to FIG. FIG. 21 is a flowchart illustrating an example of a specific processing procedure for unlocking.
 この処理手順では、終話処理の場合、図21に示すように、タッチパネル部56上に指やタッチペンの置かれたポイントの位置(X、Y)を算出する(ステップS331)。入力パターンと登録パターンとの第1の判断として、ポイントの位置が座標範囲のどの部分に該当するかを判断する(ステップS332)。即ち、登録時のポイントの位置の前後左右の範囲に入るかどうかを判定する(ステップS333)。例えば、登録時がポイント9b部分の場合、ポイント8a~10a、8b~10b、8c~10cに当てはまるかを判定する。この判定において、ポイントの位置の座標範囲に該当しなければ(ステップS333のNO)、終話不可となる(ステップS334)。 In this processing procedure, in the case of the end-call processing, as shown in FIG. 21, the position (X, Y) of the point where the finger or the touch pen is placed on the touch panel unit 56 is calculated (step S331). As a first determination between the input pattern and the registered pattern, it is determined to which part of the coordinate range the position of the point corresponds (step S332). That is, it is determined whether or not the position of the point at the time of registration falls within the range of front, rear, left and right (step S333). For example, when the point 9b is registered, it is determined whether the point 8a to 10a, 8b to 10b, and 8c to 10c are applicable. In this determination, if it does not correspond to the coordinate range of the position of the point (NO in step S333), the call cannot be ended (step S334).
 この判定において、ポイントの位置の座標範囲に該当すれば(ステップS333のYES)、入力パターンと登録パターンとの第2の判断として、圧力について、登録パターンと比較し、判定する(ステップS335)。例えば、登録時のポイントの圧力を○○〔N〕とすると、その圧力が所定の圧力範囲として例えば、○○〔N〕×0.8~○○〔N〕×1.2の範囲にあるかを判定する。この圧力判定において、所定の圧力範囲に該当しなければ(ステップS336のNG)、終話不可となる(ステップS337)。 In this determination, if it falls within the coordinate range of the position of the point (YES in step S333), as a second determination between the input pattern and the registered pattern, the pressure is compared with the registered pattern for determination (step S335). For example, if the pressure at the point of registration is XX [N], the pressure is within a predetermined pressure range, for example, XX [N] × 0.8 to XX [N] × 1.2. Determine whether. In this pressure determination, if it does not fall within the predetermined pressure range (NG in step S336), the call cannot be ended (step S337).
 このような圧力判定が所定の圧力範囲にあれば(ステップS336のOK)、ポイントの移動を監視し(ステップS338)、ポイントの移動があれば(ステップS338のYES)、ポイントの位置距離移動から加速度を算出し、この加速度について、登録パターンと比較し、判定する(ステップS339)。例えば、登録時のポイントの加速度が△△〔cm/s〕であれば、その加速度が所定の範囲として例えば、△△〔N〕×0.8~△△〔N〕×1.2の範囲に該当するかを判定する(ステップS340)。この加速度が所定の範囲外であれば(ステップS340のNG)、終話不可となる(ステップS341)。 If such a pressure determination is within a predetermined pressure range (OK in step S336), the movement of the point is monitored (step S338). An acceleration is calculated, and this acceleration is compared with a registered pattern for determination (step S339). For example, if the acceleration of the point at the time of registration is ΔΔ [cm / s 2 ], the acceleration is a predetermined range, for example, ΔΔ [N] × 0.8 to ΔΔ [N] × 1.2. It is determined whether it falls within the range (step S340). If this acceleration is outside the predetermined range (NG in step S340), the end of speech is impossible (step S341).
 また、この加速度が所定の範囲内であれば(ステップS240のOK)、終話ロック解除となり、終話処理が実行され(ステップS342)、この処理を終了する。 If the acceleration is within the predetermined range (OK in step S240), the call lock is released, the call ending process is executed (step S342), and the process ends.
 このようにタッチ入力から得られる少なくとも2種類の物理情報を判断要素として認識されるパターンを用いることにより、終話処理又は終話不可とするので、ユーザの無意識の操作による終話を防止できる。 As described above, by using a pattern that recognizes at least two types of physical information obtained from touch input as a determination element, it is possible to prevent end-of-call processing or end-of-call.
〔第3の実施の形態〕 [Third Embodiment]
 第3の実施の形態は、タッチ入力から得られる5種の物理情報を判断要素としてタッチ入力のパターンを認識し、その登録パターン又はその近傍範囲に入力パターンが合致するか否かを判定する機能を有する。 The third embodiment has a function of recognizing a touch input pattern using five types of physical information obtained from touch input as a determination element and determining whether the input pattern matches the registered pattern or the vicinity thereof. Have
 この第3の実施の形態について、図22を参照する。図22は、ロック設定の処理手順を示すフローチャートである。図22に示す構成は一例であって、本発明は斯かる構成に限定されるものではない。 FIG. 22 is referred to for the third embodiment. FIG. 22 is a flowchart illustrating a lock setting process. The configuration shown in FIG. 22 is an example, and the present invention is not limited to such a configuration.
 この処理手順は、第2の実施の形態と同様に、携帯端末装置についてのロック設定であって、本開示の電子機器、操作検出方法又は操作検出プログラムの一例である。この実施の形態においても、図3に示す機能部、図4に示すハードウェア構成が用いられる。 This processing procedure is a lock setting for the mobile terminal device as in the second embodiment, and is an example of the electronic device, the operation detection method, or the operation detection program of the present disclosure. Also in this embodiment, the functional unit shown in FIG. 3 and the hardware configuration shown in FIG. 4 are used.
 この処理手順では、タッチ入力から得られる既述の物理情報として、タッチ入力のポイント数、ポイントの位置、ポイントの加速度、移動する角度、圧力の5種の判断要素が用いられている。即ち、この処理手順では、図22に示すように、ロック設定の判断(ステップS401)、タッチ入力(ステップS402、S403、S404、S405、S406)、パターン登録(ステップS407)及びロック設定(ステップS408)が含まれている。 In this processing procedure, five types of determination elements are used as the above-described physical information obtained from the touch input: the number of points of the touch input, the position of the point, the acceleration of the point, the moving angle, and the pressure. That is, in this processing procedure, as shown in FIG. 22, determination of lock setting (step S401), touch input (steps S402, S403, S404, S405, S406), pattern registration (step S407), and lock setting (step S408). )It is included.
 そこで、この処理手順では、ロック設定をするかの判断が行われ(ステップS401)、ロック設定をしない場合には(ステップS401のNO)、この処理を終了する。ロック設定の場合には(ステップS401のYES)、タッチ入力に移行する。第1のタッチ入力として、タッチパネル部56(図4、図5)上の最初に指が置かれたポイント数を検出し、タッチ入力の判断要素としてポイント数を求める(ステップS402)。 Therefore, in this processing procedure, it is determined whether or not lock setting is to be performed (step S401). If lock setting is not to be performed (NO in step S401), this processing is terminated. If the lock is set (YES in step S401), the process proceeds to touch input. As the first touch input, the number of points where the finger is first placed on the touch panel unit 56 (FIGS. 4 and 5) is detected, and the number of points is obtained as a determination element of the touch input (step S402).
 次に、第2のタッチ入力として、タッチパネル部56(図4)上のポイントの位置を検出し、タッチ入力の判断要素としてその位置情報(X,Y)を求める(ステップS403)。 Next, the position of a point on the touch panel unit 56 (FIG. 4) is detected as the second touch input, and the position information (X, Y) is obtained as a determination element of the touch input (step S403).
 次に、第3のタッチ入力として、ポイントが移動する加速度を検出し、タッチ入力の判断要素として加速度を求める(ステップS404)。 Next, as the third touch input, the acceleration at which the point moves is detected, and the acceleration is obtained as a determination element of the touch input (step S404).
 次に、第4のタッチ入力として、ポイントが移動する角度を検出し、タッチ入力の判断要素として角度を求める(ステップS405)。 Next, as the fourth touch input, an angle at which the point moves is detected, and the angle is obtained as a touch input determination element (step S405).
 次に、第5のタッチ入力として、指でタッチパネル部56(図6)を押す圧力の変化を検出し、タッチ入力の判断要素として圧力変化を求める(ステップS406)。 Next, as a fifth touch input, a change in pressure for pressing the touch panel unit 56 (FIG. 6) with a finger is detected, and a pressure change is obtained as a determination element for the touch input (step S406).
 このように、ポイント数、位置、加速度、角度、圧力変化の5種の物理情報を判断要素に用いてパターンを生成し、このパターンをデータ記憶部74(図4)に設定されたパターン登録部に登録する(ステップS407)。 In this way, a pattern is generated using the five types of physical information of the number of points, position, acceleration, angle, and pressure change as determination elements, and this pattern is set in the data storage unit 74 (FIG. 4). (Step S407).
 このパターン登録の後、ロック設定を実行する(ステップS408)。パターン登録とともに実行されたロック設定は、登録パターンに合致する入力パターンがなければ、そのロック解除をすることができない。 After this pattern registration, lock setting is executed (step S408). The lock setting executed together with pattern registration cannot be unlocked unless there is an input pattern that matches the registered pattern.
 このロック設定に用いられるタッチ入力について、図23及び図24を参照する。図23は、入力操作を示す図、図24は、タッチ入力を表す図である。 Referring to FIG. 23 and FIG. 24 for the touch input used for this lock setting. FIG. 23 is a diagram illustrating an input operation, and FIG. 24 is a diagram illustrating a touch input.
 携帯端末装置20において、人が無意識に操作しないタッチ入力として例えば、図23に示すように、タッチパネル部56の上部左側に人指し指102、その下部右側に親指104の先端を接触させる。その接触状態を維持しながら人指し指102を矢印a、親指104を矢印bに旋回させる。このようなタッチ入力によれば、離間した2か所のポイント、その移動による判断要素が得られる。 In the mobile terminal device 20, for example, as shown in FIG. 23, as the touch input that is not operated unintentionally by a person, the index finger 102 is brought into contact with the upper left side of the touch panel unit 56, and the tip of the thumb 104 is brought into contact with the lower right side. While maintaining the contact state, the index finger 102 is turned to the arrow a and the thumb 104 is turned to the arrow b. According to such touch input, two separated points and a determination element based on the movement can be obtained.
 この携帯端末装置20のタッチパネル部56には、図24に示すように、X軸方向に0ないし60の検出ポイントが設定され、Y軸方向に0ないし90の検出ポイントが設定されている。A点、A’点及び矢印aは人指し指102の押圧点及びその移動軌跡、B点、B’点及び矢印bは親指104の押圧点及びその移動軌跡を示している。この場合、A→A’の位置情報として、A→A’={X(=10),Y(=25)}→{X’(=50),Y’(=25)}が得られる。B→B’の位置情報として、B→B’={X(=50),Y(=65)}→{X’(=10),Y’(=65)}が得られる。 As shown in FIG. 24, 0 to 60 detection points are set in the X-axis direction and 0 to 90 detection points are set in the Y-axis direction on the touch panel unit 56 of the mobile terminal device 20. Point A, point A ′, and arrow a indicate the pressing point of the index finger 102 and its movement locus, and point B, point B ′, and arrow b indicate the pressing point of the thumb 104 and its movement locus. In this case, as position information of A → A ′, A → A ′ = {X A (= 10), Y A (= 25)} → {X A ′ (= 50), Y A ′ (= 25)} Is obtained. As position information of B → B ′, B → B ′ = {X B (= 50), Y B (= 65)} → {X B ′ (= 10), Y B ′ (= 65)} is obtained. .
 ロック設定に用いられる他のタッチ入力について、図25及び図26を参照する。図25は、他のタッチ入力を表す図、図26は、他の入力操作を示す図である。 Referring to FIG. 25 and FIG. 26 for other touch inputs used for lock setting. FIG. 25 is a diagram illustrating another touch input, and FIG. 26 is a diagram illustrating another input operation.
 携帯端末装置20において、人が無意識に操作しないタッチ入力として例えば、図25に示すように、タッチパネル部56の上部右側に人指し指102の先端を接触させ、その接触状態を維持しながら人指し指102を日本語の仮名文字「の」を描くように移動する。このようなタッチ入力によれば、一筆書きのストローク86(図8)によるタッチ入力の判断要素が得られる。 For example, as shown in FIG. 25, in the portable terminal device 20, as shown in FIG. 25, the tip of the index finger 102 is brought into contact with the upper right side of the touch panel unit 56, and the index finger 102 is placed in Japan while maintaining the contact state. Move to draw the kana character "no" of the word. According to such touch input, a determination element of touch input by a stroke 86 (FIG. 8) of one stroke can be obtained.
 このタッチ入力では、タッチパネル部56に図26に示すように、X軸方向に0ないし60の検出ポイントが設定され、Y軸方向に0ないし90の検出ポイントが設定され、これら検出ポイントから位置情報が得られる。この場合、始点をC点、終点をD点とすれば、C→Dの位置情報として、C→D={X(=50),Y(=25)}→{X(=10),Y(=65)}が得られる。 In this touch input, as shown in FIG. 26, 0 to 60 detection points are set in the X-axis direction and 0 to 90 detection points are set in the Y-axis direction on the touch panel unit 56, and position information is obtained from these detection points. Is obtained. In this case, if the start point is C point and the end point is D point, C → D = {X C (= 50), Y C (= 25)} → {X D (= 10) ), Y D (= 65)}.
 タッチ入力の判断要素について、図27、図28及び図29を参照する。図27は、タッチ入力の加速度変化を示す図、図28は、タッチ入力の角度範囲を示す図、図29は、タッチ入力の圧力変化を示す図である。 Refer to FIG. 27, FIG. 28 and FIG. 29 for the touch input determination element. FIG. 27 is a diagram showing a change in acceleration of a touch input, FIG. 28 is a diagram showing an angle range of the touch input, and FIG. 29 is a diagram showing a pressure change in the touch input.
 タッチ入力の始点106から終点108の間には、例えば、図27に示すように、時間とともに変化する加速度が検出される。この加速度は、1つの物理情報であって、タッチ入力の判断要素に用いることができる。 For example, as shown in FIG. 27, acceleration changing with time is detected between the start point 106 and the end point 108 of the touch input. This acceleration is one piece of physical information and can be used as a touch input determination element.
 タッチ入力の角度θは、例えば、図28に示すように、タッチ入力の判断要素として求めることができる。この場合、タッチパネル部56に設定したX軸とY軸の交点を基準点にし、タッチ入力の既述のA点、A’点の成す角度θが求められる。この場合、A点、A’点の成す角度θ=35〔°〕である。 The touch input angle θ can be determined as a touch input determination element, for example, as shown in FIG. In this case, using the intersection point of the X axis and the Y axis set in the touch panel unit 56 as a reference point, the angle θ formed by the aforementioned points A and A ′ of the touch input is obtained. In this case, the angle θ formed by point A and point A ′ is 35 °.
 また、タッチ入力の始点106から終点108の間には、例えば、図29に示すように、時間とともに変化する圧力が検出される。この圧力変化は、1つの物理情報であって、タッチ入力の判断要素に用いることができる。 Further, for example, a pressure that changes with time is detected between the start point 106 and the end point 108 of the touch input as shown in FIG. This pressure change is one piece of physical information and can be used as a determination element for touch input.
 他のタッチ入力について、図30及び図31を参照する。図30は、他の入力操作を示す図、図31は、他のタッチ入力を表す図である。 For other touch inputs, refer to FIG. 30 and FIG. FIG. 30 is a diagram illustrating another input operation, and FIG. 31 is a diagram illustrating another touch input.
 タッチ入力としてポイントの移動(図23、図24)、ストローク(図25、図26)を例示したが、これに限定されない。タッチ入力は文字入力であってもよい。図30に示すように、例えば、タッチパネル部56に人指し指102を接触させてアルファベットの「ON」を描いてもよく、この文字「ON」は、図31に示すように、タッチ入力の判断要素としてタッチパネル部56のX軸及びY軸上の位置情報が得られる。 Although the movement of the point (FIGS. 23 and 24) and the stroke (FIGS. 25 and 26) are exemplified as the touch input, the present invention is not limited to this. The touch input may be a character input. As shown in FIG. 30, for example, the alphabet finger “ON” may be drawn by bringing the index finger 102 into contact with the touch panel unit 56, and this character “ON” is used as a touch input determination element as shown in FIG. 31. Position information on the X axis and the Y axis of the touch panel unit 56 is obtained.
 以上の構成では、一つのストローク等、単一の操作によるタッチ入力からポイント数、位置、加速度、移動する角度、圧力の変化等の複数の物理情報を取得することができる。その結果、5種類の物理情報の組み合わせを得て、これらの組み合わせを例えば、論理積によってタッチ入力のパターンを生成させることができる。また、複数の物理情報を取得すれば、その物理情報から任意のものを2種類以上選択し、パターンを生成させることができる。 With the above configuration, it is possible to acquire a plurality of physical information such as the number of points, position, acceleration, moving angle, pressure change, etc., from a touch input by a single operation such as one stroke. As a result, a combination of five types of physical information is obtained, and a touch input pattern can be generated by logical combination of these combinations, for example. If a plurality of physical information is acquired, two or more kinds of physical information can be selected and a pattern can be generated.
 次に、ロック解除について、図32は、ロック解除の処理手順を示すフローチャートである。 Next, with respect to unlocking, FIG. 32 is a flowchart showing the unlocking processing procedure.
 この処理手順では、既述のロック設定(図22)に対応するロック解除の処理である。この処理手順では、ロック設定が既述の通り、タッチ入力で得られる既述の物理情報として、タッチ入力のポイント数、ポイントの位置、ポイントの加速度、移動する角度、圧力の5種の判断要素が用いられている。そこで、この処理手順では、図32に示すように、ロック解除の判断(ステップS411)、タッチ入力(ステップS412、S413、S414、S415、S416)、パターン判定(ステップS417)及びロック解除の可不可(ステップS418、S419)が含まれている。 This processing procedure is the unlocking process corresponding to the above-described lock setting (FIG. 22). In this processing procedure, as described above, the lock setting is as described above. As the physical information described above obtained by touch input, the five types of determination elements of touch input point number, point position, point acceleration, moving angle, and pressure are used. Is used. Therefore, in this processing procedure, as shown in FIG. 32, determination of unlocking (step S411), touch input (steps S412, S413, S414, S415, S416), pattern determination (step S417), and unlocking are impossible. (Steps S418 and S419) are included.
 そこで、この処理手順では、ロック解除をするかの判断が行われ(ステップS411)、ロック解除をしない場合には(ステップS411のNO)、この処理を終了する。ロック解除の場合には(ステップS411のYES)、タッチ入力に移行する。第1のタッチ入力として、タッチパネル部56(図4、図5)上の最初に指が置かれたポイント数を検出し、タッチ入力の判断要素としてポイント数を求める(ステップS412)。 Therefore, in this processing procedure, it is determined whether or not the lock is released (step S411). If the lock is not released (NO in step S411), the process is terminated. In the case of unlocking (YES in step S411), the process proceeds to touch input. As the first touch input, the number of points where the finger is first placed on the touch panel unit 56 (FIGS. 4 and 5) is detected, and the number of points is obtained as a determination element of the touch input (step S412).
 次に、第2のタッチ入力として、タッチパネル部56(図4)上のポイントの位置を検出し、タッチ入力の判断要素としてその位置情報(X,Y)を求める(ステップS413)。 Next, the position of the point on the touch panel unit 56 (FIG. 4) is detected as the second touch input, and the position information (X, Y) is obtained as the determination element of the touch input (step S413).
 次に、第3のタッチ入力として、ポイントが移動する加速度を検出し、タッチ入力の判断要素として加速度を求める(ステップS414)。 Next, as the third touch input, the acceleration at which the point moves is detected, and the acceleration is obtained as a touch input determination element (step S414).
 次に、第4のタッチ入力として、ポイントが移動する角度を検出し、タッチ入力の判断要素として角度を求める(ステップS415)。 Next, as the fourth touch input, an angle at which the point moves is detected, and the angle is obtained as a touch input determination element (step S415).
 次に、第5のタッチ入力として、指でタッチパネル部56(図6)を押す圧力の変化を検出し、タッチ入力の判断要素として圧力変化を求める(ステップS416)。 Next, as a fifth touch input, a change in pressure for pressing the touch panel unit 56 (FIG. 6) with a finger is detected, and a pressure change is obtained as a determination element for the touch input (step S416).
このように、ポイント数、位置、加速度、角度、圧力変化の5種の物理情報を判断要素に用いてパターンを生成し、この入力パターンがデータ記憶部74(図4)にある登録パターンに合致するかの判定を行う(ステップS417)。 In this way, a pattern is generated using the five types of physical information of the number of points, position, acceleration, angle, and pressure change as judgment elements, and this input pattern matches the registered pattern in the data storage unit 74 (FIG. 4). It is determined whether or not to perform (step S417).
 このパターン判定の後、入力パターンが登録パターンに合致すれば(ステップS417のYES)、ロック解除を実行する(ステップS418)。入力パターンが登録パターンに合致しなければ(ステップS417のNO)、ロック解除不可となる(ステップS419)。 After this pattern determination, if the input pattern matches the registered pattern (YES in step S417), unlocking is executed (step S418). If the input pattern does not match the registered pattern (NO in step S417), the lock cannot be released (step S419).
 以上の構成では、物理情報としてタッチ入力のポイント数、位置、加速度、移動する角度、圧力の変化を検出し、これらの論理積によってタッチ入力のパターンを認識し、それを登録している。それぞれが特徴的な物理情報であって、これらの複数の論理積でパターンを生成させている。このため、登録パターンと入力パターンとの判定精度が高められ、ユーザの無意識によるタッチ入力を排除することができ、誤動作を防止できる。 With the above configuration, the number of touch input points, position, acceleration, moving angle, and change in pressure are detected as physical information, and the touch input pattern is recognized by the logical product of these and registered. Each is characteristic physical information, and a pattern is generated by a plurality of these logical products. For this reason, the determination accuracy between the registered pattern and the input pattern can be improved, touch input due to the unconsciousness of the user can be eliminated, and malfunction can be prevented.
〔第4の実施の形態〕 [Fourth Embodiment]
 第4の実施の形態は、終話処理を終話ロックの解除の後、表示画面から終話を選択することを条件とすることにより、通話中にユーザの意図しない終話を防止する構成である。 The fourth embodiment is configured to prevent the user from unintentionally ending the call during a call by ending the end-of-call lock on the display screen after releasing the end-of-call lock. is there.
 この第4の実施の形態について、図33、図34及び図35を参照する。図33は、終話の処理手順を示すフローチャート、図34は、終話時の問い合わせ表示及び選択ボタンの表示例を示す図、図35は、終話の表示例を示す図である。 Referring to FIG. 33, FIG. 34 and FIG. 35 for the fourth embodiment. FIG. 33 is a flowchart showing the processing procedure of the end call, FIG. 34 is a view showing a display example of the inquiry display and selection button at the end of the call, and FIG. 35 is a view showing a display example of the end call.
 この終話処理の処理手順は、本開示の電子機器、操作検出方法又は操作検出プログラムの一例である。この実施の形態においても、図3に示す機能部、図4に示すハードウェア構成が用いられる。 The processing procedure of the end speech processing is an example of the electronic device, the operation detection method, or the operation detection program of the present disclosure. Also in this embodiment, the functional unit shown in FIG. 3 and the hardware configuration shown in FIG. 4 are used.
 この処理手順には、通話開始による通話ロック(ステップS501)、通話ロックの解除(ステップS502、S503、S504、S505)、終話処理(ステップS506、S507)が含まれる。 This processing procedure includes a call lock by starting a call (step S501), a call lock release (steps S502, S503, S504, S505), and a call termination process (steps S506, S507).
 そこで、この処理手順では、図33に示すように、通話開始によりその通話がロックされる(ステップS501)。このロック設定は、例えば、第2の実施の形態のロック設定(図14、図15)で行えばよい。 Therefore, in this processing procedure, as shown in FIG. 33, the call is locked when the call starts (step S501). This lock setting may be performed, for example, in the lock setting (FIGS. 14 and 15) of the second embodiment.
 通話中、ロック解除の監視状態が維持される(ステップS502、ステップS502のNO)。ロック解除であれば(ステップS502のYES)、ユーザがタッチ入力を行う(ステップS503)。このタッチ入力は、ユーザが無意識で行わない操作による入力であり、既述の通り、このタッチ入力から入力パターンが認識され、この入力パターンが登録パターンに合致するかが判定される(ステップS504)。入力パターンが登録パターンに合致しなければ(ステップS504のNO)、ロックが維持されるので、再度タッチ入力が必要である(ステップS503)。 During the call, the unlocked monitoring state is maintained (NO in step S502 and step S502). If the lock is released (YES in step S502), the user performs a touch input (step S503). This touch input is input by an operation that is not performed by the user unconsciously. As described above, the input pattern is recognized from this touch input, and it is determined whether the input pattern matches the registered pattern (step S504). . If the input pattern does not match the registered pattern (NO in step S504), the lock is maintained, so that touch input is required again (step S503).
 入力パターンが登録パターンに合致すれば(ステップS504のYES)、ロックが解除される(ステップS505)。ロックが解除されると、終話操作が求められ(ステップS506)、その操作により終話となり(ステップS507)、この処理を終了する。 If the input pattern matches the registered pattern (YES in step S504), the lock is released (step S505). When the lock is released, an end-of-speech operation is requested (step S506), and the end of the talk is made by this operation (step S507), and this process is terminated.
 終話操作(ステップS506)は、図34に示すように、表示制御により、表示部48に終話操作の問い合わせメッセージ110及び選択ボタン112、114を表示する。メッセージ110は例えば、「通話を終了しますか?」を表示する。選択ボタン112は、終話「YES」を指示するボタンであり、選択ボタン114は終話拒否「NO」を指示するボタンである。 In the end speech operation (step S506), as shown in FIG. 34, an inquiry message 110 for the end speech operation and selection buttons 112 and 114 are displayed on the display unit 48 by display control. For example, the message 110 displays “Do you want to end the call?”. The selection button 112 is a button for instructing an end story “YES”, and the selection button 114 is a button for instructing an end story rejection “NO”.
 そこで、ユーザが選択ボタン112をタッチすれば、そのタッチ入力により終話処理となる。また、選択ボタン114をタッチすれば、そのタッチ入力により終話拒否となり、通話が維持される。 Therefore, if the user touches the selection button 112, the end-of-call process is performed by the touch input. If the selection button 114 is touched, the call termination is rejected by the touch input, and the call is maintained.
 そして、終話処理が実行されると、図35に示すように、表示制御により、表示部48には、終話を告知するメッセージ116が表示される。このメッセージ116は例えば、「通話を終了しました」を表示する。この結果、ユーザは終話に移行したことを知ることができる。 Then, when the end speech processing is executed, as shown in FIG. 35, a message 116 for notifying the end of speech is displayed on the display unit 48 by display control. This message 116 displays, for example, “Call ended”. As a result, the user can know that the end of the conversation has ended.
 斯かる構成によれば、ロック解除の後、終話処理の問い合わせがあるので、ロック解除をユーザが知ることができ、しかも、ロック解除を認識した上で操作により終話することができる。ユーザの無意識な操作や、不用意な操作によって予定しない終話を防止できる。 According to such a configuration, since there is an inquiry about the end-of-call process after unlocking, the user can know the release of the lock, and the user can end the call by recognizing the release of the lock. It is possible to prevent unscheduled endings due to user's unconscious operation or careless operation.
〔第5の実施の形態〕 [Fifth Embodiment]
 第5の実施の形態は、プログラム記憶部72(図4)にあるアプリケーションプログラム(アプリ)の起動にタッチ入力のパターン判定を用いる構成である。 The fifth embodiment is a configuration in which touch input pattern determination is used to start an application program (application) in the program storage unit 72 (FIG. 4).
 この第5の実施の形態について、図36、図37、図38及び図39を参照する。図36は、アプリの起動設定の処理手順を示すフローチャート、図37は、パターン登録の処理手順を示すフローチャート、図38は、アプリケーションの起動の処理手順を示すフローチャート、図39は、アプリケーションの起動の具体的な処理手順を示すフローチャートである。この実施の形態においても、既述の図3に示す機能部、図4に示すハードウェア構成が用いられる。 Referring to FIG. 36, FIG. 37, FIG. 38 and FIG. 39 for the fifth embodiment. FIG. 36 is a flowchart showing the processing procedure of application activation setting, FIG. 37 is a flowchart showing the processing procedure of pattern registration, FIG. 38 is a flowchart showing the processing procedure of application activation, and FIG. It is a flowchart which shows a specific process sequence. Also in this embodiment, the functional unit shown in FIG. 3 and the hardware configuration shown in FIG. 4 are used.
 このアプリの起動設定の処理手順は、本開示の電子機器、操作検出方法又は操作検出プログラムの一例である。この処理手順では、アプリ起動設定にタッチパネル部56に対するタッチ入力から得られる既述の物理情報を判断要素を用いることにより、タッチ入力のパターンを認識し、その登録パターンをアプリ起動情報として設定する。 The processing procedure of this application activation setting is an example of the electronic device, the operation detection method, or the operation detection program of the present disclosure. In this processing procedure, a touch input pattern is recognized by using the above-described physical information obtained from touch input to the touch panel unit 56 for application activation setting, and the registered pattern is set as application activation information.
 そこで、この処理手順では、図36に示すように、アプリ起動設定かを判定し(ステップS601)、アプリ起動設定でなければ(ステップS601のNO)、この処理を終了する。 Therefore, in this processing procedure, as shown in FIG. 36, it is determined whether the application activation setting is set (step S601), and if it is not the application activation setting (NO in step S601), the process is terminated.
 アプリ起動設定であれば(ステップS601のYES)、パターン登録モードを起動し、パターン登録を行う(ステップS602)。パターン登録をしたかを判定し(ステップS603)、パターン登録をしなければ(ステップS603のNO)、この処理を終了する。また、パターン登録をすれば(ステップS603のYES)、アプリ起動設定を実行する(ステップS604)。このアプリ起動設定は、タッチ入力によるパターン登録が関係付けられているので、そのアプリ起動には登録パターンと入力パターンとが合致することが必要である。 If it is an application activation setting (YES in step S601), the pattern registration mode is activated and pattern registration is performed (step S602). It is determined whether pattern registration has been performed (step S603). If pattern registration is not performed (NO in step S603), this process ends. If the pattern is registered (YES in step S603), the application activation setting is executed (step S604). Since this application activation setting is associated with pattern registration by touch input, it is necessary that the registration pattern and the input pattern match for the application activation.
 このアプリ起動に用いるパターン登録の処理は、アプリ起動設定に際し、タッチパネル部56に対するタッチ入力のパターンを登録する処理である。登録されたパターンは、アプリ起動のためのユーザ認証の処理に用いられる。 The pattern registration process used for starting the application is a process of registering a touch input pattern for the touch panel unit 56 in setting the application activation. The registered pattern is used for user authentication processing for starting an application.
 そこで、この処理手順では、図37に示すように、タッチパネル部56上の指が置かれたポイントを表すX軸及びY軸の位置情報を検出する(ステップS611)。ポイントの位置がどの範囲に該当するかを判断する(ステップS612)。指やタッチペンからの圧力を算出する(ステップS613)。ポイントの移動を監視する(ステップS614)。ポイントの移動があれば(ステップS614のYES)、ポイントの位置距離移動より、そのポイント移動の加速度を算出する、又は、ポイントの移動速度を算出する(ステップS615)。 Therefore, in this processing procedure, as shown in FIG. 37, position information on the X axis and the Y axis representing the point where the finger is placed on the touch panel unit 56 is detected (step S611). It is determined to which range the position of the point corresponds (step S612). The pressure from the finger or the touch pen is calculated (step S613). The movement of the point is monitored (step S614). If there is a movement of the point (YES in step S614), the acceleration of the movement of the point or the movement speed of the point is calculated from the movement of the position of the point (step S615).
 そして、この場合、タッチ入力の位置情報、圧力、加速度によってパターンを生成させ(ステップS616)、このパターンをパターン登録部に登録し(ステップS617)、メインルーティン(図36)のステップS602に戻る。そして、アプリ起動設定が実行される。 In this case, a pattern is generated by the position information, pressure, and acceleration of the touch input (step S616), the pattern is registered in the pattern registration unit (step S617), and the process returns to step S602 of the main routine (FIG. 36). Then, the application activation setting is executed.
 このようにアプリ起動設定が実行された後、アプリ起動の処理手順では、アプリ起動のため、タッチ入力によるパターン認識により、入力パターンが登録パターンに合致することが必要となる。 After the application activation setting is executed in this way, the application activation processing procedure requires that the input pattern matches the registered pattern by pattern recognition by touch input for application activation.
 そこで、このアプリ起動の処理手順では、図38に示すように、アプリを起動するかの判定を行い(ステップS621)、アプリ起動でなければ(ステップS621のNO)、この処理を終了する。アプリ起動であれば(ステップS621のYES)、既述のタッチ入力を行う(ステップS622)。このタッチ入力から得られる既述の物理情報を判断要素としてタッチ入力のパターンを認識する。 Therefore, in this application activation processing procedure, as shown in FIG. 38, it is determined whether the application is activated (step S621). If the application is not activated (NO in step S621), the process is terminated. If the application is activated (YES in step S621), the touch input described above is performed (step S622). The touch input pattern is recognized using the above-described physical information obtained from the touch input as a determination element.
 タッチ入力によって得られた入力パターンが登録パターンに合致するかの判定が行われる(ステップS623)。入力パターンが登録パターンに合致した場合には(ステップS623のYES)、アプリ起動が実行され(ステップS624)、入力パターンが登録パターンに合致しなければ(ステップS623のNO)、アプリ起動不可となる(ステップS625)。この場合、入力パターンと登録パターンとの合致判定は、登録パターンに近似範囲を含んで判断してもよいことは既述の通りである。 It is determined whether the input pattern obtained by touch input matches the registered pattern (step S623). If the input pattern matches the registered pattern (YES in step S623), the application is activated (step S624). If the input pattern does not match the registered pattern (NO in step S623), the application cannot be activated. (Step S625). In this case, as described above, the match determination between the input pattern and the registered pattern may be determined by including the approximate range in the registered pattern.
 そこで、このアプリ起動の具体的な処理手順では、図39に示すように、アプリ起動であるかを監視し(ステップS631)、アプリ起動であれば、タッチ入力を行う。この場合、タッチパネル部56上に指やタッチペンの置かれたポイントの位置(X、Y)を算出する(ステップS632)。この場合、入力パターンが登録パターンに合致するかの第1の判断として、ポイントの位置が座標範囲のどの部分に該当するかを判断する(ステップS633)。即ち、登録時のポイントの位置の前後左右の範囲に入るかどうかを判定する(ステップS634)。例えば、登録時がポイント9b部分の場合、ポイント8a~10a、8b~10b、8c~10cに当てはまるかを判定する。この判定において、ポイントの位置の座標範囲に該当しなければ(ステップS634のNO)、アプリ起動不可となる(ステップS635)。 Therefore, in the specific processing procedure for starting the application, as shown in FIG. 39, whether the application is started is monitored (step S631), and if the application is started, touch input is performed. In this case, the position (X, Y) of the point where the finger or the touch pen is placed on the touch panel unit 56 is calculated (step S632). In this case, as a first determination as to whether the input pattern matches the registered pattern, it is determined to which part of the coordinate range the position of the point corresponds (step S633). That is, it is determined whether or not the position of the point at the time of registration falls within the range of front, back, left and right (step S634). For example, when the point 9b is registered, it is determined whether the point 8a to 10a, 8b to 10b, and 8c to 10c are applicable. If this determination does not correspond to the coordinate range of the position of the point (NO in step S634), the application cannot be activated (step S635).
 この判定において、ポイントの位置の座標範囲に該当すれば(ステップS634のYES)、入力パターンが登録パターンに合致するかの第2の判断として、圧力について、登録パターンと比較し、判定する(ステップS636)。例えば、登録時のポイントの圧力を○○〔N〕とすると、その圧力が所定の圧力範囲として例えば、○○〔N〕×0.8~○○〔N〕×1.2の範囲にあるかを判定する。この圧力判定において、所定の圧力範囲に該当しなければ(ステップS637のNG)、アプリ起動不可となる(ステップS638)。 In this determination, if it falls within the coordinate range of the position of the point (YES in step S634), as a second determination of whether the input pattern matches the registered pattern, the pressure is compared with the registered pattern for determination (step) S636). For example, if the pressure at the point of registration is XX [N], the pressure is within a predetermined pressure range, for example, XX [N] × 0.8 to XX [N] × 1.2. Determine whether. If the pressure does not fall within the predetermined pressure range (NG in step S637), the application cannot be activated (step S638).
 このような圧力判定が所定の圧力範囲にあれば(ステップS637のOK)、ポイントの移動を監視し(ステップS639)、ポイントの移動があれば(ステップS639のYES)、ポイントの位置距離移動から加速度を算出し、この加速度について、登録パターンと比較し、判定する(ステップS640)。例えば、登録時のポイントの加速度が△△〔cm/s〕であれば、その加速度が所定の範囲として例えば、△△〔N〕×0.8~△△〔N〕×1.2の範囲に該当するかを判定する(ステップS641)。この加速度が所定の範囲外であれば(ステップS641のNG)、アプリ起動不可となる(ステップS642)。 If such pressure determination is within a predetermined pressure range (OK in step S637), the movement of the point is monitored (step S639), and if there is a movement of the point (YES in step S639), the movement of the position of the point from An acceleration is calculated, and the acceleration is compared with a registered pattern for determination (step S640). For example, if the acceleration at the point at the time of registration is ΔΔ [cm / s 2 ], the acceleration is within a predetermined range, for example, ΔΔ [N] × 0.8 to ΔΔ [N] × 1.2. It is determined whether it falls within the range (step S641). If this acceleration is outside the predetermined range (NG in step S641), the application cannot be activated (step S642).
 また、この加速度が所定の範囲内であれば(ステップS641のOK)、アプリ起動が実行される(ステップS643)。 If this acceleration is within a predetermined range (OK in step S641), application activation is executed (step S643).
 このようにタッチ入力から得られる少なくとも2種類の物理情報を判断要素として認識されるパターンを用いることにより、アプリ起動を許可するので、入力パターンが登録パターンに合致しない場合のアプリ起動を防止できる。また、自由な起動からアプリを防護でき、セキュリティを高めることができる。 Since the application activation is permitted by using a pattern that recognizes at least two types of physical information obtained from touch input as a determination element in this way, the application activation when the input pattern does not match the registered pattern can be prevented. In addition, the application can be protected from free activation, and security can be increased.
〔第6の実施の形態〕 [Sixth Embodiment]
 第6の実施の形態は、ネットワーク通信機能によるウェブサイトへのログインにタッチ入力のパターン判定を用いる構成である。 In the sixth embodiment, touch input pattern determination is used for logging in to a website using a network communication function.
 この第6の実施の形態について、図40、図41、図42及び図43を参照する。図40は、ウェブサイトへのログイン設定の処理手順を示すフローチャート、図41は、パターン登録の処理手順を示すフローチャート、図42は、ウェブサイトへのログインの処理手順を示すフローチャート、図43は、ウェブサイトへのログインの具体的な処理手順を示すフローチャートである。 Referring to FIG. 40, FIG. 41, FIG. 42, and FIG. 43 for the sixth embodiment. FIG. 40 is a flowchart showing the processing procedure for login setting to the website, FIG. 41 is a flowchart showing the processing procedure for pattern registration, FIG. 42 is a flowchart showing the processing procedure for logging in to the website, and FIG. It is a flowchart which shows the specific process sequence of login to a website.
 このウェブサイトへのログイン設定の処理手順は、本開示の電子機器、操作検出方法又は操作検出プログラムの一例である。この処理手順では、ログイン設定にタッチパネル部56に対するタッチ入力から得られる既述の物理情報を判断要素としてタッチ入力のパターンを認識し、その登録パターンをログイン情報として設定する。 The processing procedure for setting login to this website is an example of the electronic device, the operation detection method, or the operation detection program of the present disclosure. In this processing procedure, the touch input pattern is recognized using the above-described physical information obtained from the touch input to the touch panel unit 56 for the login setting, and the registered pattern is set as the login information.
 そこで、この処理手順では、図40に示すように、ログイン設定かを判定し(ステップS701)、ログイン設定でなければ(ステップS701のNO)、この処理を終了する。 Therefore, in this processing procedure, as shown in FIG. 40, it is determined whether or not the login setting is set (step S701). If the login setting is not set (NO in step S701), the processing ends.
 ログイン設定であれば(ステップS701のYES)、パターン登録モードを起動し、パターン登録を行う(ステップS702)。パターンを登録したかを判定し(ステップS703)、パターン登録をしなければ(ステップS703のNO)、この処理を終了する。また、パターン登録をすれば(ステップS703のYES)、ログイン設定を実行する(ステップS704)。このログイン設定は、タッチ入力によるパターン登録が関係付けられているので、ウェブサイトへのログインには登録パターンと入力パターンとが合致することが必要である。 If it is login setting (YES in step S701), the pattern registration mode is activated and pattern registration is performed (step S702). It is determined whether or not the pattern has been registered (step S703). If the pattern is not registered (NO in step S703), this process ends. If the pattern is registered (YES in step S703), login setting is executed (step S704). Since this login setting is related to pattern registration by touch input, it is necessary for the login pattern to match the input pattern for login to the website.
 このログインに用いるパターン登録の処理は、ログイン設定に際し、タッチパネル部56に対するタッチ入力のパターンを登録する処理である。登録されたパターンは、ログインのためのユーザ認証の処理に用いられる。 The pattern registration process used for login is a process of registering a touch input pattern for the touch panel unit 56 at the time of login setting. The registered pattern is used for user authentication processing for login.
 そこで、この処理手順では、図41に示すように、タッチパネル部56上の指が置かれたポイントを表すX軸及びY軸の位置情報を検出する(ステップS711)。ポイントの位置が座標範囲のどの部分に該当するかを判断する(ステップS712)。指やタッチペンからの圧力を算出する(ステップS713)。ポイントの移動を監視する(ステップS714)。ポイントの移動があれば(ステップS714のYES)、ポイントの位置距離移動より、そのポイント移動の加速度を算出する、又は、ポイントの移動速度を算出する(ステップS715)。 Therefore, in this processing procedure, as shown in FIG. 41, position information on the X axis and the Y axis representing the point where the finger on the touch panel unit 56 is placed is detected (step S711). It is determined to which part of the coordinate range the position of the point corresponds (step S712). The pressure from the finger or the touch pen is calculated (step S713). The movement of the point is monitored (step S714). If there is a point movement (YES in step S714), the acceleration of the point movement is calculated or the movement speed of the point is calculated from the position distance movement of the point (step S715).
 そして、この場合、タッチ入力の位置情報、圧力、加速度によってパターンを生成させ(ステップS716)、このパターンをパターン登録部に登録し(ステップS717)、メインルーティン(図40)のステップS702に戻る。そして、ログイン設定が実行される。 In this case, a pattern is generated based on the position information, pressure, and acceleration of the touch input (step S716), the pattern is registered in the pattern registration unit (step S717), and the process returns to step S702 of the main routine (FIG. 40). Then, login settings are executed.
 このようにログイン設定が実行された後、ログインの処理手順では、ログインのため、タッチ入力によるパターン認識により、入力パターンが登録パターンに合致することが必要となる。 After the login setting is executed in this way, the login processing procedure requires that the input pattern matches the registered pattern by pattern recognition by touch input for login.
 そこで、このログインの処理手順では、図42に示すように、ログインであるかの判定を行い(ステップS721)、ログインでなければ(ステップS721のNO)、この処理を終了する。ログインであれば(ステップS721のYES)、既述のタッチ入力を行う(ステップS722)。このタッチ入力から得られる既述の物理情報を判断要素としてタッチ入力のパターンを認識する。 Therefore, in this login processing procedure, as shown in FIG. 42, it is determined whether or not the user is logged in (step S721). If not logged in (NO in step S721), the process is terminated. If it is login (YES in step S721), the touch input described above is performed (step S722). The touch input pattern is recognized using the above-described physical information obtained from the touch input as a determination element.
 タッチ入力によって得られた入力パターンが登録パターンに合致するかの判定が行われる(ステップS723)。入力パターンが登録パターンに合致した場合には(ステップS723のYES)、ログインが実行され(ステップS724)、入力パターンが登録パターンに合致しなければ(ステップS723のNO)、ログイン不可となる(ステップS725)。この場合、入力パターンと登録パターンとの合致判定は、登録パターンに近似範囲を含んで判断してもよいことは既述の通りである。 It is determined whether the input pattern obtained by touch input matches the registered pattern (step S723). If the input pattern matches the registration pattern (YES in step S723), login is executed (step S724). If the input pattern does not match the registration pattern (NO in step S723), login is disabled (step S723). S725). In this case, as described above, the match determination between the input pattern and the registered pattern may be determined by including the approximate range in the registered pattern.
 そこで、このログインの具体的な処理手順では、図43に示すように、ログインであるかを監視し(ステップS731)、ログインであれば、タッチ入力を行う。この場合、タッチパネル部56上に指やタッチペンの置かれたポイントの位置(X、Y)を算出する(ステップS732)。この場合、入力パターンが登録パターンに合致するかの第1の判断として、ポイントの位置が座標範囲のどの部分に該当するかを判断する(ステップS733)。即ち、登録時のポイントの位置の前後左右の範囲に入るかどうかを判定する(ステップS734)。例えば、登録時がポイント9b部分の場合、ポイント8a~10a、8b~10b、8c~10cに当てはまるかを判定する。この判定において、ポイントの位置の座標範囲に該当しなければ(ステップS734のNO)、サイトへのログイン不可となる(ステップS735)。 Therefore, in the specific processing procedure of this login, as shown in FIG. 43, it is monitored whether it is a login (step S731), and if it is a login, touch input is performed. In this case, the position (X, Y) of the point where the finger or the touch pen is placed on the touch panel unit 56 is calculated (step S732). In this case, as a first determination as to whether the input pattern matches the registered pattern, it is determined to which part of the coordinate range the position of the point corresponds (step S733). That is, it is determined whether or not the position of the point at the time of registration falls within the range of front, rear, left and right (step S734). For example, when the point 9b is registered, it is determined whether the point 8a to 10a, 8b to 10b, and 8c to 10c are applicable. In this determination, if it does not fall within the coordinate range of the point position (NO in step S734), login to the site is disabled (step S735).
 この判定において、ポイントの位置の座標範囲に該当すれば(ステップS734のYES)、入力パターンが登録パターンに合致するかの第2の判断として、圧力について、登録パターンと比較し、判定する(ステップS736)。例えば、登録時のポイントの圧力を○○〔N〕とすると、その圧力が所定の圧力範囲として例えば、○○〔N〕×0.8~○○〔N〕×1.2の範囲にあるかを判定する。この圧力判定において、所定の圧力範囲に該当しなければ(ステップS737のNG)、ログイン不可となる(ステップS738)。 In this determination, if it falls within the coordinate range of the position of the point (YES in step S734), as a second determination of whether the input pattern matches the registered pattern, the pressure is compared with the registered pattern for determination (step (step S734). S736). For example, if the pressure at the point of registration is XX [N], the pressure is within a predetermined pressure range, for example, XX [N] × 0.8 to XX [N] × 1.2. Determine whether. If the pressure does not fall within the predetermined pressure range (NG in step S737), login is disabled (step S738).
 このような圧力判定が所定の圧力範囲にあれば(ステップS737のOK)、ポイントの移動を監視し(ステップS739)、ポイントの移動があれば(ステップS739のYES)、ポイントの位置距離移動から加速度を算出し、この加速度について、登録パターンと比較し、判定する(ステップS740)。例えば、登録時のポイントの加速度が△△〔cm/s〕であれば、その加速度が所定の範囲として例えば、△△〔N〕×0.8~△△〔N〕×1.2の範囲に該当するかを判定する(ステップS741)。この加速度が所定の範囲外であれば(ステップS741のNG)、ログイン不可となる(ステップS742)。 If such a pressure determination is within a predetermined pressure range (OK in step S737), the movement of the point is monitored (step S739), and if there is a movement of the point (YES in step S739), the movement of the position of the point from An acceleration is calculated, and this acceleration is compared with a registered pattern for determination (step S740). For example, if the acceleration at the point at the time of registration is ΔΔ [cm / s 2 ], the acceleration is within a predetermined range, for example, ΔΔ [N] × 0.8 to ΔΔ [N] × 1.2. It is determined whether it falls within the range (step S741). If this acceleration is outside the predetermined range (NG in step S741), login is disabled (step S742).
 また、この加速度が所定の範囲内であれば(ステップS741のOK)、ログインが実行される(ステップS743)。 If the acceleration is within a predetermined range (OK in step S741), login is executed (step S743).
 このようにタッチ入力から得られる少なくとも2種類の物理情報を判断要素として認識されるパターンを用いることにより、ログインを許可するので、入力パターンが登録パターンに合致しない場合のログインを防止できる。また、自由なログインを阻止でき、セキュリティを高めることができる。 Since login is permitted by using a pattern that recognizes at least two types of physical information obtained from touch input as a determination element in this way, login when the input pattern does not match the registered pattern can be prevented. Moreover, free login can be prevented and security can be improved.
〔第7の実施の形態〕 [Seventh Embodiment]
 第7の実施の形態は、既存の登録パターンの共用か又は新たなパターンの登録かを選択する処理である。 The seventh embodiment is a process of selecting whether an existing registration pattern is shared or a new pattern is registered.
 この実施の形態について、図44を参照する。図44は、ロック設定の処理手順を示すフローチャートである。 Referring to FIG. 44 for this embodiment. FIG. 44 is a flowchart illustrating a lock setting process.
 この実施の形態では、ロック設定として、登録パターンの共用か新たなパターン登録かの選択をする処理を併用しているが、このようなパターン登録はロック設定に限定されない。 In this embodiment, as the lock setting, a process of selecting whether to register a registered pattern or to register a new pattern is used together, but such pattern registration is not limited to the lock setting.
 そこで、この処理手順では、図44に示すように、ロック設定かを判定し(ステップS801)、ロック設定をしなければ、この処理を終了する。 Therefore, in this processing procedure, as shown in FIG. 44, it is determined whether or not the lock is set (step S801), and if the lock is not set, this processing ends.
 ロック設定であれば、既に登録されている登録パターンをロック解除に用いるかを判定する(ステップS802)。既存の登録パターンをロック解除に用いるのであれば(ステップS802のYES)、ロック設定を実行し(ステップS803)、この処理を終了する。 If it is the lock setting, it is determined whether the registered pattern already registered is used for unlocking (step S802). If an existing registration pattern is used for unlocking (YES in step S802), lock setting is executed (step S803), and this process is terminated.
 既存の登録パターンを用いない場合には(ステップS802のNO)、新たにパターンを設定するかを判定する(ステップS804)。新たにパターンを設定しなければ(ステップS804のNO)、この処理を終了する。 If an existing registered pattern is not used (NO in step S802), it is determined whether a new pattern is set (step S804). If no new pattern is set (NO in step S804), this process ends.
 新たにパターンを設定する場合には(ステップS804のYES)、パターン登録を実行する(ステップS805)。パターン登録は上記実施の形態に記載の通りであるから、その説明を省略する。 When a new pattern is set (YES in step S804), pattern registration is executed (step S805). Since the pattern registration is as described in the above embodiment, its description is omitted.
〔比較例〕 [Comparative example]
 比較例は、タッチパネル部に誤動作防止ロックを備える携帯端末装置にあって、終話時に終話ボタンを操作する構成である。 The comparative example is a portable terminal device having a malfunction prevention lock on the touch panel unit, and operates the call end button at the end of the call.
 この比較例について、図45及び図46を参照する。図45は、携帯端末装置の比較例を示す図、図46は、終話の処理手順を示すフローチャートである。 Referring to FIGS. 45 and 46 for this comparative example. FIG. 45 is a diagram illustrating a comparative example of the mobile terminal device, and FIG. 46 is a flowchart illustrating a processing procedure of the end call.
 この携帯端末装置220は図45に示すように、筐体278の前面部に表示部248とともにタッチパネル部256が設置されている。筐体278には表示部248を挟んでマイクロフォン266及びロックキー118と、レシーバ268とが設置され、タッチパネル部256には終話ボタン120が設置されている。終話ボタン120は、ロックキー118の押下によってタッチパネルロック解除の後、操作が有効になり、終話操作が行える。 As shown in FIG. 45, the portable terminal device 220 has a touch panel unit 256 installed together with a display unit 248 on the front surface of a housing 278. A microphone 266 and a lock key 118 and a receiver 268 are installed on the housing 278 with the display unit 248 interposed therebetween, and an end button 120 is installed on the touch panel unit 256. The end button 120 is enabled after the touch panel lock is released by depressing the lock key 118, and the end operation can be performed.
 そこで、この携帯端末装置220の終話の処理手順では、図46に示すように、通話を判定し(ステップS901)、通話中でなければ(ステップS901のNO)、この処理を終了する。 Therefore, in the process of ending the call of the mobile terminal device 220, as shown in FIG. 46, a call is determined (step S901). If the call is not in progress (NO in step S901), the process ends.
 通話中であれば(ステップS901のYES)、タッチパネル部256の誤動作防止動作を開始する。即ち、ユーザが携帯端末装置220を耳に当てると、タッチパネル部256でそれを検知し、タッチパネルロックを設定する(ステップS902)。このタッチパネルロックは、不用意なタッチ入力による終話を防止するため、タッチパネル部256に対するタッチ入力を阻止する処理である。この場合、終話ボタン120の操作は無効となる。 If the call is in progress (YES in step S901), the malfunction prevention operation of the touch panel unit 256 is started. That is, when the user places the portable terminal device 220 on the ear, the touch panel unit 256 detects it and sets the touch panel lock (step S902). This touch panel lock is a process of blocking touch input to the touch panel unit 256 in order to prevent an end of talk due to careless touch input. In this case, the operation of the end call button 120 is invalid.
 終話のため、ロックキー118を押下し(ステップS903)、ロックを解除する(ステップS904)。このロック解除により、タッチパネル部256のタッチ入力及び終話ボタン120の操作が有効となる。 To end the talk, the lock key 118 is pressed (step S903), and the lock is released (step S904). With this unlocking, the touch input on the touch panel unit 256 and the operation of the call end button 120 become effective.
 そこで、終話ボタン120を選択し(ステップS905)、それを押下すると、終話となる(ステップS906)。 Therefore, when the end button 120 is selected (step S905) and pressed, the end of the call is reached (step S906).
 斯かる構成では、ロックキー118が通話中に不用意に操作されると、タッチパネル部256が有効となり、ユーザの予期しない終話を生起させるおそれがある。また、タッチパネルロックが自動化されていても、その解除を容易に行えるため、多重操作では誤動作を回避できない不都合がある。このような課題は上記実施の形態又は後述の他の実施の形態においても解決され、利便性の高い携帯端末装置が実現されている。しかも、上記実施の形態によれば、ロック解除ボタンの設置が不要であり、機械的なスイッチや制御を省略することができ、携帯端末装置のコンパクト化、シンプル化に寄与する。 In such a configuration, if the lock key 118 is inadvertently operated during a call, the touch panel unit 256 becomes effective, which may cause an unexpected end of the user. Moreover, even if the touch panel lock is automated, it can be easily released, so there is a disadvantage that malfunctions cannot be avoided by multiple operations. Such a problem is solved in the above embodiment or other embodiments described later, and a highly convenient portable terminal device is realized. In addition, according to the above-described embodiment, it is not necessary to install a lock release button, and mechanical switches and controls can be omitted, contributing to the compactness and simplification of the mobile terminal device.
〔他の実施の形態〕 [Other Embodiments]
 (1) 上記実施の形態では、タッチ入力から得られる少なくとも2種類の物理情報を判断要素としてタッチ入力のパターンを認識している。物理情報として、タッチ入力のポイント数、位置、速度、加速度、角度、圧力、角速度を例示しているが、これに限定されない。そこで、物理情報としては、これらの何れか又はこれらの何れかを含んでもよい。 (1) In the above embodiment, the touch input pattern is recognized using at least two types of physical information obtained from the touch input as determination elements. Examples of the physical information include the number of touch input points, position, speed, acceleration, angle, pressure, and angular velocity, but are not limited thereto. Therefore, the physical information may include any one of these or any of these.
 (2) パターン認識は、2種類以上の物理情報の論理積を判断要素とすればよい。一例として、タッチ位置、加速度の論理積(第2の実施の形態:図11)、タッチポイントの数、その位置、加速度、角度及び圧力の変化の論理積(第3の実施の形態:図22、第4の実施の形態:図32)を例示したが、これに限定されない。物理情報の論理積は、種類の異なる物理情報を組み合わせ、その物理情報も少なくとも2種類であればよく、3種類以上を組み合わせ、論理積によってパターンを生成させてもよい。 (2) The pattern recognition may be based on the logical product of two or more types of physical information. As an example, the logical product of the touch position and acceleration (second embodiment: FIG. 11), the logical product of the number of touch points, its position, acceleration, angle and pressure (third embodiment: FIG. 22). Although the fourth embodiment: FIG. 32) is exemplified, the present invention is not limited to this. The logical product of physical information may be a combination of different types of physical information, and the physical information may be at least two types, and three or more types may be combined to generate a pattern by logical product.
 (3) 上記実施の形態では、加速度センサ部60を備えているが、これに限定されない。単位時間当たりのタッチ位置の移動から速度を求め、この速度を微分して加速度を算出する構成としてもよい。 (3) In the above embodiment, the acceleration sensor unit 60 is provided, but the present invention is not limited to this. A configuration may be used in which the velocity is obtained from the movement of the touch position per unit time, and the acceleration is calculated by differentiating the velocity.
 (4) 上記実施の形態では、角速度センサ部64は、携帯端末装置20の角速度を検出しているが、これに限定されない。携帯端末装置20の移動を検出するセンサを備え、そのセンサ出力から角速度を算出してもよい。また、携帯端末装置20の角速度に代え、タッチパネル部56に対するタッチ位置の角速度を算出し、これを物理情報としてもよい。 (4) In the above embodiment, the angular velocity sensor unit 64 detects the angular velocity of the mobile terminal device 20, but is not limited to this. A sensor that detects the movement of the mobile terminal device 20 may be provided, and the angular velocity may be calculated from the sensor output. Moreover, it replaces with the angular velocity of the portable terminal device 20, and the angular velocity of the touch position with respect to the touch panel part 56 is calculated, and it is good also as physical information.
 (5) 上記実施の形態では、電子機器の一例として、携帯端末装置20を例示したが、本開示の電子機器はこれに限定されない。タッチ入力が可能な電子機器であればよい。例えば、パーソナルコンピュータ(PC)320(図47)、折畳み可能な携帯電話機420(図48)、リモートコントローラ520(図49)であってもよいし、タッチ入力が可能な電子機器であれば、ゲーム機器や、自動車のエンジン起動装置であってもよい。 (5) In the above embodiment, the mobile terminal device 20 is illustrated as an example of the electronic device, but the electronic device of the present disclosure is not limited to this. Any electronic device capable of touch input may be used. For example, it may be a personal computer (PC) 320 (FIG. 47), a foldable mobile phone 420 (FIG. 48), a remote controller 520 (FIG. 49), or a game as long as it is an electronic device capable of touch input. It may be a device or an automobile engine starting device.
 (6) PC320は、図47に示すように、キーボード側筐体部322と、表示側筐体部324とがヒンジ部326によって折畳み可能に構成され、表示側筐体部324には、表示部48とともにタッチパネル部56が設定されている。タッチパネル部56は、上記実施の形態に準じて構成され、タッチ入力から既述の処理でパターン認識やパターン登録が可能である。このため、このPC320においても、上記実施の形態と同様の機能を実現し、その効果を得ることができる。 (6) As shown in FIG. 47, the PC 320 is configured such that the keyboard side casing 322 and the display side casing 324 can be folded by a hinge 326, and the display side casing 324 includes a display unit. 48 and the touch panel unit 56 are set. The touch panel unit 56 is configured according to the above-described embodiment, and can perform pattern recognition and pattern registration by the above-described processing from touch input. For this reason, also in this PC320, the same function as the said embodiment is implement | achieved and the effect can be acquired.
 (7) 携帯電話機420は、電子機器又は既述の携帯端末装置20の一例であって、図48に示すように、操作側筐体部422と、表示側筐体部424とがヒンジ部426によって折畳み可能に構成され、表示側筐体部424には、表示部48とともにタッチパネル部56が設定されている。タッチパネル部56は、上記実施の形態に準じて構成され、タッチ入力から既述の処理でパターン認識やパターン登録が可能である。このため、この携帯電話機420においても、上記実施の形態と同様の機能を実現し、その効果を得ることができる。この場合、キーパッド428は、既述の入力部46(図4)に含まれる。 (7) The mobile phone 420 is an example of an electronic device or the mobile terminal device 20 described above. As shown in FIG. 48, the operation-side housing unit 422 and the display-side housing unit 424 include a hinge unit 426. The touch panel unit 56 is set in the display-side housing unit 424 together with the display unit 48. The touch panel unit 56 is configured according to the above-described embodiment, and can perform pattern recognition and pattern registration by the above-described processing from touch input. For this reason, this mobile phone 420 can realize the same functions as those in the above-described embodiment and obtain the effects. In this case, the keypad 428 is included in the input unit 46 (FIG. 4) described above.
 (8) リモートコントローラ520は、図49に示すように、テレビジョン受信機等、各種の電子機器522を電波や音波を用いて遠隔操作する手段である。このようなリモートコントローラ520にも、本開示の電子機器又は携帯端末装置について述べた機能を搭載させ、同様の機能や効果を得ることができる。 (8) The remote controller 520 is means for remotely operating various electronic devices 522 such as a television receiver using radio waves or sound waves, as shown in FIG. Such a remote controller 520 can be equipped with the functions described for the electronic device or the mobile terminal device of the present disclosure, and the same functions and effects can be obtained.
 (9) 自動車のエンジン起動装置に本開示の機能を搭載すれば、ユーザのみが登録したパターンに合致する入力パターンにより、エンジンを起動させることができ、セキュリティを高めることができる。 (9) If the function of the present disclosure is installed in an automobile engine starter, the engine can be started with an input pattern that matches the pattern registered only by the user, and security can be improved.
 (10) 上記実施の形態では、ポイントの圧力〔N〕を例示(図17、図21、図39、図43)しているが、これに限定されない。ポイントの圧力にはポイントに作用する力を面積で除して求められる圧力〔Pa〕を用いてもよい。 (10) In the above embodiment, the point pressure [N] is exemplified (FIGS. 17, 21, 39, and 43), but is not limited thereto. As the pressure at the point, a pressure [Pa] obtained by dividing the force acting on the point by the area may be used.
 以上述べたように、本開示の電子機器、操作検出方法又は操作検出プログラムの実施の形態等について説明したが、本発明は、上記記載に限定されるものではない。請求の範囲に記載され、又は明細書に開示された発明の要旨に基づき、当業者において様々な変形や変更が可能であることは勿論である。斯かる変形や変更が、本発明の範囲に含まれることは言うまでもない。
As described above, the embodiments of the electronic device, the operation detection method, or the operation detection program of the present disclosure have been described, but the present invention is not limited to the above description. It goes without saying that various modifications and changes can be made by those skilled in the art based on the gist of the invention described in the claims or disclosed in the specification. It goes without saying that such modifications and changes are included in the scope of the present invention.
 本開示の電子機器、操作検出方法又は操作検出プログラムは、タッチ入力から得られる少なくとも2種類の物理情報を判断要素としてパターンを認識し、そのパターン登録や入力パターンの判定を行っている。従って、タッチ入力部を備える機器の利便性が高められ、有用である。
The electronic device, the operation detection method, or the operation detection program according to the present disclosure recognizes a pattern using at least two types of physical information obtained from touch input as a determination element, and performs pattern registration and input pattern determination. Therefore, the convenience of the device including the touch input unit is improved and useful.
 2 電子機器
 4 タッチ入力部
 6 パターン認識部
 8 パターン登録部
 10 パターン判定部
 40 ロック制御部
 42 プロセッサ(アプリケーション起動部)
 44 センサ部
 52 通信部(終話機能部、ログイン機能部)
 54 記憶部
 56 タッチパネル部
 58 圧力センサ部
 60 加速度センサ部
 62 位置センサ部
 64 角速度センサ部
 72 プログラム記憶部
 74 データ記憶部
2 Electronic equipment 4 Touch input unit 6 Pattern recognition unit 8 Pattern registration unit 10 Pattern determination unit 40 Lock control unit 42 Processor (application activation unit)
44 sensor unit 52 communication unit (end call function unit, login function unit)
54 storage unit 56 touch panel unit 58 pressure sensor unit 60 acceleration sensor unit 62 position sensor unit 64 angular velocity sensor unit 72 program storage unit 74 data storage unit

Claims (6)

  1.  タッチ入力部と、
     タッチ入力から得られる少なくとも2種類の物理情報を判断要素としてパターンを認識するパターン認識部と、
     前記パターン認識部で認識されたパターンが、登録部に予め登録されている登録パターンに合致し又は該登録パターンの近似範囲に合致する場合に、前記登録パターンに合致すること又は前記登録パターンの近似範囲に合致することを表す出力を発生するパターン判定部と、
     を備えることを特徴とする電子機器。
    A touch input unit;
    A pattern recognition unit that recognizes a pattern using at least two types of physical information obtained from touch input as a determination element;
    If the pattern recognized by the pattern recognition unit matches the registered pattern registered in advance in the registration unit or matches the approximate range of the registered pattern, it matches the registered pattern or approximates the registered pattern A pattern determination unit that generates an output indicating that it matches the range;
    An electronic device comprising:
  2.  前記物理情報は、タッチ入力のポイント数、位置、速度、加速度、角度、圧力、角速度の何れか又はこれらの何れかを含むことを特徴とする請求項1記載の電子機器。 The electronic device according to claim 1, wherein the physical information includes any one or any of touch input point number, position, velocity, acceleration, angle, pressure, angular velocity.
  3.  前記パターン認識部は、2種類以上の前記物理情報の論理積を判断要素とし、該判断要素によりパターンを認識することを特徴とする請求項1記載の電子機器。 The electronic device according to claim 1, wherein the pattern recognition unit uses a logical product of two or more types of physical information as a determination element, and recognizes the pattern by the determination element.
  4.  更に、前記パターン判定部の判定出力によりロックを解除するロック機能部、前記パターン判定部の判定出力によりアプリケーションを起動させるアプリケーション起動部、前記パターン判定部の判定出力により通話を終了させる終話機能部、又は前記パターン判定部の判定出力によりウェブサイトにログインさせるログイン機能部の何れかひとつを備えることを特徴とする請求項1記載の電子機器。 Furthermore, a lock function unit for releasing the lock by the determination output of the pattern determination unit, an application activation unit for starting an application by the determination output of the pattern determination unit, and an end function unit for ending the call by the determination output of the pattern determination unit The electronic apparatus according to claim 1, further comprising: a login function unit configured to log in to a website based on a determination output of the pattern determination unit.
  5.  タッチ入力部を搭載した電子機器が実行する操作検出方法であって、
     タッチ入力から得られる少なくとも2種類の物理情報を判断要素としてパターンを認識するステップと、
     認識されたパターンが、登録部に予め登録されている登録パターンに合致し又は該登録パターンの近似範囲に合致する場合に、前記登録パターンに合致すること又は前記登録パターンの近似範囲に合致することを表す出力を発生するステップと、
     を含むことを特徴とする電子機器の操作検出方法。
    An operation detection method executed by an electronic device equipped with a touch input unit,
    Recognizing a pattern using at least two types of physical information obtained from touch input as determination elements;
    When the recognized pattern matches the registered pattern registered in advance in the registration unit or matches the approximate range of the registered pattern, it matches the registered pattern or matches the approximate range of the registered pattern Generating an output representative of
    An operation detection method for an electronic device, comprising:
  6.  タッチ入力部を搭載した電子機器によって実行される操作検出プログラムであって、
     タッチ入力から得られる少なくとも2種類の物理情報を判断要素としてパターンを認識する機能と、
     認識されたパターンが、登録部に予め登録されている登録パターンに合致し又は該登録パターンの近似範囲に合致する場合に、前記登録パターンに合致すること又は前記登録パターンの近似範囲に合致することを表す出力を発生する機能と、
     を含むことを特徴とする電子機器の操作検出プログラム。
    An operation detection program executed by an electronic device equipped with a touch input unit,
    A function of recognizing a pattern using at least two types of physical information obtained from touch input as a determination element;
    When the recognized pattern matches the registered pattern registered in advance in the registration unit or matches the approximate range of the registered pattern, it matches the registered pattern or matches the approximate range of the registered pattern A function that generates an output representing
    An electronic device operation detection program comprising:
PCT/JP2009/071399 2009-12-24 2009-12-24 Electronic device, operation detection method and operation detection program WO2011077525A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2011547140A JPWO2011077525A1 (en) 2009-12-24 2009-12-24 Electronic device, operation detection method, and operation detection program
PCT/JP2009/071399 WO2011077525A1 (en) 2009-12-24 2009-12-24 Electronic device, operation detection method and operation detection program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2009/071399 WO2011077525A1 (en) 2009-12-24 2009-12-24 Electronic device, operation detection method and operation detection program

Publications (1)

Publication Number Publication Date
WO2011077525A1 true WO2011077525A1 (en) 2011-06-30

Family

ID=44195088

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2009/071399 WO2011077525A1 (en) 2009-12-24 2009-12-24 Electronic device, operation detection method and operation detection program

Country Status (2)

Country Link
JP (1) JPWO2011077525A1 (en)
WO (1) WO2011077525A1 (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013137722A (en) * 2011-12-28 2013-07-11 Kyocera Corp Device, method, and program
JP2013205986A (en) * 2012-03-27 2013-10-07 Kyocera Corp Electronic device
JP2013232047A (en) * 2012-04-27 2013-11-14 Konica Minolta Inc Image processing apparatus, control method of the same, and control program of the same
JP2014142813A (en) * 2013-01-24 2014-08-07 Sharp Corp Electronic apparatus and electronic apparatus operation control program
JP2014527675A (en) * 2011-09-12 2014-10-16 モトローラ モビリティ エルエルシーMotorola Mobility Llc Use of differential pressure by touch sensor display screen
JP2014527678A (en) * 2011-09-12 2014-10-16 モトローラ モビリティ エルエルシーMotorola Mobility Llc Use of differential pressure by touch sensor display screen
JP2015115077A (en) * 2013-12-12 2015-06-22 ビステオン グローバル テクノロジーズ インコーポレイテッド Implementing hidden touch surface
JP2015518212A (en) * 2012-04-17 2015-06-25 ゼットティーイー コーポレイション Unlocking method, device and electronic terminal
JP2017187960A (en) * 2016-04-07 2017-10-12 ソニーモバイルコミュニケーションズ株式会社 Information processor

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6755125B2 (en) * 2016-05-31 2020-09-16 シャープ株式会社 Information processing equipment and programs

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07230374A (en) * 1994-02-17 1995-08-29 Fujitsu Ltd Method and device for generating new window in multiwindow system
JP2009199093A (en) * 2006-06-09 2009-09-03 Apple Inc Touch screen liquid crystal display
JP2009282634A (en) * 2008-05-20 2009-12-03 Canon Inc Information processor, its control method, program and storage medium

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000056889A (en) * 1998-08-04 2000-02-25 Matsushita Electric Ind Co Ltd Portable terminal device and window control method
JP2000137555A (en) * 1998-11-02 2000-05-16 Sony Corp Information processor, processing method and recording medium
JP4887855B2 (en) * 2006-03-22 2012-02-29 日本電気株式会社 Portable electronic device and control method thereof
US8564544B2 (en) * 2006-09-06 2013-10-22 Apple Inc. Touch screen device, method, and graphical user interface for customizing display of content category icons

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07230374A (en) * 1994-02-17 1995-08-29 Fujitsu Ltd Method and device for generating new window in multiwindow system
JP2009199093A (en) * 2006-06-09 2009-09-03 Apple Inc Touch screen liquid crystal display
JP2009282634A (en) * 2008-05-20 2009-12-03 Canon Inc Information processor, its control method, program and storage medium

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014527675A (en) * 2011-09-12 2014-10-16 モトローラ モビリティ エルエルシーMotorola Mobility Llc Use of differential pressure by touch sensor display screen
JP2014527678A (en) * 2011-09-12 2014-10-16 モトローラ モビリティ エルエルシーMotorola Mobility Llc Use of differential pressure by touch sensor display screen
US9069460B2 (en) 2011-09-12 2015-06-30 Google Technology Holdings LLC Using pressure differences with a touch-sensitive display screen
JP2013137722A (en) * 2011-12-28 2013-07-11 Kyocera Corp Device, method, and program
US9323444B2 (en) 2011-12-28 2016-04-26 Kyocera Corporation Device, method, and storage medium storing program
JP2013205986A (en) * 2012-03-27 2013-10-07 Kyocera Corp Electronic device
JP2015518212A (en) * 2012-04-17 2015-06-25 ゼットティーイー コーポレイション Unlocking method, device and electronic terminal
JP2013232047A (en) * 2012-04-27 2013-11-14 Konica Minolta Inc Image processing apparatus, control method of the same, and control program of the same
JP2014142813A (en) * 2013-01-24 2014-08-07 Sharp Corp Electronic apparatus and electronic apparatus operation control program
JP2015115077A (en) * 2013-12-12 2015-06-22 ビステオン グローバル テクノロジーズ インコーポレイテッド Implementing hidden touch surface
JP2017187960A (en) * 2016-04-07 2017-10-12 ソニーモバイルコミュニケーションズ株式会社 Information processor

Also Published As

Publication number Publication date
JPWO2011077525A1 (en) 2013-05-02

Similar Documents

Publication Publication Date Title
WO2011077525A1 (en) Electronic device, operation detection method and operation detection program
US7562459B2 (en) Method for entering commands and/or characters for a portable communication device equipped with a tilt sensor
JP4935669B2 (en) Portable device
WO2012070682A1 (en) Input device and control method of input device
CN103294300B (en) Sensor management apparatus, method, and computer program product
CN100570698C (en) The active keyboard system that is used for hand-held electronic equipment
US9958942B2 (en) Data input device
JP2009048600A (en) Inertia detection input controller, receiver, and interactive system thereof
JP2007219966A (en) Projection input device, and information terminal and charger having projection input device
KR100777107B1 (en) apparatus and method for handwriting recognition using acceleration sensor
EP1779224A2 (en) A touchpad device
JP2009258946A (en) Capacitive touch sensor
US20220253209A1 (en) Accommodative user interface for handheld electronic devices
WO2013137455A1 (en) Information terminal and execution control method
EP1160651A1 (en) Wireless cursor control
JPWO2008111138A1 (en) Information processing system, operation device, and information processing method
TW201327274A (en) Mobile communication device and method for inputting number data thereof
JP5955912B2 (en) Pointing device and portable computer.
JP2000187551A (en) Input device
JP5992380B2 (en) Pointing device, notebook personal computer, and operation method.
KR102232308B1 (en) Smart input device and method for operating the same
WO2016190834A1 (en) Manipulator for controlling an electronic device
KR20080113465A (en) Apparatus for controlling operation of electronic equipment for the use of a car, using haptic device, and electronic system for the use of a car comprising the apparatus
JP2001084099A (en) Track ball
US20200249767A1 (en) Virtual Keyboard Input Method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09852542

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2011547140

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 09852542

Country of ref document: EP

Kind code of ref document: A1