WO2015087977A1 - Portable terminal, operation processing method, program, and recording medium - Google Patents

Portable terminal, operation processing method, program, and recording medium Download PDF

Info

Publication number
WO2015087977A1
WO2015087977A1 PCT/JP2014/082865 JP2014082865W WO2015087977A1 WO 2015087977 A1 WO2015087977 A1 WO 2015087977A1 JP 2014082865 W JP2014082865 W JP 2014082865W WO 2015087977 A1 WO2015087977 A1 WO 2015087977A1
Authority
WO
WIPO (PCT)
Prior art keywords
contact
area
display screen
coordinates
smartphone
Prior art date
Application number
PCT/JP2014/082865
Other languages
French (fr)
Japanese (ja)
Inventor
拓也 野津
Original Assignee
シャープ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by シャープ株式会社 filed Critical シャープ株式会社
Publication of WO2015087977A1 publication Critical patent/WO2015087977A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1626Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/0418Control or interface arrangements specially adapted for digitisers for error correction or compensation, e.g. based on parallax, calibration or alignment
    • G06F3/04186Touch location disambiguation

Definitions

  • the present invention mainly relates to a portable terminal that receives an operation on a display screen, and an operation processing method using such a portable terminal.
  • Various portable terminals equipped with a touch panel are widely used. Such a portable terminal can accept an input operation with a finger on the display screen of the touch panel. Examples of such portable terminals include the portable terminals disclosed in Patent Documents 1 to 4.
  • the portable terminal of patent document 1 has a terminal, for example, when the user is operating the touch panel with the thumb of the hand holding the terminal while holding the terminal with one hand as shown in FIG. It is possible to prevent an erroneous operation due to the middle finger, ring finger, pinky finger, and / or thumb ball of the hand unintentionally touching the touch panel.
  • JP 2012-234386 A Japanese Published Patent Publication “JP 2011-28603 A” (published on February 10, 2011) Japanese Patent Publication “JP 2013-69190 A” (published April 18, 2013) Japanese Patent Publication “JP 2000-39964 A” (published on February 8, 2000)
  • the portable terminal of Patent Document 1 cannot activate the above-described erroneous operation prevention function unless the user operates the invalid area setting application in advance.
  • the portable terminal of Patent Document 1 is another user whose identification information is similar to that user. May not prevent the above-described erroneous operation. This is because the size of the area on the touch panel where the hand holding the terminal can be touched unintentionally when holding the portable terminal depends on the characteristics of the hand (the size of the hand, the length of the finger, etc.). .
  • the portable terminal of Patent Document 2 is in a specific situation where it is easy to perform an erroneous operation (specifically, the user touches the thumb ball on the right side as shown in FIG. While holding the terminal with one hand so that the user's finger is in contact), the erroneous operation prevention function cannot be used because it is erroneously recognized as being operated with both hands.
  • the portable terminal of Patent Document 3 cannot activate the erroneous operation prevention function when the user unintentionally touches the corner of the touch panel while performing a continuous touch operation. Furthermore, although the portable terminal of Patent Document 4 can operate the erroneous operation prevention function in the handwriting input mode, it cannot operate the erroneous operation prevention function in the normal display mode.
  • the present invention has been made in view of the above problems, and its main object is to realize a portable terminal capable of reliably operating an erroneous operation prevention function under a specific situation in which an erroneous operation is likely to occur. There is.
  • a mobile terminal includes a display screen that receives an operation for designating coordinates, a contact detection unit provided on each of a plurality of side surfaces of the terminal, and a contact object.
  • a determination unit that determines whether or not a plurality of the contact detection units are detecting, and when the determination unit determines that the plurality of contact detection units detect a contact object, And an invalidating unit for invalidating an operation for designating coordinates in a specific area.
  • an operation processing method provides an operation process performed by a mobile terminal in which a display screen that receives an operation for specifying coordinates is provided and a contact detection unit is provided on a side surface of the terminal itself.
  • the contact detection unit is provided on each of the plurality of side surfaces of the mobile terminal, and a determination step of determining whether or not the plurality of contact detection units detect a contact object; and An invalidation step of invalidating an operation of designating coordinates in a specific area in the display screen when it is determined in the determination step that the plurality of contact detection units detect a contact object.
  • the portable terminal according to one aspect of the present invention has an effect that the erroneous operation prevention function can be surely operated under a specific situation in which an erroneous operation is easily performed.
  • a smartphone 1 which is an embodiment 1 of a mobile terminal according to the present invention will be described with reference to FIGS.
  • a smart phone is only an example of a portable terminal,
  • the portable terminal which concerns on this invention can be implemented with forms, such as a feature phone and a tablet terminal, for example.
  • FIG. 2 is an external view of the smartphone 1. As shown to (a) of FIG. 2, the smart phone 1 does not have a frame area, and the whole front surface of the smart phone 100 is covered with the touchscreen display screen. 2A and 2B, the left side surface L of the smartphone 100 is provided with a contact sensor 13L, and the right side surface R of the smartphone 100 is provided with a contact sensor 13R.
  • both the contact sensor 13L and the contact sensor 13R are used. Is designed to detect hand contact. Then, when contact is detected by both the contact sensor 13L and the contact sensor 13R, the smartphone 1 detects that the operation invalid area (a touch operation for designating coordinates in the area is invalidated at the right end and the left end of the display screen). Area).
  • the mobile terminal according to the present invention may include not only a display screen on which the operation invalid area is provided on the front surface of the own terminal but also a touch pad on which the operation invalid area is provided on the back surface of the own terminal.
  • the portable terminal may provide an operation invalid area at the left end and the right end of the display screen and an operation invalid area at the left end and the right end of the touch pad.
  • a frame region may be provided around the display screen.
  • FIG. 1 is a block diagram showing a main configuration of the smartphone 1. As illustrated in FIG. 1, the smartphone 1 includes a storage unit 11, a control unit 12, two contact sensors (contact sensors 13R and 13L), and a display unit 14.
  • the smartphone 1 includes a storage unit 11, a control unit 12, two contact sensors (contact sensors 13R and 13L), and a display unit 14.
  • the storage unit 11 stores an OS (Operating System) program, various application programs, various control programs, various data, and the like.
  • OS Operating System
  • Control unit 12 The control unit 12 is a CPU that controls the smartphone 1 as a whole.
  • the control unit 12 also functions as the application execution unit 121, the touch event control unit 122, and the sensor state detection processing unit 123 by reading the corresponding program from the storage unit 11.
  • the application execution unit 121 performs processing according to an application read from the control unit 12 (hereinafter, “application”). For example, the application execution unit 121 displays a user interface screen (UI screen) of the application on the display unit 14.
  • application an application read from the control unit 12
  • UI screen user interface screen
  • the touch event control unit 122 issues a touch event with the contact coordinates on the display screen as an argument to the foreground application.
  • the touch event control unit 122 is configured to invalidate the touch operation that specifies the coordinates in the operation invalid area when both of the two contact sensors detect the touch of the hand. That is, the touch event control unit 122 does not issue a touch event even when a touch operation that specifies only coordinates in the operation invalid area is detected when both of the two contact sensors detect a hand contact. It has become.
  • the touch event control unit 122 includes an operation invalid area specifying unit 1221.
  • the operation invalid area specifying unit 1221 reads data indicating the position and size of the operation invalid area and identifies the operation invalid area.
  • the sensor state detection processing unit 123 determines whether or not both of the two contact sensors are detecting the contact of the hand (contact object).
  • the contact sensor 13R notifies the sensor state detection processing unit 123 that the contact of the hand is detected when the contact of the hand is detected. Similarly, the contact sensor 13L notifies the sensor state detection processing unit 123 that the hand contact is detected when the hand contact is detected.
  • the display unit 140 is a touch panel display that displays a user interface screen of the application.
  • FIG. 3 is a flowchart showing the operation
  • FIG. 4 is a diagram illustrating an operation invalid area of the smartphone 1.
  • the smartphone 1 checks the states of the two contact sensors (S1). Specifically, the touch event control unit 122 specifies the coordinates (touch coordinates) with which the hand on the touch panel is in contact, and determines whether or not both of the two contact sensors are touched with respect to the sensor state detection processing unit 123. Inquire.
  • the sensor state detection processing unit 123 determines whether or not the notification that the contact of the hand is detected is received from both of the two contact sensors (S2).
  • the sensor state detection processing unit 123 displays the determination result as a touch event control unit. 122 is notified.
  • the smartphone 1 determines in S2 that one or both of the two contact sensors are not touched (that is, the notification is not received from one or both of the two contact sensors)
  • the smartphone 1 proceeds to S6 described later. .
  • the operation invalid area specifying unit 1221 of the touch event control unit 122 that has received the notification reads data indicating the position and size of the operation invalid area from the storage unit 11.
  • the operation invalid area specifying unit 1221 includes the coordinates of the four corners of the rectangular operation invalid area (see FIG. 4) near the left end of the touch panel and the four corners of the rectangular operation invalid area (see FIG. 4) near the right end of the touch panel. Data indicating the coordinates is read (S3).
  • the touch event control part 122 determines whether a touch coordinate is located in an operation invalid area
  • the touch event control unit 122 displays information indicating that the touch operation is invalid on the display unit 140 when it is determined that the touch coordinates are located within the operation invalid region (S5). That is, the touch event control unit 122 ends the operation according to the flowchart of FIG. 3 without issuing a touch event having the touch coordinates as an argument to the application execution unit 121.
  • the smartphone 1 validates the touch operation and performs an operation according to the touch event. Specifically, the touch event control unit 122 issues a touch event with the touch coordinates as an argument to the application execution unit 121, and the application execution unit 121 executes processing corresponding to the touch event.
  • the smart phone 1 complete
  • the smartphone 1 includes the display screen (display unit 140) that accepts an operation for specifying coordinates, the contact sensors (contact sensor 13R and contact sensor 13L) provided on each of both side surfaces of the terminal, A sensor state detection processing unit 123 that determines whether or not both of the two contact sensors have detected the contact of the two, and a display screen when it is determined that both of the two contact sensors have detected the contact of the hand And a touch event control unit 122 that invalidates the operation on the operation invalid area.
  • the display screen display unit 140
  • the contact sensors contact sensor 13R and contact sensor 13L
  • a sensor state detection processing unit 123 that determines whether or not both of the two contact sensors have detected the contact of the two
  • a display screen when it is determined that both of the two contact sensors have detected the contact of the hand
  • a touch event control unit 122 that invalidates the operation on the operation invalid area.
  • the smartphone 1 when the user places the smartphone 1 on the palm of one hand (that is, the user does not hold the smartphone 1 and one or both of the two contact detection units are hands). If no contact is detected, the malfunction prevention function will not work. That is, the smartphone 1 does not allow the erroneous operation prevention function to work under a situation where it is difficult to perform an erroneous operation.
  • the smartphone 1 when the smartphone 1 is operated with the thumb of the hand holding the smartphone 1 and the hand (portion such as a thumb ball or little finger) holding the smartphone 1 during operation is unintentionally touching the operation invalid area, The touch operation that specifies the coordinates in the operation invalid area is invalidated.
  • the smartphone 1 activates an erroneous operation prevention function under a specific situation where an erroneous operation is easily performed.
  • the erroneous operation prevention function works reliably regardless of the type of user who uses the smartphone 1 (for example, a user who does not know that the smartphone 1 has the erroneous operation prevention function).
  • the smartphone 1 does not operate the erroneous operation prevention function under a situation where it is difficult to perform an erroneous operation, and can reliably operate the erroneous operation prevention function under a specific situation where the erroneous operation is easily performed.
  • the smartphone 1 periodically determines whether or not the user has received notification from both of the two contact sensors whether or not the user has touched the touch panel.
  • the information indicating the determination result may be cached for each determination.
  • the smart phone 1 may perform the process after S3, without performing the process of S1 and S2, when a user makes a hand contact one point on a touch panel.
  • the smartphone 1 may first determine whether to proceed to S3 or S6 based on the most recently cached information, and then execute the process of S3 or S6 based on the determination.
  • the contact sensor 13AL and the contact sensor 13AR Both are adapted to detect hand contact and contact area.
  • the smartphone 1A displays an operation invalid area having a width proportional to the sum of the contact area detected by the contact sensor 13AL and the contact area detected by the contact sensor 13AR. They are provided at the left and right ends of the display screen.
  • FIG. 5 is a block diagram showing a main configuration of the smartphone 1A.
  • the smartphone 1A includes a storage unit 11, a control unit 12A, two contact sensors (contact sensors 13AR and 13AL), and a display unit 14.
  • Control unit 12A The control unit 12A is a CPU that controls the entire smartphone 1A.
  • the control unit 12A also functions as the application execution unit 121, the touch event control unit 122A, and the sensor state detection processing unit 123A by reading the corresponding program from the storage unit 11.
  • Touch event control unit 122A Similar to the touch event control unit 122, the touch event control unit 122A issues a touch event with the contact coordinates on the display screen as an argument, but even if it detects a touch operation that specifies only the coordinates in the operation invalid area. The touch event is not issued.
  • the touch event control unit 122A includes an operation invalid area determination unit 1221A.
  • the operation invalid region determination unit 1221A When both of the two contact sensors detect the hand contact and the contact area, the operation invalid region determination unit 1221A reads data indicating the position of the operation invalid region (the left end and the right end of the display screen). The invalid operation area determination unit 1221A determines to provide operation invalid areas having a width proportional to the sum of the contact area detected by the contact sensor 13AL and the contact area detected by the contact sensor 13AR at the left end and the right end of the display screen. .
  • the sensor state detection processing unit 123A determines whether or not both of the two contact sensors are detecting a hand contact.
  • the sensor state detection processing unit 123A includes a contact area specifying unit 1231.
  • Contact sensor 13AR and contact sensor 13AL When the contact sensor 13AR detects the contact of the hand, the contact sensor 13AR notifies the contact area specifying unit 1231 that the contact of the hand is detected and the contact area. Similarly, when the contact sensor 13AL detects a hand contact, the contact sensor 13AL notifies the contact area specifying unit 1231 that the hand contact has been detected and the contact area.
  • FIG. 6 is a flowchart showing the operation
  • FIG. 7 is a diagram illustrating an operation invalid area of the smartphone 1A.
  • the smartphone 1A performs the same processing as S1 in S11, and performs the same processing as S2 in S12.
  • the contact area specifying unit 1231 specifies the contact area between the contact sensor 13AR and the hand and the contact area between the contact sensor 13AL and the hand ( S13), each touch area is notified to the touch event control unit 122. If it is determined in S12 that one or both of the two contact sensors is not touched, the smartphone 1A performs the process of S17 (the same process as S6).
  • the operation invalid area determination unit 1221A of the touch event control unit 122A invalidates the operation so that the operation invalid areas on the right end and the left end of the display screen increase as the sum of the contact areas increases, as shown in FIG.
  • the size of the area is determined (S14).
  • the smartphone 1A performs the processing from S15 onward after S14, but S15 to S17 are the same steps as S4 to S6, respectively, and thus description thereof is omitted.
  • the smartphone 1A provides an operation invalid area having a size corresponding to the contact area detected by the contact sensor 13AL at the left end of the display screen, and an operation invalidity corresponding to the contact area detected by the contact sensor 13AR at the right end of the display screen. You may be comprised so that an area
  • the smartphone 1A provides an operation invalid area having a size proportional to the size of the contact area detected by the contact sensor 13AL on the left end of the display screen, and the size of the contact area detected by the contact sensor 13AR on the right end of the display screen. It is also possible to provide an operation invalid area having a width proportional to.
  • the smartphone 1A specifies the class to which the contact area detected by the contact sensor 13AL belongs from a predetermined N-stage class (for example, the three-stage class in FIG. 14A) related to the size of the contact area, An operation invalid area having a size corresponding to the specified class may be provided at the left end of the display screen.
  • the contact sensor 13BL and the contact sensor 13BR Both are adapted to detect hand contact and contact position.
  • the smartphone 1B sets the operation invalid area at a position corresponding to the contact position detected by the contact sensor 13BL at the left end of the display screen (specifically, the contact (Position adjacent to the position).
  • the smartphone 1B provides the operation invalid area at a position (specifically, a position adjacent to the contact position) according to the contact position detected by the contact sensor 13BR at the right end of the display screen.
  • the contact sensor 13BL and the contact sensor 13BR may detect the center coordinates of the area where the hand is in contact on the contact detection surface as the contact position.
  • FIG. 8 is a block diagram showing a main configuration of the smartphone 1B.
  • the smartphone 1B includes a storage unit 11, a control unit 12B, two contact sensors (contact sensors 13BR and 13BL), and a display unit 14.
  • Control unit 12B The control unit 12B is a CPU that controls the overall smartphone 1B.
  • the control unit 12B also functions as the application execution unit 121, the touch event control unit 122B, and the sensor state detection processing unit 123B by reading the corresponding program from the storage unit 11.
  • Touch event control unit 122B Similar to the touch event control unit 122, the touch event control unit 122B issues a touch event with the contact coordinates on the display screen as an argument, but even if it detects a touch operation that specifies only the coordinates in the operation invalid area. The touch event is not issued.
  • the touch event control unit 122B includes an operation invalid area determination unit 1221B.
  • the operation invalid area determination unit 1221B determines the position of the operation invalid area provided at the left end of the display screen based on the contact position detected by the contact sensor 13BL. The position of the operation invalid area provided at the right end of the display screen is determined based on the contact position detected by the contact sensor 13BR.
  • the sensor state detection processing unit 123B Similar to the sensor state detection processing unit 123, the sensor state detection processing unit 123B determines whether or not both of the two contact sensors are detecting a hand contact.
  • the sensor state detection processing unit 123B includes a contact position specifying unit 1232.
  • Contact position specifying unit 1232 When the contact position specifying unit 1232 determines that both of the two contact sensors are detecting the contact of the hand, the contact position of the contact sensor 13BR and the hand, and the contact sensor 13BL and the hand Identify the contact location.
  • Contact sensor 13BR and contact sensor 13BL When the contact sensor 13BR detects the contact of the hand, the contact sensor 13BR notifies the contact position specifying unit 1232 that the contact of the hand is detected and the contact position. Similarly, when the contact sensor 13AL detects a hand contact, the contact sensor 13AL notifies the contact position specifying unit 1232 that the hand contact has been detected and the contact position.
  • FIG. 9 is a flowchart showing the operation
  • FIG. 10 is a diagram illustrating an operation invalid area of the smartphone 1B.
  • the smartphone 1B performs the same process as S1 in S21, and performs the same process as S2 in S22.
  • the contact position specifying unit 1232 determines the contact position between the contact sensor 13AR and the hand, and the contact position between the contact sensor 13AL and the hand (for example, the above-described case). ) (S23), and notifies each touch position to the touch event control unit 122.
  • the smart phone 1B performs the process of S27 (process similar to S6).
  • the operation invalid region determination unit 1221B of the touch event control unit 122B that has received the notification determines the position of the operation invalid region at the left end of the display screen based on the contact position detected by the contact sensor 13BL, and is detected by the contact sensor 13AR. Based on the contact position, the position of the operation invalid area at the right end of the display screen is determined (S24). For example, when the contact position detected by the contact sensor 13AL is separated from the bottom surface of the smartphone 1B by a distance d, the operation invalid area determination unit 1221B displays the display screen as shown in FIGS. It is determined that the operation invalid area is provided at a position at the left end where the distance between the center coordinate of the operation invalid area and the lower end of the display screen is d.
  • the operation invalid area determination unit 1221B determines that the area in the display screen adjacent to the “contact area with the finger in the contact sensor 13AL” is the operation invalid area.
  • that the contact area and the area in the display screen are “adjacent” means that the contact area and the “area in the display screen” are in contact with each other.
  • the operation invalid area determining unit 1221B determines the position of the operation invalid area at the right end of the display screen in the same manner.
  • the smartphone 1B performs the processing from S25 onward after S24, but S25 to S27 are the same steps as S4 to S6, respectively, and the description thereof is omitted.
  • the operation invalid area may be a semi-elliptical area as shown in (c) of FIG. 10, a rectangular area as shown in (d) of FIG. It may be a shape area (for example, a circular area).
  • the length of the major axis of the semi-ellipse is three times the length of the minor axis, but the length of the major axis of the semi-ellipse is less than three times the length of the minor axis ( For example, the length of the major axis may be larger than three times the length of the minor axis (for example, it may be four times).
  • FIG. 10C the length of the major axis of the semi-ellipse is three times the length of the minor axis, but the length of the major axis of the semi-ellipse is less than three times the length of the minor axis ( For example, the length of the major axis may be larger than three times the length of the minor axis (for example, it may be four times).
  • the length of the long side of the rectangle is three times the length of the short side, but the length of the long side of the rectangle is less than three times the length of the short side (
  • the length of the long side may be larger than three times the length of the short side (for example, it may be four times).
  • the smartphone 1B according to this modification is configured to dynamically change the size of the operation invalid area provided on the display screen while both of the two contact sensors detect contact with the hand.
  • the smartphone 1B defines the first region in the display screen so that the position corresponds to the contact position detected by the contact sensor regardless of the presence or absence of a finger indicating the coordinates in the display screen. More specifically, the smartphone 1B defines the first area by using the same method as the method for defining the operation invalid area in FIG. Furthermore, when the finger points to the coordinates in the first area, the smartphone 1B defines a second area corresponding to the position of the coordinates in the display screen.
  • the smartphone 1B defines the first area as an operation invalid area when the finger does not point to the coordinates in the first area, and when the finger points to the coordinates within the first area.
  • An area including the first area and the second area is defined as an operation invalid area.
  • FIG. 11 is a flowchart showing the operation
  • FIG. 12 is a diagram illustrating an operation invalid area of the smartphone 1B at the left end of the display screen.
  • an area on the display screen and surrounded by a wavy line indicates an operation invalid area.
  • the smartphone 1B performs the same process as S1 in S31, and performs the same process as S2 in S32.
  • the contact position specifying unit 1232 determines the coordinates of the upper end and the lower end of the contact area between the contact sensor 13AR and the hand, and the contact sensor 13AL and the hand. The coordinates of the upper end and the lower end of the contact area are specified. Then, the operation invalid area determination unit 1221B determines the touch coordinates on the touch panel (target) that are located within a certain distance r (within the above-described first area) from any of the four specified coordinates. It is determined whether or not there is one or more (touch coordinates) (S33). In the example of FIG. 12, since the touch coordinates A and the touch coordinates B are located in the first area, it is determined that one or more target touch coordinates exist. If the smartphone 1B determines that one or both of the two contact sensors are not touched in S32, the process proceeds to S36.
  • the touch event control unit 122B determines not to include all target touch coordinates in the argument of the touch operation event (S34). If the smartphone 1B determines that the target touch coordinates do not exist in S33, the process proceeds to S36.
  • the operation invalid area determination unit 1221B is located on the touch panel such that the operation invalid area determination unit 1221B is located within a certain distance r from any one of the one or more coordinates determined not to be included in the argument of the touch operation event in S34. It is determined whether one or more touch coordinates (target touch coordinates) exist (S35). The smartphone 1B returns to S34 when it is determined that one or more target touch coordinates exist, and proceeds to S36 when it is determined that no target touch coordinates exist. In the example of FIG. 12, in the first time S35, the touch coordinate C is located within the fixed distance r (within the second region) from the touch coordinate A, and the touch coordinate D is within the fixed distance r (the second distance) from the touch coordinate B.
  • the touch event control unit 122B determines whether there is touch coordinates (target touch coordinates) that are not determined to be included in the argument of the touch event. In the example of FIG. 12, since the touch coordinates F correspond to the target touch coordinates, it is determined that the target touch coordinates exist.
  • the touch event control unit 122B determines that the target touch coordinates exist in S36
  • the touch event control unit 122B issues a touch operation event having all the target touch coordinates as arguments to the application execution unit 121.
  • the application execution part 121 performs the process according to the issued touch operation event (S37).
  • the smartphone 1B ends the operation according to the flowchart of FIG.
  • the touch event control unit 122 does not issue a touch event with the touch coordinates as an argument to the application execution unit 121, and performs the operation according to the flowchart of FIG. Exit.
  • a mobile terminal provided with a frame area may be used instead of the smartphone 1B.
  • the operation invalid area determination unit of the mobile terminal determines that the area in the display screen adjacent to the “contact area with the finger in the contact sensor” is set as the operation invalid area, similarly to the operation invalid area determination unit 1221B.
  • the contact area and the area in the display screen are “adjacent” means that the shortest distance between the contact area and the “area in the display screen” is the difference between the contact area and the entire area of the display screen. It is equal to the shortest distance between.
  • the mobile terminal according to the present invention can take various forms, the operation invalid area adjacent to the “contact area with the finger in the contact sensor” in each form by taking four types of mobile terminals as examples. The position of is shown in FIG.
  • FIG. 15 is a cross-sectional view of an outer edge portion of each mobile terminal (specifically, “parallel to the bottom surface” separated from the bottom surface by a distance d between the bottom surface of the mobile terminal and the center of the contact area of the contact sensor with the finger. It is the figure which showed typically the cross section cut off by the "face.”
  • an “area in the display screen” adjacent to the “contact area with the finger in the contact sensor” is set as an operation invalid area.
  • the smartphone according to each embodiment may be configured to display an image in the operation invalid area, but may be configured not to display an image in the operation invalid area.
  • the portable terminal according to the present invention is not limited to a terminal whose front surface is covered with a touch panel and contact sensors are provided on two opposing side surfaces (left side surface and right side surface). That is, the mobile terminal according to the present invention may be a terminal provided with a touch panel that covers the left side, the front side, and the right side and is not provided with the two contact sensors described above. In this case, the portion of the touch panel that covers the left side surface and the portion of the display unit that covers the right side surface serve as a contact detection unit in the claims.
  • the mobile terminal which concerns on this invention should just be provided with the contact sensor in the some side surface.
  • the mobile terminal according to the present invention may be a terminal A in which contact sensors are provided on two opposing side surfaces (upper side surface and lower side surface), or a contact sensor may be provided on each of four upper, lower, left, and right side surfaces.
  • the terminal B provided may be used.
  • the touch event control unit of the terminal A performs a touch operation for designating coordinates in an operation invalid area including the upper end area and the lower end area of the display screen when both of the two contact sensors detect a hand contact. It may be made invalid.
  • the terminal A is displayed unintentionally by the user's hand when the user is holding the upper and lower ends of the terminal A with both hands so that the left side or the right side of the terminal A faces upward. It is possible to prevent an erroneous operation caused by touching the screen.
  • the touch event control unit of the terminal B displays the display screen when the contact sensor provided on the upper side and the contact sensor provided on the lower side of the four contact sensors detect the hand contact.
  • the touch operation for designating the coordinates in the operation invalid area including the upper end area and the lower end area may be invalidated.
  • the touch event control unit of the terminal B displays when the contact sensor provided on the right side and the contact sensor provided on the left side of the four contact sensors detect a hand contact.
  • the touch operation that specifies the coordinates in the operation invalid area including the right end area and the left end area of the screen may be invalidated.
  • the smartphone control blocks (particularly, the touch event control unit and the sensor state detection processing unit) of each embodiment may be realized by a logic circuit (hardware) formed in an integrated circuit (IC chip) or the like. Alternatively, it may be realized by software using a CPU (Central Processing Unit).
  • a logic circuit hardware
  • IC chip integrated circuit
  • CPU Central Processing Unit
  • the smartphone has a CPU that executes instructions of a program that is software that realizes each function, a ROM (Read Only Memory) or a storage device in which the program and various data are recorded so as to be readable by a computer (or CPU) (These are referred to as “recording media”), and a RAM (Random Access Memory) for expanding the program.
  • a computer or CPU
  • a “non-temporary tangible medium” such as a tape, a disk, a card, a semiconductor memory, a programmable logic circuit, or the like can be used.
  • the program may be supplied to the computer via an arbitrary transmission medium (such as a communication network or a broadcast wave) that can transmit the program.
  • a transmission medium such as a communication network or a broadcast wave
  • the present invention can also be realized in the form of a data signal embedded in a carrier wave in which the program is embodied by electronic transmission.
  • the mobile terminal (smart phone 1) includes a display screen (display unit 140) that receives an operation for designating coordinates, and a contact detection unit (contact sensor 13L) provided on each of a plurality of side surfaces of the terminal. 13R), a determination unit (sensor state detection processing unit 123) for determining whether or not the plurality of contact detection units detect a contact object (contact of a hand), and a plurality of contact objects by the determination unit.
  • An invalidation unit touch event control unit 122) that invalidates an operation for designating coordinates in a specific region (operation invalid region) in the display screen when it is determined that the contact detection unit is detecting And.
  • the portable terminal has an effect that the erroneous operation prevention function can be surely operated under a specific situation in which an erroneous operation is easily performed.
  • the portable terminal (smart phone 1A) according to aspect 2 of the present invention further includes a defining unit (operation invalid area determining unit 1221A) that defines the specific area in the aspect 1, and the defining unit includes the contact detecting unit.
  • the specific area may be defined such that the area of the specific area increases as the contact area of the contact object increases.
  • the above-mentioned specific terminal becomes more specific when the hand holding the terminal is large (that is, the hand tends to touch the display screen unintentionally).
  • the specific region is defined so that the area of the region becomes large. Therefore, when the portable terminal is operated by a user with a large hand, there is a further effect that erroneous operation can be prevented more reliably than before.
  • the mobile terminal (smart phone 1B) according to aspect 3 of the present invention further includes a defining part (operation invalid area determining part 1221B) that defines the specific area in the aspect 1, and the defining part includes the specific area.
  • the specific region may be defined such that the specific region is positioned at a position corresponding to the contact position of the contact object in the contact detection unit.
  • a portable terminal according to aspect 4 of the present invention is the portable terminal according to aspect 3, wherein the defining unit defines the first area in the display screen regardless of the presence or absence of a finger indicating the coordinates in the display screen. Indicates the second area corresponding to the position of the coordinate in the display screen, and the defining unit is configured such that the finger is the first area.
  • the first area is defined as the specific area
  • the finger points to the coordinates in the first area the first area and the second area
  • An area including the above may be defined as the specific area.
  • the mobile terminal according to aspect 5 of the present invention is the mobile terminal according to any one of the aspects 1 to 4 described above, wherein the specific area is the display screen for each of the plurality of contact detection units detecting the contact object. It may be a region that is defined such that the region of the end portion on the side surface side where the contact detection unit is provided is part of the specific region.
  • An operation processing method is an operation processing method by a portable terminal provided with a display screen for accepting an operation for designating coordinates, and provided with a contact detection unit on a side surface of the terminal itself. Are provided on each of the plurality of side surfaces of the portable terminal, and a determination step for determining whether or not both of the plurality of contact detection units detect the contact object, and the contact object in the determination step. When it is determined that a plurality of the contact detection units are detecting, a disabling step of disabling an operation of designating coordinates in a specific area in the display screen is included.
  • the operation processing method has the same effects as the portable terminal according to aspect 1 of the present invention.
  • the mobile terminal according to each aspect of the present invention may be realized by a computer.
  • the computer is operated as the determination unit and the invalidation unit included in the mobile terminal, so that each unit is operated by the computer.
  • a program to be realized and a computer-readable recording medium on which the program is recorded also fall within the scope of the present invention.
  • the present invention can be suitably used for various portable information terminals such as smartphones.

Abstract

 The present invention realizes a portable terminal capable of reliably actuating an erroneous-operation-preventing function in a specific situation in which an erroneous operation may easily be performed. A smartphone (1) is provided with a first contact sensor (13R), a second contact sensor, and a touch-panel display unit at the right-side face, the left-side face, and the front face of the smartphone, respectively. While both of the two contact sensors detect a touch of a hand, an operation-invalid area is provided in the display unit.

Description

携帯端末、操作処理方法、プログラム、及び、記録媒体Portable terminal, operation processing method, program, and recording medium
 本発明は、主に、表示画面に対する操作を受け付ける携帯端末、及び、そのような携帯端末による操作処理方法に関する。 The present invention mainly relates to a portable terminal that receives an operation on a display screen, and an operation processing method using such a portable terminal.
 タッチパネルを備えた様々な携帯端末(例えば、タブレット端末、スマートフォン等)が広く普及している。このような携帯端末は、タッチパネルの表示画面に対する指による入力操作を受け付けることが可能となっている。そのような携帯端末としては、例えば、特許文献1~特許文献4の携帯端末が挙げられる。 Various portable terminals equipped with a touch panel (for example, tablet terminals, smartphones, etc.) are widely used. Such a portable terminal can accept an input operation with a finger on the display screen of the touch panel. Examples of such portable terminals include the portable terminals disclosed in Patent Documents 1 to 4.
 タッチパネルを備えた携帯端末(特に、額縁領域の狭い携帯端末、及び、額縁領域のない図13に示すような携帯端末)では、ユーザが端末を持つ手で意図せずタッチパネルの端部に触れ、無意識に誤操作をしてしまうことがあるが、特許文献1~特許文献4の携帯端末は、そのような誤操作をある程度防ぐことができるようになっている。 In a mobile terminal equipped with a touch panel (particularly, a mobile terminal with a narrow frame area and a mobile terminal as shown in FIG. 13 without a frame area), the user touches the end of the touch panel unintentionally with the hand holding the terminal, Although the user may unintentionally perform an erroneous operation, the portable terminals disclosed in Patent Documents 1 to 4 can prevent such an erroneous operation to some extent.
 例えば、特許文献1の携帯端末は、ユーザが無効領域設定アプリケーションを用いて該ユーザが触れたタッチパネル上の領域を無効領域として設定した場合、該ユーザと識別情報が同一又は類似のユーザによるその無効領域への手の接触を入力として扱わないようになっている(特許文献1の段落0016~0028、図3、図5を参照)。 For example, in the portable terminal of Patent Document 1, when a user sets an area on the touch panel that the user touches as an invalid area using an invalid area setting application, the invalidation by a user whose identification information is the same as or similar to that user The touch of the hand to the area is not treated as an input (see paragraphs 0016 to 0028 of FIG. 3, FIG. 3 and FIG. 5).
 これにより、特許文献1の携帯端末は、例えば、図13のようにユーザが片手で端末を持ちながら端末を持っている手の母指でタッチパネルを操作しているときに、端末を持っている手の中指、薬指、小指、及び/又は、母指球が意図せずタッチパネルに触れてしまうことによる誤操作を防ぐことができる。 Thereby, the portable terminal of patent document 1 has a terminal, for example, when the user is operating the touch panel with the thumb of the hand holding the terminal while holding the terminal with one hand as shown in FIG. It is possible to prevent an erroneous operation due to the middle finger, ring finger, pinky finger, and / or thumb ball of the hand unintentionally touching the touch panel.
日本国公開特許公報「特開2012-234386号公報」(2012年11月29日公開)Japanese Patent Publication “JP 2012-234386 A” (published on November 29, 2012) 日本国公開特許公報「特開2011-28603号公報」(2011年2月10日公開)Japanese Published Patent Publication “JP 2011-28603 A” (published on February 10, 2011) 日本国公開特許公報「特開2013-69190号公報」(2013年4月18日公開)Japanese Patent Publication “JP 2013-69190 A” (published April 18, 2013) 日本国公開特許公報「特開2000-39964号公報」(2000年2月8日公開)Japanese Patent Publication “JP 2000-39964 A” (published on February 8, 2000)
 しかしながら、特許文献1の携帯端末は、ユーザが予め無効領域設定アプリケーションを動作させない限り、上述の誤操作防止機能を働かせることはできない。加えて、特許文献1の携帯端末は、タッチパネルの極一部の領域にしか触れないように端末を持つユーザが無効領域設定アプリケーションを動作させた場合、そのユーザと識別情報が類似する他のユーザによる上述の誤操作を防止できないことがある。携帯端末を持ったときに端末を持つ手が意図せず接触し得るタッチパネル上の領域の広さは、手の特徴(手の大きさや指の長さ等)に応じたものとなるからである。 However, the portable terminal of Patent Document 1 cannot activate the above-described erroneous operation prevention function unless the user operates the invalid area setting application in advance. In addition, when a user with a terminal operates an invalid area setting application so that only a partial area of the touch panel is touched, the portable terminal of Patent Document 1 is another user whose identification information is similar to that user. May not prevent the above-described erroneous operation. This is because the size of the area on the touch panel where the hand holding the terminal can be touched unintentionally when holding the portable terminal depends on the characteristics of the hand (the size of the hand, the length of the finger, etc.). .
 また、特許文献2の携帯端末は、誤操作をし易い特定の状況下(具体的には、図13のようにユーザが右側面に母指球を接触させつつ左側面に母指以外の4本の指を接触させるようにして片手で該端末を持っている間)では、両手で操作されていると誤認識するために誤操作防止機能を働かせることはできない。 In addition, the portable terminal of Patent Document 2 is in a specific situation where it is easy to perform an erroneous operation (specifically, the user touches the thumb ball on the right side as shown in FIG. While holding the terminal with one hand so that the user's finger is in contact), the erroneous operation prevention function cannot be used because it is erroneously recognized as being operated with both hands.
 また、特許文献3の携帯端末は、ユーザが連続的なタッチ操作を行っている間に意図せずタッチパネルの隅に触れた場合には、誤操作防止機能を働かせることができない。さらに、特許文献4の携帯端末は、手書き入力モード時には誤操作防止機能を働かせることができるものの、通常表示モード時には誤操作防止機能を働かせることができない。 Also, the portable terminal of Patent Document 3 cannot activate the erroneous operation prevention function when the user unintentionally touches the corner of the touch panel while performing a continuous touch operation. Furthermore, although the portable terminal of Patent Document 4 can operate the erroneous operation prevention function in the handwriting input mode, it cannot operate the erroneous operation prevention function in the normal display mode.
 本発明は、上記課題に鑑みて成されたものであるものであり、その主な目的は、誤操作がされ易い特定の状況下では誤操作防止機能を確実に働かせることが可能な携帯端末を実現することにある。 The present invention has been made in view of the above problems, and its main object is to realize a portable terminal capable of reliably operating an erroneous operation prevention function under a specific situation in which an erroneous operation is likely to occur. There is.
 上記課題を解決するために、本発明の一態様に係る携帯端末は、座標を指定する操作を受け付ける表示画面と、自端末の複数の側面の各々に設けられた接触検出部と、接触物を複数の上記接触検出部が検出しているか否かを判定する判定部と、上記判定部により接触物を複数の上記接触検出部が検出していると判定された場合に、上記表示画面内の特定の領域内の座標を指定する操作を無効化する無効化部とを備えている。 In order to solve the above-described problem, a mobile terminal according to an aspect of the present invention includes a display screen that receives an operation for designating coordinates, a contact detection unit provided on each of a plurality of side surfaces of the terminal, and a contact object. A determination unit that determines whether or not a plurality of the contact detection units are detecting, and when the determination unit determines that the plurality of contact detection units detect a contact object, And an invalidating unit for invalidating an operation for designating coordinates in a specific area.
 上記課題を解決するために、本発明の一態様に係る操作処理方法は、座標を指定する操作を受け付ける表示画面が設けられ、自端末の側面に接触検出部が設けられた携帯端末による操作処理方法であって、上記接触検出部は、上記携帯端末の複数の側面の各々に設けられており、接触物を複数の上記接触検出部が検出しているか否かを判定する判定ステップと、上記判定工程にて接触物を複数の上記接触検出部が検出していると判定された場合に、上記表示画面内の特定の領域内の座標を指定する操作を無効化する無効化ステップと、を含んでいる。 In order to solve the above-described problem, an operation processing method according to an aspect of the present invention provides an operation process performed by a mobile terminal in which a display screen that receives an operation for specifying coordinates is provided and a contact detection unit is provided on a side surface of the terminal itself. In the method, the contact detection unit is provided on each of the plurality of side surfaces of the mobile terminal, and a determination step of determining whether or not the plurality of contact detection units detect a contact object; and An invalidation step of invalidating an operation of designating coordinates in a specific area in the display screen when it is determined in the determination step that the plurality of contact detection units detect a contact object. Contains.
 本発明の一態様に係る携帯端末は、誤操作がされ易い特定の状況下では誤操作防止機能を確実に働かせることができる、という効果を奏する。 The portable terminal according to one aspect of the present invention has an effect that the erroneous operation prevention function can be surely operated under a specific situation in which an erroneous operation is easily performed.
本発明の実施形態1に係るスマートフォンの構成を示すブロック図である。It is a block diagram which shows the structure of the smart phone which concerns on Embodiment 1 of this invention. 本発明の各実施形態に係るスマートフォンの外観図である。It is an external view of the smart phone which concerns on each embodiment of this invention. 図1のスマートフォンの動作を示すフローチャートである。It is a flowchart which shows operation | movement of the smart phone of FIG. 図1のスマートフォンの操作無効領域を例示した図である。It is the figure which illustrated the operation invalid area | region of the smart phone of FIG. 本発明の実施形態2に係るスマートフォンの構成を示すブロック図である。It is a block diagram which shows the structure of the smart phone which concerns on Embodiment 2 of this invention. 図5のスマートフォンの動作を示すフローチャートである。It is a flowchart which shows operation | movement of the smart phone of FIG. 図5のスマートフォンの操作無効領域を例示した図である。It is the figure which illustrated the operation invalid area | region of the smart phone of FIG. 本発明の実施形態3に係るスマートフォンの構成を示すブロック図である。It is a block diagram which shows the structure of the smart phone which concerns on Embodiment 3 of this invention. 図8のスマートフォンの動作を示すフローチャートである。It is a flowchart which shows operation | movement of the smart phone of FIG. 図8のスマートフォンの操作無効領域を説明するための図である。It is a figure for demonstrating the operation invalid area | region of the smart phone of FIG. 図8のスマートフォンの変形例に係る動作を示すフローチャートである。It is a flowchart which shows the operation | movement which concerns on the modification of the smart phone of FIG. 図8のスマートフォンが図11に示す変形例に係る動作をする場合に規定する操作無効領域を例示した図である。It is the figure which illustrated the operation invalid area | region prescribed | regulated when the smart phone of FIG. 8 performs the operation | movement which concerns on the modification shown in FIG. 従来のスマートフォンの外観図である。It is an external view of the conventional smart phone. 図5のスマートフォンが操作無効領域の面積を決定するために参照するデータを模式的に例示した図である。It is the figure which illustrated typically the data referred in order that the smart phone of FIG. 5 may determine the area of an operation invalid area | region. 本発明の様々な形態に係る携帯端末について、外縁部分の断面を模式的に示した図である。It is the figure which showed typically the cross section of the outer edge part about the portable terminal which concerns on various forms of this invention.
 <実施形態1>
 以下、図1~図4を参照して、本発明に係る携帯端末の一実施形態1であるスマートフォン1について説明する。なお、スマートフォンは携帯端末の一例に過ぎず、本発明に係る携帯端末は、例えば、フィーチャーフォン、タブレット端末等の形態で実施できる。
<Embodiment 1>
Hereinafter, a smartphone 1 which is an embodiment 1 of a mobile terminal according to the present invention will be described with reference to FIGS. In addition, a smart phone is only an example of a portable terminal, The portable terminal which concerns on this invention can be implemented with forms, such as a feature phone and a tablet terminal, for example.
 〔スマートフォン1の概要〕
 図2はスマートフォン1の外観図である。図2の(a)に示すように、スマートフォン1には額縁領域がなく、スマートフォン100の前面全体がタッチパネル式の表示画面で覆われている。また、図2の(a)及び(b)に示すように、スマートフォン100の左側面Lには接触センサ13Lが設けられ、スマートフォン100の右側面Rには接触センサ13Rが設けられている。
[Outline of smartphone 1]
FIG. 2 is an external view of the smartphone 1. As shown to (a) of FIG. 2, the smart phone 1 does not have a frame area, and the whole front surface of the smart phone 100 is covered with the touchscreen display screen. 2A and 2B, the left side surface L of the smartphone 100 is provided with a contact sensor 13L, and the right side surface R of the smartphone 100 is provided with a contact sensor 13R.
 すなわち、ユーザが右側面に母指球を接触させつつ左側面に母指以外の4本の指を接触させるようにして片手でスマートフォン1を持っている間、接触センサ13L及び接触センサ13Rの両方が手の接触を検知するようになっている。そして、接触センサ13L及び接触センサ13Rの両方によって接触が検知されると、スマートフォン1は、表示画面の右端及び左端に操作無効領域(領域内の座標を指定するタッチ操作が無効化される特定の領域)を設けるようになっている。 That is, while the user holds the smartphone 1 with one hand so that the finger ball is in contact with the right side and four fingers other than the thumb are in contact with the left side, both the contact sensor 13L and the contact sensor 13R are used. Is designed to detect hand contact. Then, when contact is detected by both the contact sensor 13L and the contact sensor 13R, the smartphone 1 detects that the operation invalid area (a touch operation for designating coordinates in the area is invalidated at the right end and the left end of the display screen). Area).
 なお、本発明に係る携帯端末は、操作無効領域が設けられる表示画面を自端末の前面に備えるだけでなく、操作無効領域が設けられるタッチパッドを自端末の背面にも備えていてもよい。この場合、携帯端末は、2つの接触センサの両方によって接触が検知されると、表示画面の左端および右端に操作無効領域を設けるとともに、タッチパッドの左端および右端に操作無効領域を設けてもよい。また、本発明に係る携帯端末は、表示画面の周囲に額縁領域が設けられていてもよい。 Note that the mobile terminal according to the present invention may include not only a display screen on which the operation invalid area is provided on the front surface of the own terminal but also a touch pad on which the operation invalid area is provided on the back surface of the own terminal. In this case, when contact is detected by both of the two contact sensors, the portable terminal may provide an operation invalid area at the left end and the right end of the display screen and an operation invalid area at the left end and the right end of the touch pad. . In the mobile terminal according to the present invention, a frame region may be provided around the display screen.
 次に、スマートフォン1の要部構成について図1を参照して説明する。 Next, the configuration of the main part of the smartphone 1 will be described with reference to FIG.
 〔スマートフォン1の構成〕
 図1はスマートフォン1の要部構成を示すブロック図である。図1に示すように、スマートフォン1は、記憶部11、制御部12、2つの接触センサ(接触センサ13R、13L)、及び、表示部14を備えている。
[Configuration of smartphone 1]
FIG. 1 is a block diagram showing a main configuration of the smartphone 1. As illustrated in FIG. 1, the smartphone 1 includes a storage unit 11, a control unit 12, two contact sensors ( contact sensors 13R and 13L), and a display unit 14.
 (記憶部11)
 記憶部11は、OS(Operating System)プログラムや、各種アプリケーションプログラム、各種制御プログラム、各種データ等を記憶している。
(Storage unit 11)
The storage unit 11 stores an OS (Operating System) program, various application programs, various control programs, various data, and the like.
 (制御部12)
 制御部12はスマートフォン1全体を統括して制御するCPUである。制御部12は、対応するプログラムを記憶部11から読み出すことで、アプリ実行部121、タッチイベント制御部122、及び、センサ状態検出処理部123としても機能する。
(Control unit 12)
The control unit 12 is a CPU that controls the smartphone 1 as a whole. The control unit 12 also functions as the application execution unit 121, the touch event control unit 122, and the sensor state detection processing unit 123 by reading the corresponding program from the storage unit 11.
 (アプリ実行部121)
 アプリ実行部121は、制御部12から読み出したアプリケーション(以下、「アプリ」)に応じた処理を行う。例えば、アプリ実行部121は、アプリのユーザインタフェース画面(UI画面)を表示部14に表示する。
(App execution unit 121)
The application execution unit 121 performs processing according to an application read from the control unit 12 (hereinafter, “application”). For example, the application execution unit 121 displays a user interface screen (UI screen) of the application on the display unit 14.
 (タッチイベント制御部122)
 タッチイベント制御部122は、表示画面上の接触座標を引数とするタッチイベントをフォアグラウンドのアプリに対して発行する。
(Touch event control unit 122)
The touch event control unit 122 issues a touch event with the contact coordinates on the display screen as an argument to the foreground application.
 これに関し、タッチイベント制御部122は、2つの接触センサの両方が手の接触を検知している場合、操作無効領域内の座標を指定するタッチ操作を無効にするようになっている。すなわち、タッチイベント制御部122は、2つの接触センサの両方が手の接触を検知している場合、操作無効領域内の座標のみを指定するタッチ操作を検出してもタッチイベントを発行しないようになっている。 In this regard, the touch event control unit 122 is configured to invalidate the touch operation that specifies the coordinates in the operation invalid area when both of the two contact sensors detect the touch of the hand. That is, the touch event control unit 122 does not issue a touch event even when a touch operation that specifies only coordinates in the operation invalid area is detected when both of the two contact sensors detect a hand contact. It has become.
 また、タッチイベント制御部122は、操作無効領域特定部1221を備えている。 In addition, the touch event control unit 122 includes an operation invalid area specifying unit 1221.
 (操作無効領域特定部1221)
 2つの接触センサの両方が手の接触を検知すると、操作無効領域特定部1221は、操作無効領域の位置及び大きさを示すデータを読み出して操作無効領域を特定する。
(Operation invalid area specifying unit 1221)
When both of the two contact sensors detect contact with the hand, the operation invalid area specifying unit 1221 reads data indicating the position and size of the operation invalid area and identifies the operation invalid area.
 (センサ状態検出処理部123)
 センサ状態検出処理部123は、2つの接触センサの両方が手(接触物)の接触を検知している状態にあるか否かを判定する。
(Sensor state detection processing unit 123)
The sensor state detection processing unit 123 determines whether or not both of the two contact sensors are detecting the contact of the hand (contact object).
 (接触センサ13R及び接触センサ13L)
 接触センサ13Rは、手の接触を検知した場合に手の接触を検知した旨をセンサ状態検出処理部123に通知する。同様に、接触センサ13Lは、手の接触を検知した場合に手の接触を検知した旨をセンサ状態検出処理部123に通知する。
(Contact sensor 13R and contact sensor 13L)
The contact sensor 13R notifies the sensor state detection processing unit 123 that the contact of the hand is detected when the contact of the hand is detected. Similarly, the contact sensor 13L notifies the sensor state detection processing unit 123 that the hand contact is detected when the hand contact is detected.
 (表示部140)
 表示部140は、アプリのユーザインタフェース画面等を表示するタッチパネル式のディスプレイである。
(Display unit 140)
The display unit 140 is a touch panel display that displays a user interface screen of the application.
 〔スマートフォン1の動作〕
 次に、図3及び図4を参照しながら、ユーザがタッチパネル(表示画面)上の1点に手を接触させた場合にスマートフォン1が行う動作を説明する。図3は当該動作を示すフローチャートであり、図4は、スマートフォン1の操作無効領域を例示した図である。
[Operation of smartphone 1]
Next, an operation performed by the smartphone 1 when the user touches one point on the touch panel (display screen) will be described with reference to FIGS. 3 and 4. FIG. 3 is a flowchart showing the operation, and FIG. 4 is a diagram illustrating an operation invalid area of the smartphone 1.
 まず、スマートフォン1は、図3に示すように、2つの接触センサの状態を確認する(S1)。具体的には、タッチイベント制御部122はタッチパネル上の手が接触した座標(タッチ座標)を特定し、センサ状態検出処理部123に対し、2つの接触センサの両方が触られているか否かを問い合わせる。 First, as shown in FIG. 3, the smartphone 1 checks the states of the two contact sensors (S1). Specifically, the touch event control unit 122 specifies the coordinates (touch coordinates) with which the hand on the touch panel is in contact, and determines whether or not both of the two contact sensors are touched with respect to the sensor state detection processing unit 123. Inquire.
 これを受けて、センサ状態検出処理部123は、手の接触を検知した旨の通知を2つの接触センサの両方から受け取っているか否かを判定する(S2)。 In response to this, the sensor state detection processing unit 123 determines whether or not the notification that the contact of the hand is detected is received from both of the two contact sensors (S2).
 S2において2つの接触センサの両方が触られている(すなわち、上記通知を2つの接触センサの両方から受け取っている)と判定した場合、センサ状態検出処理部123は、判定結果をタッチイベント制御部122に通知する。なお、スマートフォン1は、S2において2つの接触センサの一方又は両方が触られていない(すなわち、上記通知を2つの接触センサの一方又は両方から受け取っていない)と判定した場合、後述のS6に進む。 When it is determined in S2 that both of the two contact sensors are touched (that is, the notification is received from both of the two contact sensors), the sensor state detection processing unit 123 displays the determination result as a touch event control unit. 122 is notified. When the smartphone 1 determines in S2 that one or both of the two contact sensors are not touched (that is, the notification is not received from one or both of the two contact sensors), the smartphone 1 proceeds to S6 described later. .
 上記通知を受けたタッチイベント制御部122の操作無効領域特定部1221は、操作無効領域の位置及びサイズを示すデータを記憶部11から読み出す。例えば、操作無効領域特定部1221は、タッチパネルの左端付近の矩形の操作無効領域(図4参照)の4隅の座標とタッチパネルの右端付近の矩形の操作無効領域(図4参照)の4隅の座標とを示すデータを読み出す(S3)。 The operation invalid area specifying unit 1221 of the touch event control unit 122 that has received the notification reads data indicating the position and size of the operation invalid area from the storage unit 11. For example, the operation invalid area specifying unit 1221 includes the coordinates of the four corners of the rectangular operation invalid area (see FIG. 4) near the left end of the touch panel and the four corners of the rectangular operation invalid area (see FIG. 4) near the right end of the touch panel. Data indicating the coordinates is read (S3).
 そして、タッチイベント制御部122は、読み出したデータに基づき、タッチ座標が操作無効領域内に位置するか否かを判定する(S4)。 And the touch event control part 122 determines whether a touch coordinate is located in an operation invalid area | region based on the read data (S4).
 タッチイベント制御部122は、タッチ座標が操作無効領域内に位置すると判定した場合にはタッチ操作が無効である旨を示す情報を表示部140に表示する(S5)。すなわち、タッチイベント制御部122は、タッチ座標を引数とするタッチイベントをアプリ実行部121に対して発行せずに図3のフローチャートに従った動作を終了する。 The touch event control unit 122 displays information indicating that the touch operation is invalid on the display unit 140 when it is determined that the touch coordinates are located within the operation invalid region (S5). That is, the touch event control unit 122 ends the operation according to the flowchart of FIG. 3 without issuing a touch event having the touch coordinates as an argument to the application execution unit 121.
 一方、スマートフォン1は、タッチ座標が操作無効領域内に位置しないと判定した場合には、S6に進む。 On the other hand, if the smartphone 1 determines that the touch coordinates are not located within the operation invalid area, the process proceeds to S6.
 S6において、スマートフォン1は、タッチ操作を有効とし、タッチイベントに従った動作を行う。具体的には、タッチイベント制御部122がタッチ座標を引数とするタッチイベントをアプリ実行部121に対して発行し、アプリ実行部121が、タッチイベントに応じた処理を実行する。スマートフォン1は、S6の後、図3のフローチャートに従った動作を終了する。 In S6, the smartphone 1 validates the touch operation and performs an operation according to the touch event. Specifically, the touch event control unit 122 issues a touch event with the touch coordinates as an argument to the application execution unit 121, and the application execution unit 121 executes processing corresponding to the touch event. The smart phone 1 complete | finishes the operation | movement according to the flowchart of FIG. 3 after S6.
 (スマートフォン1の利点)
 以上のように、スマートフォン1は、座標を指定する操作を受け付ける表示画面(表示部140)と、自端末の両側面の各々に設けられた接触センサ(接触センサ13R及び接触センサ13L)と、手の接触を2つの接触センサの両方が検出しているか否かを判定するセンサ状態検出処理部123と、2つの接触センサの両方が手の接触を検出していると判定された場合に表示画面内の操作無効領域に対する操作を無効化するタッチイベント制御部122と、を備えている。
(Advantages of smartphone 1)
As described above, the smartphone 1 includes the display screen (display unit 140) that accepts an operation for specifying coordinates, the contact sensors (contact sensor 13R and contact sensor 13L) provided on each of both side surfaces of the terminal, A sensor state detection processing unit 123 that determines whether or not both of the two contact sensors have detected the contact of the two, and a display screen when it is determined that both of the two contact sensors have detected the contact of the hand And a touch event control unit 122 that invalidates the operation on the operation invalid area.
 上記の構成によれば、スマートフォン1は、例えば、ユーザが一方の手の掌にスマートフォン1を載せている場合(すなわち、スマートフォン1を握っておらず、2つの接触検出部の一方又は両方が手の接触を検出していない場合)には誤操作防止機能を働かせない。すなわち、スマートフォン1は、誤操作がされにくい状況下では誤操作防止機能を働かせない。一方、スマートフォン1は、スマートフォン1を握る手の親指で操作されている場合、操作中にスマートフォン1を握る手(母指球や小指等の部分)が意図せず操作無効領域に触れたときには、操作無効領域内の座標を指定するタッチ操作を無効にする。すなわち、スマートフォン1は、誤操作がされ易い特定の状況下では誤操作防止機能を働かせる。そして、誤操作防止機能は、スマートフォン1を使用するユーザがどのようなユーザ(例えば、スマートフォン1が誤操作防止機能を備えていることを知らないユーザ)であっても確実に働くことになる。 According to the above configuration, for example, when the user places the smartphone 1 on the palm of one hand (that is, the user does not hold the smartphone 1 and one or both of the two contact detection units are hands). If no contact is detected, the malfunction prevention function will not work. That is, the smartphone 1 does not allow the erroneous operation prevention function to work under a situation where it is difficult to perform an erroneous operation. On the other hand, when the smartphone 1 is operated with the thumb of the hand holding the smartphone 1 and the hand (portion such as a thumb ball or little finger) holding the smartphone 1 during operation is unintentionally touching the operation invalid area, The touch operation that specifies the coordinates in the operation invalid area is invalidated. That is, the smartphone 1 activates an erroneous operation prevention function under a specific situation where an erroneous operation is easily performed. The erroneous operation prevention function works reliably regardless of the type of user who uses the smartphone 1 (for example, a user who does not know that the smartphone 1 has the erroneous operation prevention function).
 従って、スマートフォン1は、誤操作がされにくい状況下では誤操作防止機能を働かせず、尚且つ、誤操作がされ易い特定の状況下では誤操作防止機能を確実に働かせることが可能であると言える。 Therefore, it can be said that the smartphone 1 does not operate the erroneous operation prevention function under a situation where it is difficult to perform an erroneous operation, and can reliably operate the erroneous operation prevention function under a specific situation where the erroneous operation is easily performed.
 (実施形態1の付記事項1)
 実施形態1では、スマートフォン1は、ユーザがタッチパネル上の1点に手を接触させた場合に、S2にて、手の接触を検知した旨の通知を2つの接触センサの両方から受け取っているか否かを判定した。しかしながら、本発明はこれに限定されない。
(Appendix 1 of Embodiment 1)
In the first embodiment, when the user touches one point on the touch panel with the hand, the smartphone 1 receives notification from both the two contact sensors that the hand contact is detected in S2. It was judged. However, the present invention is not limited to this.
 すなわち、スマートフォン1は、ユーザがタッチパネル上に手を接触させているか否かに関わらず、手の接触を検知した旨の通知を2つの接触センサの両方から受け取っているか否かを定期的に判定し、判定の度に判定結果を示す情報をキャッシュしてもよい。そして、スマートフォン1は、ユーザがタッチパネル上の1点に手を接触させた場合、S1およびS2の処理を実行せずにS3以降の処理を実行してもよい。この場合、スマートフォン1は、直近でキャッシュされた情報に基づいてS3に進むかS6に進むかを最初に決定し、その決定に基づいてS3またはS6の処理を次に実行してもよい。 In other words, the smartphone 1 periodically determines whether or not the user has received notification from both of the two contact sensors whether or not the user has touched the touch panel. The information indicating the determination result may be cached for each determination. And the smart phone 1 may perform the process after S3, without performing the process of S1 and S2, when a user makes a hand contact one point on a touch panel. In this case, the smartphone 1 may first determine whether to proceed to S3 or S6 based on the most recently cached information, and then execute the process of S3 or S6 based on the determination.
 (実施形態1の付記事項2)
 タッチ操作が無効である旨を示す情報を表示するS5は省略してもよい。
(Appendix 2 of Embodiment 1)
S5 for displaying information indicating that the touch operation is invalid may be omitted.
 <実施形態2>
 以下、図2、及び、図5~図7を参照して本発明に係る携帯端末の別の一実施形態に係るスマートフォンについて説明する。なお、説明の便宜上、前記実施形態にて説明した部材と全く同じ機能又は略同じ機能を有する部材については、同じ符号を付記し、基本的にその説明を省略する。
<Embodiment 2>
Hereinafter, a smartphone according to another embodiment of the mobile terminal according to the present invention will be described with reference to FIG. 2 and FIGS. For convenience of explanation, members having exactly the same or substantially the same functions as those described in the above embodiment are denoted by the same reference numerals and description thereof is basically omitted.
 〔スマートフォン1Aの概要〕
 本実施形態に係るスマートフォン1Aの外観は、図2に示すように、スマートフォン1の外観と同じである。
[Outline of smartphone 1A]
The appearance of the smartphone 1A according to the present embodiment is the same as the appearance of the smartphone 1 as shown in FIG.
 スマートフォン1Aは、ユーザが右側面に母指球を接触させつつ左側面に母指以外の4本の指を接触させるようにして片手でスマートフォン1Aを持っている間、接触センサ13AL及び接触センサ13ARの両方が手の接触及び接触面積を検知するようになっている。そして、2つの接触センサの両方によって接触が検知されると、スマートフォン1Aは、接触センサ13ALが検知した接触面積と接触センサ13ARが検知した接触面積との和に比例した広さの操作無効領域を表示画面の左端および右端に設けるようになっている。 While the smartphone 1A holds the smartphone 1A with one hand so that the user touches the right side with the thumb ball and the left side with four fingers other than the thumb, the contact sensor 13AL and the contact sensor 13AR Both are adapted to detect hand contact and contact area. When contact is detected by both of the two contact sensors, the smartphone 1A displays an operation invalid area having a width proportional to the sum of the contact area detected by the contact sensor 13AL and the contact area detected by the contact sensor 13AR. They are provided at the left and right ends of the display screen.
 次に、スマートフォン1Aの要部構成について図5を参照して説明する。 Next, the configuration of the main part of the smartphone 1A will be described with reference to FIG.
 〔スマートフォン1Aの構成〕
 図5はスマートフォン1Aの要部構成を示すブロック図である。図5に示すように、スマートフォン1Aは、記憶部11、制御部12A、2つの接触センサ(接触センサ13AR、13AL)、及び、表示部14を備えている。
[Configuration of Smartphone 1A]
FIG. 5 is a block diagram showing a main configuration of the smartphone 1A. As illustrated in FIG. 5, the smartphone 1A includes a storage unit 11, a control unit 12A, two contact sensors (contact sensors 13AR and 13AL), and a display unit 14.
 (制御部12A)
 制御部12Aはスマートフォン1A全体を統括して制御するCPUである。制御部12Aは、対応するプログラムを記憶部11から読み出すことで、アプリ実行部121、タッチイベント制御部122A、及び、センサ状態検出処理部123Aとしても機能する。
(Control unit 12A)
The control unit 12A is a CPU that controls the entire smartphone 1A. The control unit 12A also functions as the application execution unit 121, the touch event control unit 122A, and the sensor state detection processing unit 123A by reading the corresponding program from the storage unit 11.
 (タッチイベント制御部122A)
 タッチイベント制御部122と同様に、タッチイベント制御部122Aは、表示画面上の接触座標を引数とするタッチイベントを発行するが、操作無効領域内の座標のみを指定するタッチ操作を検出してもタッチイベントを発行しないようになっている。タッチイベント制御部122Aは、操作無効領域決定部1221Aを備えている。
(Touch event control unit 122A)
Similar to the touch event control unit 122, the touch event control unit 122A issues a touch event with the contact coordinates on the display screen as an argument, but even if it detects a touch operation that specifies only the coordinates in the operation invalid area. The touch event is not issued. The touch event control unit 122A includes an operation invalid area determination unit 1221A.
 (操作無効領域決定部1221A)
 2つの接触センサの両方が手の接触及び接触面積を検知すると、操作無効領域決定部1221Aは、操作無効領域の位置(表示画面の左端及び右端)を示すデータを読み出す。操作無効領域決定部1221Aは、接触センサ13ALが検知した接触面積と接触センサ13ARが検知した接触面積との和に比例した広さの操作無効領域を表示画面の左端および右端に設ける事を決定する。
(Operation invalid area determination unit 1221A)
When both of the two contact sensors detect the hand contact and the contact area, the operation invalid region determination unit 1221A reads data indicating the position of the operation invalid region (the left end and the right end of the display screen). The invalid operation area determination unit 1221A determines to provide operation invalid areas having a width proportional to the sum of the contact area detected by the contact sensor 13AL and the contact area detected by the contact sensor 13AR at the left end and the right end of the display screen. .
 (センサ状態検出処理部123A)
 センサ状態検出処理部123と同様に、センサ状態検出処理部123Aは、2つの接触センサの両方が手の接触を検知している状態にあるか否かを判定する。センサ状態検出処理部123Aは、接触面積特定部1231を備えている。
(Sensor state detection processing unit 123A)
Similar to the sensor state detection processing unit 123, the sensor state detection processing unit 123A determines whether or not both of the two contact sensors are detecting a hand contact. The sensor state detection processing unit 123A includes a contact area specifying unit 1231.
 (接触面積特定部1231)
 接触面積特定部1231は、2つの接触センサの両方が手の接触を検知している状態にあると判定した場合に、接触センサ13ARと手との接触面積、及び、接触センサ13ALと手との接触面積を特定する。
(Contact area identification part 1231)
When the contact area specifying unit 1231 determines that both of the two contact sensors are detecting the contact of the hand, the contact area between the contact sensor 13AR and the hand and the contact sensor 13AL and the hand Identify the contact area.
 (接触センサ13AR及び接触センサ13AL)
 接触センサ13ARは、手の接触を検知した場合、手の接触を検知した旨と接触面積とを接触面積特定部1231に通知する。同様に、接触センサ13ALは、手の接触を検知した場合、手の接触を検知した旨と接触面積とを接触面積特定部1231に通知する。
(Contact sensor 13AR and contact sensor 13AL)
When the contact sensor 13AR detects the contact of the hand, the contact sensor 13AR notifies the contact area specifying unit 1231 that the contact of the hand is detected and the contact area. Similarly, when the contact sensor 13AL detects a hand contact, the contact sensor 13AL notifies the contact area specifying unit 1231 that the hand contact has been detected and the contact area.
 〔スマートフォン1Aの動作〕
 次に、図6及び図7を参照しながら、ユーザがタッチパネル(表示画面)上の1点に手を接触させた場合のスマートフォン1Aの動作を説明する。図6は当該動作を示すフローチャートであり、図7は、スマートフォン1Aの操作無効領域を例示した図である。
[Operation of smartphone 1A]
Next, the operation of the smartphone 1 </ b> A when the user touches one point on the touch panel (display screen) will be described with reference to FIGS. 6 and 7. FIG. 6 is a flowchart showing the operation, and FIG. 7 is a diagram illustrating an operation invalid area of the smartphone 1A.
 図6に示すように、まず、スマートフォン1Aは、S11にてS1と同様の処理を行い、S12にてS2と同様の処理を行う。 As shown in FIG. 6, first, the smartphone 1A performs the same processing as S1 in S11, and performs the same processing as S2 in S12.
 S12において2つの接触センサの両方が触られていると判定した場合、接触面積特定部1231は、接触センサ13ARと手との接触面積、及び、接触センサ13ALと手との接触面積を特定し(S13)、各接触面積をタッチイベント制御部122に通知する。なお、スマートフォン1Aは、S12において2つの接触センサの一方又は両方が触られていないと判定した場合、S17の処理(S6と同様の処理)を行う。 When it is determined in S12 that both of the two contact sensors are touched, the contact area specifying unit 1231 specifies the contact area between the contact sensor 13AR and the hand and the contact area between the contact sensor 13AL and the hand ( S13), each touch area is notified to the touch event control unit 122. If it is determined in S12 that one or both of the two contact sensors is not touched, the smartphone 1A performs the process of S17 (the same process as S6).
 上記通知を受けたタッチイベント制御部122Aの操作無効領域決定部1221Aは、図7に示すように、接触面積の和が大きいほど表示画面の右端及び左端の操作無効領域が大きくなるように操作無効領域の大きさを決定する(S14)。 Upon receiving the notification, the operation invalid area determination unit 1221A of the touch event control unit 122A invalidates the operation so that the operation invalid areas on the right end and the left end of the display screen increase as the sum of the contact areas increases, as shown in FIG. The size of the area is determined (S14).
 スマートフォン1Aは、S14の後にS15以降の処理を行うが、S15~S17は、それぞれ、S4~S6と同じ工程であるので、その説明は省略する。 The smartphone 1A performs the processing from S15 onward after S14, but S15 to S17 are the same steps as S4 to S6, respectively, and thus description thereof is omitted.
 (実施形態2の付記事項)
 スマートフォン1Aは、表示画面の左端に接触センサ13ALが検知した接触面積に応じた広さの操作無効領域を設け、表示画面の右端に接触センサ13ARが検知した接触面積に応じた広さの操作無効領域を設けるように構成されていてもよい。
(Additional notes of embodiment 2)
The smartphone 1A provides an operation invalid area having a size corresponding to the contact area detected by the contact sensor 13AL at the left end of the display screen, and an operation invalidity corresponding to the contact area detected by the contact sensor 13AR at the right end of the display screen. You may be comprised so that an area | region may be provided.
 例えば、スマートフォン1Aは、表示画面の左端に接触センサ13ALが検知した接触面積の大きさに比例した広さの操作無効領域を設け、表示画面の右端に接触センサ13ARが検知した接触面積の大きさに比例した広さの操作無効領域を設けるように構成されていてもよい。あるいは、スマートフォン1Aは、接触面積の大きさに関する所定のN段階の階級(例えば、図14(a)の3段階の階級)の中から接触センサ13ALが検知した接触面積が属する階級を特定し、特定した階級に応じた広さの操作無効領域を表示画面の左端に設けてもよい。同様に、スマートフォン1Aは、接触面積の大きさに関する所定のM(M=Nであっても、M≠Nであってもよい)段階の階級(例えば、図14(b)又は図14(c)の2段階の階級)の中から接触センサ13ARが検知した接触面積が属する階級を特定し、特定した階級に応じた広さの操作無効領域を表示画面の右端に設けてもよい。 For example, the smartphone 1A provides an operation invalid area having a size proportional to the size of the contact area detected by the contact sensor 13AL on the left end of the display screen, and the size of the contact area detected by the contact sensor 13AR on the right end of the display screen. It is also possible to provide an operation invalid area having a width proportional to. Alternatively, the smartphone 1A specifies the class to which the contact area detected by the contact sensor 13AL belongs from a predetermined N-stage class (for example, the three-stage class in FIG. 14A) related to the size of the contact area, An operation invalid area having a size corresponding to the specified class may be provided at the left end of the display screen. Similarly, the smartphone 1A has a predetermined M level (for example, M = N or M ≠ N) related to the size of the contact area (for example, FIG. 14B or FIG. 14C). 2), the class to which the contact area detected by the contact sensor 13AR belongs may be specified, and an operation invalid area having a size corresponding to the specified class may be provided at the right end of the display screen.
 <実施形態3>
 以下、図2、及び、図8~図10を参照して本発明に係る携帯端末の更に別の一実施形態に係るスマートフォンについて説明する。なお、説明の便宜上、前記実施形態にて説明した部材と全く同じ機能又は略同じ機能を有する部材については、同じ符号を付記し、基本的にその説明を省略する。
<Embodiment 3>
Hereinafter, a smartphone according to still another embodiment of the mobile terminal according to the present invention will be described with reference to FIG. 2 and FIGS. For convenience of explanation, members having exactly the same or substantially the same functions as those described in the above embodiment are denoted by the same reference numerals and description thereof is basically omitted.
 〔スマートフォン1Bの概要〕
 本実施形態に係るスマートフォン1Bの外観は、図2に示すように、スマートフォン1及びスマートフォン1Aの外観と同じである。
[Outline of smartphone 1B]
The appearance of the smartphone 1B according to the present embodiment is the same as the appearance of the smartphone 1 and the smartphone 1A as shown in FIG.
 スマートフォン1Bは、ユーザが右側面に母指球を接触させつつ左側面に母指以外の4本の指を接触させるようにして片手でスマートフォン1Bを持っている間、接触センサ13BL及び接触センサ13BRの両方が手の接触及び接触位置を検知するようになっている。そして、2つの接触センサの両方によって接触が検知されると、スマートフォン1Bは、操作無効領域を、表示画面の左端における、接触センサ13BLが検知した接触位置に応じた位置(具体的には、接触位置に隣接する位置)に設ける。これと同時に、スマートフォン1Bは、操作無効領域を、表示画面の右端における、接触センサ13BRが検知した接触位置に応じた位置(具体的には、接触位置に隣接する位置)に設ける。 While the smartphone 1B holds the smartphone 1B with one hand so that the user touches the thumb ball on the right side and four fingers other than the thumb on the left side, the contact sensor 13BL and the contact sensor 13BR Both are adapted to detect hand contact and contact position. When contact is detected by both of the two contact sensors, the smartphone 1B sets the operation invalid area at a position corresponding to the contact position detected by the contact sensor 13BL at the left end of the display screen (specifically, the contact (Position adjacent to the position). At the same time, the smartphone 1B provides the operation invalid area at a position (specifically, a position adjacent to the contact position) according to the contact position detected by the contact sensor 13BR at the right end of the display screen.
 なお、接触センサ13BL及び接触センサ13BRは、接触検出面において手が接触している領域の中心座標を接触位置として検知してもよい。 It should be noted that the contact sensor 13BL and the contact sensor 13BR may detect the center coordinates of the area where the hand is in contact on the contact detection surface as the contact position.
 次に、スマートフォン1Bの要部構成について図8を参照して説明する。 Next, the configuration of the main part of the smartphone 1B will be described with reference to FIG.
 〔スマートフォン1Bの構成〕
 図8はスマートフォン1Bの要部構成を示すブロック図である。図8に示すように、スマートフォン1Bは、記憶部11、制御部12B、2つの接触センサ(接触センサ13BR、13BL)、及び、表示部14を備えている。
[Configuration of Smartphone 1B]
FIG. 8 is a block diagram showing a main configuration of the smartphone 1B. As illustrated in FIG. 8, the smartphone 1B includes a storage unit 11, a control unit 12B, two contact sensors (contact sensors 13BR and 13BL), and a display unit 14.
 (制御部12B)
 制御部12Bはスマートフォン1B全体を統括して制御するCPUである。制御部12Bは、対応するプログラムを記憶部11から読み出すことで、アプリ実行部121、タッチイベント制御部122B、及び、センサ状態検出処理部123Bとしても機能する。
(Control unit 12B)
The control unit 12B is a CPU that controls the overall smartphone 1B. The control unit 12B also functions as the application execution unit 121, the touch event control unit 122B, and the sensor state detection processing unit 123B by reading the corresponding program from the storage unit 11.
 (タッチイベント制御部122B)
 タッチイベント制御部122と同様に、タッチイベント制御部122Bは、表示画面上の接触座標を引数とするタッチイベントを発行するが、操作無効領域内の座標のみを指定するタッチ操作を検出してもタッチイベントを発行しないようになっている。タッチイベント制御部122Bは、操作無効領域決定部1221Bを備えている。
(Touch event control unit 122B)
Similar to the touch event control unit 122, the touch event control unit 122B issues a touch event with the contact coordinates on the display screen as an argument, but even if it detects a touch operation that specifies only the coordinates in the operation invalid area. The touch event is not issued. The touch event control unit 122B includes an operation invalid area determination unit 1221B.
 (操作無効領域決定部1221B)
 2つの接触センサの両方が手の接触及び接触位置を検知すると、操作無効領域決定部1221Bは、表示画面の左端に設ける操作無効領域の位置を接触センサ13BLが検知した接触位置に基づいて決定し、表示画面の右端に設ける操作無効領域の位置を接触センサ13BRが検知した接触位置に基づいて決定する。
(Operation invalid area determination unit 1221B)
When both of the two contact sensors detect the contact and the contact position of the hand, the operation invalid area determination unit 1221B determines the position of the operation invalid area provided at the left end of the display screen based on the contact position detected by the contact sensor 13BL. The position of the operation invalid area provided at the right end of the display screen is determined based on the contact position detected by the contact sensor 13BR.
 (センサ状態検出処理部123B)
 センサ状態検出処理部123と同様に、センサ状態検出処理部123Bは、2つの接触センサの両方が手の接触を検知している状態にあるか否かを判定する。センサ状態検出処理部123Bは、接触位置特定部1232を備えている。
(Sensor state detection processing unit 123B)
Similar to the sensor state detection processing unit 123, the sensor state detection processing unit 123B determines whether or not both of the two contact sensors are detecting a hand contact. The sensor state detection processing unit 123B includes a contact position specifying unit 1232.
 (接触位置特定部1232)
 接触位置特定部1232は、2つの接触センサの両方が手の接触を検知している状態にあると判定した場合に、接触センサ13BRと手との接触位置、及び、接触センサ13BLと手との接触位置を特定する。
(Contact position specifying unit 1232)
When the contact position specifying unit 1232 determines that both of the two contact sensors are detecting the contact of the hand, the contact position of the contact sensor 13BR and the hand, and the contact sensor 13BL and the hand Identify the contact location.
 (接触センサ13BR及び接触センサ13BL)
 接触センサ13BRは、手の接触を検知した場合、手の接触を検知した旨と接触位置とを接触位置特定部1232に通知する。同様に、接触センサ13ALは、手の接触を検知した場合、手の接触を検知した旨と接触位置とを接触位置特定部1232に通知する。
(Contact sensor 13BR and contact sensor 13BL)
When the contact sensor 13BR detects the contact of the hand, the contact sensor 13BR notifies the contact position specifying unit 1232 that the contact of the hand is detected and the contact position. Similarly, when the contact sensor 13AL detects a hand contact, the contact sensor 13AL notifies the contact position specifying unit 1232 that the hand contact has been detected and the contact position.
 〔スマートフォン1Bの動作〕
 次に、図9及び図10を参照しながら、ユーザがタッチパネル(表示画面)上の1点に手を接触させた場合のスマートフォン1Bの動作を説明する。図9は当該動作を示すフローチャートであり、図10はスマートフォン1Bの操作無効領域を例示した図である。
[Operation of smartphone 1B]
Next, the operation of the smartphone 1 </ b> B when the user touches one point on the touch panel (display screen) will be described with reference to FIGS. 9 and 10. FIG. 9 is a flowchart showing the operation, and FIG. 10 is a diagram illustrating an operation invalid area of the smartphone 1B.
 図9に示すように、まず、スマートフォン1Bは、S21にてS1と同様の処理を行い、S22にてS2と同様の処理を行う。 As shown in FIG. 9, first, the smartphone 1B performs the same process as S1 in S21, and performs the same process as S2 in S22.
 S22において2つの接触センサの両方が触られていると判定した場合、接触位置特定部1232は、接触センサ13ARと手との接触位置、及び、接触センサ13ALと手との接触位置(例えば、上述の中心座標)を特定し(S23)、各接触位置をタッチイベント制御部122に通知する。なお、スマートフォン1Bは、S22において2つの接触センサの一方又は両方が触られていないと判定した場合、S27の処理(S6と同様の処理)を行う。 When it is determined in S22 that both of the two contact sensors are touched, the contact position specifying unit 1232 determines the contact position between the contact sensor 13AR and the hand, and the contact position between the contact sensor 13AL and the hand (for example, the above-described case). ) (S23), and notifies each touch position to the touch event control unit 122. In addition, when it determines with one or both of the two contact sensors not being touched in S22, the smart phone 1B performs the process of S27 (process similar to S6).
 上記通知を受けたタッチイベント制御部122Bの操作無効領域決定部1221Bは、接触センサ13BLが検知した接触位置に基づいて表示画面の左端における操作無効領域の位置を決定し、接触センサ13ARが検知した接触位置に基づいて表示画面の右端における操作無効領域の位置を決定する(S24)。例えば、操作無効領域決定部1221Bは、接触センサ13ALが検知した接触位置がスマートフォン1Bの底面から距離dだけ離れている場合、図10の(a)~(d)に示すように、表示画面の左端における、操作無効領域の中心座標と表示画面の下端との間の距離がdとなるような位置に、操作無効領域を設けることを決定する。すなわち、操作無効領域決定部1221Bは、「接触センサ13ALにおける指との接触領域」に隣接する表示画面内の領域を操作無効領域とすることを決定する。ここで、本実施形態において接触領域と表示画面内の領域とが「隣接する」とは、当該接触領域と当該「表示画面内の領域」とが接することを意味している。また、操作無効領域決定部1221Bは、表示画面の右端における操作無効領域の位置も同様にして決定する。 The operation invalid region determination unit 1221B of the touch event control unit 122B that has received the notification determines the position of the operation invalid region at the left end of the display screen based on the contact position detected by the contact sensor 13BL, and is detected by the contact sensor 13AR. Based on the contact position, the position of the operation invalid area at the right end of the display screen is determined (S24). For example, when the contact position detected by the contact sensor 13AL is separated from the bottom surface of the smartphone 1B by a distance d, the operation invalid area determination unit 1221B displays the display screen as shown in FIGS. It is determined that the operation invalid area is provided at a position at the left end where the distance between the center coordinate of the operation invalid area and the lower end of the display screen is d. That is, the operation invalid area determination unit 1221B determines that the area in the display screen adjacent to the “contact area with the finger in the contact sensor 13AL” is the operation invalid area. Here, in the present embodiment, that the contact area and the area in the display screen are “adjacent” means that the contact area and the “area in the display screen” are in contact with each other. The operation invalid area determining unit 1221B determines the position of the operation invalid area at the right end of the display screen in the same manner.
 スマートフォン1Bは、S24の後にS25以降の処理を行うが、S25~S27は、それぞれ、S4~S6と同じ工程であるので、その説明は省略する。 The smartphone 1B performs the processing from S25 onward after S24, but S25 to S27 are the same steps as S4 to S6, respectively, and the description thereof is omitted.
 なお、操作無効領域は、図10の(c)に示すような半楕円の領域であってもよいし、図10の(d)に示すような矩形の領域であってもよいし、その他の形状の領域(例えば、円形の領域)であってもよい。また、図10の(c)では、半楕円の長径の長さが短径の長さの3倍になっているが、半楕円の長径の長さは短径の長さの3倍未満(例えば2倍)であってもよいし、長径の長さは短径の長さの3倍より大きくてもよい(例えば4倍であってもよい)。また、図10の(d)では、矩形の長辺の長さが短辺の長さの3倍になっているが、矩形の長辺の長さは短辺の長さの3倍未満(例えば2倍)であってもよいし、長辺の長さは短辺の長さの3倍より大きくてもよい(例えば4倍であってもよい)。 The operation invalid area may be a semi-elliptical area as shown in (c) of FIG. 10, a rectangular area as shown in (d) of FIG. It may be a shape area (for example, a circular area). In FIG. 10C, the length of the major axis of the semi-ellipse is three times the length of the minor axis, but the length of the major axis of the semi-ellipse is less than three times the length of the minor axis ( For example, the length of the major axis may be larger than three times the length of the minor axis (for example, it may be four times). In FIG. 10D, the length of the long side of the rectangle is three times the length of the short side, but the length of the long side of the rectangle is less than three times the length of the short side ( For example, the length of the long side may be larger than three times the length of the short side (for example, it may be four times).
 <実施形態3の変形例>
 以下、図2、図11及び図12を参照しながらスマートフォン1Bの一変形例について説明する。本変形例に係るスマートフォン1Bは、2つの接触センサの両方が手の接触を検知している間、表示画面に設ける操作無効領域の広さを動的に変化させるようになっている。
<Modification of Embodiment 3>
Hereinafter, a modification of the smartphone 1 </ b> B will be described with reference to FIGS. 2, 11, and 12. The smartphone 1B according to this modification is configured to dynamically change the size of the operation invalid area provided on the display screen while both of the two contact sensors detect contact with the hand.
 具体的には、スマートフォン1Bは、表示画面内の座標を指し示す指の有無に関わりなく接触センサが検知した接触位置に応じた位置となるように表示画面内の第1の領域を規定する。より具体的には、スマートフォン1Bは、この第1の領域を、図10の操作無効領域を規定する方法と同じ方法を用いて規定する。さらに、スマートフォン1Bは、指が第1の領域内の座標を指し示している場合には表示画面内に当該座標の位置に応じた第2の領域を規定する。 Specifically, the smartphone 1B defines the first region in the display screen so that the position corresponds to the contact position detected by the contact sensor regardless of the presence or absence of a finger indicating the coordinates in the display screen. More specifically, the smartphone 1B defines the first area by using the same method as the method for defining the operation invalid area in FIG. Furthermore, when the finger points to the coordinates in the first area, the smartphone 1B defines a second area corresponding to the position of the coordinates in the display screen.
 そして、スマートフォン1Bは、指が第1の領域内の座標を指し示していない場合には第1の領域を操作無効領域として規定し、指が第1の領域内の座標を指し示している場合には第1の領域と第2の領域とを含む領域を操作無効領域として規定する。 Then, the smartphone 1B defines the first area as an operation invalid area when the finger does not point to the coordinates in the first area, and when the finger points to the coordinates within the first area. An area including the first area and the second area is defined as an operation invalid area.
 本変形例に係るスマートフォン1Bの構成は既に説明した図8に示した通りの構成であるので、本変形例に係るスマートフォン1Bの構成については改めて触れない。以下、本変形例に係るスマートフォン1Bの動作について図11および図12を参照しながら説明する。 Since the configuration of the smartphone 1B according to the present modification is the configuration shown in FIG. 8 already described, the configuration of the smartphone 1B according to the present modification will not be touched again. Hereinafter, the operation of the smartphone 1 </ b> B according to this modification will be described with reference to FIGS. 11 and 12.
 〔変形例に係るスマートフォン1Bの動作〕
 図11は当該動作を示すフローチャートであり、図12は、表示画面の左端における、スマートフォン1Bの操作無効領域を例示した図である。なお、図12では、表示画面上の領域であって波線の円周に囲まれた領域が操作無効領域を示している。
[Operation of Smartphone 1B according to Modification]
FIG. 11 is a flowchart showing the operation, and FIG. 12 is a diagram illustrating an operation invalid area of the smartphone 1B at the left end of the display screen. In FIG. 12, an area on the display screen and surrounded by a wavy line indicates an operation invalid area.
 図11に示すように、まず、スマートフォン1Bは、S31にてS1と同様の処理を行い、S32にてS2と同様の処理を行う。 As shown in FIG. 11, first, the smartphone 1B performs the same process as S1 in S31, and performs the same process as S2 in S32.
 S32において2つの接触センサの両方が触られていると判定した場合、接触位置特定部1232は、接触センサ13ARと手との接触領域の上端の座標及び下端の座標、並びに、接触センサ13ALと手との接触領域の上端の座標及び下端の座標を特定する。そして、操作無効領域決定部1221Bは、特定された4つの座標のうちの何れかの座標から一定距離r内(上述の第1の領域内)に位置するような、タッチパネル上のタッチ座標(対象タッチ座標)が1つ以上あるか否かを判定する(S33)。図12の例では、タッチ座標Aおよびタッチ座標Bが第1の領域内に位置するので、対象タッチ座標が1つ以上存在すると判定する。なお、スマートフォン1Bは、S32において2つの接触センサの一方又は両方が触られていないと判定した場合、S36に進む。 When it is determined in S32 that both of the two contact sensors are touched, the contact position specifying unit 1232 determines the coordinates of the upper end and the lower end of the contact area between the contact sensor 13AR and the hand, and the contact sensor 13AL and the hand. The coordinates of the upper end and the lower end of the contact area are specified. Then, the operation invalid area determination unit 1221B determines the touch coordinates on the touch panel (target) that are located within a certain distance r (within the above-described first area) from any of the four specified coordinates. It is determined whether or not there is one or more (touch coordinates) (S33). In the example of FIG. 12, since the touch coordinates A and the touch coordinates B are located in the first area, it is determined that one or more target touch coordinates exist. If the smartphone 1B determines that one or both of the two contact sensors are not touched in S32, the process proceeds to S36.
 S33または後述のS35にて対象タッチ座標が1つ以上存在すると判定された場合、タッチイベント制御部122Bは、全ての対象タッチ座標をタッチ操作イベントの引数に含めないことを決定する(S34)。なお、スマートフォン1Bは、S33において対象タッチ座標が存在しないと判定した場合、S36に進む。 When it is determined in S33 or S35 described later that one or more target touch coordinates exist, the touch event control unit 122B determines not to include all target touch coordinates in the argument of the touch operation event (S34). If the smartphone 1B determines that the target touch coordinates do not exist in S33, the process proceeds to S36.
 その後、操作無効領域決定部1221Bは、S34にてタッチ操作イベントの引数に含めないことを決定した1つ以上の座標のうち何れかの座標から一定距離r内に位置するような、タッチパネル上のタッチ座標(対象タッチ座標)が1つ以上存在するか否かを判定する(S35)。スマートフォン1Bは、対象タッチ座標が1つ以上存在すると判定した場合にはS34に戻り、対象タッチ座標が存在しないと判定した場合にはS36に進む。なお、図12の例では、初回のS35では、タッチ座標Cがタッチ座標Aから一定距離r内(第2の領域内)に位置し、タッチ座標Dがタッチ座標Bから一定距離r内(第2の領域内)に位置するので、対象タッチ座標が1つ以上存在すると判定する。2回目のS35では、タッチ座標Eがタッチ座標Dから一定距離r内に位置するので対象タッチ座標が1つ以上存在すると判定する。一方、3回目のS35では、残りのどのタッチ座標もタッチ座標Eから一定距離r内に位置しないので対象タッチ座標が存在しないと判定する。 After that, the operation invalid area determination unit 1221B is located on the touch panel such that the operation invalid area determination unit 1221B is located within a certain distance r from any one of the one or more coordinates determined not to be included in the argument of the touch operation event in S34. It is determined whether one or more touch coordinates (target touch coordinates) exist (S35). The smartphone 1B returns to S34 when it is determined that one or more target touch coordinates exist, and proceeds to S36 when it is determined that no target touch coordinates exist. In the example of FIG. 12, in the first time S35, the touch coordinate C is located within the fixed distance r (within the second region) from the touch coordinate A, and the touch coordinate D is within the fixed distance r (the second distance) from the touch coordinate B. 2), it is determined that one or more target touch coordinates exist. In S35 of the second time, since the touch coordinates E are located within a certain distance r from the touch coordinates D, it is determined that one or more target touch coordinates exist. On the other hand, in the third S35, since no remaining touch coordinates are located within the predetermined distance r from the touch coordinates E, it is determined that the target touch coordinates do not exist.
 S36において、タッチイベント制御部122Bは、タッチイベントの引数に含めないことが決定されていないタッチ座標(対象タッチ座標)が存在するか否かを判定する。図12の例では、タッチ座標Fが対象タッチ座標に該当するので、対象タッチ座標が存在すると判定する。 In S36, the touch event control unit 122B determines whether there is touch coordinates (target touch coordinates) that are not determined to be included in the argument of the touch event. In the example of FIG. 12, since the touch coordinates F correspond to the target touch coordinates, it is determined that the target touch coordinates exist.
 タッチイベント制御部122Bは、S36にて対象タッチ座標が存在すると判定した場合、全対象タッチ座標を引数とするタッチ操作イベントをアプリ実行部121に発行する。そして、アプリ実行部121は、発行されたタッチ操作イベントに従った処理を実行する(S37)。S37の後、スマートフォン1Bは、図11のフローチャートに従った動作を終了する。一方、S36にて対象タッチ座標が存在すると判定した場合、タッチイベント制御部122は、タッチ座標を引数とするタッチイベントをアプリ実行部121に対して発行せずに図11のフローチャートに従った動作を終了する。 When the touch event control unit 122B determines that the target touch coordinates exist in S36, the touch event control unit 122B issues a touch operation event having all the target touch coordinates as arguments to the application execution unit 121. And the application execution part 121 performs the process according to the issued touch operation event (S37). After S37, the smartphone 1B ends the operation according to the flowchart of FIG. On the other hand, when it is determined in S36 that the target touch coordinates exist, the touch event control unit 122 does not issue a touch event with the touch coordinates as an argument to the application execution unit 121, and performs the operation according to the flowchart of FIG. Exit.
 (実施形態3の付記事項)
 スマートフォン1Bの代わりに額縁領域が設けられた携帯端末を用いてもよい。この場合、携帯端末の操作無効領域決定部は、操作無効領域決定部1221Bと同様に「接触センサにおける指との接触領域」に隣接する表示画面内の領域を操作無効領域とすることを決定する。ここで、接触領域と表示画面内の領域とが「隣接する」とは、当該接触領域と当該「表示画面内の領域」との間の最短距離が当該接触領域と表示画面の全領域との間の最短距離に等しいことを意味している。
(Additional notes of Embodiment 3)
A mobile terminal provided with a frame area may be used instead of the smartphone 1B. In this case, the operation invalid area determination unit of the mobile terminal determines that the area in the display screen adjacent to the “contact area with the finger in the contact sensor” is set as the operation invalid area, similarly to the operation invalid area determination unit 1221B. . Here, the contact area and the area in the display screen are “adjacent” means that the shortest distance between the contact area and the “area in the display screen” is the difference between the contact area and the entire area of the display screen. It is equal to the shortest distance between.
 なお、本発明に係る携帯端末は様々な形態をとることができるが、4種類の形態の携帯端末を例に挙げて各形態における「接触センサにおける指との接触領域」に隣接する操作無効領域の位置を図15に示した。 Although the mobile terminal according to the present invention can take various forms, the operation invalid area adjacent to the “contact area with the finger in the contact sensor” in each form by taking four types of mobile terminals as examples. The position of is shown in FIG.
 図15は、各携帯端末の外縁部分の断面(具体的には、携帯端末の底面と、接触センサにおける指との接触領域の中心と、の間の距離dだけ底面から離れた「底面に平行な面」で切った断面)を模式的に示した図である。図15に示すように、どの形態の携帯端末も、「接触センサにおける指との接触領域」に隣接する「表示画面内の領域」を操作無効領域とするようになっている。 FIG. 15 is a cross-sectional view of an outer edge portion of each mobile terminal (specifically, “parallel to the bottom surface” separated from the bottom surface by a distance d between the bottom surface of the mobile terminal and the center of the contact area of the contact sensor with the finger. It is the figure which showed typically the cross section cut off by the "face." As shown in FIG. 15, in any form of mobile terminal, an “area in the display screen” adjacent to the “contact area with the finger in the contact sensor” is set as an operation invalid area.
 (実施形態1~3の付記事項1)
 各実施形態に係るスマートフォンは、操作無効領域に画像を表示するように構成されていてもよいが、操作無効領域に画像を表示しないように構成されていてもよい。
(Appendix 1 of Embodiments 1 to 3)
The smartphone according to each embodiment may be configured to display an image in the operation invalid area, but may be configured not to display an image in the operation invalid area.
 また、本発明に係る携帯端末は、前面がタッチパネルに覆われ、対向する2つの側面(左側面及び右側面)に接触センサが設けられた端末に限定されない。すなわち、本発明に係る携帯端末は、左側面、前面及び右側面の3面を覆うタッチパネルが設けられた端末であって上述の2つの接触センサが設けられていない端末であってもよい。この場合、タッチパネルのうちの左側面を覆う部分と該表示部のうちの右側面を覆う部分とが特許請求の範囲における接触検出部としての役割を担うことになる。 Also, the portable terminal according to the present invention is not limited to a terminal whose front surface is covered with a touch panel and contact sensors are provided on two opposing side surfaces (left side surface and right side surface). That is, the mobile terminal according to the present invention may be a terminal provided with a touch panel that covers the left side, the front side, and the right side and is not provided with the two contact sensors described above. In this case, the portion of the touch panel that covers the left side surface and the portion of the display unit that covers the right side surface serve as a contact detection unit in the claims.
 (実施形態1~3の付記事項2)
 本発明に係る携帯端末は、複数の側面に接触センサが設けられたものであればよい。例えば、本発明に係る携帯端末は、対向する2つの側面(上側面及び下側面)に接触センサが設けられた端末Aであってもよいし、上下左右の4つの側面の各々に接触センサが設けられた端末Bであってもよい。
(Appendix 2 of Embodiments 1 to 3)
The portable terminal which concerns on this invention should just be provided with the contact sensor in the some side surface. For example, the mobile terminal according to the present invention may be a terminal A in which contact sensors are provided on two opposing side surfaces (upper side surface and lower side surface), or a contact sensor may be provided on each of four upper, lower, left, and right side surfaces. The terminal B provided may be used.
 端末Aのタッチイベント制御部は、2つの接触センサの両方が手の接触を検知している場合、表示画面の上端の領域および下端の領域を含む操作無効領域内の座標を指定するタッチ操作を無効にするようになっていてもよい。この場合、端末Aは、ユーザが端末Aの左側面又は右側面が上方を向くように両手で端末Aの上端及び下端を持ってカメラ撮影をしようとしている時に、ユーザの手が意図せず表示画面に触れてしまうことによる誤操作が発生するのを防ぐことができる。 The touch event control unit of the terminal A performs a touch operation for designating coordinates in an operation invalid area including the upper end area and the lower end area of the display screen when both of the two contact sensors detect a hand contact. It may be made invalid. In this case, the terminal A is displayed unintentionally by the user's hand when the user is holding the upper and lower ends of the terminal A with both hands so that the left side or the right side of the terminal A faces upward. It is possible to prevent an erroneous operation caused by touching the screen.
 また、端末Bのタッチイベント制御部は、4つの接触センサのうち上側面に設けられた接触センサと下側面に設けられた接触センサとの両方が手の接触を検知している場合、表示画面の上端の領域および下端の領域を含む操作無効領域内の座標を指定するタッチ操作を無効にするようになっていてもよい。同様に、端末Bのタッチイベント制御部は、4つの接触センサのうち右側面に設けられた接触センサと左側面に設けられた接触センサとの両方が手の接触を検知している場合、表示画面の右端の領域および左端の領域を含む操作無効領域内の座標を指定するタッチ操作を無効にするようになっていてもよい。 Further, the touch event control unit of the terminal B displays the display screen when the contact sensor provided on the upper side and the contact sensor provided on the lower side of the four contact sensors detect the hand contact. The touch operation for designating the coordinates in the operation invalid area including the upper end area and the lower end area may be invalidated. Similarly, the touch event control unit of the terminal B displays when the contact sensor provided on the right side and the contact sensor provided on the left side of the four contact sensors detect a hand contact. The touch operation that specifies the coordinates in the operation invalid area including the right end area and the left end area of the screen may be invalidated.
 〔ソフトウェアによる実現例〕
 各実施形態のスマートフォンの制御ブロック(特に、タッチイベント制御部、及び、センサ状態検出処理部)は、集積回路(ICチップ)等に形成された論理回路(ハードウェア)によって実現してもよいし、CPU(Central Processing Unit)を用いてソフトウェアによって実現してもよい。
[Example of software implementation]
The smartphone control blocks (particularly, the touch event control unit and the sensor state detection processing unit) of each embodiment may be realized by a logic circuit (hardware) formed in an integrated circuit (IC chip) or the like. Alternatively, it may be realized by software using a CPU (Central Processing Unit).
 後者の場合、スマートフォンは、各機能を実現するソフトウェアであるプログラムの命令を実行するCPU、上記プログラム及び各種データがコンピュータ(又はCPU)で読み取り可能に記録されたROM(Read Only Memory)又は記憶装置(これらを「記録媒体」と称する)、上記プログラムを展開するRAM(Random Access Memory)などを備えている。そして、コンピュータ(又はCPU)が上記プログラムを上記記録媒体から読み取って実行することにより、本発明の目的が達成される。上記記録媒体としては、「一時的でない有形の媒体」、例えば、テープ、ディスク、カード、半導体メモリ、プログラマブルな論理回路などを用いることができる。また、上記プログラムは、該プログラムを伝送可能な任意の伝送媒体(通信ネットワークや放送波等)を介して上記コンピュータに供給されてもよい。なお、本発明は、上記プログラムが電子的な伝送によって具現化された、搬送波に埋め込まれたデータ信号の形態でも実現され得る。 In the latter case, the smartphone has a CPU that executes instructions of a program that is software that realizes each function, a ROM (Read Only Memory) or a storage device in which the program and various data are recorded so as to be readable by a computer (or CPU) (These are referred to as “recording media”), and a RAM (Random Access Memory) for expanding the program. And the objective of this invention is achieved when a computer (or CPU) reads the said program from the said recording medium and runs it. As the recording medium, a “non-temporary tangible medium” such as a tape, a disk, a card, a semiconductor memory, a programmable logic circuit, or the like can be used. The program may be supplied to the computer via an arbitrary transmission medium (such as a communication network or a broadcast wave) that can transmit the program. The present invention can also be realized in the form of a data signal embedded in a carrier wave in which the program is embodied by electronic transmission.
 〔まとめ〕
 本発明の態様1に係る携帯端末(スマートフォン1)は、座標を指定する操作を受け付ける表示画面(表示部140)と、自端末の複数の側面の各々に設けられた接触検出部(接触センサ13L、13R)と、接触物(手の接触)を複数の上記接触検出部が検出しているか否かを判定する判定部(センサ状態検出処理部123)と、上記判定部により接触物を複数の上記接触検出部が検出していると判定された場合に、上記表示画面内の特定の領域(操作無効領域)内の座標を指定する操作を無効化する無効化部(タッチイベント制御部122)と、を備えている。
[Summary]
The mobile terminal (smart phone 1) according to aspect 1 of the present invention includes a display screen (display unit 140) that receives an operation for designating coordinates, and a contact detection unit (contact sensor 13L) provided on each of a plurality of side surfaces of the terminal. 13R), a determination unit (sensor state detection processing unit 123) for determining whether or not the plurality of contact detection units detect a contact object (contact of a hand), and a plurality of contact objects by the determination unit. An invalidation unit (touch event control unit 122) that invalidates an operation for designating coordinates in a specific region (operation invalid region) in the display screen when it is determined that the contact detection unit is detecting And.
 上記の構成によれば、上記携帯端末は、誤操作がされ易い特定の状況下では誤操作防止機能を確実に働かせることができる、という効果を奏する。 According to the above configuration, the portable terminal has an effect that the erroneous operation prevention function can be surely operated under a specific situation in which an erroneous operation is easily performed.
 本発明の態様2に係る携帯端末(スマートフォン1A)は、上記態様1において、上記特定の領域を規定する規定部(操作無効領域決定部1221A)を更に備え、上記規定部は、上記接触検出部における上記接触物の接触面積が大きいほど上記特定の領域の面積が大きくなるように上記特定の領域を規定してもよい。 The portable terminal (smart phone 1A) according to aspect 2 of the present invention further includes a defining unit (operation invalid area determining unit 1221A) that defines the specific area in the aspect 1, and the defining unit includes the contact detecting unit. The specific area may be defined such that the area of the specific area increases as the contact area of the contact object increases.
 上記の構成によれば、上記携帯端末は、図13に示す持ち方をされた場合、該端末を握る手が大きい(すなわち、手が意図せず表示画面に接触しやすい)ときほど上記特定の領域の面積が大きくなるように上記特定の領域を規定する。従って、上記携帯端末は、手の大きいユーザにより操作された場合において、誤操作を従来よりも確実に防ぐことができるという更なる効果を奏する。 According to the above configuration, when the portable terminal is held as shown in FIG. 13, the above-mentioned specific terminal becomes more specific when the hand holding the terminal is large (that is, the hand tends to touch the display screen unintentionally). The specific region is defined so that the area of the region becomes large. Therefore, when the portable terminal is operated by a user with a large hand, there is a further effect that erroneous operation can be prevented more reliably than before.
 本発明の態様3に係る携帯端末(スマートフォン1B)は、上記態様1において、上記特定の領域を規定する規定部(操作無効領域決定部1221B)を更に備え、上記規定部は、上記特定の領域が上記接触検出部における上記接触物の接触位置に応じた位置に位置するように、上記特定の領域を規定してもよい。 The mobile terminal (smart phone 1B) according to aspect 3 of the present invention further includes a defining part (operation invalid area determining part 1221B) that defines the specific area in the aspect 1, and the defining part includes the specific area. The specific region may be defined such that the specific region is positioned at a position corresponding to the contact position of the contact object in the contact detection unit.
 上記の構成によれば、上記携帯端末は、図13に示す持ち方をされた場合、誤操作を従来よりも確実に防ぐことができるという更なる効果を奏する。 According to the above configuration, when the portable terminal is held as shown in FIG. 13, there is a further effect that erroneous operation can be prevented more reliably than in the past.
 本発明の態様4に係る携帯端末は、上記態様3において、上記規定部が、上記表示画面内の座標を指し示す指の有無に関わりなく上記表示画面内に第1の領域を規定するとともに、指が第1の領域内の座標を指し示している場合には上記表示画面内に当該座標の位置に応じた第2の領域を規定するようになっており、上記規定部は、指が第1の領域内の座標を指し示していない場合には第1の領域を上記特定の領域として規定し、指が第1の領域内の座標を指し示している場合には第1の領域と第2の領域とを含む領域を上記特定の領域として規定してもよい。 A portable terminal according to aspect 4 of the present invention is the portable terminal according to aspect 3, wherein the defining unit defines the first area in the display screen regardless of the presence or absence of a finger indicating the coordinates in the display screen. Indicates the second area corresponding to the position of the coordinate in the display screen, and the defining unit is configured such that the finger is the first area. When the coordinates in the area are not pointed, the first area is defined as the specific area, and when the finger points to the coordinates in the first area, the first area and the second area An area including the above may be defined as the specific area.
 本発明の態様5に係る携帯端末は、上記態様1から態様4までのいずれかの態様において、上記特定の領域が、上記接触物を検出している複数の接触検出部の各々について上記表示画面の端部の領域であって当該接触検出部が設けられている側面側の端部の領域が上記特定の領域の一部となるように、規定された領域であってもよい。 The mobile terminal according to aspect 5 of the present invention is the mobile terminal according to any one of the aspects 1 to 4 described above, wherein the specific area is the display screen for each of the plurality of contact detection units detecting the contact object. It may be a region that is defined such that the region of the end portion on the side surface side where the contact detection unit is provided is part of the specific region.
 本発明の態様6に係る操作処理方法は、座標を指定する操作を受け付ける表示画面が設けられ、自端末の側面に接触検出部が設けられた携帯端末による操作処理方法であって、上記接触検出部は、上記携帯端末の複数の側面の各々に設けられており、接触物を複数の上記接触検出部の両方が検出しているか否かを判定する判定ステップと、上記判定ステップにて接触物を複数の上記接触検出部が検出していると判定された場合に、上記表示画面内の特定の領域内の座標を指定する操作を無効化する無効化ステップを含んでいる。 An operation processing method according to aspect 6 of the present invention is an operation processing method by a portable terminal provided with a display screen for accepting an operation for designating coordinates, and provided with a contact detection unit on a side surface of the terminal itself. Are provided on each of the plurality of side surfaces of the portable terminal, and a determination step for determining whether or not both of the plurality of contact detection units detect the contact object, and the contact object in the determination step. When it is determined that a plurality of the contact detection units are detecting, a disabling step of disabling an operation of designating coordinates in a specific area in the display screen is included.
 上記の構成によれば、上記操作処理方法は、本発明の態様1に係る携帯端末と同様の作用効果を奏する。 According to the above configuration, the operation processing method has the same effects as the portable terminal according to aspect 1 of the present invention.
 本発明の各態様に係る携帯端末は、コンピュータによって実現してもよく、この場合には、コンピュータを上記携帯端末が備える上記判定部および上記無効化部として動作させることにより当該各部をコンピュータにて実現させるプログラム、およびそれを記録したコンピュータ読み取り可能な記録媒体も、本発明の範疇に入る。 The mobile terminal according to each aspect of the present invention may be realized by a computer. In this case, the computer is operated as the determination unit and the invalidation unit included in the mobile terminal, so that each unit is operated by the computer. A program to be realized and a computer-readable recording medium on which the program is recorded also fall within the scope of the present invention.
 本発明は上述した各実施形態に限定されるものではなく、請求項に示した範囲で種々の変更が可能であり、異なる実施形態にそれぞれ開示された技術的手段を適宜組み合わせて得られる実施形態についても本発明の技術的範囲に含まれる。さらに、各実施形態にそれぞれ開示された技術的手段を組み合わせることにより、新しい技術的特徴を形成することができる。 The present invention is not limited to the above-described embodiments, and various modifications are possible within the scope shown in the claims, and embodiments obtained by appropriately combining technical means disclosed in different embodiments. Is also included in the technical scope of the present invention. Furthermore, a new technical feature can be formed by combining the technical means disclosed in each embodiment.
 本発明は、スマートフォン等の各種携帯情報端末に好適に利用することができる。 The present invention can be suitably used for various portable information terminals such as smartphones.
 1、1A、1B      スマートフォン(携帯端末)
 12、12A、12B     制御部
 122、122A、122B    タッチイベント制御部(無効化部、無効化手段)
 1221A、1221B       操作無効領域決定部(規定部、規定手段)
 123、123A、123B    センサ状態検出処理部(判定部、判定手段)
 13、13AL、13BL   接触センサ(接触検出部)
 13、13AR、13BR   接触センサ(接触検出部)
 14             表示部(表示画面)
1, 1A, 1B Smartphone (mobile terminal)
12, 12A, 12B Control unit 122, 122A, 122B Touch event control unit (invalidation unit, invalidation means)
1221A, 1221B operation invalid area determination unit (regulation unit, regulation means)
123, 123A, 123B Sensor state detection processing unit (determination unit, determination unit)
13, 13AL, 13BL Contact sensor (contact detector)
13, 13AR, 13BR Contact sensor (contact detection unit)
14 Display section (display screen)

Claims (8)

  1.  座標を指定する操作を受け付ける表示画面と、
     自端末の複数の側面の各々に設けられた接触検出部と、
     接触物を複数の上記接触検出部が検出しているか否かを判定する判定部と、
     上記判定部により接触物を複数の上記接触検出部が検出していると判定された場合に、上記表示画面内の特定の領域内の座標を指定する操作を無効化する無効化部と、を備えていることを特徴とする携帯端末。
    A display screen that accepts operations for specifying coordinates;
    A contact detector provided on each of a plurality of side surfaces of the terminal;
    A determination unit that determines whether a plurality of contact detection units detect a contact object; and
    When the determination unit determines that a plurality of contact detection units are detecting a contact object, an invalidation unit for invalidating an operation for designating coordinates in a specific area in the display screen, A portable terminal characterized by comprising.
  2.  上記特定の領域を規定する規定部を更に備え、
     上記規定部は、上記接触検出部における上記接触物の接触面積が大きいほど上記特定の領域の面積が大きくなるように、上記特定の領域を規定する、ことを特徴とする請求項1に記載の携帯端末。
    Further comprising a defining part for defining the specific area,
    The said specific | specification part prescribes | regulates the said specific area | region so that the area of the said specific area | region becomes large, so that the contact area of the said contact thing in the said contact detection part is large. Mobile device.
  3.  上記特定の領域を規定する規定部を更に備え、
     上記規定部は、上記特定の領域が上記接触検出部における上記接触物の接触位置に応じた位置に位置するように、上記特定の領域を規定する、ことを特徴とする請求項1に記載の携帯端末。
    Further comprising a defining part for defining the specific area,
    The said specific | specification part prescribes | regulates the said specific area | region so that the said specific area | region may be located in the position according to the contact position of the said contact thing in the said contact detection part. Mobile device.
  4.  上記規定部は、上記表示画面内の座標を指し示す指の有無に関わりなく上記表示画面内に第1の領域を規定するとともに、指が第1の領域内の座標を指し示している場合には上記表示画面内に当該座標の位置に応じた第2の領域を規定するようになっており、
     上記規定部は、指が第1の領域内の座標を指し示していない場合には第1の領域を上記特定の領域として規定し、指が第1の領域内の座標を指し示している場合には第1の領域と第2の領域とを含む領域を上記特定の領域として規定する、ことを特徴とする請求項3に記載の携帯端末。
    The defining unit defines the first area in the display screen regardless of the presence or absence of a finger indicating the coordinates in the display screen, and when the finger points to the coordinates in the first area, A second area corresponding to the position of the coordinate is defined in the display screen,
    The defining unit defines the first area as the specific area when the finger does not point to the coordinates in the first area, and the finger points to the coordinates within the first area. The mobile terminal according to claim 3, wherein an area including the first area and the second area is defined as the specific area.
  5.  上記特定の領域は、上記接触物を検出している複数の接触検出部の各々について上記表示画面の端部の領域であって当該接触検出部が設けられている側面側の端部の領域が上記特定の領域の一部となるように、規定された領域である、ことを特徴とする請求項1から4のいずれか1項に記載の携帯端末。 The specific region is an end region of the display screen for each of the plurality of contact detection units detecting the contact object, and an end region on the side surface side where the contact detection unit is provided. The mobile terminal according to any one of claims 1 to 4, wherein the mobile terminal is a defined area so as to be a part of the specific area.
  6.  座標を指定する操作を受け付ける表示画面が設けられ、自端末の側面に接触検出部が設けられた携帯端末による操作処理方法であって、
     上記接触検出部は、上記携帯端末の複数の側面の各々に設けられており、
     接触物を複数の上記接触検出部が検出しているか否かを判定する判定ステップと、
     上記判定ステップにて接触物を複数の上記接触検出部が検出していると判定された場合に、上記表示画面内の特定の領域内の座標を指定する操作を無効化する無効化ステップと、を含んでいることを特徴とする操作処理方法。
    A display screen that accepts an operation for designating coordinates is provided, and an operation processing method by a mobile terminal in which a contact detection unit is provided on a side surface of the terminal,
    The contact detection unit is provided on each of a plurality of side surfaces of the mobile terminal,
    A determination step of determining whether or not a plurality of the contact detection units detect a contact object;
    An invalidation step of invalidating an operation of designating coordinates in a specific area in the display screen when it is determined in the determination step that a plurality of contact detection units detect a contact object; The operation processing method characterized by including.
  7.  請求項1から5のいずれか1項に記載の携帯端末が備える上記判定部および上記無効化部としてコンピュータを機能させるためのプログラム。 A program for causing a computer to function as the determination unit and the invalidation unit included in the mobile terminal according to any one of claims 1 to 5.
  8.  請求項7に記載のプログラムが記録されているコンピュータ読み取り可能な記録媒体。 A computer-readable recording medium on which the program according to claim 7 is recorded.
PCT/JP2014/082865 2013-12-13 2014-12-11 Portable terminal, operation processing method, program, and recording medium WO2015087977A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2013258598A JP6159243B2 (en) 2013-12-13 2013-12-13 Portable terminal, operation processing method, program, and recording medium
JP2013-258598 2013-12-13

Publications (1)

Publication Number Publication Date
WO2015087977A1 true WO2015087977A1 (en) 2015-06-18

Family

ID=53371276

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2014/082865 WO2015087977A1 (en) 2013-12-13 2014-12-11 Portable terminal, operation processing method, program, and recording medium

Country Status (2)

Country Link
JP (1) JP6159243B2 (en)
WO (1) WO2015087977A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113031802A (en) * 2019-12-09 2021-06-25 华为终端有限公司 Touch area adjusting method and device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011028603A (en) * 2009-07-28 2011-02-10 Nec Casio Mobile Communications Ltd Terminal device and program
JP2011119959A (en) * 2009-12-03 2011-06-16 Nec Corp Portable terminal
JP2013088929A (en) * 2011-10-14 2013-05-13 Panasonic Corp Input device, information terminal, input control method and input control program
JP2013228836A (en) * 2012-04-25 2013-11-07 Konica Minolta Inc Operation display device, operation display method, and program
JP2013235468A (en) * 2012-05-10 2013-11-21 Fujitsu Ltd Mobile terminal and mobile terminal cover

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011028603A (en) * 2009-07-28 2011-02-10 Nec Casio Mobile Communications Ltd Terminal device and program
JP2011119959A (en) * 2009-12-03 2011-06-16 Nec Corp Portable terminal
JP2013088929A (en) * 2011-10-14 2013-05-13 Panasonic Corp Input device, information terminal, input control method and input control program
JP2013228836A (en) * 2012-04-25 2013-11-07 Konica Minolta Inc Operation display device, operation display method, and program
JP2013235468A (en) * 2012-05-10 2013-11-21 Fujitsu Ltd Mobile terminal and mobile terminal cover

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113031802A (en) * 2019-12-09 2021-06-25 华为终端有限公司 Touch area adjusting method and device

Also Published As

Publication number Publication date
JP2015115002A (en) 2015-06-22
JP6159243B2 (en) 2017-07-05

Similar Documents

Publication Publication Date Title
KR102240088B1 (en) Application switching method, device and graphical user interface
EP2854010B1 (en) Method and apparatus for displaying messages
KR102021048B1 (en) Method for controlling user input and an electronic device thereof
EP3343341B1 (en) Touch input method through edge screen, and electronic device
JP5986957B2 (en) Portable terminal, invalid area setting program, and invalid area setting method
EP2770421A2 (en) Electronic device having touch-sensitive user interface and related operating method
US10474344B2 (en) Method, apparatus and recording medium for a scrolling screen
TW201432557A (en) Touch screen with unintended input prevention
US20140292697A1 (en) Portable terminal having double-sided touch screens, and control method and storage medium therefor
JPWO2012111230A1 (en) Information processing terminal and control method thereof
JP2014182657A (en) Information processing device and program
JP2019175090A (en) Electronic apparatus, control unit, control program, and method of operating an electronic apparatus
JP6349015B2 (en) Display method for touch input device
AU2014242161B2 (en) Off-center sensor target region
WO2014057929A1 (en) User interface device, user interface method and program
US20170075553A1 (en) Method of controlling indication, user terminal device, and indication control program
KR20140105354A (en) Electronic device including a touch-sensitive user interface
KR20140130798A (en) Apparatus and method for touch screen panel display and touch key
JP6159243B2 (en) Portable terminal, operation processing method, program, and recording medium
US20220043517A1 (en) Multi-modal touchpad
JP2015052851A (en) Operation input device, portable information terminal, control method of operation input device, program, and recording medium
US20170017389A1 (en) Method and apparatus for smart device manipulation utilizing sides of device
US20150054795A1 (en) Terminal device
JP2014178768A (en) Information processor
JP2017157086A (en) Display device and method of controlling the same

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14870349

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14870349

Country of ref document: EP

Kind code of ref document: A1