JP6284459B2 - Terminal device - Google Patents

Terminal device Download PDF

Info

Publication number
JP6284459B2
JP6284459B2 JP2014187682A JP2014187682A JP6284459B2 JP 6284459 B2 JP6284459 B2 JP 6284459B2 JP 2014187682 A JP2014187682 A JP 2014187682A JP 2014187682 A JP2014187682 A JP 2014187682A JP 6284459 B2 JP6284459 B2 JP 6284459B2
Authority
JP
Japan
Prior art keywords
display
object
display position
smartphone
terminal device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2014187682A
Other languages
Japanese (ja)
Other versions
JP2016062156A (en
Inventor
笹 雅明
雅明 笹
Original Assignee
シャープ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by シャープ株式会社 filed Critical シャープ株式会社
Priority to JP2014187682A priority Critical patent/JP6284459B2/en
Publication of JP2016062156A publication Critical patent/JP2016062156A/en
Application granted granted Critical
Publication of JP6284459B2 publication Critical patent/JP6284459B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Description

  The present invention relates to a portable terminal device.

  Conventionally, portable terminal devices such as smartphones are known. For example, Patent Document 1 discloses a display terminal device including a display unit, an operation determination unit, and a display control unit as such a terminal device.

  The display means of the display terminal device displays a display area having a size corresponding to the size of the display screen. The operation determining means determines whether or not a predetermined operation has been performed. Specifically, the operation determination unit determines whether or not a point operation (cursor movement) to a predetermined display position (specifically, an end of the display area) within the display screen has been performed. When it is determined that the predetermined operation has been performed by the operation determining means, the display control means shifts the display area so that the end of the display area is directed to the center portion of the display screen and reduces the shifted display area. To do.

  Conventionally, a terminal device having a narrow bezel (picture frame) (a terminal device with a so-called narrow frame or ultra-narrow frame) is known. A terminal device with a narrow bezel width is superior to a terminal device with a wide bezel width in that the display area can be increased.

JP 2010-33413 A

  When the width of the bezel is reduced as described above, an object (eg, icon) located near the finger of the gripping hand (eg, thumb of the left hand) is selected with the other finger (eg, index finger of the right hand). It becomes difficult. Therefore, a terminal device with a narrow bezel width may be inferior in terms of selection of an object displayed on the end of the display screen, compared to a terminal device with a wide bezel width.

  In the display terminal device disclosed in Patent Document 1, the display area can be shifted and reduced. However, in order to execute these displays on the display terminal device, the user must perform an operation of moving the cursor. Therefore, it is troublesome for the user to cause the display terminal device to perform shift and reduction display.

  The present invention has been made in view of the above problems, and an object thereof is to provide a portable terminal device that can easily select an object displayed on an end of a display. .

  According to one aspect of the present invention, the terminal device is a portable device including a touch screen having a touch panel and a display. The terminal device includes a first detection unit for detecting that the terminal device is held by the user, a display control unit for displaying a predetermined object at a first display position of the display, And a second detection means for detecting that an object that can be selected approaches the bezel of the terminal device. When it is detected that the user is holding the terminal device and the object is detected to be close to the bezel, the display control unit changes the display position of the object from the first display position to the first display position. The display position is switched to the second display position that is closer to the center of the display area of the display than the first display position.

  According to the present invention, it is possible to easily select an object displayed on an end portion of a display.

It is a front view of the smart phone 1 concerning this Embodiment. It is a figure for demonstrating the outline | summary of the process of the smart phone 1 when a user operates in the state holding the smart phone 1. FIG. It is a figure for demonstrating the outline | summary of the process of the smart phone 1 when a user operates in the state which put the smart phone 1 on the desk. 4 is a flowchart for explaining a flow of processing executed by the smartphone 1. 3 is a block diagram for explaining a functional configuration of a smartphone 1. FIG. 2 is a diagram illustrating a hardware configuration of a smartphone 1. FIG. It is a figure for demonstrating the modification of reduction mode. It is a front view of 1 A of smart phones concerning this Embodiment. It is a figure for demonstrating the operation example in the case of invalidating the selection operation of the icon by a user. It is a figure for demonstrating the example of operation in the case of validating selection operation of the icon by a user. It is a flowchart for demonstrating the flow of the process performed with the smart phone 1A. It is a block diagram for demonstrating the functional structure of 1 A of smart phones.

  Hereinafter, terminal devices according to embodiments of the present invention will be described with reference to the drawings. In the following description, the same reference numerals are assigned to the same members. Their names and functions are also the same. Therefore, detailed description thereof will not be repeated. Examples of the terminal device include various portable devices such as a smartphone, a fablet, a tablet, a PDA (Personal Digital Assistant), and a digital music player. In the following embodiments, a smartphone will be described as an example of a terminal device.

[Embodiment 1]
(A. Appearance and built-in sensor)
FIG. 1 is a front view of a smartphone 1 according to the present embodiment. Referring to FIG. 1, a smartphone 1 is a terminal device (a terminal device having a very narrow frame or a narrow frame) having a narrow bezel. The smartphone 1 includes a bezel 10, a touch screen 108, and sensors 191 to 194. The touch screen 108 is an input / output device having a configuration in which a touch panel is placed on a display. In the present embodiment, the bezel 10 is described as a part of the housing of the smartphone 1.

  The sensors 191 to 194 are used to detect contact of a user's hand (specifically, a finger). Each of the sensors 191 to 194 is typically a capacitance type sensor. The sensors 191 to 194 are incorporated in the vicinity of the left end, upper end, right end, and lower end of the bezel 10, respectively. The smartphone 1 detects whether or not the user is holding the smartphone 1 based on the detected change in capacitance. In addition, the number and position of a sensor are not limited to said number and position.

(B. Outline of processing)
(B1. First operation example)
FIG. 2 is a diagram for explaining an outline of processing of the smartphone 1 when the user operates with the smartphone 1 held.

  With reference to FIG. 2, the state (A) represents a state in which the user holds the smartphone 1 with the left hand. In this case, the CPU 101 (see FIG. 6) of the smartphone 1 determines that the smartphone 1 is being held based on outputs from the sensors 191 to 194. In the state (A), the smartphone 1 displays a plurality of icons including the icon 201.

  The state (B) after the state (A) is such that the index finger of the right hand approaches the icon 201 while the user holds the smartphone 1 with the left hand in order to select the icon 201 at the end of the display screen. Represents a state.

  A state (C) after the state (B) represents a state where the index finger of the right hand is closer to the icon 201 while the user is holding the smartphone 1 with the left hand. In this case, the CPU 101 detects a change in capacitance by using outputs from the sensors 191 to 194. In other words, the smartphone 1 uses the outputs from the sensors 191 to 194 to determine that an object that changes the capacitance, such as a finger or a stylus pen, is approaching the bezel 10. In the state (C), the CPU 101 typically determines that the object is approaching the bezel 10 by using the output from the sensor 191.

  When the CPU 101 determines that the object has approached the bezel 10, the CPU 101 shifts the icons so as to approach the center of the screen by stopping the full screen display and performing the screen display in the reduction mode. “Full screen display” is a display mode in which display is performed using the entire displayable area of the display. Hereinafter, the display mode for performing full screen display is also referred to as “full screen mode”.

  The state (D) after the state (C) represents a state where the user has selected the icon 201 with the index finger of the right hand. A state (E) after the state (D) represents a state in which the user has released the index finger of the right hand from the touch screen 108. In this case, the CPU 101 executes a process associated with the icon 201 on the display of the touch screen 108. The smartphone 1 displays a map on the display as an example in the full screen mode.

  If no icon is selected in the state (C), the smartphone 1 changes the state of the display screen from the state (C) to the state (F). Specifically, when the state (C) continues for a predetermined time (for example, 5 seconds), the smartphone 1 changes the state of the display screen from the state (C) to the state (F). That is, the smartphone 1 switches the display mode from the reduction mode to the full screen mode (normal mode) that performs full screen display.

  As described above, when the smartphone 1 detects the approach of the object to the bezel 10 while being held, the smartphone 1 displays each icon on the screen by performing screen display in the reduction mode as shown in the state (C). Shift toward the center. Therefore, the user can easily select an icon (for example, the icon 201) displayed at the end of the display in the full screen mode as compared to the full screen mode. Therefore, even if the width of the bezel of the smartphone 1 is narrow, the user can easily select the icon displayed on the end of the display.

(B2. Second operation example)
FIG. 3 is a diagram for explaining an overview of processing of the smartphone 1 when the user operates the smartphone 1 on the desk.

  With reference to FIG. 3, the state (A) represents a state in which the user is placed on the desk without holding the smartphone 1. In this case, the CPU 101 determines that the smartphone 1 is not gripped based on the outputs from the sensors 191 to 194. In the state (A), the smartphone 1 is displaying a plurality of icons including the icon 201 as in the state (A) of FIG.

  The state (B) after the state (A) is a state in which the user is trying to bring the index finger of the right hand closer to the icon 201 without holding the smartphone 1 in order to select the icon 201 at the end of the display screen. Represents.

  A state (C) after the state (B) represents a state where the index finger of the right hand is further brought closer to the icon 201 without holding the smartphone 1. In this case, the CPU 101 uses the outputs from the sensors 191 to 194 to determine that an object that changes the capacitance, such as a finger or a stylus pen, is approaching the bezel 10. Even if the CPU 101 determines that the object has approached the bezel 10, since the smartphone 1 is not gripped, it continues full screen display. That is, the smartphone 1 does not shift the display mode to the reduction mode, unlike the case where the smartphone 1 is gripped (state (C) in FIG. 2).

  The state (D) after the state (C) represents a state where the user has selected the icon 201 with the index finger of the right hand. A state (E) after the state (D) represents a state in which the user has released the index finger of the right hand from the touch screen 108. In this case, the CPU 101 executes a process associated with the icon 201 on the display of the touch screen 108. The smartphone 1 displays a map on the display as in the state (E) of FIG.

(B3. Summary of operation examples)
It is as follows when the process of the smart phone 1 in said operation example is demonstrated paying attention to the icon 201. FIG.

  (1) The smartphone 1 detects that it is gripped by the user. In addition, the smartphone 1 displays a predetermined icon 201 at a predetermined first display position on the display. Furthermore, the smartphone 1 detects that an object (a finger in the case of FIGS. 2 and 3) capable of selecting the icon 201 has approached the bezel 10 of the smartphone 1. When the smartphone 1 detects that the user is holding the smartphone 1 and detects that the object is close to the bezel 10, the smartphone 1 changes the display position of the icon 201 from the first display position. The display position is switched to the second display position closer to the center of the display area of the display than the first display position.

  According to the above configuration, when the smartphone 1 detects the approach of the object to the bezel 10 while being held, the smartphone 201 displays the display position of the icon 201 from the first display position rather than the first display position. Is switched to the second display position close to the center of the display area (see state (C) in FIG. 2).

  Therefore, the user can easily display the icon 201 displayed at the end of the display in the full screen mode as compared with the case where the icon 201 is displayed at the first display position (that is, the full screen mode). It becomes possible to select. Therefore, even if the width of the bezel 10 of the smartphone 1 is narrow, the user can easily select the icon displayed on the end of the display.

  (2) The smartphone 1 switches the display position of the icon 201 from the first display position to the second display position by reducing the screen displayed on the display. According to the above configuration, it is possible to display all the icons displayed on the display before reducing the screen while maintaining the positional relationship.

  (3) The touch panel detects selection of the icon 201 by the object. When the smartphone 1 switches the display position of the icon 201 from the first display position to the second display position, and the selection of the icon 201 is not detected for a predetermined time, the display position of the icon 201 is displayed. Is switched from the second display position to the first display position.

  According to said structure, the smart phone 1 returns the icon 201 to an initial display position automatically, when the icon 201 is not selected over predetermined time. Therefore, the user does not need to perform an operation for returning the display position of the icon 201.

(C. Control structure)
FIG. 4 is a flowchart for explaining the flow of processing executed by the smartphone 1. Referring to FIG. 4, in step S2, CPU 101 performs screen display in the full screen mode. In step S <b> 4, the CPU 101 determines whether the smartphone 1 is gripped based on the outputs of the sensors 191 to 194.

  If it is determined that it is not gripped (NO in step S4), CPU 101 returns the process to step S2. If it is determined that the object has been gripped (YES in step S4), CPU 101 determines in step S6 whether or not an object (finger or stylus pen) approaches bezel 10.

  If it is determined that the object is not approaching (NO in step S6), CPU 101 returns the process to step S2. If it is determined that the object has approached (YES in step S6), CPU 101 performs screen display in the reduction mode in step S8. That is, the CPU 101 switches the display mode from the full screen mode to the reduction mode.

  In step S10, the CPU 101 determines whether or not a touch operation (selection operation) on the object has been received within a predetermined time after switching to the reduction mode. If it is determined that it has not been accepted (NO in step S10), CPU 101 returns the process to step S2. When it is determined that it has been accepted (YES in step S10), in step S12, the CPU 101 converts the coordinate value of the touch position in the touch operation into the coordinate value in the full screen mode. In step S14, the CPU 101 executes a process associated with the selected object based on the converted coordinate values. In this case, the CPU 101 switches the display mode from the reduction mode to the full screen mode.

(D. Functional configuration)
FIG. 5 is a block diagram for explaining a functional configuration of the smartphone 1. With reference to FIG. 5, the smartphone 1 includes a detection unit 151, a control unit 152, a display unit 153, and a storage unit 154. The detection unit 151 includes a grip detection unit 1511, a proximity detection unit 1512, and a touch operation detection unit 1513. The control unit 152 includes a display control unit 1521 and a process execution unit 1522.

  The display unit 153 corresponds to the display of the touch screen 108. The storage unit 154 is a memory such as the flash memory 104 (see FIG. 6). The storage unit 154 stores an operating system and various application programs in advance.

  The detection unit 151 corresponds to a configuration in which the sensors 191 to 194, the CPU 101 (see FIG. 6), and the touch panel (see FIG. 6) are combined. Specifically, the detection unit 151 is realized by the CPU 101 executing an application program stored in the storage unit 154.

  The grip detection unit 1511 detects that the smartphone 1 is gripped by the user. The proximity detection unit 1512 detects that an object capable of selecting an object has approached the bezel 10 of the smartphone 1.

  Touch operation detection unit 1513 detects that an object has touched touch panel 1082. More specifically, the touch operation detection unit 1513 detects that an object has been selected by an object. Note that the touch operation detection unit 1513 corresponds to a combination of the CPU 101 and the touch panel.

  The control unit 152 corresponds to the CPU 101. Specifically, the control unit 152 is realized by the CPU 101 executing an application program stored in the storage unit 154.

  The display control unit 1521 performs processing based on the detection result by the grip detection unit 1511 and the detection result by the proximity detection unit 1512. Specifically, when the display control unit 1521 determines that the object has approached the bezel 10, the display control unit 1521 stops the full screen display and performs the screen display in the reduction mode, thereby shifting each icon so as to approach the center of the screen.

  Focusing on the icon 201, the display control unit 1521 displays the icon 201 at the first display position of the display in the full screen mode which is the default display mode. When the display control unit 1521 detects that the user is holding the smartphone 1 and detects that the object is close to the bezel 10, the display control unit 1521 changes the display position of the icon 201 to the first position. The display position is switched to the second display position that is closer to the center of the display area of the display than the first display position.

  Specifically, the display control unit 1521 switches the display position of the icon 201 from the first display position to the second display position by reducing the screen displayed on the display unit 153. More specifically, after the display control unit 1521 switches the display position of the icon 201 from the first display position to the second display position, the selection of an icon such as the icon 201 is not detected for a predetermined time. The display position of the icon 201 is switched from the second display position to the first display position.

  The process execution unit 1522 executes a process according to the detection result based on the detection result by the touch operation detection unit 1513. Specifically, the process execution unit 1522 receives a display mode type (full screen mode or reduced mode) notification from the display control unit 1521. When the processing execution unit 1522 receives a notification that the mode is the reduction mode, the process execution unit 1522 performs the coordinate conversion described above.

  For example, when the icon 201 is selected by the touch operation detection unit 1513 in the reduction mode, the process execution unit 1522 calculates the coordinate value of the icon 201 in the full screen mode by coordinate conversion. The process execution unit 1522 specifies a process associated with the icon 201 using the converted coordinate value, and executes the specified process. In this case, the display control unit 1521 causes the display unit 153 to display a display screen based on the processing.

(E. Hardware configuration)
FIG. 6 is a diagram illustrating a hardware configuration of the smartphone 1. Referring to FIG. 6, the smartphone 1 includes a CPU 101 that executes a program, a ROM (Read Only Memory) 102, a RAM (Random Access Memory) 103, a flash memory 104, an operation key 105, a speaker 106, It includes at least a camera 107, a touch screen 108, an acceleration sensor 111, a wireless communication IF (Interface) 112, an antenna 121, and sensors 191 to 194. The touch screen 108 includes a display 1081 and a touch panel 1082. The components 101 to 108, 111, 112, and 191 to 194 are connected to each other by a data bus.

  The antenna 121 is connected to the wireless communication IF 112. The antenna 121 and the wireless communication IF 112 are used for wireless communication with other mobile terminals, fixed telephones, and PCs (Personal Computers) via a base station, for example.

  The ROM 102 is a nonvolatile semiconductor memory. The ROM 102 stores a boot program for the smartphone 1 in advance. The flash memory 104 is a nonvolatile semiconductor memory. The flash memory 104 may be configured as a NAND type as an example. The flash memory 104 stores various data such as an operating system of the smartphone 1, various programs for controlling the smartphone 1, data generated by the smartphone 1, and data acquired from an external device of the smartphone 1 in a volatile manner. .

  The processing in the smartphone 1 is realized by software executed by each hardware and the CPU 101. Such software may be stored in the flash memory 104 in advance. The software may be stored in a memory card or other storage medium (not shown) and distributed as a program product. Alternatively, the software may be provided as a program product that can be downloaded by an information provider connected to the so-called Internet. Such software is downloaded via the antenna 121 and the wireless communication IF 112 and then temporarily stored in the flash memory 104. The software is read from the flash memory 104 by the CPU 101 and further stored in the flash memory 104 in the form of an executable program. The CPU 101 executes the program.

An essential part of the present invention can be said to be software stored in the flash memory 104 or other storage medium, or software that can be downloaded via a network. The recording medium is not limited to DVD-ROM, CD-ROM, FD, and hard disk, but is fixed such as semiconductor memory such as magnetic tape, cassette tape, optical disk, optical card, mask ROM, EPROM, EEPROM, and flash ROM. A medium carrying a program may be used. The recording medium is a non-temporary medium that can be read by the computer. The program here includes not only a program directly executable by the CPU but also a program in a source program format, a compressed program, an encrypted program, and the like.

(F. Modification)
FIG. 7 is a diagram for explaining a modified example of the reduction mode. That is, FIG. 7 is a diagram for explaining a reduction mode having a different mode from the reduction mode shown in the state (C) of FIG.

  As described above, the CPU 101 uses the outputs from the sensors 191 to 194 to determine that an object whose capacitance is changed, such as a finger or a stylus pen, is approaching the bezel 10. For example, in the state (C) of FIG. 2, the CPU 101 determines that the object is approaching the bezel 10 by using the output from the sensor 191. That is, the CPU 101 can determine that the object is approaching the vicinity of the sensor 191 (the location of the bezel 10 near the sensor 191).

  In this case, when the CPU 101 determines that the object has approached the sensor 191, the CPU 101 stops the full screen display and performs the screen display in the reduction mode. Specifically, the CPU 101 shifts each icon so as to approach the lower right of the screen. As a result, the icon 201 moves to the center of the screen. Even with such a configuration, it is possible to easily select an icon (for example, the icon 201) displayed on the end of the display 1081 in the full screen mode. It should be noted that the same effect can be obtained even when the CPU 101 shifts each icon so as to approach the upper right of the screen.

[Embodiment 2]
In the present embodiment, a configuration for preventing an erroneous selection of an icon by a user will be described. Specifically, a configuration for invalidating an icon selection operation by a user in a certain aspect will be described.

(G. Appearance and built-in sensor)
FIG. 8 is a front view of the smartphone 1A according to the present embodiment. Referring to FIG. 8, smartphone 1 </ b> A is a terminal device with a narrow bezel, as with smartphone 1 according to the first embodiment. Similar to the smartphone 1, the smartphone 1 </ b> A includes the bezel 10, the touch screen 108, and sensors 191 to 194.

  The smartphone 1A has an XYZ coordinate system unique to the smartphone 1A. The smartphone 1 </ b> A determines the position of the hand holding the smartphone 1 </ b> A (for example, the position on the bezel 10 where the finger is in contact) based on the outputs from the sensors 191 to 194. Moreover, although mentioned later for details, 1 A of smart phones determine the proximity | contact position of the object in the bezel 10 based on the output from the sensors 191-194.

  The smartphone 1A may have a two-dimensional XY coordinate system that does not have a coordinate system in the thickness direction of the smartphone 1A, instead of the three-dimensional coordinate system. Also in the present embodiment, description will be made assuming that the bezel 10 is part of the housing of the smartphone 1A.

(H. Outline of processing)
(H1. First operation example)
FIG. 9 is a diagram for explaining an operation example when invalidating an icon selection operation by the user.

  Referring to FIG. 9, state (A) represents a state in which the user holds smartphone 1A with the left hand. In this case, the CPU 101 of the smartphone 1A determines that the smartphone 1 is gripped based on outputs from the sensors 191 to 194. Specifically, the CPU 101 detects a place where a finger is in contact. Typically, the CPU 101 calculates the coordinate value of the contact position on the side surface of the housing based on the outputs from the sensors 191 to 194. Specifically, the CPU 101 sets the center position of a contact area such as a finger as the contact position.

  In the state (A), the CPU 101 detects that the finger is in contact with a portion in the vicinity of the icon 201 (a portion of the housing) based on the output from the sensor 191. Further, based on the output from the sensor 193, the CPU 101 detects that three fingers are in contact with a part of the casing opposite to the thumb with the touch screen 108 interposed therebetween. To be exact, the CPU 101 calculates the position where each finger is in contact as a coordinate value.

  The state (B) after the state (A) represents a state in which the tip of the thumb is moved away from the touch screen 108 with the base of the left thumb in contact with the housing. That is, the state (B) represents a state in which the tip of the thumb is moved away from the icon 201.

  The state (C) after the state (B) is a state in which the base of the thumb of the left hand is in contact with the housing as in the state (B), and the thumb is closer to the icon 201 than the state (B). Represents. Specifically, the state (C) represents a state in which the thumb is closer to the bezel 10 (that is, the sensor 191) than the state (B).

  A state (D) after the state (C) represents a state where the thumb is in contact with the area where the icon 201 is displayed in the state where the base of the thumb of the left hand is in contact with the housing as in the state (C). ing. That is, the state (D) represents a state where a touch operation is performed.

  When the icon 201 is selected by the series of operations described above, the smartphone 1A invalidates the selection without validating the selection. Specifically, the CPU 101 does not activate the icon 201 (selected state). Furthermore, the CPU 101 does not execute processing associated with the icon 201.

  This will be described in more detail as follows. The CPU 101 detects that the proximity of an object such as a finger is detected while the smartphone 1 is being gripped, and the position between the gripping hand and the proximity position of the object on the bezel 10 is detected. Is determined to be shorter than a predetermined threshold. When it is determined that the distance is shorter than the threshold, the CPU 101 invalidates the selection of the icon 201 by the object.

  With such a configuration, an icon that is not intended by the user can be selected, that is, an erroneous operation can be prevented.

(H2. Second operation example)
FIG. 10 is a diagram for explaining an operation example when the icon selection operation by the user is validated.

  Referring to FIG. 10, state (A) represents the same state as state (A) in FIG. The state (B) represents the same state as the state (B) in FIG. Therefore, description of the states (A) and (B) in FIG. 10 will not be repeated here. Note that the smartphone 1 </ b> A displays the icon 202 on the upper side of the screen (the positive direction side of the Y axis) from the icon 201.

  The state (C) after the state (B) is a state in which the base of the thumb of the left hand is in contact with the housing as in the state (B), and the thumb is closer to the icon 202 than the state (B). Represents. Specifically, the state (C) represents a state in which the thumb is closer to the bezel 10 (that is, the sensor 191) than the state (B).

  The CPU 101 detects that the proximity of an object such as a finger is detected while the smartphone 1 is being gripped, and the position between the gripping hand and the proximity position of the object on the bezel 10 is detected. Is determined to be shorter than a predetermined threshold. In the state (C), the CPU 101 determines that it is not shorter (that is, longer) than the threshold value, and switches the display mode from the full screen mode to the reduced mode.

  A state (D) after the state (C) represents a state in which the user selects the icon 202 with the left thumb of the right hand. When the icon 202 is selected by the series of operations described above, the CPU 101 validates the selection without invalidating the selection. Specifically, the CPU 101 brings the icon 202 into an active state (selected state).

  A state (E) after the state (D) represents a state in which the user has released the thumb of the right hand from the touch screen 108. In this case, the CPU 101 of the smartphone 1 </ b> A executes a process associated with the icon 202 on the display 1081 of the touch screen 108. The smartphone 1A displays a calendar on the display 1081 as an example in the full screen mode.

  If no icon is selected in state (C), smartphone 1A changes the state of the display screen from state (C) to state (F). Specifically, when the state (C) continues for a predetermined time (for example, 5 seconds), the smartphone 1A changes the state of the display screen from the state (C) to the state (F). That is, the smartphone 1A switches the display mode from the reduced mode to the full screen mode (normal mode) that performs full screen display.

(H3. Summary of operation examples)
(1) The process of the smartphone 1A in the above operation example will be described with a focus on the icon 201 as follows.

  The smartphone 1A detects that it has been gripped by the user. In addition, the smartphone 1A displays a predetermined icon 201 at a predetermined first display position on the display. Furthermore, the smartphone 1 </ b> A detects that an object (typically a finger) that can select the icon 201 has approached the bezel 10 of the smartphone 1 </ b> A.

  The smartphone 1 determines that the distance between the position of the gripping hand and the proximity position of the object in the bezel 10 is based on the fact that the proximity of the object is detected in the state where the gripping is detected. It is determined whether or not it is shorter than a predetermined threshold. Since the smartphone 1A determines that the distance is shorter than the threshold value, the selection of the icon 201 by the object is invalidated. Therefore, the smartphone 1A does not execute the process associated with the icon 201.

  In this case, the smartphone 1A does not switch the display mode from the full screen mode to the reduced mode.

  (2) The process of the smartphone 1A in the above operation example will be described with a focus on the icon 202 as follows.

  The smartphone 1A detects that it has been gripped by the user. In addition, the smartphone 1A displays a predetermined icon 202 at a predetermined first display position on the display. Furthermore, the smartphone 1 </ b> A detects that an object (typically a finger) capable of selecting the icon 202 has approached the bezel 10 of the smartphone 1 </ b> A.

  The smartphone 1 </ b> A determines that the distance between the position of the gripping hand and the proximity position of the object on the bezel 10 is based on the fact that the proximity of the object is detected in a state where it is detected that the grip is detected. It is determined whether or not it is shorter than a predetermined threshold. Since the smartphone 1A determines that the distance is longer than the threshold, the display position of the icon 202 is the second display position closer to the center of the display area of the display than the first display position from the first display position. Switch to. That is, the smartphone 1A switches the display mode from the full screen mode to the reduction mode.

  Furthermore, the smartphone 1 </ b> A validates the selection of the icon 202 by the object in response to accepting the touch operation on the icon 202 by the user in the reduction mode. That is, the smartphone 1A executes a process associated with the icon 202.

(I. Control structure)
FIG. 11 is a flowchart for explaining the flow of processing executed by the smartphone 1A. Referring to FIG. 11, the series of processes in FIG. 11 is different from the series of processes shown in FIG. 4 according to Embodiment 1 in that the process of step S <b> 22 is added. Accordingly, only the contents related to step S22 will be described below, and the description of the other steps S2, S4, S6, S8, S10, S12, and S14 will not be repeated here.

  If a positive determination is made in step S6, the smartphone 1A determines in step S22 whether or not the distance between the position of the gripping finger and the proximity position of the object in the bezel 10 is shorter than a threshold value. to decide.

  If it is determined to be short (YES in step S22), CPU 101 returns the process to step S2. That is, the CPU 101 performs control to invalidate the icon selection. If it is determined that the length is long (NO in step S22), CPU 101 advances the process to step S8. That is, the CPU 101 does not perform control for invalidating the icon selection.

(J. Functional configuration)
FIG. 12 is a block diagram for explaining a functional configuration of the smartphone 1A. With reference to FIG. 12, the smartphone 1 </ b> A includes a detection unit 151, a control unit 152 </ b> A, a display unit 153, and a storage unit 154. The detection unit 151 includes a grip detection unit 1511, a proximity detection unit 1512, and a touch operation detection unit 1513. Control unit 152A includes a determination unit 1525, a display control unit 1526, an invalidation unit 1527, and a process execution unit 1528.

  The control unit 152A corresponds to the CPU 101. Specifically, the control unit 152A is realized by the CPU 101 executing an application program stored in the storage unit 154.

  The determination unit 1525 determines in advance that the distance between the position of the gripping hand and the proximity position of the object on the bezel 10 is based on the detection of the proximity of the object in a state where the gripping is detected. It is determined whether or not it is shorter than a predetermined threshold. The determination unit 1525 notifies the display control unit 1526 and the invalidation unit 1527 of the determination result.

  The display control unit 1526 performs processing based on the detection result by the grip detection unit 1511 and the detection result by the proximity detection unit 1512. Specifically, the display control unit 1526 stops full-screen display when it is determined that the distance between the position of the gripping hand and the proximity position of the object on the bezel 10 is longer than a predetermined threshold. By displaying the screen in the reduced mode, each icon is shifted toward the center of the screen.

  Focusing on the icon 202, the display control unit 1526 displays the icon 202 at the first display position of the display in the full screen mode which is the default display mode. When the display control unit 1526 detects that the user is holding the smartphone 1A and detects that the object is close to the bezel 10, the display control unit 1526 changes the display position of the icon 202 to the first position. The display position is switched to the second display position that is closer to the center of the display area of the display than the first display position.

  The invalidation unit 1527 is detected by the touch operation detection unit 1513 when it is determined that the distance between the position of the gripping hand and the proximity position of the object on the bezel 10 is shorter than a predetermined threshold. Disable icon selection. Specifically, invalidation unit 1527 instructs processing execution unit 1528 not to execute the process associated with the icon selected by the touch operation. For example, when the user operation illustrated in FIG. 9 is performed, the invalidation unit 1527 invalidates the selection of the icon 201.

(K. Modification)
In the first and second embodiments, the configuration in which the approach of the object (finger or stylus pen) is determined using the outputs from the sensors 191 to 194 has been described as an example. However, the present invention is not limited to this. For example, if the touch panel 1082 is of the capacitive type, the smartphones 1 and 1A are connected so that the CPU 101 detects the approach of an object using a change in the capacitance of the touch panel 1082 instead of the sensors 191 to 194. It may be configured.

  The embodiment disclosed this time is an exemplification, and the present invention is not limited to the above contents. The scope of the present invention is defined by the terms of the claims, and is intended to include any modifications within the scope and meaning equivalent to the terms of the claims.

  1,1A Smartphone, 10 bezel, 104 flash memory, 108 touch screen, 151 detection unit, 152, 152A control unit, 153 display unit, 191, 192, 193, 194 sensor, 201, 202 icon, 1081 display, 1082 touch panel, 1511 Grasping detection unit, 1512 Proximity detection unit, 1513 Touch operation detection unit, 1521 and 1526 Display control unit, 1522 and 1528 Processing execution unit, 1525 Determination unit and 1527 Invalidation unit.

Claims (5)

  1. A portable terminal device having a touch screen having a touch panel and a display,
    First detection means for detecting that the terminal device is held by a user;
    Display control means for displaying a predetermined object at the first display position of the display;
    A second detection means for detecting that an object capable of selecting the object is close to the bezel of the terminal device;
    When it is detected that the user is holding the terminal device, the display control unit detects the display position of the object when the proximity of the object is detected. A terminal device that switches from one display position to a second display position that is closer to the center of the display area of the display than the first display position.
  2.   The terminal according to claim 1, wherein the display control unit switches the display position of the object from the first display position to the second display position by reducing and displaying the screen displayed on the display. apparatus.
  3. The touch panel detects selection of the object by the object,
    The display control means switches the display position of the object from the first display position to the second display position, and when the selection of the object is not detected for a predetermined time, The terminal device according to claim 1, wherein the display position is switched from the second display position to the first display position.
  4. A portable terminal device having a touch screen having a touch panel and a display,
    First detection means for detecting that the terminal device is held by a user;
    Display control means for displaying a predetermined object at the first display position of the display;
    A second detecting means for detecting that an object capable of selecting the object is close to the bezel of the terminal device;
    Execution means for executing processing associated with the object based on the object being selected by the object;
    A distance between the position of the gripping hand and the proximity position of the object on the bezel is determined in advance based on the proximity detected when the gripping is detected. Determining means for determining whether or not the threshold is shorter than
    A terminal device comprising: invalidating means for invalidating the selection of the object by the object when it is determined that the distance is shorter than the threshold.
  5.   When the distance is longer than the threshold, the display control means sets the display position of the object from the first display position to the second display area closer to the center of the display area than the first display position. The terminal device according to claim 4, wherein the terminal device is switched to the display position.
JP2014187682A 2014-09-16 2014-09-16 Terminal device Active JP6284459B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2014187682A JP6284459B2 (en) 2014-09-16 2014-09-16 Terminal device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2014187682A JP6284459B2 (en) 2014-09-16 2014-09-16 Terminal device

Publications (2)

Publication Number Publication Date
JP2016062156A JP2016062156A (en) 2016-04-25
JP6284459B2 true JP6284459B2 (en) 2018-02-28

Family

ID=55797845

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2014187682A Active JP6284459B2 (en) 2014-09-16 2014-09-16 Terminal device

Country Status (1)

Country Link
JP (1) JP6284459B2 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018143412A1 (en) * 2017-02-06 2018-08-09 シャープ株式会社 Display device

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012014459A (en) * 2010-06-30 2012-01-19 Brother Ind Ltd Display apparatus, image display method and image display program
JP5323010B2 (en) * 2010-07-05 2013-10-23 レノボ・シンガポール・プライベート・リミテッド Information input device, screen layout method thereof, and computer-executable program
KR20130102298A (en) * 2012-03-07 2013-09-17 주식회사 팬택 Mobile device and method for controlling display of mobile device
WO2013187370A1 (en) * 2012-06-15 2013-12-19 京セラ株式会社 Terminal device
JP5798103B2 (en) * 2012-11-05 2015-10-21 株式会社Nttドコモ terminal device, screen display method, program

Also Published As

Publication number Publication date
JP2016062156A (en) 2016-04-25

Similar Documents

Publication Publication Date Title
US20200034018A1 (en) Electronic device using auxiliary input device and operating method thereof
US9645730B2 (en) Method and apparatus for providing user interface in portable terminal
TWI603254B (en) Method and apparatus for multitasking
JP5970086B2 (en) Touch screen hover input processing
EP2987068B1 (en) Method for adjusting display area and electronic device thereof
US9619139B2 (en) Device, method, and storage medium storing program
US9213467B2 (en) Interaction method and interaction device
US9367238B2 (en) Terminal apparatus and input correction method
KR101995278B1 (en) Method and apparatus for displaying ui of touch device
EP2631766B1 (en) Method and apparatus for moving contents in terminal
KR101995403B1 (en) Stylus pen, electroinic device and method for processing input using it
EP2508972B1 (en) Portable electronic device and method of controlling same
US9595238B2 (en) Electronic device, cover for electronic device, and method of performing a function in an electronic device
US9035883B2 (en) Systems and methods for modifying virtual keyboards on a user interface
JP5922480B2 (en) Portable device having display function, program, and control method of portable device having display function
EP2796982B1 (en) Method and apparatus for providing a changed shortcut icon corresponding to a status thereof
KR102061360B1 (en) User interface indirect interaction
US8446383B2 (en) Information processing apparatus, operation prediction method, and operation prediction program
RU2616536C2 (en) Method, device and terminal device to display messages
JP6478181B2 (en) Method of connecting and operating portable terminal and external display device, and device for supporting the same
JP6100287B2 (en) Terminal multiple selection operation method and terminal
KR102016975B1 (en) Display apparatus and method for controlling thereof
US10152228B2 (en) Enhanced display of interactive elements in a browser
US20200034012A1 (en) User terminal device and displaying method thereof
US8570283B2 (en) Information processing apparatus, information processing method, and program

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20170323

TRDD Decision of grant or rejection written
A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20171220

A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20180109

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20180130

R150 Certificate of patent or registration of utility model

Ref document number: 6284459

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150