JP6096100B2 - Electronic device, control method, and control program - Google Patents

Electronic device, control method, and control program Download PDF

Info

Publication number
JP6096100B2
JP6096100B2 JP2013246511A JP2013246511A JP6096100B2 JP 6096100 B2 JP6096100 B2 JP 6096100B2 JP 2013246511 A JP2013246511 A JP 2013246511A JP 2013246511 A JP2013246511 A JP 2013246511A JP 6096100 B2 JP6096100 B2 JP 6096100B2
Authority
JP
Japan
Prior art keywords
display area
operation
display
smartphone
detected
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2013246511A
Other languages
Japanese (ja)
Other versions
JP2015106181A (en
Inventor
長谷川 純一
純一 長谷川
那由 能町
那由 能町
須藤 智浩
智浩 須藤
Original Assignee
京セラ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 京セラ株式会社 filed Critical 京セラ株式会社
Priority to JP2013246511A priority Critical patent/JP6096100B2/en
Publication of JP2015106181A publication Critical patent/JP2015106181A/en
Application granted granted Critical
Publication of JP6096100B2 publication Critical patent/JP6096100B2/en
Application status is Active legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Description

  The present application relates to an electronic device, a control method, and a control program. In particular, the present application relates to an electronic device having a touch screen, a control method for controlling the electronic device, and a control program for controlling the electronic device.

  Touch screen devices with a touch screen display are known. Touch screen devices include, but are not limited to, for example, smartphones and tablets. The touch screen device detects a finger, pen, or stylus pen gesture via the touch screen. The touch screen device operates according to the detected gesture. An example of the operation according to the detected gesture is described in Patent Document 1, for example.

  The basic operation of the touch screen device is realized by an OS (Operating System) installed in the device. Examples of the OS installed in the touch screen device include, but are not limited to, Android (registered trademark), BlackBerry (registered trademark) OS, iOS, Symbian (registered trademark) OS, and Windows (registered trademark) Phone.

International Publication No. 2008/086302

  The touch screen device enables an intuitive screen operation by the user, but there is room for improvement in the screen operation in the touch screen device. For these reasons, there is a need for an electronic device, a control method, and a control program that can improve user convenience for screen operations.

  An electronic apparatus according to an aspect displays a touch screen display that displays a screen having a first display area and a second display area, and displays a first image in the first display area. A controller for displaying information on a position where the operation is detected in the second display area.

  A control method according to one aspect is a method for controlling an electronic device including a touch screen display, the step of displaying a screen having a first display area and a second display area on the touch screen display, Displaying a first image in one display area; and displaying information on a position where an operation is detected in the first display area in the second display area.

  A control program according to an aspect includes a step of causing an electronic device including a touch screen display to display a screen having a first display area and a second display area on the touch screen display; and a first program in the first display area. A step of displaying one image, and a step of displaying information on a position where an operation is detected in the first display area in the second display area.

FIG. 1 is a front view of a smartphone. FIG. 2 is a rear view of the smartphone. FIG. 3 is a block diagram of the smartphone. FIG. 4 is a diagram for explaining an example of the screen configuration of the smartphone. FIG. 5 is a diagram illustrating a first example of display control by a smartphone. FIG. 6 is a flowchart illustrating a processing procedure of a first example of display control by the smartphone. FIG. 7 is a diagram illustrating a second example of display control by a smartphone. FIG. 8 is a flowchart illustrating a processing procedure of a second example of display control by the smartphone. FIG. 9 is a diagram illustrating a modification of the second example of display control by the smartphone. FIG. 10 is a diagram illustrating a third example of display control by a smartphone. FIG. 11 is a flowchart illustrating a processing procedure of a third example of display control by the smartphone. FIG. 12 is a diagram illustrating a fourth example of display control by a smartphone. FIG. 13 is a flowchart illustrating a processing procedure of a fourth example of display control by the smartphone. FIG. 14 is a diagram illustrating a fifth example of display control by a smartphone. FIG. 15 is a flowchart illustrating a processing procedure of a fifth example of display control by the smartphone. FIG. 16 is a diagram illustrating a sixth example of display control by a smartphone. FIG. 17 is a flowchart illustrating a processing procedure of a sixth example of display control by the smartphone. FIG. 18 is a diagram illustrating a seventh example of display control by a smartphone. FIG. 19 is a flowchart illustrating a processing procedure of a seventh example of display control by the smartphone.

  Embodiments for carrying out the present invention will be described in detail with reference to the drawings. Below, a smart phone is demonstrated as an example of an electronic device provided with a touch screen display.

(Embodiment)
The overall configuration of the smartphone 1 according to the embodiment will be described with reference to FIGS. As shown in FIGS. 1 and 2, the smartphone 1 has a housing 20. The housing 20 includes a front face 1A, a back face 1B, and side faces 1C1 to 1C4. The front face 1 </ b> A is the front of the housing 20. The back face 1 </ b> B is the back surface of the housing 20. The side faces 1C1 to 1C4 are side surfaces that connect the front face 1A and the back face 1B. Hereinafter, the side faces 1C1 to 1C4 may be collectively referred to as the side face 1C without specifying which face.

  The smartphone 1 includes a touch screen display 2, buttons 3A to 3C, an illuminance sensor 4, a proximity sensor 5, a receiver 7, a microphone 8, and a camera 12 on the front face 1A. The smartphone 1 has a speaker 11 and a camera 13 on the back face 1B. The smartphone 1 has buttons 3D to 3F and a connector 14 on the side face 1C. Hereinafter, the buttons 3A to 3F may be collectively referred to as the button 3 without specifying which button.

  The touch screen display 2 includes a display 2A and a touch screen 2B. In the example of FIG. 1, the display 2A and the touch screen 2B are substantially rectangular, but the shapes of the display 2A and the touch screen 2B are not limited to this. Each of the display 2A and the touch screen 2B can take any shape such as a square or a circle. In the example of FIG. 1, the display 2 </ b> A and the touch screen 2 </ b> B are overlapped, but the arrangement of the display 2 </ b> A and the touch screen 2 </ b> B is not limited to this. For example, the display 2A and the touch screen 2B may be arranged side by side or may be arranged apart from each other. In the example of FIG. 1, the long side of the display 2A is along the long side of the touch screen 2B, and the short side of the display 2A is along the short side of the touch screen 2B, but the display 2A and the touch screen 2B are overlapped. Is not limited to this. When the display 2A and the touch screen 2B are arranged so as to overlap each other, for example, one or more sides of the display 2A may not be along any side of the touch screen 2B.

  The display 2A includes a liquid crystal display (LCD: Liquid Crystal Display), an organic EL display (OELD: Organic Electro-Luminescence Display), or an inorganic EL display (IELD: Inorganic Electro-Luminescence Display). The display 2A displays characters, images, symbols, graphics, and the like.

  The touch screen 2B detects contact of a finger, a pen, a stylus pen, or the like with respect to the touch screen 2B. The touch screen 2B can detect a position where a plurality of fingers, a pen, a stylus pen, or the like is in contact with the touch screen 2B. In the following description, a finger, pen, stylus pen, or the like that contacts the touch screen 2B may be referred to as a “contact object” or “contact object”.

  The detection method of the touch screen 2B may be any method such as a capacitance method, a resistive film method, a surface acoustic wave method (or an ultrasonic method), an infrared method, an electromagnetic induction method, and a load detection method. In the following description, in order to simplify the description, it is assumed that the user uses the finger to touch the touch screen 2B in order to operate the smartphone 1.

  The smartphone 1 is based on at least one of the contact detected by the touch screen 2B, the position at which the contact is detected, the change in the position at which the contact is detected, the interval at which the contact is detected, and the number of times the contact is detected. Determine the type of gesture. The gesture is an operation performed on the touch screen 2B. Gestures identified by the smartphone 1 include, but are not limited to, touch, long touch, release, swipe, tap, double tap, long tap, drag, flick, pinch in, and pinch out, for example.

  “Touch” is a gesture in which a finger touches the touch screen 2B. The smartphone 1 determines a gesture in which a finger contacts the touch screen 2B as a touch. “Long touch” is a gesture in which a finger touches the touch screen 2B for a longer period of time. The smartphone 1 determines a gesture in which a finger contacts the touch screen 2B for longer than a certain time as a long touch.

  “Release” is a gesture in which a finger leaves the touch screen 2B. The smartphone 1 determines that a gesture in which a finger leaves the touch screen 2B is a release. “Swipe” is a gesture in which a finger moves while touching the touch screen 2B. The smartphone 1 determines a gesture that moves while the finger is in contact with the touch screen 2B as a swipe.

  A “tap” is a gesture for releasing following a touch. The smartphone 1 determines a gesture for releasing following a touch as a tap. The “double tap” is a gesture in which a gesture for releasing following a touch is continued twice. The smartphone 1 determines a gesture in which a gesture for releasing following a touch is continued twice as a double tap.

  “Long tap” is a gesture for releasing following a long touch. The smartphone 1 determines a gesture for releasing following a long touch as a long tap. “Drag” is a gesture for performing a swipe starting from an area where a movable object is displayed. The smartphone 1 determines, as a drag, a gesture for performing a swipe starting from an area where a movable object is displayed.

  “Flick” is a gesture in which a finger leaves the touch screen 2B while moving after touching the touch screen 2B. In other words, “flick” is a gesture in which a release is performed while a finger moves following a touch. The smartphone 1 determines, as a flick, a gesture in which the finger leaves the touch screen 2B while moving after touching the touch screen 2B. The flick is often performed while the finger moves in one direction. Flick is "upper flick" where the finger moves upward on the screen, "lower flick" where the finger moves downward on the screen, "right flick" where the finger moves rightward on the screen, finger is left on the screen Including “left flick” moving in the direction. The movement of a finger in a flick is often quicker than the movement of a finger in a swipe.

  “Pinch in” is a gesture in which a plurality of fingers are swiped in a direction approaching each other. The smartphone 1 determines, as a pinch-in, a gesture in which the distance between the position of a finger detected by the touch screen 2B and the position of another finger is shortened. “Pinch out” is a gesture of swiping a plurality of fingers away from each other. The smartphone 1 determines, as a pinch-out, a gesture that increases the distance between the position of one finger and the position of another finger detected by the touch screen 2B.

  In the following description, a gesture performed with one finger may be referred to as a “single touch gesture”, and a gesture performed with two or more fingers may be referred to as a “multi-touch gesture”. Multi-touch gestures include, for example, pinch-in and pinch-out. Taps, flicks, swipes, and the like are single-touch gestures when performed with one finger, and multi-touch gestures when performed with two or more fingers.

  The smartphone 1 operates according to these gestures that are determined via the touch screen 2B. Therefore, an operability that is intuitive and easy to use for the user is realized. The operation performed by the smartphone 1 according to the determined gesture may differ depending on the screen displayed on the display 2A. In the following description, in order to simplify the description, “the touch screen 2B detects contact and the smartphone 1 determines the gesture type as X based on the detected contact”, “the smartphone determines X May be described as “detect” or “the controller detects X”.

  FIG. 3 is a block diagram of the smartphone 1. The smartphone 1 includes a touch screen display 2, a button 3, an illuminance sensor 4, a proximity sensor 5, a communication unit 6, a receiver 7, a microphone 8, a storage 9, a controller 10, a speaker 11, and a camera. 12 and 13, a connector 14, an acceleration sensor 15, an orientation sensor 16, and a gyroscope 17.

  As described above, the touch screen display 2 includes the display 2A and the touch screen 2B. The display 2A displays characters, images, symbols, graphics, or the like. The touch screen 2B detects contact. The controller 10 detects a gesture for the smartphone 1. Specifically, the controller 10 detects an operation (gesture) on the touch screen 2B (touch screen display 2) by cooperating with the touch screen 2B.

  The button 3 is operated by the user. The button 3 includes buttons 3A to 3F. The controller 10 detects an operation on the button 3 by cooperating with the button 3. The operation on the button 3 includes, for example, click, double click, triple click, push, and multi-push, but is not limited thereto.

  The buttons 3A to 3C are, for example, a home button, a back button, or a menu button. The button 3D is, for example, a power on / off button of the smartphone 1. The button 3D may also serve as a sleep / sleep release button. The buttons 3E and 3F are volume buttons, for example.

  The illuminance sensor 4 detects the illuminance of the ambient light of the smartphone 1. Illuminance indicates light intensity, brightness, or luminance. The illuminance sensor 4 is used for adjusting the luminance of the display 2A, for example. The proximity sensor 5 detects the presence of a nearby object without contact. The proximity sensor 5 detects the presence of an object based on a change in a magnetic field or a change in a feedback time of an ultrasonic reflected wave. For example, the proximity sensor 5 detects that the touch screen display 2 is brought close to the face. The illuminance sensor 4 and the proximity sensor 5 may be configured as one sensor. The illuminance sensor 4 may be used as a proximity sensor.

  The communication unit 6 communicates wirelessly. The communication method supported by the communication unit 6 is a wireless communication standard. Examples of wireless communication standards include cellular phone communication standards such as 2G, 3G, and 4G. As cellular phone communication standards, for example, LTE (Long Term Evolution), W-CDMA (Wideband Code Multiple Access), CDMA2000, PDC (Personal Digital Cellular, GSM (registered trademark) mmloS). (Personal Handy-phone System). As wireless communication standards, for example, there are WiMAX (Worldwide Interoperability for Microwave Access), IEEE802.11, Bluetooth (registered trademark), IrDA (Infrared Data Association), NFC (NearCo), etc. The communication unit 6 may support one or more of the communication standards described above.

  The receiver 7 and the speaker 11 are sound output units. The receiver 7 and the speaker 11 output the sound signal transmitted from the controller 10 as sound. The receiver 7 is used, for example, to output the other party's voice during a call. The speaker 11 is used for outputting a ring tone and music, for example. One of the receiver 7 and the speaker 11 may also function as the other. The microphone 8 is a sound input unit. The microphone 8 converts the user's voice or the like into a sound signal and transmits the sound signal to the controller 10.

  The storage 9 stores programs and data. The storage 9 is also used as a work area for temporarily storing the processing result of the controller 10. The storage 9 may include any non-transitory storage medium such as a semiconductor storage medium and a magnetic storage medium. The storage 9 may include a plurality of types of storage media. The storage 9 may include a combination of a portable storage medium such as a memory card, an optical disk, or a magneto-optical disk and a storage medium reader. The storage 9 may include a storage device used as a temporary storage area such as a RAM (Random Access Memory).

  The programs stored in the storage 9 include an application executed in the foreground or the background and a control program that supports the operation of the application. For example, the application displays a screen on the display 2A, and causes the controller 10 to execute processing according to a gesture detected via the touch screen 2B. The control program is, for example, an OS. The application and the control program may be installed in the storage 9 via wireless communication by the communication unit 6 or a non-transitory storage medium.

  The storage 9 stores, for example, a control program 9A, a map application 9B, a browser application 9C, and setting data 9Z. The map application 9B provides a function for displaying map data of the current location or an arbitrary place, a function for enlarging or reducing the displayed map data, and the like. The browser application 9C provides a WEB browsing function for displaying a WEB page. The setting data 9Z includes information related to various settings related to the operation of the smartphone 1.

  The control program 9A provides functions related to various controls for operating the smartphone 1. The control program 9A realizes a call by controlling the communication unit 6, the receiver 7, the microphone 8, and the like, for example. The function provided by the control program 9A includes a function of performing various controls such as changing information displayed on the display 2A according to a gesture detected via the touch screen 2B. The function provided by the control program 9A may be used in combination with a function provided by another program such as the map application 9B.

  The controller 10 is an arithmetic processing device. The arithmetic processing unit includes, for example, a CPU (Central Processing Unit), an SoC (System-on-a-chip), an MCU (Micro Control Unit), and an FPGA (Field-Programmable Gate Array), but is not limited thereto. The controller 10 controls various operations of the smartphone 1 to realize various functions.

  Specifically, the controller 10 executes instructions included in the program stored in the storage 9 while referring to the data stored in the storage 9 as necessary. And the controller 10 controls a function part according to data and a command, and implement | achieves various functions by it. The functional unit includes, for example, the display 2A, the communication unit 6, the receiver 7, and the speaker 11, but is not limited thereto. The controller 10 may change the control according to the detection result of the detection unit. The detection unit includes, for example, the touch screen 2B, the button 3, the illuminance sensor 4, the proximity sensor 5, the microphone 8, the camera 12, the camera 13, the acceleration sensor 15, the azimuth sensor 16, and the gyroscope 17, but is not limited thereto. .

  For example, by executing the control program 9A, the controller 10 executes various controls such as changing information displayed on the display 2A according to a gesture detected via the touch screen 2B.

  The camera 12 is an in-camera that captures an object facing the front face 1A. The camera 13 is an out camera that captures an object facing the back face 1B.

  The connector 14 is a terminal to which another device is connected. The connector 14 may be a general-purpose terminal such as USB (Universal Serial Bus), HDMI (registered trademark) (High-Definition Multimedia Interface), Light Peak (Thunderbolt (registered trademark)), and an earphone microphone connector. . The connector 14 may be a dedicated terminal such as a dock connector. Devices connected to the connector 14 include, but are not limited to, external storage, speakers, and communication devices, for example.

  The acceleration sensor 15 detects the direction and magnitude of acceleration acting on the smartphone 1. The direction sensor 16 detects the direction of geomagnetism. The gyroscope 17 detects the angle and angular velocity of the smartphone 1. The detection results of the acceleration sensor 15, the azimuth sensor 16, and the gyroscope 17 are used in combination in order to detect changes in the position and orientation of the smartphone 1.

  In FIG. 3, some or all of the programs and data stored in the storage 9 may be downloaded from another device through wireless communication by the communication unit 6. In FIG. 3, some or all of the programs and data stored in the storage 9 may be stored in a non-transitory storage medium that can be read by the reading device included in the storage 9. 3 may be stored in a non-transitory storage medium that can be read by a reading device connected to the connector 14. Non-transitory storage media include, for example, optical disks such as CD (registered trademark), DVD (registered trademark), and Blu-ray (registered trademark), magneto-optical disks, magnetic storage media, memory cards, and solid-state storage media Including, but not limited to.

  The configuration of the smartphone 1 shown in FIG. 3 is an example, and may be appropriately changed within a range that does not impair the gist of the present invention. For example, the number and type of buttons 3 are not limited to the example of FIG. The smartphone 1 may include buttons such as a numeric keypad layout or a QWERTY layout as buttons for operations related to the screen instead of the buttons 3A to 3C. The smartphone 1 may include only one button or may not include a button for operations related to the screen. In the example illustrated in FIG. 3, the smartphone 1 includes two cameras, but the smartphone 1 may include only one camera or may not include a camera. In the example illustrated in FIG. 3, the smartphone 1 includes three types of sensors in order to detect the position and orientation, but the smartphone 1 may not include some of these sensors. Alternatively, the smartphone 1 may include another type of sensor for detecting at least one of a position and a posture.

  The screen configuration of the smartphone 1 will be described with reference to FIG. As shown in FIG. 4, the touch screen display 2 (display 2A) of the smartphone 1 has an aspect ratio of 21: 9. For this reason, the smartphone 1 can display the screen 60 having an aspect ratio of 21: 9. A 21: 9 aspect ratio is often used in movies.

  On the other hand, the display 102A of the smartphone 101 shown as the comparative example has an aspect ratio of 16: 9. For this reason, the smartphone 1 can display the screen 160 having an aspect ratio of 16: 9. The 16: 9 aspect ratio is currently the most common aspect ratio for smartphone displays.

  Here, if the same screen as the screen 160 is displayed on the display 2A of the smartphone 1, the 16: 9 area may be used for display, and the remaining 5: 9 area may be blank. That is, when a screen created for a smartphone having a general aspect ratio display is displayed on the display 2A, an area where nothing is displayed may be formed. Of course, in the case of a scrollable screen, a 5: 9 region can be utilized by displaying a portion that should be displayed by scrolling. In addition, the screen can be extended to a 5: 9 area by increasing the interval between items arranged on the screen.

  However, when the screen is stretched, the interval between the optimally arranged items is increased, and the operability or visibility of the screen may be reduced. Therefore, the smartphone 1 provides a first display area 61 with an aspect ratio of 16: 9 and a second display area 62 with an aspect ratio of 5: 9 on the screen 60, and displays different information in each area. It is configured to be able to. For example, the smartphone 1 causes the first display area 61 to display the first image, and causes the second display area 62 to display information regarding the position where the operation is detected in the first display area 61. The function of displaying different information in the first display area 61 and the second display area 62 may be provided by an individual application program that provides the screen 60, or may be provided by the control program 9A.

  In FIG. 4, when a general aspect ratio screen can be displayed without changing the aspect ratio, the first display area 61 is used, and the general aspect ratio screen is displayed without changing. The example which made the area | region used as a margin the 2nd display area 62 was shown. However, the first display area 61 and the second display area 62 are not limited to this. The area ratio, arrangement, shape, orientation, and the like of the first display area 61 and the second display area 62 may be arbitrarily changed according to the aspect ratio of the display, the type of information to be displayed, and the like.

  A first example of display control by the smartphone 1 will be described with reference to FIG. FIG. 5 is a diagram illustrating a first example of display control by the smartphone 1. In step S1, the smartphone 1 executes the browser application 9C and displays the screen 60 provided by the browser application 9C on the touch screen display 2 (display 2A). Specifically, the smartphone 1 acquires a WEB page by Internet communication using the communication unit 6 based on the function provided by the browser application 9C, and first displays a first image obtained by reproducing the WEB page. It is displayed in the area 61. In step S <b> 1, nothing is displayed in the second display area 62.

  In step S <b> 2, the user's finger F <b> 1 touches the first display area 61 of the screen 60. When the smartphone 1 detects an operation on the first display area 61, the smartphone 1 displays information on the position where the operation is detected in the first display area 61 in the second display area 62. The information regarding the position where the operation is detected is, for example, a second image obtained by enlarging or reducing a portion corresponding to the position where the operation is detected in the first image. When the display magnification of the WEB page displayed in the first display area 61 is smaller than the threshold, the smartphone 1 enlarges a portion corresponding to the position where the operation is detected in the first image, and expands the second display area. 62 may be displayed. When the display magnification of the WEB page displayed in the first display area 61 is not smaller than the threshold, the smartphone 1 reduces the portion corresponding to the position where the operation is detected in the first image and performs the second display. You may display in the area | region 62. FIG.

  Furthermore, the smartphone 1 enlarges or reduces the index object 64 indicating the position where the operation is detected in the first display area 61 in the first image, by enlarging or reducing the portion of the first image corresponding to the position where the operation is detected. The image is displayed on the second display area 62 so as to overlap the image.

  In step S2, the link object 63 exists near the position where the smartphone 1 detects the operation. For this reason, in step S <b> 2, the link object 63 is enlarged and displayed in the second display area 62 in the same manner as the surrounding area. The link object 63 includes a reference to another WEB page. When the smartphone 1 detects an operation of selecting the link object 63, the WEB page displayed on the screen 60 is changed to the WEB page referred to by the link object 63. change. The operation for selecting the link object 63 is, for example, a tap.

  Thus, in step S <b> 2, the enlarged link object 63 and its surroundings and the index object 64 indicating the position where the operation is detected in the first display area 61 are displayed in the second display area 62. When the WEB page is displayed in a reduced size, it becomes easy to look around the entire WEB page, but it is difficult to perform a fine operation with a finger having a relatively large surface area, and thus a desired link may not be operated well. is there. By displaying the enlarged link object 63 and its surroundings and the index object 64 in the second display area 62, the user can easily grasp the relationship between the currently operated position and the position of the link object 63. This makes it easy to operate the link object 63.

  In step S <b> 3, the user adjusts the position of the finger F <b> 1 based on the grasped positional relationship, and taps the link object 63 in the first display area 61. At this time, in the second display area 62, the index object 64 is displayed so as to overlap the link object 63. When the smartphone 1 detects a tap on the link object 63 in the first display area 61, the smartphone 1 changes the WEB page displayed on the screen 60 to the WEB page referred to by the link object 63.

  As a result, in step S4, the screen 60 is updated, and the first image obtained by reproducing the WEB page referred to by the link object 63 is displayed in the first display area 61. The second display area 62 has returned to a state in which nothing is displayed.

  As described above, when the smartphone 1 detects an operation on the first display area 61 while displaying the first image in the first display area 61, the smartphone 1 displays information on the position where the operation is detected in the second display area 62. To do. As a result, the smartphone 1 can display information on the portion of the first image that the user is interested in in the second display area 62 while displaying the first image in the first display area 61.

  Furthermore, the smartphone 1 displays an index object 64 indicating the position where the operation is detected in the first display area 61 in the second display area 62. Thereby, the smartphone 1 can easily show the user the positional relationship between the portion of the first image in which the user is interested and the position where the user has actually operated.

  A processing procedure of a first example of display control by the smartphone 1 will be described with reference to FIG. FIG. 6 is a flowchart illustrating a processing procedure of a first example of display control by the smartphone 1. The processing procedure shown in FIG. 6 is realized by the controller 10 executing the control program 9A. The processing procedure shown in FIG. 6 is executed when the controller 10 displays the screen 60 on the display 2A. In the following description, the link object may be referred to as a first object and the index object may be referred to as a second object.

  As illustrated in FIG. 6, the controller 10 of the smartphone 1 displays a first image in the first display area 61 of the screen 60 as step S <b> 101. The controller 10 determines whether operation was detected in the 1st display area 61 as step S102. If no operation is detected in the first display area 61 (No at Step S102), the controller 10 proceeds to Step S107.

  In step S107, the controller 10 determines whether to change the screen 60 to another screen. The case where the screen 60 is changed to another screen includes, for example, a case where the user performs an operation for switching the screen and a case where the screen is forcibly switched by interrupt processing. When the screen 60 is not changed to another screen (step S107, No), the controller 10 returns to step S102. When the screen 60 is changed to another screen (step S107, Yes), the controller 10 ends the processing procedure shown in FIG.

  When an operation is detected in the first display area 61 (step S102, Yes), the controller 10 proceeds to step S103. The controller 10 causes the second display area 62 to display information regarding the position where the operation is detected in the first display area 61 as step S103. In step S <b> 104, the controller 10 causes the second display area 62 to display a second object indicating the position where the operation is detected in the first display area 61.

  Subsequently, the controller 10 determines whether selection of the first object has been detected as step S105. When selection of the first object is detected (step S105, Yes), the controller 10 proceeds to step S106. In step S106, the controller 10 displays an image corresponding to the first object in the first display area 61 and clears the second display area 62. If selection of the first object has not been detected (No at Step S105), Step S106 is not executed.

  Thereafter, the controller 10 proceeds to step S107 already described.

  The smartphone 1 detects the operation for selecting the first object displayed in the first display area 61 when detecting the operation for selecting the first object displayed in the second display area 62. The process may be executed.

  A second example of display control by the smartphone 1 will be described with reference to FIG. FIG. 7 is a diagram illustrating a second example of display control by the smartphone 1.

  In step S11, the smartphone 1 executes the browser application 9C and displays the screen 60 provided by the browser application 9C on the touch screen display 2 (display 2A). Specifically, the smartphone 1 acquires a WEB page by Internet communication using the communication unit 6 based on the function provided by the browser application 9C, and first displays a first image obtained by reproducing the WEB page. It is displayed in the area 61. In step S <b> 11, nothing is displayed in the second display area 62.

  In step S <b> 11, the user's finger F <b> 1 touches the first display area 61 of the screen 60. A link object 63a and a link object 63b are arranged in the vicinity of the touched position. When the smartphone 1 detects an operation on the first display area 61, the smartphone 1 displays information on the position where the operation is detected in the first display area 61 in the second display area 62.

  In the second example of display control, the smartphone 1 includes only the objects to which processing for the operation is assigned among the objects near the position where the operation is detected in the first image displayed in the first display area 61. Is displayed in the second display area 62. When the smartphone 1 detects an operation on an object to which a process for the operation is assigned, the smartphone 1 executes the process assigned to the object. Objects to which processing for operations is assigned are, for example, a link object and a button object. When a WEB page is displayed on the screen 60, by analyzing an HTML file corresponding to the WEB page, an object to which processing for the operation is assigned is extracted from objects in the vicinity of the position where the operation is detected. Can do.

  In step S <b> 12, the smartphone 1 displays the link object 63 a and the link object 63 b arranged in the vicinity of the position where the operation is detected in the first display area 61 in the second display area 62. Furthermore, the smartphone 1 displays an index object 64 indicating the position where the operation is detected in the second display area 62.

  As described above, when the smartphone 1 detects an operation on the first display area 61 while displaying the first image in the first display area 61, the process for the operation is assigned among the information on the position where the operation is detected. The displayed object is displayed in the second display area 62. As a result, the user can easily grasp an object that exists in the vicinity of the operated position and to which processing for the operation is assigned.

  Furthermore, the smartphone 1 displays an index object 64 indicating the position where the operation is detected in the first display area 61 in the second display area 62. Thereby, the smartphone 1 can easily show the user the positional relationship between the object that is likely to be operated by the user and the actually operated position.

  A processing procedure of a second example of display control by the smartphone 1 will be described with reference to FIG. FIG. 8 is a flowchart illustrating a processing procedure of a second example of display control by the smartphone 1. The processing procedure shown in FIG. 8 is realized by the controller 10 executing the control program 9A. The processing procedure shown in FIG. 8 is executed when the controller 10 displays the screen 60 on the display 2A.

  As shown in FIG. 8, the controller 10 of the smartphone 1 displays a first image in the first display area 61 of the screen 60 as step S201. The controller 10 determines whether operation was detected in the 1st display area 61 as step S202. When no operation is detected in the first display area 61 (No at Step S202), the controller 10 proceeds to Step S207.

  In step S207, the controller 10 determines whether to change the screen 60 to another screen. The case where the screen 60 is changed to another screen includes, for example, a case where the user performs an operation for switching the screen and a case where the screen is forcibly switched by interrupt processing. When the screen 60 is not changed to another screen (step S207, No), the controller 10 returns to step S202. When the screen 60 is changed to another screen (step S207, Yes), the controller 10 ends the processing procedure shown in FIG.

  When an operation is detected in the first display area 61 (step S202, Yes), the controller 10 proceeds to step S203. The controller 10 causes the second display area 62 to display the first object arranged in the vicinity of the position where the operation is detected in the first display area 61 as step S203. Then, the controller 10 causes the second display area 62 to display a second object indicating the position where the operation is detected in the first display area 61 as step S204.

  Subsequently, the controller 10 determines whether selection of the first object has been detected as step S205. When the selection of the first object is detected (step S205, Yes), the controller 10 proceeds to step S206. In step S206, the controller 10 displays an image corresponding to the first object in the first display area 61 and clears the second display area 62. When the selection of the first object is not detected (No at Step S205), the controller 10 does not execute Step S206.

  Thereafter, the controller 10 proceeds to step S207 already described.

  The smartphone 1 detects the operation for selecting the first object displayed in the first display area 61 when detecting the operation for selecting the first object displayed in the second display area 62. The process may be executed.

  A modification of the second example of display control by the smartphone 1 will be described with reference to FIG. FIG. 9 is a diagram illustrating a modification of the second example of display control by the smartphone 1.

  In step S21, the smartphone 1 executes the browser application 9C and displays the screen 60 provided by the browser application 9C on the touch screen display 2 (display 2A). Specifically, the smartphone 1 acquires a WEB page by Internet communication using the communication unit 6 based on the function provided by the browser application 9C, and first displays a first image obtained by reproducing the WEB page. It is displayed in the area 61. In step S21, nothing is displayed in the second display area 62.

  In step S <b> 21, the user's finger F <b> 1 touches the first display area 61 of the screen 60. A link object 63a and a link object 63b are arranged in the vicinity of the touched position. When the smartphone 1 detects an operation on the first display area 61, the smartphone 1 displays information on the position where the operation is detected in the first display area 61 in the second display area 62.

  In the second example of display control described above, the smartphone 1 arranges an object to which processing for an operation is assigned in the vicinity of the position where the operation is detected in the first display area 61 in the first display area. Is displayed in the second display area 62 while maintaining. On the other hand, in this modification, the smartphone 1 displays, in a list format, objects that are assigned processing for operations in the vicinity of the position where the operation is detected in the first display area 61 in the second display area 62. In step S22, a list including a list item 62a corresponding to the link object 63a and a list item 62b corresponding to the link object 63b is displayed in the second display area 62.

  As described above, when the smartphone 1 detects an operation on the first display area 61 while displaying the first image on the first display area 61, the information on the position where the operation is detected in the first display area 61 is included. A list of objects to which processing for the operation is assigned is displayed in the second display area 62. As a result, the user can more easily grasp an object that is present in the vicinity of the operated position in the first display area 61 and to which a process for the operation is assigned.

  When the smartphone 1 detects an operation for selecting a list item displayed in the second display area 62, the smartphone 1 detects an operation for selecting a corresponding first object displayed in the first display area 61. Similar processing may be executed.

  A third example of display control by the smartphone 1 will be described with reference to FIG. FIG. 10 is a diagram illustrating a third example of display control by the smartphone 1. In step S31, the smartphone 1 executes the browser application 9C and displays the screen 60 provided by the browser application 9C on the touch screen display 2 (display 2A). Specifically, the smartphone 1 acquires a WEB page by Internet communication using the communication unit 6 based on the function provided by the browser application 9 </ b> C, and displays a first image obtained by reproducing the WEB page in the first display area 61. it's shown. In step S31, nothing is displayed in the second display area 62.

  The smartphone 1 has a function of detecting the position of an object close to the touch screen 2B. Specifically, the smartphone 1 increases the sensitivity of the touch screen 2 </ b> B so that not only the object in contact with the touch screen display 2 but also the position of the object close to the touch screen display 2 is three-dimensionally displayed. Can be detected. That is, the smartphone 1 can detect the coordinates of the position on the surface of the touch screen display 2 that is closest to the adjacent object and the distance between the adjacent object and the surface of the touch screen display 2. The detection method of the object located in the vicinity of the touch screen display 2 is not limited to this. For example, the smartphone 1 may detect the position of an object close to the touch screen display 2 based on an image captured by the camera 12 or detection signals of a plurality of light receiving elements arranged on the touch screen display 2.

  In step S <b> 32, the user's finger F <b> 1 is close to the first display area 61 of the screen 60 and is located in a range where the smartphone 1 can be detected. When the smartphone 1 detects an object close to the first display area 61, the smartphone 1 displays a second image obtained by enlarging a portion closest to the close object in the first image in the second display area 62. The display magnification of the second image is determined based on the distance between the close object and the touch screen display 2. Specifically, the display magnification of the second image becomes higher as the distance between the adjacent object and the touch screen display 2 is shorter. For this reason, the second image is enlarged as the finger F1 approaches the touch screen display 2, and is reduced as the finger F1 moves away from the touch screen display 2.

  In step S <b> 33, the user's finger F <b> 1 moves horizontally along the surface direction of the touch screen display 2. When the smartphone 1 detects that the object is moving substantially horizontally, the first image is displayed without moving the portion displayed as the second image in the first image and changing the display magnification of the second image. The second image is displayed as it is. As a result, the user can move the finger F <b> 1 to a position close to the second display area 62 while maintaining the contents of the first display area 61 and the second display area 62.

  It is preferable that the substantially horizontal movement allows the vertical displacement that generally occurs when the user tries to move the finger F1 horizontally. The smartphone 1 preferably determines that the finger F1 is moving substantially horizontally even when the locus of movement of the finger F1 is inclined in a predetermined range upward or downward. The predetermined range is, for example, about ± 10 degrees.

  In step S <b> 34, the user's finger F <b> 1 taps the link object 63 in the second display area 62. When the smartphone 1 detects contact of an object with the touch screen display 2, the smartphone 1 determines what operation has been performed, and executes processing according to the determined operation. When the smartphone 1 detects a tap on the link object 63, the smartphone 1 displays an image corresponding to the link object 63 in the first display area 61 as step S35.

  As described above, the smartphone 1 changes the display magnification of the information displayed in the second display area 62 according to the distance between the object close to the first display area 61 and the touch screen display 2. Thereby, the smartphone 1 can change the display magnification of the information displayed in the second display area 62 by an intuitive operation, and can improve the convenience for the user.

  A processing procedure of a third example of display control by the smartphone 1 will be described with reference to FIG. FIG. 11 is a flowchart illustrating a processing procedure of a third example of display control by the smartphone 1. The processing procedure shown in FIG. 11 is realized by the controller 10 executing the control program 9A. The processing procedure shown in FIG. 11 is executed when the controller 10 displays the screen 60 on the display 2A.

  As shown in FIG. 11, the controller 10 of the smartphone 1 displays the first image in the first display area 61 of the screen 60 as step S301. The controller 10 determines whether the object which adjoined or touched the touch screen display 2 was detected as step S302. When an object is not detected (No at Step S302), the controller 10 proceeds to Step S308.

  In step S308, the controller 10 determines whether to change the screen 60 to another screen. The case where the screen 60 is changed to another screen includes, for example, a case where the user performs an operation for switching the screen and a case where the screen is forcibly switched by interrupt processing. When the screen 60 is not changed to another screen (No at Step S308), the controller 10 returns to Step S302. When the screen 60 is changed to another screen (step S308, Yes), the controller 10 ends the processing procedure shown in FIG.

  When an object is detected (step S302, Yes), the controller 10 determines whether the object has touched the touch screen display 2 as step S303. When there is a contact (step S304, Yes), the controller 10 determines what operation was performed as step S305. When no operation is detected (No at Step S306), the controller 10 proceeds to Step S308 already described. When an operation is detected (Yes in step S306), the controller 10 executes a process corresponding to the operation as step S307. Then, the controller 10 proceeds to step S308 already described.

  When there is no contact (step S304, No), the controller 10 proceeds to step S309. In step S309, the controller 10 determines whether the object is moving substantially horizontally with respect to the touch screen display 2. When the object has not moved substantially horizontally (step S309, No), the controller 10 proceeds to step S310. In step S310, the controller 10 displays a portion corresponding to the detection position in the first image as the second image in the second display area 62 at a display magnification corresponding to the distance between the object and the touch screen display 2. When the object is moving almost horizontally (step S309, Yes), step S310 is not executed.

  Thereafter, the controller 10 proceeds to step S308 already described.

  In this example, when the object is moving substantially horizontally with respect to the touch screen display 2, the contents of the first display area 61 and the second display area 62 are maintained, but the display control is not limited to this. When the smartphone 1 detects that the object is moving substantially horizontally with respect to the touch screen display 2, one or both of the first display area 61 and the second display area 62 are selected according to the moving direction. You may scroll the contents.

  A fourth example of display control by the smartphone 1 will be described with reference to FIG. FIG. 12 is a diagram illustrating a fourth example of display control by the smartphone 1.

  In step S41, the smartphone 1 executes the map application 9B and displays the screen 60 provided by the map application 9B on the touch screen display 2 (display 2A). Specifically, the smartphone 1 acquires map data by Internet communication using the communication unit 6 based on the function provided by the map application 9B, and displays a first image generated from the map data on the first display of the screen 60. Display in area 61. Some information may be displayed in the second display area 62, or nothing may be displayed.

  In step S <b> 42, the user starts pinching out in the first display area 61 using the finger F <b> 1 and the finger F <b> 2 in order to enlarge the first image displayed in the first display area 61. In this example, the user performs pinch out using both hands, but pinch out can also be performed with one hand.

  A threshold is set for the operation amount of the pinch operation in the first display area 61 in the smartphone 1. The operation amount of the pinch operation is, for example, the total movement amount of two fingers. When the pinch operation in the first display area 61 is started, the smartphone 1 sets the portion corresponding to the center of the start position of the pinch operation in the first image as the second image at the same display magnification as the first image. 2 in the display area 62. Then, until the operation amount of the pinch operation in the first display area 61 reaches the threshold, the smartphone 1 does not enlarge the first image displayed in the first display area 61 as in step S43. The second image displayed in the second display area 62 is enlarged according to the operation amount.

  Enlargement by the pinch operation is normally performed with reference to the center of the start position of the pinch operation (the center of the position where the two fingers start moving). May be difficult to place on the touch screen display 2 accurately. For this reason, it may be noticed that the position where the finger is placed is shifted only after the enlargement is started. In this case, since an erroneous operation is noticed after the enlargement is executed, the user may not be able to easily grasp which part before the enlargement was accidentally enlarged.

  By enlarging only the second image in the second display area 62 until the operation amount of the pinch reaches the threshold, the user can execute the pinch while confirming whether a desired portion is enlarged. Furthermore, since the first image in the first display area 61 is not enlarged, the user can easily correct the position when the user notices that the position where the finger is placed is shifted. In order to obtain such an effect, the threshold value related to the operation amount of the pinch operation is set to a value that allows the user to check whether the desired position is enlarged by looking at the second image. .

  When the user confirms that the desired position is enlarged, the user continues to pinch. When the operation amount of the pinch operation reaches the threshold value, the first image displayed in the first display area 61 starts to be enlarged according to the operation amount as in step S44. When the operation amount of the pinch operation reaches the threshold value, the smartphone 1 may enlarge the first image at a magnification corresponding to the operation amount until then in a moment. Alternatively, when the operation amount of the pinch operation reaches the threshold value, the smartphone 1 takes 1 to several seconds and gradually enlarges the first image until the magnification according to the operation amount is reached. Good. Thereafter, the first image in the first display area 61 is enlarged or reduced according to the user's pinch operation.

  A processing procedure of a fourth example of display control by the smartphone 1 will be described with reference to FIG. FIG. 13 is a flowchart illustrating a processing procedure of a fourth example of display control by the smartphone 1. The processing procedure shown in FIG. 13 is realized by the controller 10 executing the control program 9A. The processing procedure shown in FIG. 13 is executed when the controller 10 displays the screen 60 on the display 2A.

  As illustrated in FIG. 13, the controller 10 of the smartphone 1 displays a first image in the first display area 61 of the screen 60 as step S401. The controller 10 determines whether the pinch operation with respect to the 1st display area 61 was detected as step S402. When the pinch operation is not detected (No at Step S402), the controller 10 proceeds to Step S406.

  In step S406, the controller 10 determines whether to change the screen 60 to another screen. The case where the screen 60 is changed to another screen includes, for example, a case where the user performs an operation for switching the screen and a case where the screen is forcibly switched by interrupt processing. When the screen 60 is not changed to another screen (step S406, No), the controller 10 returns to step S402. When the screen 60 is changed to another screen (step S406, Yes), the controller 10 ends the processing procedure shown in FIG.

  When the pinch operation is detected (step S402, Yes), the controller 10 proceeds to step S403. In step S403, the controller 10 determines whether the operation amount of the pinch operation has reached the threshold value. When the operation amount has not reached the threshold value (No at Step S403), the controller 10 proceeds to Step S404. In step S404, the controller 10 retains the magnification of the first display area 61, sets the portion corresponding to the operation position in the first image as the second image, and sets the second display area 62 at a magnification corresponding to the operation amount. To display. Then, the controller 10 proceeds to step S406 already described.

  When the operation amount has reached the threshold value (step S403, Yes), the controller 10 proceeds to step S405. In step S405, the controller 10 changes the magnification of the first image displayed in the first display area 61 according to the operation amount, and displays the first image at the changed magnification. Then, the controller 10 proceeds to step S406 already described.

  A fifth example of display control by the smartphone 1 will be described with reference to FIG. FIG. 14 is a diagram illustrating a fifth example of display control by the smartphone 1.

  In step S51, the smartphone 1 executes the map application 9B and displays the screen 60 provided by the map application 9B on the touch screen display 2 (display 2A). Specifically, the smartphone 1 acquires map data by Internet communication using the communication unit 6 based on the function provided by the map application 9B, and displays a first image generated from the map data on the first display of the screen 60. Display in area 61. Some information may be displayed in the second display area 62, or nothing may be displayed.

  In step S <b> 52, the finger F <b> 1 of the user who wants to enlarge the map data displayed in the first display area 61 has double-tapped the first display area 61. When the smartphone 1 detects a double tap operation on the first display area 61, the smartphone 1 displays a second image (information) obtained by enlarging the map data of the first display area 61 in the second display area 62 as step S <b> 53. The magnification of the second image may be determined in advance, or may be determined according to the type, number of operations, interval, intensity, and the like of the operation. The operation for changing the magnification of the second image may be other operations such as a long touch instead of a double tap.

  The smartphone 1 displays the index object 64 indicating the position corresponding to the position where the operation is detected in the first display area 61 in the first display area 61 and the second display area 62. Furthermore, the smartphone 1 displays a magnification object 66 indicating the magnification of the second image in the second display area 62. In step S <b> 53, a magnification object 66 indicating that the magnification of the second image is “5” is displayed in the second display area 62.

  As described above, when the smartphone 1 displays map data in the first display area 61, when a double-tap operation on the first display area 61 is detected, the display magnifications of the enlarged and reduced second image and second image are displayed. Are displayed in the second display area 62. Thereby, the smartphone 1 can enlarge the first image displayed in the first display area 61 to the user by a simple operation. In this embodiment, although the smart phone 1 demonstrated the case where a 1st image was expanded when the predetermined | prescribed operation with respect to a 1st image was detected, it is not limited to this. For example, the smartphone 1 may be configured to reduce the first image when a predetermined operation on the first image is detected.

  A processing procedure of a fifth example of display control by the smartphone 1 will be described with reference to FIG. FIG. 15 is a flowchart illustrating a processing procedure of a fifth example of display control by the smartphone 1. The processing procedure shown in FIG. 15 is realized by the controller 10 executing the control program 9A. The processing procedure shown in FIG. 15 is executed when the controller 10 displays the screen 60 on the display 2A.

  As illustrated in FIG. 15, the controller 10 of the smartphone 1 displays a first image in the first display area 61 of the screen 60 as step S501. The controller 10 determines whether operation with respect to the 1st display area 61 was detected as step S502. When the operation is not detected (No at Step S502), the controller 10 proceeds to Step S506.

  In step S506, the controller 10 determines whether to change the screen 60 to another screen. The case where the screen 60 is changed to another screen includes, for example, a case where the user performs an operation for switching the screen and a case where the screen is forcibly switched by interrupt processing. When the screen 60 is not changed to another screen (step S506, No), the controller 10 returns to step S502. When the screen 60 is changed to another screen (step S506, Yes), the controller 10 ends the processing procedure shown in FIG.

  When an operation is detected (step S502, Yes), the controller 10 proceeds to step S503. In step S503, the controller 10 changes the display magnification of the portion corresponding to the operation position in the first image, and displays it in the second display area 62 as the second image. The controller 10 displays the display magnification in the second display area 62 as step S504. The controller 10 displays the 2nd object which shows an operation position on the 1st display area 61 and the 2nd display area 62 as step S505. Then, the controller 10 proceeds to step S506 already described.

  A sixth example of display control by the smartphone 1 will be described with reference to FIG. FIG. 16 is a diagram illustrating a sixth example of display control by the smartphone 1.

  In step S61, the smartphone 1 executes the map application 9B and displays the screen 60 provided by the map application 9B on the touch screen display 2 (display 2A). Specifically, the smartphone 1 acquires map data by Internet communication using the communication unit 6 based on the function provided by the map application 9B, and displays a first image generated from the map data on the first display of the screen 60. Display in area 61. Some information may be displayed in the second display area 62, or nothing may be displayed.

  In step S62, the fingers F1 and F2 of the user who wants to enlarge the map data displayed in the first display area 61 perform a pinch-out operation, and then finish the pinch-out operation.

  When the smartphone 1 detects a pinch operation on the first display area 61, the smartphone 1 displays the first image displayed in the first display area 61 as the second image in the second display area 62 in step S63. That is, the smartphone 1 displays the first image before enlargement in the second display area 62 as the second image. The second image is the first image centered on the operation position at the same magnification as the first image. The smartphone 1 displays the index object 64 indicating the pinch-out operation position in the second display area 62. And the smart phone 1 expands a 1st image with the magnification | multiplying_factor according to the operation amount, and displays it on the 1st display area 61 as step S63.

  As described above, when the smartphone 1 detects a pinch operation on the first display area 61, the smartphone 1 displays the first image before enlargement or reduction as the second image in the second display area 62. The image is enlarged or reduced at a magnification according to the operation amount. Thereby, even if the user enlarges or reduces the first image in the first display area 61, the smartphone 1 confirms with the user the first image before enlargement or reduction displayed in the second display area 62. Can be made.

  After the pinch-out operation on the first display area 61 is completed, the user continues to perform the pinch-out operation twice in the vicinity of the operation position. The smart phone 1 changes the 1st image currently displayed on the 1st display area 61 so that it may become the magnification according to the operation amount, whenever it detects pinch out. In this case, the smartphone 1 has not changed the second image displayed in the second display area 62. The smart phone 1 displays the some indicator object 64 which shows the operation position which detected pinch out on the 2nd display area 62 as step S64. For example, the plurality of index objects 64 may be displayed in different display colors or may be displayed so as to indicate the operation order. Thereby, the user can confirm the history of operation positions in the first image before enlargement or reduction by referring to the plurality of index objects 64 displayed in the second display area 62.

  A processing procedure of a sixth example of display control by the smartphone 1 will be described with reference to FIG. FIG. 17 is a flowchart illustrating a processing procedure of a sixth example of display control by the smartphone 1. The processing procedure shown in FIG. 17 is realized by the controller 10 executing the control program 9A. The processing procedure shown in FIG. 17 is executed when the controller 10 displays the screen 60 on the display 2A.

  As illustrated in FIG. 17, the controller 10 of the smartphone 1 displays a first image in the first display area 61 of the screen 60 as step S601. The controller 10 determines whether the pinch operation with respect to the 1st display area 61 was detected as step S602. When the pinch operation is not detected (step S602, No), the controller 10 proceeds to step S607.

  In step S607, the controller 10 determines whether to change the screen 60 to another screen. The case where the screen 60 is changed to another screen includes, for example, a case where the user performs an operation for switching the screen and a case where the screen is forcibly switched by interrupt processing. When the screen 60 is not changed to another screen (step S607, No), the controller 10 returns to step S602. When the screen 60 is changed to another screen (step S607, Yes), the controller 10 ends the processing procedure shown in FIG.

  When the pinch operation is detected (step S602, Yes), the controller 10 proceeds to step S603. In step S603, the controller 10 determines whether the second image is being displayed in the second display area 62. When not displaying (step S603, No), the controller 10 progresses to step S604. The controller 10 displays the 1st image currently displayed on the 1st display area 61 on the 2nd display area 62 as a 2nd image as step S604. When the display is in progress (step S603, Yes), the detected pinch operation is a continuous pinch operation, and the controller 10 does not perform step S604.

  The controller 10 displays a second object indicating the pinch position in the second display area 62 as step S605. In step S606, the controller 10 changes the magnification of the first image displayed in the first display area 61 according to the operation amount of the pinch operation, and displays the first image at the changed magnification. Then, the controller 10 proceeds to step S607 already described.

  A seventh example of display control by the smartphone 1 will be described with reference to FIG. FIG. 18 is a diagram illustrating a seventh example of display control by the smartphone 1. In step S71, the smartphone 1 executes the browser application 9C and displays the screen 60 provided by the browser application 9C on the touch screen display 2 (display 2A). Specifically, the smartphone 1 acquires a WEB page by Internet communication using the communication unit 6 based on the function provided by the browser application 9 </ b> C, and displays a first image obtained by reproducing the WEB page in the first display area 61. it's shown.

  The smartphone 1 displays the area corresponding to the portion displayed in the first display area 61 in the first display area 62 as the second image in the first image. The region corresponding to the portion displayed in the first display region 61 is, for example, a part of the portion displayed in the first display region 61 in the first image, the whole, or a wider region including the same. is there. The initial display magnification of the second image is a preset magnification. The display magnification of the second image may be changeable according to a user operation.

  Furthermore, the smartphone 1 displays, in the first display area 61, an indicator 61 a that indicates which part of the first image reproduced from the WEB page is displayed in the first display area 61.

  In step S <b> 72, the user's finger F <b> 1 performs a scroll operation on the first display area 61. Specifically, the user performs a lower flick in the first display area 61 using the finger F1.

  When detecting the scroll operation on the first display area 61, the smartphone 1 scrolls the first display area 61 in the operation direction at the first speed corresponding to the scroll operation as Step S73. Furthermore, the smartphone 1 scrolls the second display area 62 at a second speed different from the first speed at which the first display area 61 is scrolled.

  When a second image obtained by enlarging the first image displayed in the first display area 61 is displayed in the second display area 62, the second display area 62 is moved at the same first speed as the first display area 61. When scrolling, the second display area 62 is scrolled at a speed that is difficult to visually recognize. For this reason, the smartphone 1 sets the second speed to a speed slower than the first speed. Thereby, even if the smartphone 1 scrolls the second display area 62 in conjunction with the first display area 61, the user can visually recognize the display content of the second display area 62. The second speed may be a constant speed regardless of the scroll operation speed. The second speed is preferably a speed at which the user can recognize the content of the information being scrolled.

  Furthermore, when the smartphone 1 detects the scroll stop operation in the first display area 61 or the second display area 62, the smartphone 1 stops scrolling of the display area in which the scroll stop operation is detected. And the smart phone 1 changes the part displayed on the other display area so that it may correspond with the part currently displayed on the display area which detected scroll stop operation. The scroll stop operation is, for example, touch or long touch.

  For example, it is assumed that a portion near the center of the first image is scrolled in the first display area 61 and a portion near the head of the first image is still scrolled in the second display area 62. At this time, when a scroll stop operation is detected in the first display area 61, the smartphone 1 displays a portion near the center of the first image in the first display area 61, and also in the second display area 62, A portion close to the center of the first image is displayed as the second image. On the other hand, when the scroll stop operation is detected in the second display area 62, the smartphone 1 causes the second display area 62 to display a portion close to the top of the first image as the second image, and further to the first display area 61. In addition, a portion close to the top of the first image is displayed.

  As described above, when the smartphone 1 detects the scroll operation on the first display area 61, the smartphone 1 scrolls the first display area 61 at the first speed corresponding to the operation, and performs the second display at the second speed slower than the first speed. The area 62 is scrolled. Thereby, even when the scroll operation of the first display area 61 is unintentionally performed at a high speed, the user can confirm the display contents in the second display area 62 that is scrolling at a low speed and Scrolling can be stopped at the position.

  In step S74, the user's finger F1 performs a scroll stop operation on the second display area 62 having a low scroll speed. In the example shown in step S <b> 74, the user's finger F <b> 1 performs a scroll stop operation in an area corresponding to “Kits” in the second image scroll-displayed in the second display area 62. In step S75, the smartphone 1 stops scrolling the second display area 62, and further changes the part of the first image displayed in the first display area 61 to a part that includes an area corresponding to “Kittsu”. Thus, the scrolling of the second display area 62 is stopped.

  A processing procedure of a seventh example of display control by the smartphone 1 will be described with reference to FIG. FIG. 19 is a flowchart illustrating a processing procedure of a seventh example of display control by the smartphone 1. The processing procedure shown in FIG. 19 is realized by the controller 10 executing the control program 9A. The processing procedure shown in FIG. 19 is executed when the controller 10 displays the screen 60 on the display 2A.

  As illustrated in FIG. 19, the controller 10 of the smartphone 1 displays a first image in the first display area 61 of the screen 60 as step S701. Furthermore, the controller 10 displays, in the second display area 62, a second image corresponding to a portion displayed in the first display area 61 in the first image. The controller 10 determines whether scroll operation with respect to the 1st display area 61 was detected as step S702. When the scroll operation is detected (step S702, Yes), the controller 10 proceeds to step S703. In step S703, the controller 10 scrolls the display area 61 at a first speed corresponding to the scroll operation. In step S704, the controller 10 scrolls the display area 62 at the second speed. When the scroll operation is not detected (No at Step S702), Steps S703 and 704 are not executed.

  In step S705, the controller 10 determines whether an operation other than the scroll operation has been detected. When an operation other than the scroll operation is detected (step S705, Yes), the controller 10 proceeds to step S706. Controller 10 performs processing according to operation as Step S706. When an operation other than the scroll operation is not detected (step S705, No), step S706 is not executed.

  Thereafter, the controller 10 determines whether to change the screen 60 to another screen in step S707. The case where the screen 60 is changed to another screen includes, for example, a case where the user performs an operation for switching the screen and a case where the screen is forcibly switched by interrupt processing. When the screen 60 is not changed to another screen (step S707, No), the controller 10 returns to step S702. When the screen 60 is changed to another screen (step S708, Yes), the controller 10 ends the processing procedure shown in FIG.

  Embodiment which this application discloses can be changed in the range which does not deviate from the summary and range of invention. Furthermore, the embodiment disclosed in the present application and its modifications can be combined as appropriate. For example, the above embodiment may be modified as follows.

  For example, each program shown in FIG. 5 may be divided into a plurality of modules, or may be combined with other programs.

  In the above embodiment, a smartphone has been described as an example of an electronic device including a touch screen. However, the electronic device according to the appended claims is not limited to a smartphone. The electronic device according to the appended claims may be an electronic device other than a smartphone. Examples of electronic devices include, but are not limited to, mobile phones, tablets, portable personal computers, digital cameras, media players, electronic book readers, navigators, and game machines. The electronic device according to the appended claims may be a stationary electronic device. The stationary electronic device includes, for example, a desktop personal computer, an automatic depositing and paying machine (ATM), and a television receiver, but is not limited thereto.

  In the above embodiment, the smartphone 1 has been described with respect to the case where the second display area 62 is arranged on the right side or the lower side of the first display area 61, but the arrangement of the second display area 62 is not limited to this. For example, the smartphone 1 may arrange the second display area 62 on the upper side, the left side of the first display area 61, in the vicinity of the operation position, and the like.

  In the above embodiment, the smartphone 1 has been described with respect to the case where the second display area 62 is a display area other than the first display area 61 on the screen 60, but is not limited thereto. For example, the smartphone 1 may be configured so that the first display area 61 is the entire display area of the screen 60 and the second display area 62 is another screen that is displayed on a part of the screen 60.

  When the first image is displayed for the first time in the first display area 61 of the screen 60, what is displayed in the second display area 62 or whether nothing is displayed may be changed as appropriate. For example, when the first image is displayed for the first time in the first display area 61, the first display area 61 and the second display area 62 are treated as one display area, and the first image is displayed across these areas. When a predetermined operation in the first display area 61 is detected, information related to the position where the operation is detected may be the second display area 62. Alternatively, when the first image is displayed for the first time in the first display area 61, nothing is displayed in the second display area 62, and the operation is performed when a predetermined operation in the first display area 61 is detected. The information regarding the position where the position is detected may be the second display area 62.

  The characterizing embodiments have been described in order to fully and clearly disclose the technology according to the appended claims. However, the appended claims should not be limited to the above-described embodiments, but all modifications and alternatives that can be created by those skilled in the art within the scope of the basic matters shown in this specification. Should be configured to embody such a configuration.

DESCRIPTION OF SYMBOLS 1 Smart phone 2 Touch screen display 2A Display 2B Touch screen 3 Button 4 Illuminance sensor 5 Proximity sensor 6 Communication unit 7 Receiver 8 Microphone 9 Storage 9A Control program 9B Map application 9C Browser application 9Z Setting data 10 Controller 11 Speaker 12, 13 Camera 14 Connector 15 Acceleration sensor 16 Direction sensor 17 Gyroscope 20 Housing

Claims (13)

  1. A touch screen display for displaying a screen having a first display area and a second display area;
    A controller that displays a first image in the first display area, and displays information obtained by enlarging or reducing the vicinity of a position where an operation is detected in the first display area in the second display area ,
    When the controller detects a pinch operation on the first display area and the detected operation amount of the pinch operation has not reached a preset threshold, the controller displays the first display area on the first display area. While maintaining the display size of the first image, the information is enlarged or reduced, and when the detected operation amount reaches the threshold, the magnification of the first image displayed in the first display area is set as follows: electronic device that controls so as to change in response to the operation amount.
  2. The first image has a first object for displaying another image,
    2. The electronic device according to claim 1 , wherein the controller displays information on the first object in the second display area when the first object is present in the vicinity of the position where the operation is detected.
  3. The electronic device according to claim 2 , wherein the controller further displays a second object indicating a position where the operation is detected in the second display area.
  4. The electronic device according to claim 2 , wherein the controller causes the second display area to display only the first object among objects near the position where the operation is detected.
  5. The electronic device according to claim 3 , wherein the controller displays a list of the plurality of first objects in the second display area when there are a plurality of the first objects near a position where the operation is detected.
  6. The operation is an operation of bringing an object closer to the first display area,
    The electronic device according to claim 1 , wherein the controller displays the information on the second display area at a magnification according to a distance between the object that is brought close to the touch screen display.
  7. The electronic device according to claim 6 , wherein the controller maintains the display of the information when detecting the movement of the approached object in a direction along a surface direction of the touch screen display.
  8. The electronic device according to claim 1 , wherein the controller displays a magnification in the second display area together with the information.
  9. When the controller detects a pinch operation on the first display area, the controller displays a second object indicating the detected operation position of the pinch operation on the second display area together with the information, and the first display area The electronic apparatus according to claim 1 , wherein the first image displayed on the screen is controlled to be enlarged or reduced according to an operation amount of the pinch operation.
  10. The electronic device according to claim 9 , wherein the controller displays the position where each pinch operation is detected in the second display area when the pinch operation is detected a plurality of times.
  11. The first display area and the second display area are scrollable display areas,
    When the controller detects the scroll operation for the first display area when the information is displayed in the second display area, the controller displays the first display area at a first speed corresponding to the detected scroll operation. is scrolled, the electronic device according to claim 1 for scrolling the second display area different from the second speed to the first speed.
  12. A method for controlling an electronic device having a touch screen display,
    Displaying a screen having a first display area and a second display area on the touch screen display;
    Displaying a first image in the first display area;
    Displaying in the second display area information obtained by enlarging or reducing the vicinity of the position where the operation is detected in the first display area ;
    Detecting a pinch operation on the first display area;
    When the detected operation amount of the pinch operation does not reach a preset threshold value, the information is enlarged or reduced while maintaining the display size of the first image displayed in the first display area. And, when the detected operation amount reaches the threshold value, controlling to change the magnification of the first image displayed in the first display area according to the operation amount. .
  13. For electronic devices with touch screen displays,
    Displaying a screen having a first display area and a second display area on the touch screen display;
    Displaying a first image in the first display area;
    Displaying in the second display area information obtained by enlarging or reducing the vicinity of the position where the operation is detected in the first display area ;
    Detecting a pinch operation on the first display area;
    When the detected operation amount of the pinch operation does not reach a preset threshold value, the information is enlarged or reduced while maintaining the display size of the first image displayed in the first display area. And, when the detected operation amount reaches the threshold value, a step of executing a step of controlling the magnification of the first image displayed in the first display area to be changed according to the operation amount. program.
JP2013246511A 2013-11-28 2013-11-28 Electronic device, control method, and control program Active JP6096100B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2013246511A JP6096100B2 (en) 2013-11-28 2013-11-28 Electronic device, control method, and control program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2013246511A JP6096100B2 (en) 2013-11-28 2013-11-28 Electronic device, control method, and control program

Publications (2)

Publication Number Publication Date
JP2015106181A JP2015106181A (en) 2015-06-08
JP6096100B2 true JP6096100B2 (en) 2017-03-15

Family

ID=53436278

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2013246511A Active JP6096100B2 (en) 2013-11-28 2013-11-28 Electronic device, control method, and control program

Country Status (1)

Country Link
JP (1) JP6096100B2 (en)

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3560500B2 (en) * 1999-06-04 2004-09-02 富士通テン株式会社 Navigation device
JP4481289B2 (en) * 2006-11-14 2010-06-16 株式会社コナミデジタルエンタテインメント Item selection device, item selection method, and program
JP5001182B2 (en) * 2008-01-10 2012-08-15 パナソニック株式会社 Display control apparatus, electronic device, display control method, and program
JP5554972B2 (en) * 2009-11-30 2014-07-23 株式会社 ミックウェア Map information processing apparatus, map information processing method, and program
JP2011253468A (en) * 2010-06-03 2011-12-15 Aisin Aw Co Ltd Display device, display method and display program
JP2012094008A (en) * 2010-10-27 2012-05-17 Kyocera Corp Portable electronic equipment
JP5828800B2 (en) * 2012-04-23 2015-12-09 パナソニック インテレクチュアル プロパティ コーポレーション オブアメリカPanasonic Intellectual Property Corporation of America Display device, display control method, and program
JP2013050972A (en) * 2012-10-22 2013-03-14 Seiko Epson Corp Portable information apparatus, electronic book and program

Also Published As

Publication number Publication date
JP2015106181A (en) 2015-06-08

Similar Documents

Publication Publication Date Title
EP2924550B1 (en) Split-screen display method and electronic device thereof
US8976129B2 (en) Portable electronic device and method of controlling same
KR101012300B1 (en) User interface apparatus of mobile station having touch screen and method thereof
US8493333B2 (en) Method of displaying information by using touch input in mobile terminal
KR101387270B1 (en) Mobile terminal for displaying menu information accordig to trace of touch signal
CN103181089B (en) The method of controlling a mobile terminal in response to a touch screen and multi-touch input device
US10275151B2 (en) Apparatus and method for cursor control and text selection and editing based on gesture-based touch inputs received in a virtual keyboard display area
CN104272240B (en) Systems and methods for modifying a virtual keyboard on the user interface
US10254927B2 (en) Device, method, and graphical user interface for manipulating workspace views
KR20100041867A (en) Method, apparatus and computer program product for facilitating data entry using an offset connection element
US9304668B2 (en) Method and apparatus for customizing a display screen of a user interface
US8654076B2 (en) Touch screen hover input handling
US9298265B2 (en) Device, method, and storage medium storing program for displaying a paused application
JP2012505444A (en) Portable electronic device and method for controlling portable electronic device
JP5771585B2 (en) apparatus, method, and program
JP6086689B2 (en) Apparatus and program
CN102171639A (en) Live preview of open windows
US9423952B2 (en) Device, method, and storage medium storing program
US9448691B2 (en) Device, method, and storage medium storing program
JP6017891B2 (en) Apparatus, method, and program
US9817544B2 (en) Device, method, and storage medium storing program
US20140068478A1 (en) Data display method and apparatus
JP6194162B2 (en) Apparatus, method, and program
CN103197848A (en) System and method for executing an e-book reading application in an electronic device
RU2677591C2 (en) Apparatus and method for deleting item on touch screen display

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20160216

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20161026

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20161101

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20161227

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20170131

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20170215

R150 Certificate of patent or registration of utility model

Ref document number: 6096100

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150