WO2012077273A1 - Electronic device - Google Patents

Electronic device Download PDF

Info

Publication number
WO2012077273A1
WO2012077273A1 PCT/JP2011/005929 JP2011005929W WO2012077273A1 WO 2012077273 A1 WO2012077273 A1 WO 2012077273A1 JP 2011005929 W JP2011005929 W JP 2011005929W WO 2012077273 A1 WO2012077273 A1 WO 2012077273A1
Authority
WO
WIPO (PCT)
Prior art keywords
display
display data
display area
data
area
Prior art date
Application number
PCT/JP2011/005929
Other languages
French (fr)
Japanese (ja)
Inventor
鈴木 達也
Original Assignee
パナソニック株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パナソニック株式会社 filed Critical パナソニック株式会社
Publication of WO2012077273A1 publication Critical patent/WO2012077273A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus

Definitions

  • the present invention relates to an electronic device provided with detection means such as a touch panel for detecting an operation on a display screen.
  • Patent Document 1 it is possible to improve a user's operability by determining an area that can be operated with a finger of a user's hand on a touch panel screen and displaying items such as buttons and links in the area.
  • a portable terminal that can be used is disclosed.
  • An object of the present invention is to solve the above problems, and to provide an electronic device that includes a detection means such as a touch panel for detecting an operation on a display screen and that can improve operability as compared with the prior art.
  • An electronic device is an electronic device provided with detection means for detecting an operation on a display screen, The predetermined first display data is converted into at least one second display data, and the first display data is combined with the second display data to display the combined display data on the display screen. And a control means for detecting a predetermined operation within the display area of one of the at least one second display data by the detection means, The control means executes the process when the detecting means detects an operation instructing execution of a predetermined process within the display area of one of the at least one second display data. It is characterized by doing.
  • control means displays an indicator corresponding to the predetermined operation in a display area of the first display data.
  • control unit detects the predetermined operation at a first position in a display area of one display data of the at least one second display data by the detection unit. Then, the coordinates of the first position are converted into the coordinates of the second position corresponding to the first position in the display area of the first display data, and the indicator is displayed at the second position. It is characterized by doing.
  • control means detects the first display when the detecting means detects a movement operation within a display area of one display data of the at least one second display data.
  • the indicator is displayed so that the indicator is moved by a movement amount corresponding to the movement amount of the movement operation in the movement direction of the movement operation.
  • control means prohibits the display of the second display data and displays only the first display data on the display screen when the process is executed.
  • control means reduces the first display data in a similar shape and converts it into the second display data.
  • control means reduces the first display data in an unsimilar shape and converts it into the second display data.
  • control means sets the first display data to the first display data so that at least a part of the display area of the second display data overlaps the display area of the first display data. It is characterized by being combined with each second display data.
  • control unit may change the first display data to the second display data so that a display area of the second display data does not overlap with a display area of the first display data. It is characterized by combining with display data.
  • control unit combines the first display data with the second display data so that a display area of the second display data is positioned at a lower portion of the display screen. It is characterized by doing.
  • control means converts the first display data into two second display data.
  • predetermined display data is converted into at least one second display data, and the first display data is combined with each second display data to be combined.
  • the process Therefore, the user can perform the same operation as the operation performed in the display area of the first display data only by performing the operation in the display area of the second display data. Therefore, it is possible to provide an electronic device with better operability compared to the prior art.
  • FIG. 4 is a diagram showing a display example of the display 4 in step S ⁇ b> 7 of FIG. 3 when the user is tapping an application icon 74. It is a figure which shows the example of a display of the display 4 in FIG.3 S7 when a user swipes within the display area 12.
  • FIG. It is a figure which shows the example of a display of the display 4 when the program of the application for video reproduction
  • FIG. 3 S7 It is a figure which shows another example of a display of the display 4 in FIG.3 S7 when the user is tapping the application icon 74.
  • FIG. It is a figure which shows another example of a display of the display 4 in FIG.3 S7 when a user double taps the application icon 74.
  • FIG. It is a figure which shows the example of a display of display area 11A and 12A which concerns on the 2nd Embodiment of this invention.
  • FIG. shows the example of a display of the display areas 11B and 12B which concern on the 1st modification of the 2nd Embodiment of this invention.
  • step S3A of the control processing which concerns on the 3rd Embodiment of this invention. It is a figure which shows the example of a display of the display 4 in step S3A of FIG. It is a figure which shows the example of a display of icon IC2 which concerns on the 4th Embodiment of this invention. It is a figure which shows the example of a display of the display areas 11, 12R, and 12L which concern on the 5th Embodiment of this invention. It is a block diagram which shows the structure of 100 A of information terminal devices which concern on the 6th Embodiment of this invention.
  • an electronic device according to the present invention will be described using the information terminal device 100 including the touch panel 2 that detects a user operation on the display screen of the display 4 as an example.
  • FIG. 1 is a block diagram showing the configuration of the information terminal device 100 according to the first embodiment of the present invention
  • FIG. 2 is a front view of the information terminal device 100 of FIG.
  • an information terminal device 100 includes a CPU (Central Processing Unit) 1, a touch panel 2, an interface 3, a display (also called a monitor) 4, a ROM (Read Only Memory) 5, a DRAM (Dynamic Random). (Access Memory) 6, buttons 7 to 9, and a housing 10 (see FIG. 2).
  • CPU Central Processing Unit
  • touch panel 2 is a front view of the information terminal device 100 of FIG.
  • an information terminal device 100 includes a CPU (Central Processing Unit) 1, a touch panel 2, an interface 3, a display (also called a monitor) 4, a ROM (Read Only Memory) 5, a DRAM (Dynamic Random). (Access Memory) 6, buttons 7 to 9, and a housing 10 (see FIG. 2).
  • ROM Read Only Memory
  • DRAM Dynamic Random
  • the information terminal device 100 is an electronic device that includes the touch panel 2 that detects an operation on the display screen of the display 4 and converts predetermined first display data into second display data. Then, the display data synthesized by combining the first display data with the second display data is displayed on the display screen of the display 4, and a predetermined operation is performed in the display area 12 of the second display data by the touch panel 2. It is characterized by having a CPU 1 for detecting the above.
  • the CPU 1 is characterized in that when the touch panel 2 detects an operation instructing execution of a predetermined process within the display area 12, the CPU 1 executes the process. Further, the CPU 1 is characterized in that the finger icon IC1 corresponding to the detected predetermined operation is displayed in the display area 11 of the first display data.
  • the ROM 5 stores various software programs that are necessary for the operation of the information terminal device 100 and executed by the CPU 1 in advance.
  • the DRAM 6 is used as a working area of the CPU 1, and when the CPU 1 executes a program stored in the ROM 5, the execution program necessary for executing a function corresponding to the program to be executed and the execution program are executed. Stores necessary data and temporary data generated during execution.
  • the interface 3 executes predetermined interface processing related to video display processing such as signal conversion on the display data from the CPU 1, and outputs the processed display data to the display 4 for display.
  • the display 4 is a display device such as a liquid crystal display (LCD (Liquid Crystal Display)), and is provided on the front surface of the housing 10 of the information terminal device 100, and various GUIs (Graphic User Interface). ) Functions as a display device for the program.
  • the touch panel 2 is a detection unit that detects a user operation on the display screen of the display 4, and includes a transparent film provided on the surface of the display 4 and a touch detection unit.
  • the touch detection means detects the position and movement of the user's finger that has touched the above-described film, and includes the coordinates (xf, yf) of the finger and the operation content based on the detected position and movement of the finger.
  • a detection signal S2 is generated and output to the CPU1. 2
  • the upper left corner of the display screen of the display 4 is defined as the origin O1 of the xy coordinate system
  • the right direction in FIG. 2 is defined as the x axis direction
  • the lower direction in FIG. 2 is defined as the y axis. Defined as direction.
  • the user's operation includes a tap (also referred to as a click) for tapping the touch panel 2 only once, a double tap for tapping twice in succession, and a swipe for sliding the finger while touching the touch panel 2. Including.
  • the information terminal device 100 includes a button 7 for turning on / off the information terminal device 100, a button 8 for displaying a predetermined menu screen on the display 4, and details on the display 4. And a button 9 for displaying a display area 12 (also referred to as a sub-screen) to be described later.
  • the CPU 1 is connected to the touch panel 2, the interface 3, the ROM 5, and the DRAM 6 to control them, and based on the detection signal S2 from the touch panel 2 and information indicating whether or not the buttons 7 to 9 are operated. Perform various software functions.
  • the user can operate the information terminal device 100 by operating the buttons 7 to 9 or touching the touch panel 2 with a finger.
  • FIG. 3 is a flowchart showing a control process executed by the CPU 1 of FIG.
  • the CPU 1 executes the control process of FIG. 1 when the user operates the button 8, for example.
  • the CPU 1 includes a first menu screen including application icons 51 to 61 (see FIG. 2) for instructing execution of a program of each application such as a video display application and a character input application. Display data is displayed on the entire display screen of the display 4.
  • the CPU 1 determines whether or not an operation for instructing display of second display data to be described later has been performed on the touch panel 2 or the button 9. If YES, the process proceeds to step S3. On the other hand, if NO, the process of step S4 is repeated. For example, the CPU 1 determines YES in step S ⁇ b> 2 when the application icon 51 displayed on the display 4 is tapped and when the button 9 is operated.
  • step S3 in FIG. 3 the CPU 1 reduces the first display data with a similar shape and converts it into second display data. At this time, the CPU 1 reduces the display content of the first display data such as the application icons 51 to 61 in a similar shape as it is.
  • step S4 the first display data is combined with the second display data, and the combined display data is displayed on the display 4. Specifically, the CPU 1 displays the first display data in the display area 11 of the entire display screen of the display 4 and the second display data in the display area 12 in the lower right corner portion of the display screen of the display 4. As described above (the entire display area 12 hides a part of the display area 11), the first display data is combined with the second display data to generate combined display data.
  • FIG. 1 the entire display area 12 hides a part of the display area 11
  • FIG. 2 shows a display example of the display 4 in step S4 of FIG.
  • application icons 51 to 61 are displayed in the display area 11 corresponding to the entire display screen of the display 4.
  • application icons 71 to 81 corresponding to the application icons 51 to 61 are displayed in the display area 12 in the lower right corner portion of the display area 11, respectively. Thereby, the user can tap the application icons 71 to 81 in the display area 12 while holding the information terminal device 100 with the right hand.
  • step S5 the CPU 1 determines whether or not the finger position on the touch panel 2 is within the display area 12 based on the detection signal S2 from the touch panel 2, and proceeds to step S6 if YES.
  • step S11 another touch detection process is executed in step S11 based on the detection signal S2, and the process returns to step S5.
  • another touch detection process for example, when the finger position on the touch panel 2 is on one of the application icons 51 to 61, application software corresponding to the application icon at the finger position is used. This is a process for executing a wear program.
  • step S6 the CPU 1 converts the coordinates (xf, yf) of the finger in the xy coordinate system of the display area 11 into coordinates (pf, qf) in the pq coordinate system of the display area 12.
  • the upper left corner of the display area 12 is defined as the origin O2 of the pq coordinate system
  • the right direction in FIG. 2 is defined as the p-axis direction
  • the lower direction in FIG. It is defined as
  • a finger icon IC1 indicating the position of the finger in the display area 11 corresponding to the actual position of the finger in the display area 12 is displayed at the coordinates (pf, qf) in the xy coordinate system.
  • the coordinate (pf, qf) in the pq coordinate system is (50, 100)
  • the finger icon IC1 is displayed at the coordinate (50, 100) in the xy coordinate system.
  • step S8 the CPU 1 determines whether one of the application icons 71 to 81 has been tapped based on the detection signal S2. If YES, step S9 is performed. On the other hand, if NO, the process returns to step S5. In step S9, it is determined whether or not the tapped application icon corresponds to the character input application. If YES, the process proceeds to step S10. If NO, the process proceeds to step S12. Note that, in the case of NO in steps S5 and S8, the CPU 1 repeatedly executes the processes in steps S6 and S7 at predetermined time intervals.
  • FIG. 4 is a diagram illustrating a display example of the display 4 in step S7 of FIG. 3 when the user taps the application icon 74.
  • FIG. 5 is a diagram illustrating a display example of the display 4 in step S ⁇ b> 7 of FIG. 3 when the user swipes within the display area 12.
  • the finger icon IC ⁇ b> 1 is displayed at a fixed position on the application icon 54 corresponding to the application icon 74.
  • the finger icon IC ⁇ b> 1 moves in the display area 11 at predetermined time intervals.
  • step S10 in FIG. 3 the CPU 1 prohibits the display of the second display data, displays only the first display data on the display 4, and executes a program for the character input application corresponding to the tapped application icon. Start and execute, and return to step S1.
  • step S12 the CPU 1 starts and executes the program of the application corresponding to the tapped application icon while displaying the second display data in the display area 12, and returns to step S1.
  • step S12 the CPU 1 displays the finger icon IC1 by repeatedly executing the processes in steps S6 and S7 at predetermined time intervals.
  • FIG. 6 is a diagram showing a display example of the display 4 when the video reproduction application program is executed in step S12 of FIG.
  • the CPU 1 reduces the first display data in a similar shape and converts it into second display data (see step S3 in FIG. 3). Further, the CPU 1 synthesizes the first display data with the second display data so that the entire display area 12 of the second display data overlaps the display area 11 of the first display data (step S4 in FIG. 3). reference.). More specifically, as shown in FIGS. 2 and 4 to 6, the display screen of the display 4 corresponds to the entire display screen of the display 4 and includes a display area 11 for displaying first display data, Display area 12 for displaying second display data obtained by reducing the display data of one with a similar shape.
  • the display area 12 is arranged so as to overlap the display area 11 so as to hide the lower right corner portion of the display area 11. Further, the display area 11 displays application icons 51 to 61 for instructing to start characters and applications, video, and the finger cursor IC1, and the display area 12 instructs to start characters and applications. Application icons 71 to 81 and video and the like are displayed.
  • the CPU 1 detects that the user has tapped the display area 12 by using the touch panel 2, the finger coordinates (xf, yf) in the xy coordinate system are detected. Is converted into coordinates (pf, qf) in the pq coordinate system of the display area 12. Then, a finger icon IC1 indicating the position of the finger in the display area 11 corresponding to the actual position of the finger in the display area 12 is displayed at coordinates (pf, qf) in the xy coordinate system. That is, the CPU 1 converts the actual position of the finger in the display area 12 into a position in the display area 11 corresponding to the position of the finger, and displays the finger icon IC1 at the converted position.
  • the user can operate the application icons 71 to 81 in the display area 12 while looking at the finger icon IC1 in the display area 11.
  • the portable terminal described in Patent Document 1 since items such as buttons and links are collected and displayed near the finger, there is a problem that the displayed items are difficult to see and difficult to operate. For example, since the user does not need to operate while looking at the application icons 71 to 81 in the display area 12, the operability can be improved.
  • the position of the finger is measured at a predetermined time interval, and the measured position is displayed in the display area. 11 and the finger icon IC1 is displayed at predetermined time intervals. Therefore, an operation can be performed in the display area 12 while observing the movement of the finger icon IC1 in the display area 11. That is, the user can perform an operation in the display area 11 by performing an operation in the display area 12. Further, since the display area 12 is displayed at the bottom of the display 4, the user touches the display area 12 displayed at the position of the finger of the holding hand while holding the information terminal apparatus 100 with one hand, and the information terminal apparatus. 100 can be operated. Accordingly, the user can swipe, which is a characteristic operation of the touch panel, while holding the information terminal device 100 with one hand, or can start the application by tapping the application icons 71 to 81.
  • the CPU 1 taps an operation (for example, an application icon corresponding to the character input application among the application icons 71 to 81) that instructs execution of the character input application in the display area 12 by the touch panel 2. Is detected), the character input application program is executed (see step S10 in FIG. 3). At this time, the CPU 1 prohibits the display of the second display data and displays only the first display data on the display screen of the display 4. Therefore, since the display area 12 is not displayed when the character input application is executed, the display area 12 does not interfere with the character input. Further, the CPU 1 displays the display area 12 (see step S12 in FIG. 3) when executing other than a character input application such as a video reproduction application. For example, the user touches the display area 12. You can perform operations such as fast-forwarding.
  • an operation for example, an application icon corresponding to the character input application among the application icons 71 to 81
  • the character input application program is executed (see step S10 in FIG. 3). At this time, the CPU 1 prohibits the display of the second display data and displays
  • the finger icon IC1 is displayed on the display 4, but the present invention is not limited to this.
  • the CPU 1 may detect a user operation such as swipe or tap in the display area 12 and display an indicator corresponding to the operation in the display area 11.
  • FIG. 7 is a diagram illustrating another display example of the display 4 in step S ⁇ b> 7 of FIG. 3 when the user taps the application icon 74. In FIG. 7, an arrow-shaped cursor IC2 is displayed instead of the finger icon IC1.
  • FIG. 8 is a diagram showing another display example of the display 4 in step S7 of FIG. 3 when the user double taps the application icon 74. In FIG. 8, when the user double taps an icon, a finger icon IC3 including a star indicating a double tap is displayed.
  • the first display data is a menu screen including the application icons 51 to 61, and the CPU 1 is tapped when detecting that one of the application icons 71 to 81 is tapped.
  • the application program corresponding to the application icon is executed, the present invention is not limited to this.
  • the CPU 1 may execute the process.
  • the first display data may be display data including moving image display data and a moving image playback button and a stop button. At this time, when the CPU 1 detects that the moving image stop button displayed in the display area 12 has been tapped, the CPU 1 executes processing for stopping the moving image display.
  • the CPU 1 prohibits displaying the second display data when executing the character input application, but the present invention is not limited to this. For example, it may be determined whether to display the second display data according to the type of application to be executed. Further, the CPU 1 may prohibit the display of the second display data when detecting that the user has tapped an area other than the display area 12.
  • the CPU 1 combines the first display data with the second display data so that the entire display area 12 hides a part of the display area 11.
  • the CPU 1 transfers the first display data to the second display data so that the display area 12A for the second display data does not overlap the display area 11A for the first display data. The only difference is that it is combined with the display data, and the other configuration and operation are the same as those in the first embodiment.
  • FIG. 9 is a diagram showing a display example of the display areas 11A and 12A according to the second embodiment of the present invention.
  • the CPU 1 reduces the first display data in a similar shape and converts it into second display data, displays the first display data in the display area 11 ⁇ / b> A in the upper left part of the display 4,
  • the second display data is displayed in the display area 12A in the lower right corner portion of the display 4, and the first display data is combined with the second display data so that the display area 12A does not overlap the display area 11A.
  • the displayed display data is displayed on the display screen of the display 4.
  • the user can perform the same operation as that performed in the display area 11A by performing an operation in the display area 12A.
  • FIG. 10 is a diagram showing a display example of the display areas 11B and 12B according to the first modification of the second embodiment of the present invention.
  • the present modified example is that the CPU 1 combines the first display data with the second display data so that a part of the display area 12B overlaps a part of the display area 11B. Only the difference is that other configurations and operations are the same as those of the second embodiment.
  • the CPU 1 reduces the first display data in a similar shape and converts it into second display data, and the first display data is displayed in the display area in the upper left portion of the display 4.
  • the second display data is displayed in the display area 12B in the lower right corner portion of the display 4, and the first display data is displayed so that a part of the display area 12B overlaps a part of the display area 11B.
  • the display data is combined with the second display data, and the combined display data is displayed on the display 4.
  • the user can perform the same operation as when operating in the display area 11B by performing an operation in the display area 12B.
  • FIG. 11 is a diagram showing a display example of the display areas 11C and 12C according to the second modification of the second embodiment of the present invention.
  • the CPU 1 displays the first and second display data in the display areas 11C and 12C that have the same size and do not overlap each other, as compared with the second embodiment. The only difference is that the first display data is combined with the second display data, and other configurations and operations are the same as those of the second embodiment.
  • the CPU 1 converts the first display data into second display data having the same shape as the first display data, and converts the first display data into the upper left part of the display 4.
  • the second display data is displayed in the display area 12C in the lower right corner portion of the display 4, and the first display data is displayed in the second area so that the display area 12C does not overlap the display area 11C.
  • the combined display data is displayed on the display 4.
  • the user can perform the same operation as that performed in the display area 11C by performing an operation in the display area 12C.
  • the display data having the same display size as the entire display area of the display 4 is reduced in a similar shape so that the same display size as that of the display area 11A, 11B, or 11C is obtained. You may convert beforehand to the 1st display data which have.
  • the CPU 1 reduced the first display data in a similar shape and converted it into second display data or converted it into second display data having the same shape as the first display data.
  • the present invention is not limited to this, and it is sufficient that an operation in the display area of the first display data can be performed by a user operating in the display area of the second display data.
  • FIG. 12 is a flowchart showing step S3A of the control process according to the third embodiment of the present invention.
  • the control process executed by the CPU 1 in the present embodiment is obtained by replacing step S3 with step S3A in the control process of FIG. 3, and the steps S1 to S2, S4 to S4 other than step S3A.
  • S12 is the same as FIG.
  • the CPU 1 reduces the first display data to an unsimilar shape and converts it into second display data. At this time, the CPU 1 reduces the display content of the first display data such as the application icons 51 to 61 in an unsimilar shape.
  • step S4 the first display data is combined with the second display data, and the combined display data is displayed on the display 4.
  • the CPU 1 displays the first display data in the display area 11D at the top of the display screen of the display 4 and the second display data in the display area 12D at the bottom of the display screen of the display 4.
  • the first display data is combined with the second display data (so that the display areas 11D and 12D do not overlap each other), and the combined display data is generated.
  • FIG. 13 is a diagram showing a display example of the display 4 in step S3A of FIG.
  • application icons 51, 52, 53, 54,... are arranged in a grid pattern in the display area 11D.
  • application icons 71, 72, 73, 74,... Corresponding to the application icons 51, 52, 53, 54,... are rearranged in the display area 12D and arranged in a line.
  • the CPU 1 uses the finger coordinates (xf, yf) as the application icons 71, 72, 73,.
  • the finger coordinates (xf, yf) are converted into predetermined coordinates in the application icons 51, 52, 53,...
  • the display area 11D corresponding to the icon including the coordinates (xf, yf), and the finger icon IC1 or the like is indicated in the converted coordinates. Display children.
  • the user can perform the same operation as when operating in the display area 11D by performing an operation in the display area 12D.
  • the display area 12D does not overlap the display area 11D, but the present invention is not limited to this, and at least a part of the display area 12D may overlap the display area 11D.
  • FIG. 14 is a diagram showing a display example of the icon IC2 according to the fourth embodiment of the present invention.
  • the CPU 1 determines that the actual positions of the fingers in the display areas 12, 12A, 12B, 12C, and 12D are the finger icons IC1 in the display areas 11, 11A, 11B11C, and 11D. There was a one-to-one correspondence with the position of the indicator.
  • the CPU 1 detects a finger movement operation in the display area 12 based on the detection signal S2 from the touch panel 2, the CPU 1 detects the movement operation detected in the display area 11.
  • the cursor IC2 is displayed so that the cursor IC2 is moved in the movement direction by the movement amount corresponding to the movement amount of the detected movement operation.
  • the user can use the display area 12 as a touch pad.
  • the first display data is converted into the second display data and displayed in the display area 12.
  • the present invention is not limited to this, and another display data other than the first and second display data is displayed. Display data may be displayed in the display area 12.
  • FIG. 15 is a diagram showing a display example of the display areas 11, 12R and 12L according to the fifth embodiment of the present invention.
  • the CPU 1 converts the first display data into one second display data.
  • the CPU 1 reduces the first display data with a similar shape and converts it into two pieces of second display data. Then, the CPU 1 displays the first display data in the display area 11 of the entire display screen of the display 4, displays one second display data in the display area 12R in the lower right corner portion of the display 4, and the other display The first display data is combined with the two second display data so that the display data of 2 is displayed in the display area 12L in the lower left corner portion of the display 4.
  • the display areas 12R and 12L are used for operations with the right hand and the left hand, respectively.
  • the touch panel 2 detects a pressure in addition to the finger coordinates (xf, yf) and the operation content, and generates a detection signal S2 including the finger coordinates (xf, yf), the operation content, and the pressure. Generated and output to the CPU 1. Further, the CPU 1 selects the display area with the greater pressing out of the display areas 12R and 12L based on the detection signal S2 from the touch panel 2, and based on the user's operation in the selected display area, The finger icon IC1 is displayed in the display area 11 as in the first embodiment (see FIG. 15).
  • the operation of the display area with the larger pressing of the display areas 12R and 12L is adopted, so that even if the user touches the display 4 with either the left or right finger, the operation of either finger Only becomes active. For this reason, the information terminal device 100 is operated with either finger of the left and right hands of the user.
  • the CPU 1 selects the display area with the greater pressing out of the display areas 12R and 12L, and displays the finger icon IC1 based on the user's operation within the selected display area.
  • the present invention is not limited to this.
  • the CPU 1 may select the display area touched first or the display area selected by a predetermined operation such as the operation of the button 9 among the display areas 12R and 12L.
  • FIG. 16 is a block diagram showing a configuration of an information terminal device 100A according to the sixth embodiment of the present invention.
  • the information terminal device 100 includes a display instruction unit 200, a display control unit 201, an operation unit 202, a monitor 102, a conversion unit 203, and a processing unit 204.
  • the monitor 102 corresponds to the interface 3 and the display 4 in FIG.
  • the operation unit 202 corresponds to the touch panel 2 in FIG. 1
  • the display instruction unit 200 corresponds to the buttons 7 to 9 and the touch panel 2 in FIG.
  • the display control unit 201, the conversion unit 203, and the processing unit 204 correspond to the CPU 1 in FIG. 1, and execute the control process of the CPU 1 according to each of the above-described embodiments and modifications thereof.
  • the operation means 202 is operated by the user with a finger in the display area 11, 11A, 11B, 11C, 11D, 12, 12A, 12B, 12C, 12D, 12L or 12R of the display screen of the monitor 102.
  • the position and movement of the finger are detected, and the user's operation content is determined from the detected position and movement information. For example, when the user clicks on the application icon 51 displayed in the display area 11 of FIG. 2, the operation unit 202 determines that the operation content of the user is an instruction to start an application corresponding to the application icon 51. .
  • the processing unit 204 executes processing corresponding to the operation content according to the operation content determined by the operation unit 202.
  • the user's operation content is an application activation instruction corresponding to the application icon 51
  • the program of the application is actually executed.
  • the display instruction means 200 The instruction is notified to the display control means 201. For example, when the user touches a button (for example, the application icon 52 of FIG. 2) for displaying the display area 12 displayed in the display area 11, the display area 12 is displayed.
  • the display control unit 201 is notified by the display instruction unit 200 to display the display areas 12, 12A, 12B, 12C, 12D or 12L and 12R
  • the display areas 12, 12A, 12B, 12C, 12D or 12L and 12R are displayed on the display screen of the monitor 102.
  • the display area 12 is displayed on the display screen of the monitor 102 so as to hide a part of the display area 11.
  • the conversion unit 203 generates the second display data to be displayed in the display area 12 by reducing the display content of the display area 11 as it is.
  • This embodiment has the same effects as the first to fifth embodiments and their modifications.
  • the second display data is displayed in the display areas 12, 12A, 12B, 12C, 12D, 12L, and 12R below the display 4, but the present invention is not limited to this. Instead, it may be displayed in any area of the display 4.
  • the first display data is combined with the second display data so that the display area of the second display data is located at the lower part of the display screen of the display 4.
  • the first display data is stored in the second display data so that the display area of the second display data is located at the lower part of the display screen of the display 4. By combining with the display data, the user can touch the display area of the second display data with the handle.
  • the present invention has been described by taking the information terminal device 100 as an example, but the present invention is not limited to this.
  • the present invention can be applied to an electronic device provided with detection means such as a touch panel 2 that detects an operation on a display screen such as the display 4. Therefore, it is possible to provide an electronic device that can be operated on the entire display screen by operating the sub-screen displayed at the position of the finger of the electronic device. For this reason, according to this invention, even if it is a tablet-type electronic device with which a user is generally assumed to hold
  • predetermined first display data is converted into at least one second display data, and the first display data is combined with each second display data.
  • the display data synthesized in this way is displayed on the display screen, and an operation for instructing execution of a predetermined process is performed within the display area of one of the at least one second display data by the detecting means.
  • the control means for executing the processing is provided, the user simply performs an operation in the display area of the second display data, and is similar to the operation performed in the display area of the first display data. Can be operated. Therefore, it is possible to provide an electronic device with better operability compared to the prior art.
  • buttons, 10 housing, 11, 11A, 11B, 11C, 11D, 12, 12A, 12B, 12C, 12D, 12L, 12R ... display area, 51-61, 71-81 ... application icons, 100, 100A ... information terminal device, 102 ... monitor, 200 ... display instruction means, 201 ... display control means, 202 ... operation means, 203 ... conversion means, 204 ... processing means, IC1, IC3 ... finger icon, IC2 ... cursor.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

First display data including application icons (51 to 61) is converted to second display data by reducing the size while maintaining a similar shape, and the first display data and the second display data are combined such that the first display data is displayed on a display area (11) and the second display data is displayed on a display area (12). When a touch is detected on the display area (12), the touched position is converted to a corresponding position on the display area (11) and a finger icon (IC1) is displayed on the converted position.

Description

電子機器Electronics
 本発明は、表示画面上での操作を検出するタッチパネルなどの検出手段を備えた電子機器に関する。 The present invention relates to an electronic device provided with detection means such as a touch panel for detecting an operation on a display screen.
 タッチパネルを搭載した多くの電子機器が商品化されている。また、近年、このような電子機器において、タッチパネルのサイズは大きくなってきている。タッチパネルのサイズが大きくなるにつれて、従来の比較的小さいサイズのタッチパネルを搭載した小型電子機器よりもユーザが手を動かす範囲が広くなり、使い勝手は悪くなる。 Many electronic devices equipped with touch panels have been commercialized. In recent years, in such electronic devices, the size of the touch panel has been increasing. As the size of the touch panel increases, the range in which the user moves his / her hand becomes wider than that of a conventional small electronic device equipped with a relatively small touch panel, and the usability deteriorates.
 特許文献1では、タッチパネルの画面において、ユーザのもち手の指で操作可能な領域を判断し、その領域内にボタンやリンク等の項目を表示することで、ユーザの操作性を向上させることができる携帯端末が開示されている。 In Patent Document 1, it is possible to improve a user's operability by determining an area that can be operated with a finger of a user's hand on a touch panel screen and displaying items such as buttons and links in the area. A portable terminal that can be used is disclosed.
特開2001-79442号公報。JP 2001-79442 A.
 特許文献1記載の携帯端末の場合、指の近くにボタンやリンク等の項目を集めて表示するので、タッチパネルで特徴的な動作であるスワイプ等の指による操作ができなかった。また、集めて表示された項目が見づらくなるので操作性は必ずしも向上しなかった。 In the case of the portable terminal described in Patent Document 1, since items such as buttons and links are collected and displayed near the finger, operations such as swipe, which is a characteristic operation on the touch panel, cannot be performed. Also, the operability is not always improved because it is difficult to see the items collected and displayed.
 本発明の目的は以上の問題点を解決し、表示画面上での操作を検出するタッチパネルなどの検出手段を備え、かつ従来技術に比較して操作性を向上できる電子機器を提供することを目的とする。 An object of the present invention is to solve the above problems, and to provide an electronic device that includes a detection means such as a touch panel for detecting an operation on a display screen and that can improve operability as compared with the prior art. And
 本発明に係る電子機器は、表示画面上での操作を検出する検出手段を備えた電子機器であって、
 所定の第1の表示データを少なくとも1つの第2の表示データに変換し、上記第1の表示データを上記各第2の表示データと合成して合成された表示データを上記表示画面上に表示し、上記検出手段により上記少なくとも1つの第2の表示データのうちの1つの表示データの表示領域内で所定の操作を検出する制御手段を備え、
 上記制御手段は、上記検出手段により、上記少なくとも1つの第2の表示データのうちの1つの表示データの表示領域内で、所定の処理の実行を指示する操作を検出したとき、上記処理を実行することを特徴とする。
An electronic device according to the present invention is an electronic device provided with detection means for detecting an operation on a display screen,
The predetermined first display data is converted into at least one second display data, and the first display data is combined with the second display data to display the combined display data on the display screen. And a control means for detecting a predetermined operation within the display area of one of the at least one second display data by the detection means,
The control means executes the process when the detecting means detects an operation instructing execution of a predetermined process within the display area of one of the at least one second display data. It is characterized by doing.
 上記電子機器において、上記制御手段は、上記所定の操作に対応する指示子を、上記第1の表示データの表示領域内に表示することを特徴とする。 In the electronic apparatus, the control means displays an indicator corresponding to the predetermined operation in a display area of the first display data.
 また、上記電子機器において、上記制御手段は、上記検出手段により、上記少なくとも1つの第2の表示データのうちの1つの表示データの表示領域内の第1の位置で上記所定の操作を検出したとき、上記第1の位置の座標を上記第1の表示データの表示領域内の上記第1の位置に対応する第2の位置の座標に変換し、上記第2の位置に上記指示子を表示することを特徴とする。 In the electronic apparatus, the control unit detects the predetermined operation at a first position in a display area of one display data of the at least one second display data by the detection unit. Then, the coordinates of the first position are converted into the coordinates of the second position corresponding to the first position in the display area of the first display data, and the indicator is displayed at the second position. It is characterized by doing.
 さらに、上記電子機器において、上記制御手段は、上記検出手段により、上記少なくとも1つの第2の表示データのうちの1つの表示データの表示領域内で移動操作を検出したとき、上記第1の表示データの表示領域内で、上記移動操作の移動方向に上記移動操作の移動量に対応する移動量だけ上記指示子を移動させるように上記指示子を表示することを特徴とする。 Furthermore, in the electronic device, the control means detects the first display when the detecting means detects a movement operation within a display area of one display data of the at least one second display data. In the data display area, the indicator is displayed so that the indicator is moved by a movement amount corresponding to the movement amount of the movement operation in the movement direction of the movement operation.
 またさらに、上記電子機器において、上記制御手段は、上記処理の実行時に、上記第2の表示データを表示することを禁止して上記第1の表示データのみを上記表示画面上に表示することを特徴とする。 Still further, in the electronic device, the control means prohibits the display of the second display data and displays only the first display data on the display screen when the process is executed. Features.
 また、上記電子機器において、上記制御手段は、上記第1の表示データを相似形状で縮小して上記各第2の表示データに変換することを特徴とする。 Further, in the electronic apparatus, the control means reduces the first display data in a similar shape and converts it into the second display data.
 さらに、上記電子機器において、上記制御手段は、上記第1の表示データを非相似形状で縮小して上記各第2の表示データに変換することを特徴とする。 Further, in the electronic device, the control means reduces the first display data in an unsimilar shape and converts it into the second display data.
 またさらに、上記電子機器において、上記制御手段は、上記各第2の表示データの表示領域の少なくとも一部が上記第1の表示データの表示領域と重なるように、上記第1の表示データを上記各第2の表示データと合成することを特徴とする。 Still further, in the electronic device, the control means sets the first display data to the first display data so that at least a part of the display area of the second display data overlaps the display area of the first display data. It is characterized by being combined with each second display data.
 また、上記電子機器において、上記制御手段は、上記各第2の表示データの表示領域が上記第1の表示データの表示領域と重ならないように、上記第1の表示データを上記各第2の表示データと合成することを特徴とする。 In the electronic apparatus, the control unit may change the first display data to the second display data so that a display area of the second display data does not overlap with a display area of the first display data. It is characterized by combining with display data.
 さらに、上記電子機器において、上記制御手段は、上記各第2の表示データの表示領域が上記表示画面の下部に位置するように、上記第1の表示データを上記各第2の表示データと合成することを特徴とする。 Further, in the electronic apparatus, the control unit combines the first display data with the second display data so that a display area of the second display data is positioned at a lower portion of the display screen. It is characterized by doing.
 またさらに、上記電子機器において、上記制御手段は、上記第1の表示データを2つの上記第2の表示データに変換することを特徴とする。 Still further, in the electronic apparatus, the control means converts the first display data into two second display data.
 本発明に係る電子機器によれば、所定の第1の表示データを少なくとも1つの第2の表示データに変換し、第1の表示データを各第2の表示データと合成して合成された表示データを表示画面上に表示し、検出手段により、少なくとも1つの第2の表示データのうちの1つの表示データの表示領域内で、所定の処理の実行を指示する操作を検出したとき、当該処理を実行する制御手段を備えたので、ユーザは、第2の表示データの表示領域内で操作を行うだけで、第1の表示データの表示領域内で行う操作と同様の操作を行える。従って、従来技術に比較して操作性の良い電子機器を提供できる。 According to the electronic device of the present invention, predetermined display data is converted into at least one second display data, and the first display data is combined with each second display data to be combined. When the data is displayed on the display screen and the operation of instructing the execution of the predetermined process is detected within the display area of one of the at least one second display data by the detection means, the process Therefore, the user can perform the same operation as the operation performed in the display area of the first display data only by performing the operation in the display area of the second display data. Therefore, it is possible to provide an electronic device with better operability compared to the prior art.
本発明の第1の実施形態に係る情報端末装置100の構成を示すブロック図である。It is a block diagram which shows the structure of the information terminal device 100 which concerns on the 1st Embodiment of this invention. 図1の情報端末装置100の正面図である。It is a front view of the information terminal device 100 of FIG. 図1のCPU1によって実行される制御処理を示すフローチャートである。It is a flowchart which shows the control processing performed by CPU1 of FIG. ユーザがアプリケーションアイコン74をタップしているときの図3のステップS7におけるディスプレイ4の表示例を示す図である。FIG. 4 is a diagram showing a display example of the display 4 in step S <b> 7 of FIG. 3 when the user is tapping an application icon 74. ユーザが表示領域12内でスワイプしたときの図3のステップS7におけるディスプレイ4の表示例を示す図である。It is a figure which shows the example of a display of the display 4 in FIG.3 S7 when a user swipes within the display area 12. FIG. 図3のステップS12において映像再生用のアプリケーションのプログラムを実行したときのディスプレイ4の表示例を示す図である。It is a figure which shows the example of a display of the display 4 when the program of the application for video reproduction | regeneration is performed in step S12 of FIG. ユーザがアプリケーションアイコン74をタップしているときの図3のステップS7におけるディスプレイ4の別の表示例を示す図である。It is a figure which shows another example of a display of the display 4 in FIG.3 S7 when the user is tapping the application icon 74. FIG. ユーザがアプリケーションアイコン74をダブルタップしたときの図3のステップS7におけるディスプレイ4の別の表示例を示す図である。It is a figure which shows another example of a display of the display 4 in FIG.3 S7 when a user double taps the application icon 74. FIG. 本発明の第2の実施形態に係る表示領域11A及び12Aの表示例を示す図である。It is a figure which shows the example of a display of display area 11A and 12A which concerns on the 2nd Embodiment of this invention. 本発明の第2の実施形態の第1の変形例に係る表示領域11B及び12Bの表示例を示す図である。It is a figure which shows the example of a display of the display areas 11B and 12B which concern on the 1st modification of the 2nd Embodiment of this invention. 本発明の第2の実施形態の第2の変形例に係る表示領域11C及び12Cの表示例を示す図である。It is a figure which shows the example of a display of the display areas 11C and 12C which concern on the 2nd modification of the 2nd Embodiment of this invention. 本発明の第3の実施形態に係る制御処理のステップS3Aを示すフローチャートである。It is a flowchart which shows step S3A of the control processing which concerns on the 3rd Embodiment of this invention. 図12のステップS3Aにおけるディスプレイ4の表示例を示す図である。It is a figure which shows the example of a display of the display 4 in step S3A of FIG. 本発明の第4の実施形態に係るアイコンIC2の表示例を示す図である。It is a figure which shows the example of a display of icon IC2 which concerns on the 4th Embodiment of this invention. 本発明の第5の実施形態に係る表示領域11,12R及び12Lの表示例を示す図である。It is a figure which shows the example of a display of the display areas 11, 12R, and 12L which concern on the 5th Embodiment of this invention. 本発明の第6の実施形態に係る情報端末装置100Aの構成を示すブロック図である。It is a block diagram which shows the structure of 100 A of information terminal devices which concern on the 6th Embodiment of this invention.
 以下、本発明に係る実施形態について図面を参照して説明する。なお、同様の構成要素については同一の符号を付している。 Embodiments according to the present invention will be described below with reference to the drawings. In addition, the same code | symbol is attached | subjected about the same component.
第1の実施形態.
 本実施形態及び以下の各実施形態において、本発明に係る電子機器を、ディスプレイ4の表示画面上でのユーザの操作を検出するタッチパネル2を備えた情報端末装置100を例に挙げて説明する。
First embodiment.
In the present embodiment and each of the following embodiments, an electronic device according to the present invention will be described using the information terminal device 100 including the touch panel 2 that detects a user operation on the display screen of the display 4 as an example.
 図1は、本発明の第1の実施形態に係る情報端末装置100の構成を示すブロック図であり、図2は、図1の情報端末装置100の正面図である。図1において、情報端末装置100は、CPU(Central Processing Unit)1と、タッチパネル2と、インターフェース3と、ディスプレイ(モニタともいう。)4と、ROM(Read Only Memory)5と、DRAM(Dynamic Random Access Memory)6と、ボタン7~9と、筐体10(図2参照。)とを備えて構成される。 FIG. 1 is a block diagram showing the configuration of the information terminal device 100 according to the first embodiment of the present invention, and FIG. 2 is a front view of the information terminal device 100 of FIG. In FIG. 1, an information terminal device 100 includes a CPU (Central Processing Unit) 1, a touch panel 2, an interface 3, a display (also called a monitor) 4, a ROM (Read Only Memory) 5, a DRAM (Dynamic Random). (Access Memory) 6, buttons 7 to 9, and a housing 10 (see FIG. 2).
 詳細後述するように、情報端末装置100は、ディスプレイ4の表示画面上での操作を検出するタッチパネル2を備えた電子機器であって、所定の第1の表示データを第2の表示データに変換し、第1の表示データを第2の表示データと合成して合成された表示データをディスプレイ4の表示画面上に表示し、タッチパネル2により第2の表示データの表示領域12内で所定の操作を検出するCPU1を備えたことを特徴としている。ここで、CPU1は、タッチパネル2により、表示領域12内で、所定の処理の実行を指示する操作を検出したとき、当該処理を実行することを特徴としている。また、CPU1は、検出された所定の操作に対応する指アイコンIC1を第1の表示データの表示領域11内に表示することを特徴としている。 As will be described in detail later, the information terminal device 100 is an electronic device that includes the touch panel 2 that detects an operation on the display screen of the display 4 and converts predetermined first display data into second display data. Then, the display data synthesized by combining the first display data with the second display data is displayed on the display screen of the display 4, and a predetermined operation is performed in the display area 12 of the second display data by the touch panel 2. It is characterized by having a CPU 1 for detecting the above. Here, the CPU 1 is characterized in that when the touch panel 2 detects an operation instructing execution of a predetermined process within the display area 12, the CPU 1 executes the process. Further, the CPU 1 is characterized in that the finger icon IC1 corresponding to the detected predetermined operation is displayed in the display area 11 of the first display data.
 図1において、ROM5は、情報端末装置100の動作に必要であってCPU1によって実行される種々のソフトウェアのプログラムを予め格納する。DRAM6は、CPU1のワーキングエリアとして使用されて、CPU1がROM5に格納されたプログラムを実行したときに、当該実行するプログラムに対応する機能を実行するために必要な実行プログラムとそれを実行するために必要なデータ及び実行時に発生する一時的なデータを格納する。 In FIG. 1, the ROM 5 stores various software programs that are necessary for the operation of the information terminal device 100 and executed by the CPU 1 in advance. The DRAM 6 is used as a working area of the CPU 1, and when the CPU 1 executes a program stored in the ROM 5, the execution program necessary for executing a function corresponding to the program to be executed and the execution program are executed. Stores necessary data and temporary data generated during execution.
 また、図1において、インターフェース3は、CPU1からの表示データに対して信号変換などの映像表示処理に係る所定のインターフェース処理を実行し、処理後の表示データをディスプレイ4に出力して表示する。図1及び図2において、ディスプレイ4は、液晶表示装置(LCD(Liquid Crystal Display))などの表示装置であり、情報端末装置100の筐体10の正面に設けられ、種々のGUI(Graphic User Interface)プログラムのための表示装置として機能する。図1において、タッチパネル2はディスプレイ4の表示画面上でのユーザの操作を検出する検出手段であって、ディスプレイ4の表面に設けられた透明のフィルムとタッチ検出手段とを備えて構成される。ここで、タッチ検出手段は、上述したフィルムに接触したユーザの指の位置及び動きを検出し、検出された指の位置及び動きに基づいて、指の座標(xf,yf)及び操作内容を含む検出信号S2を発生してCPU1に出力する。なお、図2に示すように、ディスプレイ4の表示画面の左上の角をxy座標系の原点O1と定義し、図2の右方向をx軸方向と定義し、図2の下方向をy軸方向と定義する。また、ユーザの操作内容は、タッチパネル2を1回だけ叩くタップ(クリックともいう。)、タップを2回連続で行うダブルタップ及びタッチパネル2に指を触れたまま指を滑らせるスワイプなどの操作を含む。 In FIG. 1, the interface 3 executes predetermined interface processing related to video display processing such as signal conversion on the display data from the CPU 1, and outputs the processed display data to the display 4 for display. 1 and 2, the display 4 is a display device such as a liquid crystal display (LCD (Liquid Crystal Display)), and is provided on the front surface of the housing 10 of the information terminal device 100, and various GUIs (Graphic User Interface). ) Functions as a display device for the program. In FIG. 1, the touch panel 2 is a detection unit that detects a user operation on the display screen of the display 4, and includes a transparent film provided on the surface of the display 4 and a touch detection unit. Here, the touch detection means detects the position and movement of the user's finger that has touched the above-described film, and includes the coordinates (xf, yf) of the finger and the operation content based on the detected position and movement of the finger. A detection signal S2 is generated and output to the CPU1. 2, the upper left corner of the display screen of the display 4 is defined as the origin O1 of the xy coordinate system, the right direction in FIG. 2 is defined as the x axis direction, and the lower direction in FIG. 2 is defined as the y axis. Defined as direction. In addition, the user's operation includes a tap (also referred to as a click) for tapping the touch panel 2 only once, a double tap for tapping twice in succession, and a swipe for sliding the finger while touching the touch panel 2. Including.
 さらに、図1及び図2において、情報端末装置100は、情報端末装置100をオン/オフするためのボタン7と、ディスプレイ4に所定のメニュー画面を表示するためのボタン8と、ディスプレイ4に詳細後述する表示領域12(子画面ともいう。)を表示するためのボタン9とを備える。 1 and 2, the information terminal device 100 includes a button 7 for turning on / off the information terminal device 100, a button 8 for displaying a predetermined menu screen on the display 4, and details on the display 4. And a button 9 for displaying a display area 12 (also referred to as a sub-screen) to be described later.
 図1において、CPU1は、タッチパネル2、インターフェース3、ROM5及びDRAM6と接続されてそれらを制御するほか、タッチパネル2からの検出信号S2及びボタン7~9が操作されたか否かを示す情報に基づいて種々のソフトウェアの機能を実行する。ユーザは、ボタン7~9を操作し又はタッチパネル2に指で触れることにより情報端末装置100を操作できる。 In FIG. 1, the CPU 1 is connected to the touch panel 2, the interface 3, the ROM 5, and the DRAM 6 to control them, and based on the detection signal S2 from the touch panel 2 and information indicating whether or not the buttons 7 to 9 are operated. Perform various software functions. The user can operate the information terminal device 100 by operating the buttons 7 to 9 or touching the touch panel 2 with a finger.
 図3は、図1のCPU1によって実行される制御処理を示すフローチャートである。CPU1は、例えばユーザがボタン8を操作したときに図1の制御処理を実行する。まず始めに、ステップS1において、CPU1は、映像表示アプリケーション及び文字入力アプリケーションなどの各アプリケーションのプロクラムの実行を指示するためのアプリケーションアイコン51~61(図2参照。)を含むメニュー画面の第1の表示データを、ディスプレイ4の表示画面全体に表示する。次に、ステップS2において、CPU1は、タッチパネル2又はボタン9において、後述する第2の表示データの表示を指示するための操作が行われたか否かを判断し、YESのときはステップS3に進む一方、NOのときはステップS4の処理を繰り返して実行する。例えば、CPU1は、ディスプレイ4に表示されたアプリケーションアイコン51がタップされたとき、及びボタン9が操作されたときにステップS2においてYESと判断する。 FIG. 3 is a flowchart showing a control process executed by the CPU 1 of FIG. The CPU 1 executes the control process of FIG. 1 when the user operates the button 8, for example. First, in step S1, the CPU 1 includes a first menu screen including application icons 51 to 61 (see FIG. 2) for instructing execution of a program of each application such as a video display application and a character input application. Display data is displayed on the entire display screen of the display 4. Next, in step S2, the CPU 1 determines whether or not an operation for instructing display of second display data to be described later has been performed on the touch panel 2 or the button 9. If YES, the process proceeds to step S3. On the other hand, if NO, the process of step S4 is repeated. For example, the CPU 1 determines YES in step S <b> 2 when the application icon 51 displayed on the display 4 is tapped and when the button 9 is operated.
 CPU1は、図3のステップS3において、第1の表示データを相似形状で縮小して第2の表示データに変換する。このとき、CPU1は、アプリケーションアイコン51~61などの第1の表示データの表示内容をそのまま相似形状で縮小する。そして、ステップS3に続いて、ステップS4において、第1の表示データを第2の表示データと合成して、合成された表示データをディスプレイ4に表示する。具体的には、CPU1は、第1の表示データをディスプレイ4の表示画面全体の表示領域11に表示しかつ第2の表示データをディスプレイ4の表示画面の右下角部分の表示領域12に表示するように(表示領域12全体が表示領域11の一部を隠すように)、第1の表示データを第2の表示データと合成して、合成された表示データを生成する。図2に、図3のステップS4におけるディスプレイ4の表示例を示す。図2において、ディスプレイ4の表示画面全体に対応する表示領域11にアプリケーションアイコン51~61が表示される。また、表示領域11の右下角部分の表示領域12に、アプリケーションアイコン51~61にそれぞれ対応するアプリケーションアイコン71~81が表示される。これにより、ユーザは、右手で情報端末装置100を保持したまま、表示領域12内のアプリケーションアイコン71~81をタップできる。 In step S3 in FIG. 3, the CPU 1 reduces the first display data with a similar shape and converts it into second display data. At this time, the CPU 1 reduces the display content of the first display data such as the application icons 51 to 61 in a similar shape as it is. Subsequently to step S3, in step S4, the first display data is combined with the second display data, and the combined display data is displayed on the display 4. Specifically, the CPU 1 displays the first display data in the display area 11 of the entire display screen of the display 4 and the second display data in the display area 12 in the lower right corner portion of the display screen of the display 4. As described above (the entire display area 12 hides a part of the display area 11), the first display data is combined with the second display data to generate combined display data. FIG. 2 shows a display example of the display 4 in step S4 of FIG. In FIG. 2, application icons 51 to 61 are displayed in the display area 11 corresponding to the entire display screen of the display 4. In addition, application icons 71 to 81 corresponding to the application icons 51 to 61 are displayed in the display area 12 in the lower right corner portion of the display area 11, respectively. Thereby, the user can tap the application icons 71 to 81 in the display area 12 while holding the information terminal device 100 with the right hand.
 次に、ステップS5において、CPU1は、タッチパネル2からの検出信号S2に基づいて、タッチパネル2上の指の位置が表示領域12内であるか否かを判断し、YESのときはステップS6に進む一方、NOのときはステップS11において検出信号S2に基づいて別のタッチ検出処理を実行し、ステップS5に戻る。ここで、別のタッチ検出処理は、例えば、タッチパネル2上の指の位置がアプリケーションアイコン51~61のうちの1つのアプリケーションアイコン上にある場合に、指の位置にあるアプリケーションアイコンに対応するアプリケーションソフトウエアのプログラムを実行する処理である。 Next, in step S5, the CPU 1 determines whether or not the finger position on the touch panel 2 is within the display area 12 based on the detection signal S2 from the touch panel 2, and proceeds to step S6 if YES. On the other hand, if NO, another touch detection process is executed in step S11 based on the detection signal S2, and the process returns to step S5. Here, in another touch detection process, for example, when the finger position on the touch panel 2 is on one of the application icons 51 to 61, application software corresponding to the application icon at the finger position is used. This is a process for executing a wear program.
 さらに、ステップS6において、CPU1は、表示領域11のxy座標系における指の座標(xf,yf)を表示領域12のpq座標系における座標(pf,qf)に変換する。ここで、図2に示すように、表示領域12の左上の角をpq座標系の原点O2と定義し、図2の右方向をp軸方向と定義し、図2の下方向をq軸方向と定義する。 Furthermore, in step S6, the CPU 1 converts the coordinates (xf, yf) of the finger in the xy coordinate system of the display area 11 into coordinates (pf, qf) in the pq coordinate system of the display area 12. Here, as shown in FIG. 2, the upper left corner of the display area 12 is defined as the origin O2 of the pq coordinate system, the right direction in FIG. 2 is defined as the p-axis direction, and the lower direction in FIG. It is defined as
 そして、ステップS7において、xy座標系における座標(pf,qf)に、表示領域12内の指の実際の位置に対応する表示領域11における指の位置を示す指アイコンIC1を表示する。例えば、pq座標系における座標(pf,qf)が(50,100)であるとき、xy座標系における座標(50,100)に指アイコンIC1を表示する。 In step S7, a finger icon IC1 indicating the position of the finger in the display area 11 corresponding to the actual position of the finger in the display area 12 is displayed at the coordinates (pf, qf) in the xy coordinate system. For example, when the coordinate (pf, qf) in the pq coordinate system is (50, 100), the finger icon IC1 is displayed at the coordinate (50, 100) in the xy coordinate system.
 図3において、ステップS7に続いてステップS8において、CPU1は、検出信号S2に基づいて、アプリケーションアイコン71~81のうちの1つのアイコンがタップされたか否かを判断し、YESのときはステップS9に進む一方、NOのときはステップS5に戻る。そして、ステップS9において、タップされたアプリケーションアイコンが文字入力アプリケーションに対応するか否かを判断し、YESのときはステップS10に進む一方、NOのときはステップS12に進む。なお、CPU1は、ステップS5及びS8においてNOの場合、ステップS6及びS7の各処理を所定の時間間隔で繰り返して実行する。 In FIG. 3, in step S8 following step S7, the CPU 1 determines whether one of the application icons 71 to 81 has been tapped based on the detection signal S2. If YES, step S9 is performed. On the other hand, if NO, the process returns to step S5. In step S9, it is determined whether or not the tapped application icon corresponds to the character input application. If YES, the process proceeds to step S10. If NO, the process proceeds to step S12. Note that, in the case of NO in steps S5 and S8, the CPU 1 repeatedly executes the processes in steps S6 and S7 at predetermined time intervals.
 図4は、ユーザがアプリケーションアイコン74をタップしているときの図3のステップS7におけるディスプレイ4の表示例を示す図である。また、図5は、ユーザが表示領域12内でスワイプしたときの図3のステップS7におけるディスプレイ4の表示例を示す図である。図4に示すように、ユーザがアプリケーションアイコン74をタップしているときは、指アイコンIC1は、アプリケーションアイコン74に対応するアプリケーションアイコン54上の固定された位置に表示される。また、図5に示すように、ユーザが表示領域12内でスワイプしているときは、指アイコンIC1は表示領域11内で所定の時間間隔毎に移動する。 FIG. 4 is a diagram illustrating a display example of the display 4 in step S7 of FIG. 3 when the user taps the application icon 74. FIG. 5 is a diagram illustrating a display example of the display 4 in step S <b> 7 of FIG. 3 when the user swipes within the display area 12. As shown in FIG. 4, when the user taps the application icon 74, the finger icon IC <b> 1 is displayed at a fixed position on the application icon 54 corresponding to the application icon 74. As shown in FIG. 5, when the user is swiping in the display area 12, the finger icon IC <b> 1 moves in the display area 11 at predetermined time intervals.
 図3のステップS10において、CPU1は、第2の表示データを表示することを禁止し、第1の表示データのみをディスプレイ4に表示し、タップされたアプリケーションアイコンに対応する文字入力アプリケーションのプログラムを起動して実行し、ステップS1に戻る。また、ステップS12において、CPU1は第2の表示データを表示領域12に表示したまま、タップされたアプリケーションアイコンに対応するアプリケーションのプログラムを起動して実行し、ステップS1に戻る。なお、ステップS12において、CPU1は、ステップS6及びS7の各処理を所定の時間間隔毎に繰り返して実行することにより、指アイコンIC1を表示する。図6は、図3のステップS12において映像再生用のアプリケーションのプログラムを実行したときのディスプレイ4の表示例を示す図である。 In step S10 in FIG. 3, the CPU 1 prohibits the display of the second display data, displays only the first display data on the display 4, and executes a program for the character input application corresponding to the tapped application icon. Start and execute, and return to step S1. In step S12, the CPU 1 starts and executes the program of the application corresponding to the tapped application icon while displaying the second display data in the display area 12, and returns to step S1. In step S12, the CPU 1 displays the finger icon IC1 by repeatedly executing the processes in steps S6 and S7 at predetermined time intervals. FIG. 6 is a diagram showing a display example of the display 4 when the video reproduction application program is executed in step S12 of FIG.
 以上説明したように、CPU1は、第1の表示データを相似形状で縮小して第2の表示データに変換する(図3のステップS3参照。)。さらに、CPU1は、第2の表示データの表示領域12全体が第1の表示データの表示領域11と重なるように、第1の表示データを第2の表示データと合成する(図3のステップS4参照。)。より具体的には、図2及び図4~図6に示すように、ディスプレイ4の表示画面は、ディスプレイ4の表示画面全体に対応しかつ第1の表示データを表示する表示領域11と、第1の表示データを相似形状で縮小した第2の表示データを表示する表示領域12とを含む。また、本実施形態において、表示領域12は、表示領域11の右下角部分を隠すように表示領域11に重なって配置される。さらに、表示領域11には、文字、アプリケーションを起動することを指示するためのアプリケーションアイコン51~61、映像、及び指カーソルIC1が表示され、表示領域12には文字、アプリケーションを起動することを指示するためのアプリケーションアイコン71~81、及び映像等が表示される。 As described above, the CPU 1 reduces the first display data in a similar shape and converts it into second display data (see step S3 in FIG. 3). Further, the CPU 1 synthesizes the first display data with the second display data so that the entire display area 12 of the second display data overlaps the display area 11 of the first display data (step S4 in FIG. 3). reference.). More specifically, as shown in FIGS. 2 and 4 to 6, the display screen of the display 4 corresponds to the entire display screen of the display 4 and includes a display area 11 for displaying first display data, Display area 12 for displaying second display data obtained by reducing the display data of one with a similar shape. Further, in the present embodiment, the display area 12 is arranged so as to overlap the display area 11 so as to hide the lower right corner portion of the display area 11. Further, the display area 11 displays application icons 51 to 61 for instructing to start characters and applications, video, and the finger cursor IC1, and the display area 12 instructs to start characters and applications. Application icons 71 to 81 and video and the like are displayed.
 本実施形態によれば、図4及び図5に示すように、CPU1は、タッチパネル2により、ユーザが表示領域12をタップしたことを検出したとき、xy座標系における指の座標(xf,yf)を表示領域12のpq座標系における座標(pf,qf)に変換する。そして、xy座標系における座標(pf,qf)に、表示領域12内の指の実際の位置に対応する表示領域11における指の位置を示す指アイコンIC1を表示する。すなわち、CPU1は、表示領域12内の指の実際の位置を、当該指の位置に対応する表示領域11内の位置に変換し、変換後の位置に指アイコンIC1を表示する。従って、ユーザは、表示領域11の指アイコンIC1を見ながら、表示領域12内のアプリケーションアイコン71~81を操作できる。特許文献1記載の携帯端末によれば、指の近くにボタンやリンク等の項目を集めて表示するので、表示された項目が見づらくなり操作しにくいという課題があったが、本実施形態によれば、ユーザは表示領域12内のアプリケーションアイコン71~81を見ながら操作する必要がないので、操作性を向上できる。 According to the present embodiment, as shown in FIGS. 4 and 5, when the CPU 1 detects that the user has tapped the display area 12 by using the touch panel 2, the finger coordinates (xf, yf) in the xy coordinate system are detected. Is converted into coordinates (pf, qf) in the pq coordinate system of the display area 12. Then, a finger icon IC1 indicating the position of the finger in the display area 11 corresponding to the actual position of the finger in the display area 12 is displayed at coordinates (pf, qf) in the xy coordinate system. That is, the CPU 1 converts the actual position of the finger in the display area 12 into a position in the display area 11 corresponding to the position of the finger, and displays the finger icon IC1 at the converted position. Therefore, the user can operate the application icons 71 to 81 in the display area 12 while looking at the finger icon IC1 in the display area 11. According to the portable terminal described in Patent Document 1, since items such as buttons and links are collected and displayed near the finger, there is a problem that the displayed items are difficult to see and difficult to operate. For example, since the user does not need to operate while looking at the application icons 71 to 81 in the display area 12, the operability can be improved.
 また、図5に示すように、本実施形態によれば、ユーザが表示領域12上で指を移動しているとき、所定の時間間隔で指の位置を計測し、計測された位置を表示領域11における位置に変換し、所定の時間間隔毎に指アイコンIC1を表示する。従って、表示領域11内の指アイコンIC1の移動する様子を見ながら、表示領域12内で操作を行える。すなわち、ユーザは、表示領域12内で操作を行うことにより、表示領域11における操作を行える。さらに、表示領域12はディスプレイ4の下部に表示されるので、ユーザは、情報端末装置100を片手で保持したまま、持ち手の指の位置に表示された表示領域12をタッチして情報端末装置100を操作できる。従って、ユーザは、情報端末装置100を片手で保持したまま、タッチパネルの特徴的な操作であるスワイプを行ったり、アプリケーションアイコン71~81をタップしてアプリケーションを起動したりできる。 As shown in FIG. 5, according to the present embodiment, when the user moves the finger on the display area 12, the position of the finger is measured at a predetermined time interval, and the measured position is displayed in the display area. 11 and the finger icon IC1 is displayed at predetermined time intervals. Therefore, an operation can be performed in the display area 12 while observing the movement of the finger icon IC1 in the display area 11. That is, the user can perform an operation in the display area 11 by performing an operation in the display area 12. Further, since the display area 12 is displayed at the bottom of the display 4, the user touches the display area 12 displayed at the position of the finger of the holding hand while holding the information terminal apparatus 100 with one hand, and the information terminal apparatus. 100 can be operated. Accordingly, the user can swipe, which is a characteristic operation of the touch panel, while holding the information terminal device 100 with one hand, or can start the application by tapping the application icons 71 to 81.
 さらに、本実施形態によれば、CPU1は、タッチパネル2により表示領域12内で、文字入力アプリケーションの実行を指示する操作(例えば、アプリケーションアイコン71~81のうち文字入力アプリケーションに対応するアプリケーションアイコンをタップする操作である。)を検出したとき、文字入力アプリケーションのプログラムを実行する(図3のステップS10参照。)。このとき、CPU1は、第2の表示データを表示することを禁止して、第1の表示データのみをディスプレイ4の表示画面上に表示する。従って、文字入力アプリケーションの実行時は表示領域12が表示されないので、表示領域12が文字入力の邪魔になることはない。また、CPU1は、映像再生用のアプリケーションなどの文字入力アプリケーション以外の実行時は、表示領域12を表示する(図3のステップS12参照。)ので、例えば、ユーザは、表示領域12内をタッチすることにより早送りなどの操作ができる。 Furthermore, according to the present embodiment, the CPU 1 taps an operation (for example, an application icon corresponding to the character input application among the application icons 71 to 81) that instructs execution of the character input application in the display area 12 by the touch panel 2. Is detected), the character input application program is executed (see step S10 in FIG. 3). At this time, the CPU 1 prohibits the display of the second display data and displays only the first display data on the display screen of the display 4. Therefore, since the display area 12 is not displayed when the character input application is executed, the display area 12 does not interfere with the character input. Further, the CPU 1 displays the display area 12 (see step S12 in FIG. 3) when executing other than a character input application such as a video reproduction application. For example, the user touches the display area 12. You can perform operations such as fast-forwarding.
 なお、本実施形態において、ディスプレイ4に指アイコンIC1を表示したが、本発明はこれに限られない。CPU1は、表示領域12でスワイプ又はタップなどのユーザの操作を検出し、当該操作に対応する指示子を表示領域11内に表示すればよい。図7は、ユーザがアプリケーションアイコン74をタップしているときの図3のステップS7におけるディスプレイ4の別の表示例を示す図である。図7では、指アイコンIC1に代えて矢印形のカーソルIC2を表示している。また、図8は、ユーザがアプリケーションアイコン74をダブルタップしたときの図3のステップS7におけるディスプレイ4の別の表示例を示す図である。図8では、ユーザがアイコンをダブルタップしたとき、ダブルタップを示す星印を含む指アイコンIC3を表示している。 In the present embodiment, the finger icon IC1 is displayed on the display 4, but the present invention is not limited to this. The CPU 1 may detect a user operation such as swipe or tap in the display area 12 and display an indicator corresponding to the operation in the display area 11. FIG. 7 is a diagram illustrating another display example of the display 4 in step S <b> 7 of FIG. 3 when the user taps the application icon 74. In FIG. 7, an arrow-shaped cursor IC2 is displayed instead of the finger icon IC1. FIG. 8 is a diagram showing another display example of the display 4 in step S7 of FIG. 3 when the user double taps the application icon 74. In FIG. 8, when the user double taps an icon, a finger icon IC3 including a star indicating a double tap is displayed.
 なお、本実施形態において、第1の表示データは、アプリケーションアイコン51~61を含むメニュー画面であり、CPU1は、アプリケーションアイコン71~81のうちの1つがタップされたことを検出したとき、タップされたアプリケーションアイコンに対応するアプリケーションのプログラムを実行したが、本発明はこれに限られない。CPU1は、タッチパネル2により、第2の表示データの表示領域内12で、所定の処理の実行を指示する操作を検出したとき、当該処理を実行すればよい。例えば、第1の表示データは、動画の表示データ並びに動画の再生ボタン及び停止ボタンを含む表示データであってもよい。このとき、CPU1は、表示領域12内に表示された動画の停止ボタンがタップされたことを検出したとき、動画の表示を停止する処理を実行する。 In the present embodiment, the first display data is a menu screen including the application icons 51 to 61, and the CPU 1 is tapped when detecting that one of the application icons 71 to 81 is tapped. Although the application program corresponding to the application icon is executed, the present invention is not limited to this. When the CPU 1 detects an operation for instructing execution of a predetermined process in the display area 12 of the second display data by the touch panel 2, the CPU 1 may execute the process. For example, the first display data may be display data including moving image display data and a moving image playback button and a stop button. At this time, when the CPU 1 detects that the moving image stop button displayed in the display area 12 has been tapped, the CPU 1 executes processing for stopping the moving image display.
 また、本実施形態において、CPU1は、文字入力アプリケーションを実行するときに第2の表示データを表示することを禁止したが、本発明はこれに限られない。例えば、実行するアプリケーションの種類に応じて第2の表示データを表示するか否かを判断してもよい。また、CPU1は、ユーザが表示領域12以外の領域をタップしたことを検出したとき、第2の表示データを表示することを禁止してもよい。 In the present embodiment, the CPU 1 prohibits displaying the second display data when executing the character input application, but the present invention is not limited to this. For example, it may be determined whether to display the second display data according to the type of application to be executed. Further, the CPU 1 may prohibit the display of the second display data when detecting that the user has tapped an area other than the display area 12.
第2の実施形態.
 第1の実施形態では、図3のステップS4において、CPU1は、表示領域12全体が表示領域11の一部を隠すように、第1の表示データを第2の表示データと合成した。本実施形態は、第1の実施形態と比較して、CPU1が、第2の表示データの表示領域12Aが第1の表示データの表示領域11Aと重ならないように第1の表示データを第2の表示データと合成する点のみが異なり、他の構成及び動作は第1の実施形態と同様である。
Second embodiment.
In the first embodiment, in step S4 in FIG. 3, the CPU 1 combines the first display data with the second display data so that the entire display area 12 hides a part of the display area 11. In the present embodiment, compared to the first embodiment, the CPU 1 transfers the first display data to the second display data so that the display area 12A for the second display data does not overlap the display area 11A for the first display data. The only difference is that it is combined with the display data, and the other configuration and operation are the same as those in the first embodiment.
 図9は、本発明の第2の実施形態に係る表示領域11A及び12Aの表示例を示す図である。図9に示すように、CPU1は、第1の表示データを相似形状で縮小して第2の表示データに変換し、第1の表示データをディスプレイ4の左上部分の表示領域11Aに表示し、第2の表示データをディスプレイ4の右下角部分の表示領域12Aに表示し、かつ表示領域12Aが表示領域11Aと重ならないように、第1の表示データを第2の表示データと合成し、合成された表示データをディスプレイ4の表示画面に表示する。 FIG. 9 is a diagram showing a display example of the display areas 11A and 12A according to the second embodiment of the present invention. As shown in FIG. 9, the CPU 1 reduces the first display data in a similar shape and converts it into second display data, displays the first display data in the display area 11 </ b> A in the upper left part of the display 4, The second display data is displayed in the display area 12A in the lower right corner portion of the display 4, and the first display data is combined with the second display data so that the display area 12A does not overlap the display area 11A. The displayed display data is displayed on the display screen of the display 4.
 従って、本実施形態によれば、第1の実施形態と同様に、ユーザは表示領域12A内で操作を行うことにより、表示領域11A内で操作したときと同様の操作を行える。 Therefore, according to the present embodiment, as in the first embodiment, the user can perform the same operation as that performed in the display area 11A by performing an operation in the display area 12A.
第2の実施形態の第1の変形例.
 図10は、本発明の第2の実施形態の第1の変形例に係る表示領域11B及び12Bの表示例を示す図である。本変形例は、第2の実施形態と比較して、CPU1が、表示領域12Bの一部が表示領域11Bの一部に重なるように第1の表示データを第2の表示データと合成した点のみが異なり、他の構成及び動作は第2の実施形態と同様である。図10に示すように、本変形例において、CPU1は、第1の表示データを相似形状で縮小して第2の表示データに変換し、第1の表示データをディスプレイ4の左上部分の表示領域11Bに表示し、第2の表示データをディスプレイ4の右下角部分の表示領域12Bに表示し、かつ表示領域12Bの一部が表示領域11Bの一部に重なるように、第1の表示データを第2の表示データと合成し、合成された表示データをディスプレイ4に表示する。
First modified example of the second embodiment.
FIG. 10 is a diagram showing a display example of the display areas 11B and 12B according to the first modification of the second embodiment of the present invention. Compared with the second embodiment, the present modified example is that the CPU 1 combines the first display data with the second display data so that a part of the display area 12B overlaps a part of the display area 11B. Only the difference is that other configurations and operations are the same as those of the second embodiment. As shown in FIG. 10, in the present modification, the CPU 1 reduces the first display data in a similar shape and converts it into second display data, and the first display data is displayed in the display area in the upper left portion of the display 4. 11B, the second display data is displayed in the display area 12B in the lower right corner portion of the display 4, and the first display data is displayed so that a part of the display area 12B overlaps a part of the display area 11B. The display data is combined with the second display data, and the combined display data is displayed on the display 4.
 従って、本実施形態によれば、第1の実施形態と同様に、ユーザは表示領域12B内で操作を行うことにより、表示領域11B内で操作したときと同様の操作を行える。 Therefore, according to the present embodiment, as in the first embodiment, the user can perform the same operation as when operating in the display area 11B by performing an operation in the display area 12B.
第2の実施形態の第2の変形例.
 図11は、本発明の第2の実施形態の第2の変形例に係る表示領域11C及び12Cの表示例を示す図である。本変形例は、第2の実施形態と比較して、CPU1が、第1及び第2の表示データを、互いに同一のサイズを有しかつ互いに重ならない表示領域11C及び12Cにそれぞれ表示するように、第1の表示データを第2の表示データと合成した点のみが異なり、他の構成及び動作は第2の実施形態と同様である。本変形例において、CPU1は、図11に示すように、第1の表示データを第1の表示データと同一形状の第2の表示データに変換し、第1の表示データをディスプレイ4の左上部分の表示領域11Cに表示し、第2の表示データをディスプレイ4の右下角部分の表示領域12Cに表示し、かつ表示領域12Cが表示領域11Cと重ならないように、第1の表示データを第2の表示データと合成し、合成された表示データをディスプレイ4に表示する。
Second modification of the second embodiment.
FIG. 11 is a diagram showing a display example of the display areas 11C and 12C according to the second modification of the second embodiment of the present invention. In this modification, the CPU 1 displays the first and second display data in the display areas 11C and 12C that have the same size and do not overlap each other, as compared with the second embodiment. The only difference is that the first display data is combined with the second display data, and other configurations and operations are the same as those of the second embodiment. In this modification, as shown in FIG. 11, the CPU 1 converts the first display data into second display data having the same shape as the first display data, and converts the first display data into the upper left part of the display 4. Is displayed in the display area 11C, the second display data is displayed in the display area 12C in the lower right corner portion of the display 4, and the first display data is displayed in the second area so that the display area 12C does not overlap the display area 11C. And the combined display data is displayed on the display 4.
 従って、本実施形態によれば、第1の実施形態と同様に、ユーザは表示領域12C内で操作を行うことにより、表示領域11C内で操作したときと同様の操作を行える。 Therefore, according to the present embodiment, as in the first embodiment, the user can perform the same operation as that performed in the display area 11C by performing an operation in the display area 12C.
 なお、第2の実施形態及びその変形例において、ディスプレイ4の全体の表示領域と同一の表示サイズを有する表示データを相似形状で縮小して、表示領域11A,11B又は11Cと同一の表示サイズを有する第1の表示データにあらかじめ変換してもよい。 In the second embodiment and its modification, the display data having the same display size as the entire display area of the display 4 is reduced in a similar shape so that the same display size as that of the display area 11A, 11B, or 11C is obtained. You may convert beforehand to the 1st display data which have.
第3の実施形態.
 上記各実施形態及びその変形例において、CPU1は第1の表示データを相似形状で縮小して第2の表示データに変換し又は第1の表示データと同一形状の第2の表示データに変換したが、本発明はこれに限られず、第2の表示データの表示領域内でユーザが操作することにより、第1の表示データの表示領域における操作を行えればよい。
Third embodiment.
In each of the above embodiments and modifications thereof, the CPU 1 reduced the first display data in a similar shape and converted it into second display data or converted it into second display data having the same shape as the first display data. However, the present invention is not limited to this, and it is sufficient that an operation in the display area of the first display data can be performed by a user operating in the display area of the second display data.
 図12は、本発明の第3の実施形態に係る制御処理のステップS3Aを示すフローチャートである。図12に示すように、本実施形態においてCPU1によって実行される制御処理は、図3の制御処理においてステップS3をステップS3Aに置き換えたものであり、ステップS3A以外の各ステップS1~S2,S4~S12は図3と同様である。CPU1は、図12のステップS3Aにおいて、第1の表示データを非相似形状で縮小して第2の表示データに変換する。このとき、CPU1は、アプリケーションアイコン51~61などの第1の表示データの表示内容を非相似形状で縮小する。そして、ステップS3Aに続いて、ステップS4において、第1の表示データを第2の表示データと合成して、合成された表示データをディスプレイ4に表示する。具体的には、CPU1は、第1の表示データをディスプレイ4の表示画面上部の表示領域11Dに表示しかつ第2の表示データをディスプレイ4の表示画面の下部の表示領域12Dに表示するように(表示領域11D及び12Dが互いに重ならないように)、第1の表示データを第2の表示データと合成して、合成された表示データを生成する。 FIG. 12 is a flowchart showing step S3A of the control process according to the third embodiment of the present invention. As shown in FIG. 12, the control process executed by the CPU 1 in the present embodiment is obtained by replacing step S3 with step S3A in the control process of FIG. 3, and the steps S1 to S2, S4 to S4 other than step S3A. S12 is the same as FIG. In step S3A in FIG. 12, the CPU 1 reduces the first display data to an unsimilar shape and converts it into second display data. At this time, the CPU 1 reduces the display content of the first display data such as the application icons 51 to 61 in an unsimilar shape. Subsequently to step S3A, in step S4, the first display data is combined with the second display data, and the combined display data is displayed on the display 4. Specifically, the CPU 1 displays the first display data in the display area 11D at the top of the display screen of the display 4 and the second display data in the display area 12D at the bottom of the display screen of the display 4. The first display data is combined with the second display data (so that the display areas 11D and 12D do not overlap each other), and the combined display data is generated.
 図13は、図12のステップS3Aにおけるディスプレイ4の表示例を示す図である。図13に示すように、表示領域11Dにおいてアプリケーションアイコン51,52,53,54,…は格子状に配置されている。一方、アプリケーションアイコン51,52,53,54,…に対応するアプリケーションアイコン71,72,73,74,…は、表示領域12Dにおいて並べ替えられて一列に配置されている。 FIG. 13 is a diagram showing a display example of the display 4 in step S3A of FIG. As shown in FIG. 13, application icons 51, 52, 53, 54,... Are arranged in a grid pattern in the display area 11D. On the other hand, application icons 71, 72, 73, 74,... Corresponding to the application icons 51, 52, 53, 54,... Are rearranged in the display area 12D and arranged in a line.
 さらに、CPU1は、指の座標(xf,yf)が表示領域12D内のアプリケーションアイコン71,72,73,…にあるとき、指の座標(xf,yf)を、アプリケーションアイコン71,72,73,…のうち座標(xf,yf)を含むアイコンに対応する表示領域11D内のアプリケーションアイコン51,52,53,…内の所定の座標に変換し、当該変換後の座標に指アイコンIC1などの指示子を表示する。 Further, when the finger coordinates (xf, yf) are in the application icons 71, 72, 73,... In the display area 12D, the CPU 1 uses the finger coordinates (xf, yf) as the application icons 71, 72, 73,. Are converted into predetermined coordinates in the application icons 51, 52, 53,... In the display area 11D corresponding to the icon including the coordinates (xf, yf), and the finger icon IC1 or the like is indicated in the converted coordinates. Display children.
 従って、本実施形態によれば、第1の実施形態と同様に、ユーザは表示領域12D内で操作を行うことにより、表示領域11Dで操作したときと同様の操作を行える。 Therefore, according to the present embodiment, as in the first embodiment, the user can perform the same operation as when operating in the display area 11D by performing an operation in the display area 12D.
 なお、本実施形態において、表示領域12Dは表示領域11Dと重ならなかったが、本発明はこれに限られず、表示領域12Dの少なくとも一部は表示領域11Dと重なってもよい。 In the present embodiment, the display area 12D does not overlap the display area 11D, but the present invention is not limited to this, and at least a part of the display area 12D may overlap the display area 11D.
第4の実施形態.
 図14は、本発明の第4の実施形態に係るアイコンIC2の表示例を示す図である。上述した各実施形態及びその変形例において、CPU1は、表示領域12,12A,12B,12C及び12D内の指の実際の位置は、表示領域11,11A,11B11C及び11D内の指アイコンIC1などの指示子の位置と一対一に対応していた。これに対して、本実施形態では、CPU1は、タッチパネル2からの検出信号S2に基づいて、表示領域12内で指の移動操作を検出したとき、表示領域11内で、検出された移動操作の移動方向に、検出された移動操作の移動量に対応する移動量だけカーソルIC2を移動させるようにカーソルIC2を表示する。本実施形態によれば、ユーザは表示領域12をタッチパッドとして利用できる。
Fourth embodiment.
FIG. 14 is a diagram showing a display example of the icon IC2 according to the fourth embodiment of the present invention. In each of the above-described embodiments and modifications thereof, the CPU 1 determines that the actual positions of the fingers in the display areas 12, 12A, 12B, 12C, and 12D are the finger icons IC1 in the display areas 11, 11A, 11B11C, and 11D. There was a one-to-one correspondence with the position of the indicator. On the other hand, in this embodiment, when the CPU 1 detects a finger movement operation in the display area 12 based on the detection signal S2 from the touch panel 2, the CPU 1 detects the movement operation detected in the display area 11. The cursor IC2 is displayed so that the cursor IC2 is moved in the movement direction by the movement amount corresponding to the movement amount of the detected movement operation. According to the present embodiment, the user can use the display area 12 as a touch pad.
 なお、第2及び第3の実施形態及びその変形例に、本実施形態を適用してもよい。 In addition, you may apply this embodiment to 2nd and 3rd embodiment and its modification.
 また、本実施形態において、第1の表示データを第2の表示データに変換して表示領域12に表示したが、本発明はこれに限られず、第1及び第2の表示データ以外の別の表示データを表示領域12に表示してもよい。 In the present embodiment, the first display data is converted into the second display data and displayed in the display area 12. However, the present invention is not limited to this, and another display data other than the first and second display data is displayed. Display data may be displayed in the display area 12.
第5の実施形態.
 図15は、本発明の第5の実施形態に係る表示領域11,12R及び12Lの表示例を示す図である。上記各実施形態において、CPU1は第1の表示データを1つの第2の表示データに変換した。これに対して、本実施形態ではCPU1は第1の表示データを相似形状で縮小して2つの第2の表示データに変換する。そして、CPU1は、第1の表示データをディスプレイ4の表示画面全体の表示領域11に表示し、一方の第2の表示データをディスプレイ4の右下角部分の表示領域12Rに表示し、他方の第2の表示データをディスプレイ4の左下角部分の表示領域12Lに表示するように、第1の表示データを2つの第2の表示データと合成する。ここで、表示領域12R及び12Lは、それぞれ右手及び左手での操作のために用いられる。
Fifth embodiment.
FIG. 15 is a diagram showing a display example of the display areas 11, 12R and 12L according to the fifth embodiment of the present invention. In the above embodiments, the CPU 1 converts the first display data into one second display data. On the other hand, in the present embodiment, the CPU 1 reduces the first display data with a similar shape and converts it into two pieces of second display data. Then, the CPU 1 displays the first display data in the display area 11 of the entire display screen of the display 4, displays one second display data in the display area 12R in the lower right corner portion of the display 4, and the other display The first display data is combined with the two second display data so that the display data of 2 is displayed in the display area 12L in the lower left corner portion of the display 4. Here, the display areas 12R and 12L are used for operations with the right hand and the left hand, respectively.
 本実施形態において、タッチパネル2は、指の座標(xf,yf)及び操作内容に加えて押圧を検出し、指の座標(xf,yf)と、操作内容と、押圧とを含む検出信号S2を発生してCPU1に出力する。さらに、CPU1は、タッチパネル2からの検出信号S2に基づいて、表示領域12R及び12Lのうち、押圧が大きい方の表示領域を選択し、選択された表示領域内でのユーザの操作に基づいて、表示領域11に第1の実施形態と同様に指アイコンIC1を表示する(図15参照。)。本実施形態によれば、表示領域12R及び12Lのうち、押圧が大きい方の表示領域の操作を採用するので、ユーザが左右どちらの指をディスプレイ4に触れていても、いずれかの指の操作だけがアクティブになる。このため、情報端末装置100は、ユーザの左右の手のどちらかの指で操作される。 In the present embodiment, the touch panel 2 detects a pressure in addition to the finger coordinates (xf, yf) and the operation content, and generates a detection signal S2 including the finger coordinates (xf, yf), the operation content, and the pressure. Generated and output to the CPU 1. Further, the CPU 1 selects the display area with the greater pressing out of the display areas 12R and 12L based on the detection signal S2 from the touch panel 2, and based on the user's operation in the selected display area, The finger icon IC1 is displayed in the display area 11 as in the first embodiment (see FIG. 15). According to the present embodiment, the operation of the display area with the larger pressing of the display areas 12R and 12L is adopted, so that even if the user touches the display 4 with either the left or right finger, the operation of either finger Only becomes active. For this reason, the information terminal device 100 is operated with either finger of the left and right hands of the user.
 なお、本実施形態において、CPU1は、表示領域12R及び12Lのうち、押圧が大きい方の表示領域を選択し、選択された表示領域内でのユーザの操作に基づいて指アイコンIC1を表示したが、本発明はこれに限られない。CPU1は、表示領域12R及び12Lのうち、先にタッチされた表示領域又はボタン9の操作などの所定の操作によって選択された表示領域を選択してもよい。 Note that in the present embodiment, the CPU 1 selects the display area with the greater pressing out of the display areas 12R and 12L, and displays the finger icon IC1 based on the user's operation within the selected display area. The present invention is not limited to this. The CPU 1 may select the display area touched first or the display area selected by a predetermined operation such as the operation of the button 9 among the display areas 12R and 12L.
第6の実施形態.
 図16は、本発明の第6の実施形態に係る情報端末装置100Aの構成を示すブロック図である。図16において、情報端末装置100は、表示指示手段200と、表示制御手段201と、操作手段202と、モニタ102と、変換手段203と、処理手段204とを備えて構成される。なお、図16において、モニタ102は図1のインターフェース3及びディスプレイ4に対応する。また、操作手段202は図1のタッチパネル2に対応し、表示指示手段200は図1のボタン7~9及びタッチパネル2に対応する。さらに、表示制御手段201、変換手段203及び処理手段204は図1のCPU1に対応しており、上述した各実施形態及びその変形例に係るCPU1の制御処理を実行する。
Sixth embodiment.
FIG. 16 is a block diagram showing a configuration of an information terminal device 100A according to the sixth embodiment of the present invention. In FIG. 16, the information terminal device 100 includes a display instruction unit 200, a display control unit 201, an operation unit 202, a monitor 102, a conversion unit 203, and a processing unit 204. In FIG. 16, the monitor 102 corresponds to the interface 3 and the display 4 in FIG. The operation unit 202 corresponds to the touch panel 2 in FIG. 1, and the display instruction unit 200 corresponds to the buttons 7 to 9 and the touch panel 2 in FIG. Further, the display control unit 201, the conversion unit 203, and the processing unit 204 correspond to the CPU 1 in FIG. 1, and execute the control process of the CPU 1 according to each of the above-described embodiments and modifications thereof.
 図16において、操作手段202は、モニタ102の表示画面の表示領域11,11A,11B,11C,11D,12,12A,12B,12C,12D,12L又は12Rの領域内でユーザが指で操作したときに、その指の位置及び動きを検出し、検出した位置及び動きの情報からユーザの操作内容を判断する。例えば、ユーザが図2の表示領域11内に表示されているアプリケーションアイコン51をクリックしたときは、操作手段202は、ユーザの操作内容はアプリケーションアイコン51に対応するアプリケーションの起動指示であると判断する。 In FIG. 16, the operation means 202 is operated by the user with a finger in the display area 11, 11A, 11B, 11C, 11D, 12, 12A, 12B, 12C, 12D, 12L or 12R of the display screen of the monitor 102. Sometimes, the position and movement of the finger are detected, and the user's operation content is determined from the detected position and movement information. For example, when the user clicks on the application icon 51 displayed in the display area 11 of FIG. 2, the operation unit 202 determines that the operation content of the user is an instruction to start an application corresponding to the application icon 51. .
 また、図16において、処理手段204は、操作手段202で判断された操作内容に従って、当該操作内容に対応する処理を実行する。例えば、ユーザの操作内容が、アプリケーションアイコン51に対応するアプリケーションの起動指示であるときは、実際にそのアプリケーションのプログラムを実行する。さらに、ボタン9を押す動作又は操作手段202におけるユーザの操作指示により、表示領域12,12A,12B,12C,12D又は12L及び12Rを表示することをユーザが指示したとき、表示指示手段200は、その指示を表示制御手段201に通知する。例えば、ユーザが、表示領域11に表示された、表示領域12を表示するためのボタン(例えば、図2のアプリケーションアイコン52である。)に触れたとき、表示領域12を表示させる。 Further, in FIG. 16, the processing unit 204 executes processing corresponding to the operation content according to the operation content determined by the operation unit 202. For example, when the user's operation content is an application activation instruction corresponding to the application icon 51, the program of the application is actually executed. Further, when the user instructs to display the display areas 12, 12A, 12B, 12C, 12D or 12L and 12R by the operation of pressing the button 9 or the user's operation instruction in the operation means 202, the display instruction means 200 The instruction is notified to the display control means 201. For example, when the user touches a button (for example, the application icon 52 of FIG. 2) for displaying the display area 12 displayed in the display area 11, the display area 12 is displayed.
 さらに、表示制御手段201は、表示指示手段200から表示領域12,12A,12B,12C,12D又は12L及び12Rを表示することを通知されたとき、表示領域12,12A,12B,12C,12D又は12L及び12Rをモニタ102の表示画面に表示する。これにより、例えば、図2に示すように、表示領域11の一部を隠すように表示領域12をモニタ102の表示画面に表示する。このとき、変換手段203は、表示領域11の表示内容をそのまま縮小して表示領域12に表示する第2の表示データを生成する。 Further, when the display control unit 201 is notified by the display instruction unit 200 to display the display areas 12, 12A, 12B, 12C, 12D or 12L and 12R, the display areas 12, 12A, 12B, 12C, 12D or 12L and 12R are displayed on the display screen of the monitor 102. Thereby, for example, as shown in FIG. 2, the display area 12 is displayed on the display screen of the monitor 102 so as to hide a part of the display area 11. At this time, the conversion unit 203 generates the second display data to be displayed in the display area 12 by reducing the display content of the display area 11 as it is.
 本実施形態は、第1~第5の実施形態及びその変形例と同様の効果を奏する。 This embodiment has the same effects as the first to fifth embodiments and their modifications.
 なお、上記各実施形態及びその変形例において、第2の表示データを、ディスプレイ4の下部の表示領域12,12A,12B,12C,12D,12L,12Rに表示したが、本発明はこれに限らず、ディスプレイ4の任意の領域に表示してもよい。ただし、好ましくは、第2の表示データの表示領域がディスプレイ4の表示画面の下部に位置するように、第1の表示データを第2の表示データと合成する。一般に、ユーザはディスプレイ4の下部を持って情報端末装置100を操作するため、第2の表示データの表示領域がディスプレイ4の表示画面の下部に位置するように、第1の表示データを第2の表示データと合成することにより、ユーザは持ち手で第2の表示データの表示領域をタッチできる。 In the above-described embodiments and modifications thereof, the second display data is displayed in the display areas 12, 12A, 12B, 12C, 12D, 12L, and 12R below the display 4, but the present invention is not limited to this. Instead, it may be displayed in any area of the display 4. However, preferably, the first display data is combined with the second display data so that the display area of the second display data is located at the lower part of the display screen of the display 4. In general, since the user operates the information terminal device 100 with the lower part of the display 4, the first display data is stored in the second display data so that the display area of the second display data is located at the lower part of the display screen of the display 4. By combining with the display data, the user can touch the display area of the second display data with the handle.
 また、上記各実施形態及びその変形例において、情報端末装置100を例に挙げて本発明を説明したが、本発明はこれに限られない。本発明は、ディスプレイ4などの表示画面上での操作を検出するタッチパネル2などの検出手段を備えた電子機器に適用できる。従って、電子機器の持ち手の指の位置に表示された子画面を操作することにより表示画面全体での操作を行える電子機器を提供できる。このため、本発明によれば、一般にユーザが両手で持つことが想定されるタブレット型の電子機器であっても、ユーザは、当該電子機器を手元の子画面だけの操作で使用できる。 In the above-described embodiments and modifications thereof, the present invention has been described by taking the information terminal device 100 as an example, but the present invention is not limited to this. The present invention can be applied to an electronic device provided with detection means such as a touch panel 2 that detects an operation on a display screen such as the display 4. Therefore, it is possible to provide an electronic device that can be operated on the entire display screen by operating the sub-screen displayed at the position of the finger of the electronic device. For this reason, according to this invention, even if it is a tablet-type electronic device with which a user is generally assumed to hold | maintain with both hands, the user can use the said electronic device only by operation of a small child screen at hand.
 以上説明したように、本発明に係る電子機器によれば、所定の第1の表示データを少なくとも1つの第2の表示データに変換し、第1の表示データを各第2の表示データと合成して合成された表示データを表示画面上に表示し、検出手段により、少なくとも1つの第2の表示データのうちの1つの表示データの表示領域内で、所定の処理の実行を指示する操作を検出したとき、当該処理を実行する制御手段を備えたので、ユーザは、第2の表示データの表示領域内で操作を行うだけで、第1の表示データの表示領域内で行う操作と同様の操作を行える。従って、従来技術に比較して操作性の良い電子機器を提供できる。 As described above, according to the electronic apparatus of the present invention, predetermined first display data is converted into at least one second display data, and the first display data is combined with each second display data. The display data synthesized in this way is displayed on the display screen, and an operation for instructing execution of a predetermined process is performed within the display area of one of the at least one second display data by the detecting means. When it is detected, since the control means for executing the processing is provided, the user simply performs an operation in the display area of the second display data, and is similar to the operation performed in the display area of the first display data. Can be operated. Therefore, it is possible to provide an electronic device with better operability compared to the prior art.
1…CPU、
2…タッチパネル、
3…インターフェース、
4…ディスプレイ、
5…ROM、
6…DRAM、
7~9…ボタン、
10…筐体、
11,11A,11B,11C,11D,12,12A,12B,12C,12D,12L,12R…表示領域、
51~61,71~81…アプリケーションアイコン、
100,100A…情報端末装置、
102…モニタ、
200…表示指示手段、
201…表示制御手段、
202…操作手段、
203…変換手段、
204…処理手段、
IC1,IC3…指アイコン、
IC2…カーソル。
1 ... CPU,
2. Touch panel,
3 ... Interface,
4 ... Display,
5 ... ROM,
6 ... DRAM,
7-9 ... buttons,
10: housing,
11, 11A, 11B, 11C, 11D, 12, 12A, 12B, 12C, 12D, 12L, 12R ... display area,
51-61, 71-81 ... application icons,
100, 100A ... information terminal device,
102 ... monitor,
200 ... display instruction means,
201 ... display control means,
202 ... operation means,
203 ... conversion means,
204 ... processing means,
IC1, IC3 ... finger icon,
IC2 ... cursor.

Claims (11)

  1.  表示画面上での操作を検出する検出手段を備えた電子機器であって、
     所定の第1の表示データを少なくとも1つの第2の表示データに変換し、上記第1の表示データを上記各第2の表示データと合成して合成された表示データを上記表示画面上に表示し、上記検出手段により上記少なくとも1つの第2の表示データのうちの1つの表示データの表示領域内で所定の操作を検出する制御手段を備え、
     上記制御手段は、上記検出手段により、上記少なくとも1つの第2の表示データのうちの1つの表示データの表示領域内で、所定の処理の実行を指示する操作を検出したとき、上記処理を実行することを特徴とする電子機器。
    An electronic device provided with detection means for detecting an operation on a display screen,
    The predetermined first display data is converted into at least one second display data, and the first display data is combined with the second display data to display the combined display data on the display screen. And a control means for detecting a predetermined operation within the display area of one of the at least one second display data by the detection means,
    The control means executes the process when the detecting means detects an operation instructing execution of a predetermined process within the display area of one of the at least one second display data. An electronic device characterized by that.
  2.  上記制御手段は、上記所定の操作に対応する指示子を、上記第1の表示データの表示領域内に表示することを特徴とする請求項1記載の電子機器。 2. The electronic apparatus according to claim 1, wherein the control means displays an indicator corresponding to the predetermined operation in a display area of the first display data.
  3.  上記制御手段は、上記検出手段により、上記少なくとも1つの第2の表示データのうちの1つの表示データの表示領域内の第1の位置で上記所定の操作を検出したとき、上記第1の位置の座標を上記第1の表示データの表示領域内の上記第1の位置に対応する第2の位置の座標に変換し、上記第2の位置に上記指示子を表示することを特徴とする請求項2記載の電子機器。 The control means detects the first position when the detection means detects the predetermined operation at a first position within a display area of one of the at least one second display data. The coordinates are converted into coordinates of a second position corresponding to the first position in the display area of the first display data, and the indicator is displayed at the second position. Item 3. The electronic device according to Item 2.
  4.  上記制御手段は、上記検出手段により、上記少なくとも1つの第2の表示データのうちの1つの表示データの表示領域内で移動操作を検出したとき、上記第1の表示データの表示領域内で、上記移動操作の移動方向に上記移動操作の移動量に対応する移動量だけ上記指示子を移動させるように上記指示子を表示することを特徴とする請求項2記載の電子機器。 When the control means detects a movement operation in the display area of one of the at least one second display data by the detecting means, the control means in the display area of the first display data, 3. The electronic device according to claim 2, wherein the indicator is displayed so that the indicator is moved by a movement amount corresponding to a movement amount of the movement operation in a movement direction of the movement operation.
  5.  上記制御手段は、上記処理の実行時に、上記第2の表示データを表示することを禁止して上記第1の表示データのみを上記表示画面上に表示することを特徴とする請求項1から4までのうちのいずれか1つに記載の電子機器。 5. The control unit according to claim 1, wherein the control unit prohibits the display of the second display data and displays only the first display data on the display screen when the process is executed. The electronic device according to any one of the above.
  6.  上記制御手段は、上記第1の表示データを相似形状で縮小して上記各第2の表示データに変換することを特徴とする請求項1から5までのうちのいずれか1つに記載の電子機器。 6. The electronic device according to claim 1, wherein the control unit reduces the first display data in a similar shape and converts the first display data into the second display data. 7. machine.
  7.  上記制御手段は、上記第1の表示データを非相似形状で縮小して上記各第2の表示データに変換することを特徴とする請求項1から5までのうちのいずれか1つに記載の電子機器。 6. The control unit according to claim 1, wherein the control unit reduces the first display data into an unsimilar shape and converts the first display data into the second display data. 7. Electronics.
  8.  上記制御手段は、上記各第2の表示データの表示領域の少なくとも一部が上記第1の表示データの表示領域と重なるように、上記第1の表示データを上記各第2の表示データと合成することを特徴とする請求項6又は7記載の電子機器。 The control means combines the first display data with the second display data so that at least a part of the display area of the second display data overlaps the display area of the first display data. The electronic device according to claim 6, wherein the electronic device is an electronic device.
  9.  上記制御手段は、上記各第2の表示データの表示領域が上記第1の表示データの表示領域と重ならないように、上記第1の表示データを上記各第2の表示データと合成することを特徴とする請求項6又は7記載の電子機器。 The control means combines the first display data with the second display data so that the display area of the second display data does not overlap the display area of the first display data. The electronic apparatus according to claim 6 or 7, characterized in that
  10.  上記制御手段は、上記各第2の表示データの表示領域が上記表示画面の下部に位置するように、上記第1の表示データを上記各第2の表示データと合成することを特徴とする請求項1から9までのうちのいずれか1つに記載の電子機器。 The said control means synthesize | combines said 1st display data with each said 2nd display data so that the display area of each said 2nd display data may be located in the lower part of the said display screen. Item 10. The electronic device according to any one of Items 1 to 9.
  11.  上記制御手段は、上記第1の表示データを2つの上記第2の表示データに変換することを特徴とする請求項1から10までのうちのいずれか1つに記載の電子機器。 The electronic device according to any one of claims 1 to 10, wherein the control means converts the first display data into two second display data.
PCT/JP2011/005929 2010-12-07 2011-10-24 Electronic device WO2012077273A1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US42046110P 2010-12-07 2010-12-07
JP2010272463 2010-12-07
JP2010-272463 2010-12-07
US61/420,461 2010-12-07

Publications (1)

Publication Number Publication Date
WO2012077273A1 true WO2012077273A1 (en) 2012-06-14

Family

ID=46206789

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2011/005929 WO2012077273A1 (en) 2010-12-07 2011-10-24 Electronic device

Country Status (1)

Country Link
WO (1) WO2012077273A1 (en)

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102915201A (en) * 2012-09-17 2013-02-06 广东欧珀移动通信有限公司 One-hand operation method of large-screen touch screen mobile phone
JP2013030050A (en) * 2011-07-29 2013-02-07 Kddi Corp Screen pad inputting user interface device, input processing method, and program
CN103488419A (en) * 2013-08-26 2014-01-01 宇龙计算机通信科技(深圳)有限公司 Operating method of communication terminal and communication terminal
KR20140005481A (en) * 2012-07-04 2014-01-15 엘지전자 주식회사 Terminal and method for controlling the same
CN103543913A (en) * 2013-10-25 2014-01-29 小米科技有限责任公司 Terminal device operation method and device, and terminal device
CN103927080A (en) * 2014-03-27 2014-07-16 小米科技有限责任公司 Method and device for controlling control operation
JP2014153948A (en) * 2013-02-08 2014-08-25 International Business Maschines Corporation Control apparatus and control program
JP2014178790A (en) * 2013-03-14 2014-09-25 Ricoh Co Ltd Projection system, projection device, projection program, and projection method
EP2799971A2 (en) * 2013-05-03 2014-11-05 Samsung Electronics Co., Ltd. Method of operating touch screen and electronic device thereof
EP2806339A1 (en) * 2013-05-24 2014-11-26 Samsung Electronics Co., Ltd Method and apparatus for displaying a picture on a portable device
CN104380238A (en) * 2013-12-03 2015-02-25 华为技术有限公司 Processing method, device and terminal
CN104516654A (en) * 2013-09-26 2015-04-15 联想(北京)有限公司 Operation processing method and device
JP2015106418A (en) * 2013-11-29 2015-06-08 株式会社 ハイヂィープ Virtual touch pad operation method and terminal performing the same
EP2685369A3 (en) * 2012-07-12 2015-07-01 Samsung Electronics Co., Ltd Method and mobile device for adjusting size of touch input window
CN104898976A (en) * 2015-06-03 2015-09-09 北京百纳威尔科技有限公司 Method for using small screen window on mobile device and mobile device
EP2752753A3 (en) * 2013-01-02 2016-11-30 Samsung Display Co., Ltd. Terminal and method for operating the same
US9696882B2 (en) 2013-08-28 2017-07-04 Lenovo (Beijing) Co., Ltd. Operation processing method, operation processing device, and control method
JPWO2017022031A1 (en) * 2015-07-31 2018-02-22 マクセル株式会社 Information terminal equipment
JP2019083965A (en) * 2017-11-06 2019-06-06 株式会社カプコン Game program and game system
CN113260426A (en) * 2018-12-28 2021-08-13 株式会社万代南梦宫娱乐 Game system, processing method, and information storage medium
WO2021227628A1 (en) * 2020-05-14 2021-11-18 华为技术有限公司 Electronic device and interaction method therefor
WO2024114234A1 (en) * 2022-11-30 2024-06-06 华为技术有限公司 Single-handed operation method and electronic device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05330289A (en) * 1992-05-29 1993-12-14 Hitachi Software Eng Co Ltd Electronic blackboard device
JPH11259237A (en) * 1998-03-12 1999-09-24 Ricoh Co Ltd Picture display device
JP2006018348A (en) * 2004-06-30 2006-01-19 Hitachi Ltd Input/display system and its method in using large screen display
JP2009064209A (en) * 2007-09-06 2009-03-26 Sharp Corp Information display device
JP2009122837A (en) * 2007-11-13 2009-06-04 Sharp Corp Information display device, information display method, program, and recording medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05330289A (en) * 1992-05-29 1993-12-14 Hitachi Software Eng Co Ltd Electronic blackboard device
JPH11259237A (en) * 1998-03-12 1999-09-24 Ricoh Co Ltd Picture display device
JP2006018348A (en) * 2004-06-30 2006-01-19 Hitachi Ltd Input/display system and its method in using large screen display
JP2009064209A (en) * 2007-09-06 2009-03-26 Sharp Corp Information display device
JP2009122837A (en) * 2007-11-13 2009-06-04 Sharp Corp Information display device, information display method, program, and recording medium

Cited By (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9244544B2 (en) 2011-07-29 2016-01-26 Kddi Corporation User interface device with touch pad enabling original image to be displayed in reduction within touch-input screen, and input-action processing method and program
JP2013030050A (en) * 2011-07-29 2013-02-07 Kddi Corp Screen pad inputting user interface device, input processing method, and program
WO2013018480A1 (en) * 2011-07-29 2013-02-07 Kddi株式会社 User interface device comprising touch pad for shrinking and displaying source image within screen capable of touch input, input processing method and program
KR102019116B1 (en) 2012-07-04 2019-09-06 엘지전자 주식회사 Terminal and method for controlling the same
KR20140005481A (en) * 2012-07-04 2014-01-15 엘지전자 주식회사 Terminal and method for controlling the same
EP2685369A3 (en) * 2012-07-12 2015-07-01 Samsung Electronics Co., Ltd Method and mobile device for adjusting size of touch input window
CN102915201A (en) * 2012-09-17 2013-02-06 广东欧珀移动通信有限公司 One-hand operation method of large-screen touch screen mobile phone
EP2752753A3 (en) * 2013-01-02 2016-11-30 Samsung Display Co., Ltd. Terminal and method for operating the same
JP2014153948A (en) * 2013-02-08 2014-08-25 International Business Maschines Corporation Control apparatus and control program
JP2014178790A (en) * 2013-03-14 2014-09-25 Ricoh Co Ltd Projection system, projection device, projection program, and projection method
EP2799971A3 (en) * 2013-05-03 2014-11-12 Samsung Electronics Co., Ltd. Method of operating touch screen and electronic device thereof
US9652056B2 (en) 2013-05-03 2017-05-16 Samsung Electronics Co., Ltd. Touch-enable cursor control method and electronic device thereof
EP2799971A2 (en) * 2013-05-03 2014-11-05 Samsung Electronics Co., Ltd. Method of operating touch screen and electronic device thereof
EP2806339A1 (en) * 2013-05-24 2014-11-26 Samsung Electronics Co., Ltd Method and apparatus for displaying a picture on a portable device
US10691291B2 (en) 2013-05-24 2020-06-23 Samsung Electronics Co., Ltd. Method and apparatus for displaying picture on portable device
CN103488419A (en) * 2013-08-26 2014-01-01 宇龙计算机通信科技(深圳)有限公司 Operating method of communication terminal and communication terminal
US9696882B2 (en) 2013-08-28 2017-07-04 Lenovo (Beijing) Co., Ltd. Operation processing method, operation processing device, and control method
CN104516654A (en) * 2013-09-26 2015-04-15 联想(北京)有限公司 Operation processing method and device
CN103543913A (en) * 2013-10-25 2014-01-29 小米科技有限责任公司 Terminal device operation method and device, and terminal device
JP2015106418A (en) * 2013-11-29 2015-06-08 株式会社 ハイヂィープ Virtual touch pad operation method and terminal performing the same
US10073613B2 (en) 2013-12-03 2018-09-11 Huawei Technologies Co., Ltd. Processing method and apparatus, and terminal
CN104380238A (en) * 2013-12-03 2015-02-25 华为技术有限公司 Processing method, device and terminal
CN103927080A (en) * 2014-03-27 2014-07-16 小米科技有限责任公司 Method and device for controlling control operation
CN104898976A (en) * 2015-06-03 2015-09-09 北京百纳威尔科技有限公司 Method for using small screen window on mobile device and mobile device
JPWO2017022031A1 (en) * 2015-07-31 2018-02-22 マクセル株式会社 Information terminal equipment
JP2019083965A (en) * 2017-11-06 2019-06-06 株式会社カプコン Game program and game system
CN113260426A (en) * 2018-12-28 2021-08-13 株式会社万代南梦宫娱乐 Game system, processing method, and information storage medium
WO2021227628A1 (en) * 2020-05-14 2021-11-18 华为技术有限公司 Electronic device and interaction method therefor
WO2024114234A1 (en) * 2022-11-30 2024-06-06 华为技术有限公司 Single-handed operation method and electronic device

Similar Documents

Publication Publication Date Title
WO2012077273A1 (en) Electronic device
JP4372188B2 (en) Information processing apparatus and display control method
US8638315B2 (en) Virtual touch screen system
JP5718042B2 (en) Touch input processing device, information processing device, and touch input control method
JP5691464B2 (en) Information processing device
TWI588734B (en) Electronic apparatus and method for operating electronic apparatus
JP2011028524A (en) Information processing apparatus, program and pointing method
TWI434202B (en) Electronic apparatus with touch screen and associated displaying control method
JP5848732B2 (en) Information processing device
EP2530573B1 (en) Touch control method and electronic apparatus
JP5197533B2 (en) Information processing apparatus and display control method
CA2766528A1 (en) A user-friendly process for interacting with informational content on touchscreen devices
JP2009276819A (en) Method for controlling pointing device, pointing device and computer program
JP2011253252A (en) Electronic device and input control method of the same
JP3850570B2 (en) Touchpad and scroll control method using touchpad
WO2014006806A1 (en) Information processing device
JP2008257629A (en) Touch type input device
JP5275429B2 (en) Information processing apparatus, program, and pointing method
KR20160019762A (en) Method for controlling touch screen with one hand
JP2011081447A (en) Information processing method and information processor
KR101179584B1 (en) Virtual mouse display method on touchscreen and computer readable recording medium storing program performing the method
JP5414134B1 (en) Touch-type input system and input control method
JP4856136B2 (en) Movement control program
JP5458130B2 (en) Electronic device and input control method
JP2014241078A (en) Information processing apparatus

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11846786

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11846786

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP