WO2012077273A1 - Dispositif électronique - Google Patents

Dispositif électronique Download PDF

Info

Publication number
WO2012077273A1
WO2012077273A1 PCT/JP2011/005929 JP2011005929W WO2012077273A1 WO 2012077273 A1 WO2012077273 A1 WO 2012077273A1 JP 2011005929 W JP2011005929 W JP 2011005929W WO 2012077273 A1 WO2012077273 A1 WO 2012077273A1
Authority
WO
WIPO (PCT)
Prior art keywords
display
display data
display area
data
area
Prior art date
Application number
PCT/JP2011/005929
Other languages
English (en)
Japanese (ja)
Inventor
鈴木 達也
Original Assignee
パナソニック株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パナソニック株式会社 filed Critical パナソニック株式会社
Publication of WO2012077273A1 publication Critical patent/WO2012077273A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus

Definitions

  • the present invention relates to an electronic device provided with detection means such as a touch panel for detecting an operation on a display screen.
  • Patent Document 1 it is possible to improve a user's operability by determining an area that can be operated with a finger of a user's hand on a touch panel screen and displaying items such as buttons and links in the area.
  • a portable terminal that can be used is disclosed.
  • An object of the present invention is to solve the above problems, and to provide an electronic device that includes a detection means such as a touch panel for detecting an operation on a display screen and that can improve operability as compared with the prior art.
  • An electronic device is an electronic device provided with detection means for detecting an operation on a display screen, The predetermined first display data is converted into at least one second display data, and the first display data is combined with the second display data to display the combined display data on the display screen. And a control means for detecting a predetermined operation within the display area of one of the at least one second display data by the detection means, The control means executes the process when the detecting means detects an operation instructing execution of a predetermined process within the display area of one of the at least one second display data. It is characterized by doing.
  • control means displays an indicator corresponding to the predetermined operation in a display area of the first display data.
  • control unit detects the predetermined operation at a first position in a display area of one display data of the at least one second display data by the detection unit. Then, the coordinates of the first position are converted into the coordinates of the second position corresponding to the first position in the display area of the first display data, and the indicator is displayed at the second position. It is characterized by doing.
  • control means detects the first display when the detecting means detects a movement operation within a display area of one display data of the at least one second display data.
  • the indicator is displayed so that the indicator is moved by a movement amount corresponding to the movement amount of the movement operation in the movement direction of the movement operation.
  • control means prohibits the display of the second display data and displays only the first display data on the display screen when the process is executed.
  • control means reduces the first display data in a similar shape and converts it into the second display data.
  • control means reduces the first display data in an unsimilar shape and converts it into the second display data.
  • control means sets the first display data to the first display data so that at least a part of the display area of the second display data overlaps the display area of the first display data. It is characterized by being combined with each second display data.
  • control unit may change the first display data to the second display data so that a display area of the second display data does not overlap with a display area of the first display data. It is characterized by combining with display data.
  • control unit combines the first display data with the second display data so that a display area of the second display data is positioned at a lower portion of the display screen. It is characterized by doing.
  • control means converts the first display data into two second display data.
  • predetermined display data is converted into at least one second display data, and the first display data is combined with each second display data to be combined.
  • the process Therefore, the user can perform the same operation as the operation performed in the display area of the first display data only by performing the operation in the display area of the second display data. Therefore, it is possible to provide an electronic device with better operability compared to the prior art.
  • FIG. 4 is a diagram showing a display example of the display 4 in step S ⁇ b> 7 of FIG. 3 when the user is tapping an application icon 74. It is a figure which shows the example of a display of the display 4 in FIG.3 S7 when a user swipes within the display area 12.
  • FIG. It is a figure which shows the example of a display of the display 4 when the program of the application for video reproduction
  • FIG. 3 S7 It is a figure which shows another example of a display of the display 4 in FIG.3 S7 when the user is tapping the application icon 74.
  • FIG. It is a figure which shows another example of a display of the display 4 in FIG.3 S7 when a user double taps the application icon 74.
  • FIG. It is a figure which shows the example of a display of display area 11A and 12A which concerns on the 2nd Embodiment of this invention.
  • FIG. shows the example of a display of the display areas 11B and 12B which concern on the 1st modification of the 2nd Embodiment of this invention.
  • step S3A of the control processing which concerns on the 3rd Embodiment of this invention. It is a figure which shows the example of a display of the display 4 in step S3A of FIG. It is a figure which shows the example of a display of icon IC2 which concerns on the 4th Embodiment of this invention. It is a figure which shows the example of a display of the display areas 11, 12R, and 12L which concern on the 5th Embodiment of this invention. It is a block diagram which shows the structure of 100 A of information terminal devices which concern on the 6th Embodiment of this invention.
  • an electronic device according to the present invention will be described using the information terminal device 100 including the touch panel 2 that detects a user operation on the display screen of the display 4 as an example.
  • FIG. 1 is a block diagram showing the configuration of the information terminal device 100 according to the first embodiment of the present invention
  • FIG. 2 is a front view of the information terminal device 100 of FIG.
  • an information terminal device 100 includes a CPU (Central Processing Unit) 1, a touch panel 2, an interface 3, a display (also called a monitor) 4, a ROM (Read Only Memory) 5, a DRAM (Dynamic Random). (Access Memory) 6, buttons 7 to 9, and a housing 10 (see FIG. 2).
  • CPU Central Processing Unit
  • touch panel 2 is a front view of the information terminal device 100 of FIG.
  • an information terminal device 100 includes a CPU (Central Processing Unit) 1, a touch panel 2, an interface 3, a display (also called a monitor) 4, a ROM (Read Only Memory) 5, a DRAM (Dynamic Random). (Access Memory) 6, buttons 7 to 9, and a housing 10 (see FIG. 2).
  • ROM Read Only Memory
  • DRAM Dynamic Random
  • the information terminal device 100 is an electronic device that includes the touch panel 2 that detects an operation on the display screen of the display 4 and converts predetermined first display data into second display data. Then, the display data synthesized by combining the first display data with the second display data is displayed on the display screen of the display 4, and a predetermined operation is performed in the display area 12 of the second display data by the touch panel 2. It is characterized by having a CPU 1 for detecting the above.
  • the CPU 1 is characterized in that when the touch panel 2 detects an operation instructing execution of a predetermined process within the display area 12, the CPU 1 executes the process. Further, the CPU 1 is characterized in that the finger icon IC1 corresponding to the detected predetermined operation is displayed in the display area 11 of the first display data.
  • the ROM 5 stores various software programs that are necessary for the operation of the information terminal device 100 and executed by the CPU 1 in advance.
  • the DRAM 6 is used as a working area of the CPU 1, and when the CPU 1 executes a program stored in the ROM 5, the execution program necessary for executing a function corresponding to the program to be executed and the execution program are executed. Stores necessary data and temporary data generated during execution.
  • the interface 3 executes predetermined interface processing related to video display processing such as signal conversion on the display data from the CPU 1, and outputs the processed display data to the display 4 for display.
  • the display 4 is a display device such as a liquid crystal display (LCD (Liquid Crystal Display)), and is provided on the front surface of the housing 10 of the information terminal device 100, and various GUIs (Graphic User Interface). ) Functions as a display device for the program.
  • the touch panel 2 is a detection unit that detects a user operation on the display screen of the display 4, and includes a transparent film provided on the surface of the display 4 and a touch detection unit.
  • the touch detection means detects the position and movement of the user's finger that has touched the above-described film, and includes the coordinates (xf, yf) of the finger and the operation content based on the detected position and movement of the finger.
  • a detection signal S2 is generated and output to the CPU1. 2
  • the upper left corner of the display screen of the display 4 is defined as the origin O1 of the xy coordinate system
  • the right direction in FIG. 2 is defined as the x axis direction
  • the lower direction in FIG. 2 is defined as the y axis. Defined as direction.
  • the user's operation includes a tap (also referred to as a click) for tapping the touch panel 2 only once, a double tap for tapping twice in succession, and a swipe for sliding the finger while touching the touch panel 2. Including.
  • the information terminal device 100 includes a button 7 for turning on / off the information terminal device 100, a button 8 for displaying a predetermined menu screen on the display 4, and details on the display 4. And a button 9 for displaying a display area 12 (also referred to as a sub-screen) to be described later.
  • the CPU 1 is connected to the touch panel 2, the interface 3, the ROM 5, and the DRAM 6 to control them, and based on the detection signal S2 from the touch panel 2 and information indicating whether or not the buttons 7 to 9 are operated. Perform various software functions.
  • the user can operate the information terminal device 100 by operating the buttons 7 to 9 or touching the touch panel 2 with a finger.
  • FIG. 3 is a flowchart showing a control process executed by the CPU 1 of FIG.
  • the CPU 1 executes the control process of FIG. 1 when the user operates the button 8, for example.
  • the CPU 1 includes a first menu screen including application icons 51 to 61 (see FIG. 2) for instructing execution of a program of each application such as a video display application and a character input application. Display data is displayed on the entire display screen of the display 4.
  • the CPU 1 determines whether or not an operation for instructing display of second display data to be described later has been performed on the touch panel 2 or the button 9. If YES, the process proceeds to step S3. On the other hand, if NO, the process of step S4 is repeated. For example, the CPU 1 determines YES in step S ⁇ b> 2 when the application icon 51 displayed on the display 4 is tapped and when the button 9 is operated.
  • step S3 in FIG. 3 the CPU 1 reduces the first display data with a similar shape and converts it into second display data. At this time, the CPU 1 reduces the display content of the first display data such as the application icons 51 to 61 in a similar shape as it is.
  • step S4 the first display data is combined with the second display data, and the combined display data is displayed on the display 4. Specifically, the CPU 1 displays the first display data in the display area 11 of the entire display screen of the display 4 and the second display data in the display area 12 in the lower right corner portion of the display screen of the display 4. As described above (the entire display area 12 hides a part of the display area 11), the first display data is combined with the second display data to generate combined display data.
  • FIG. 1 the entire display area 12 hides a part of the display area 11
  • FIG. 2 shows a display example of the display 4 in step S4 of FIG.
  • application icons 51 to 61 are displayed in the display area 11 corresponding to the entire display screen of the display 4.
  • application icons 71 to 81 corresponding to the application icons 51 to 61 are displayed in the display area 12 in the lower right corner portion of the display area 11, respectively. Thereby, the user can tap the application icons 71 to 81 in the display area 12 while holding the information terminal device 100 with the right hand.
  • step S5 the CPU 1 determines whether or not the finger position on the touch panel 2 is within the display area 12 based on the detection signal S2 from the touch panel 2, and proceeds to step S6 if YES.
  • step S11 another touch detection process is executed in step S11 based on the detection signal S2, and the process returns to step S5.
  • another touch detection process for example, when the finger position on the touch panel 2 is on one of the application icons 51 to 61, application software corresponding to the application icon at the finger position is used. This is a process for executing a wear program.
  • step S6 the CPU 1 converts the coordinates (xf, yf) of the finger in the xy coordinate system of the display area 11 into coordinates (pf, qf) in the pq coordinate system of the display area 12.
  • the upper left corner of the display area 12 is defined as the origin O2 of the pq coordinate system
  • the right direction in FIG. 2 is defined as the p-axis direction
  • the lower direction in FIG. It is defined as
  • a finger icon IC1 indicating the position of the finger in the display area 11 corresponding to the actual position of the finger in the display area 12 is displayed at the coordinates (pf, qf) in the xy coordinate system.
  • the coordinate (pf, qf) in the pq coordinate system is (50, 100)
  • the finger icon IC1 is displayed at the coordinate (50, 100) in the xy coordinate system.
  • step S8 the CPU 1 determines whether one of the application icons 71 to 81 has been tapped based on the detection signal S2. If YES, step S9 is performed. On the other hand, if NO, the process returns to step S5. In step S9, it is determined whether or not the tapped application icon corresponds to the character input application. If YES, the process proceeds to step S10. If NO, the process proceeds to step S12. Note that, in the case of NO in steps S5 and S8, the CPU 1 repeatedly executes the processes in steps S6 and S7 at predetermined time intervals.
  • FIG. 4 is a diagram illustrating a display example of the display 4 in step S7 of FIG. 3 when the user taps the application icon 74.
  • FIG. 5 is a diagram illustrating a display example of the display 4 in step S ⁇ b> 7 of FIG. 3 when the user swipes within the display area 12.
  • the finger icon IC ⁇ b> 1 is displayed at a fixed position on the application icon 54 corresponding to the application icon 74.
  • the finger icon IC ⁇ b> 1 moves in the display area 11 at predetermined time intervals.
  • step S10 in FIG. 3 the CPU 1 prohibits the display of the second display data, displays only the first display data on the display 4, and executes a program for the character input application corresponding to the tapped application icon. Start and execute, and return to step S1.
  • step S12 the CPU 1 starts and executes the program of the application corresponding to the tapped application icon while displaying the second display data in the display area 12, and returns to step S1.
  • step S12 the CPU 1 displays the finger icon IC1 by repeatedly executing the processes in steps S6 and S7 at predetermined time intervals.
  • FIG. 6 is a diagram showing a display example of the display 4 when the video reproduction application program is executed in step S12 of FIG.
  • the CPU 1 reduces the first display data in a similar shape and converts it into second display data (see step S3 in FIG. 3). Further, the CPU 1 synthesizes the first display data with the second display data so that the entire display area 12 of the second display data overlaps the display area 11 of the first display data (step S4 in FIG. 3). reference.). More specifically, as shown in FIGS. 2 and 4 to 6, the display screen of the display 4 corresponds to the entire display screen of the display 4 and includes a display area 11 for displaying first display data, Display area 12 for displaying second display data obtained by reducing the display data of one with a similar shape.
  • the display area 12 is arranged so as to overlap the display area 11 so as to hide the lower right corner portion of the display area 11. Further, the display area 11 displays application icons 51 to 61 for instructing to start characters and applications, video, and the finger cursor IC1, and the display area 12 instructs to start characters and applications. Application icons 71 to 81 and video and the like are displayed.
  • the CPU 1 detects that the user has tapped the display area 12 by using the touch panel 2, the finger coordinates (xf, yf) in the xy coordinate system are detected. Is converted into coordinates (pf, qf) in the pq coordinate system of the display area 12. Then, a finger icon IC1 indicating the position of the finger in the display area 11 corresponding to the actual position of the finger in the display area 12 is displayed at coordinates (pf, qf) in the xy coordinate system. That is, the CPU 1 converts the actual position of the finger in the display area 12 into a position in the display area 11 corresponding to the position of the finger, and displays the finger icon IC1 at the converted position.
  • the user can operate the application icons 71 to 81 in the display area 12 while looking at the finger icon IC1 in the display area 11.
  • the portable terminal described in Patent Document 1 since items such as buttons and links are collected and displayed near the finger, there is a problem that the displayed items are difficult to see and difficult to operate. For example, since the user does not need to operate while looking at the application icons 71 to 81 in the display area 12, the operability can be improved.
  • the position of the finger is measured at a predetermined time interval, and the measured position is displayed in the display area. 11 and the finger icon IC1 is displayed at predetermined time intervals. Therefore, an operation can be performed in the display area 12 while observing the movement of the finger icon IC1 in the display area 11. That is, the user can perform an operation in the display area 11 by performing an operation in the display area 12. Further, since the display area 12 is displayed at the bottom of the display 4, the user touches the display area 12 displayed at the position of the finger of the holding hand while holding the information terminal apparatus 100 with one hand, and the information terminal apparatus. 100 can be operated. Accordingly, the user can swipe, which is a characteristic operation of the touch panel, while holding the information terminal device 100 with one hand, or can start the application by tapping the application icons 71 to 81.
  • the CPU 1 taps an operation (for example, an application icon corresponding to the character input application among the application icons 71 to 81) that instructs execution of the character input application in the display area 12 by the touch panel 2. Is detected), the character input application program is executed (see step S10 in FIG. 3). At this time, the CPU 1 prohibits the display of the second display data and displays only the first display data on the display screen of the display 4. Therefore, since the display area 12 is not displayed when the character input application is executed, the display area 12 does not interfere with the character input. Further, the CPU 1 displays the display area 12 (see step S12 in FIG. 3) when executing other than a character input application such as a video reproduction application. For example, the user touches the display area 12. You can perform operations such as fast-forwarding.
  • an operation for example, an application icon corresponding to the character input application among the application icons 71 to 81
  • the character input application program is executed (see step S10 in FIG. 3). At this time, the CPU 1 prohibits the display of the second display data and displays
  • the finger icon IC1 is displayed on the display 4, but the present invention is not limited to this.
  • the CPU 1 may detect a user operation such as swipe or tap in the display area 12 and display an indicator corresponding to the operation in the display area 11.
  • FIG. 7 is a diagram illustrating another display example of the display 4 in step S ⁇ b> 7 of FIG. 3 when the user taps the application icon 74. In FIG. 7, an arrow-shaped cursor IC2 is displayed instead of the finger icon IC1.
  • FIG. 8 is a diagram showing another display example of the display 4 in step S7 of FIG. 3 when the user double taps the application icon 74. In FIG. 8, when the user double taps an icon, a finger icon IC3 including a star indicating a double tap is displayed.
  • the first display data is a menu screen including the application icons 51 to 61, and the CPU 1 is tapped when detecting that one of the application icons 71 to 81 is tapped.
  • the application program corresponding to the application icon is executed, the present invention is not limited to this.
  • the CPU 1 may execute the process.
  • the first display data may be display data including moving image display data and a moving image playback button and a stop button. At this time, when the CPU 1 detects that the moving image stop button displayed in the display area 12 has been tapped, the CPU 1 executes processing for stopping the moving image display.
  • the CPU 1 prohibits displaying the second display data when executing the character input application, but the present invention is not limited to this. For example, it may be determined whether to display the second display data according to the type of application to be executed. Further, the CPU 1 may prohibit the display of the second display data when detecting that the user has tapped an area other than the display area 12.
  • the CPU 1 combines the first display data with the second display data so that the entire display area 12 hides a part of the display area 11.
  • the CPU 1 transfers the first display data to the second display data so that the display area 12A for the second display data does not overlap the display area 11A for the first display data. The only difference is that it is combined with the display data, and the other configuration and operation are the same as those in the first embodiment.
  • FIG. 9 is a diagram showing a display example of the display areas 11A and 12A according to the second embodiment of the present invention.
  • the CPU 1 reduces the first display data in a similar shape and converts it into second display data, displays the first display data in the display area 11 ⁇ / b> A in the upper left part of the display 4,
  • the second display data is displayed in the display area 12A in the lower right corner portion of the display 4, and the first display data is combined with the second display data so that the display area 12A does not overlap the display area 11A.
  • the displayed display data is displayed on the display screen of the display 4.
  • the user can perform the same operation as that performed in the display area 11A by performing an operation in the display area 12A.
  • FIG. 10 is a diagram showing a display example of the display areas 11B and 12B according to the first modification of the second embodiment of the present invention.
  • the present modified example is that the CPU 1 combines the first display data with the second display data so that a part of the display area 12B overlaps a part of the display area 11B. Only the difference is that other configurations and operations are the same as those of the second embodiment.
  • the CPU 1 reduces the first display data in a similar shape and converts it into second display data, and the first display data is displayed in the display area in the upper left portion of the display 4.
  • the second display data is displayed in the display area 12B in the lower right corner portion of the display 4, and the first display data is displayed so that a part of the display area 12B overlaps a part of the display area 11B.
  • the display data is combined with the second display data, and the combined display data is displayed on the display 4.
  • the user can perform the same operation as when operating in the display area 11B by performing an operation in the display area 12B.
  • FIG. 11 is a diagram showing a display example of the display areas 11C and 12C according to the second modification of the second embodiment of the present invention.
  • the CPU 1 displays the first and second display data in the display areas 11C and 12C that have the same size and do not overlap each other, as compared with the second embodiment. The only difference is that the first display data is combined with the second display data, and other configurations and operations are the same as those of the second embodiment.
  • the CPU 1 converts the first display data into second display data having the same shape as the first display data, and converts the first display data into the upper left part of the display 4.
  • the second display data is displayed in the display area 12C in the lower right corner portion of the display 4, and the first display data is displayed in the second area so that the display area 12C does not overlap the display area 11C.
  • the combined display data is displayed on the display 4.
  • the user can perform the same operation as that performed in the display area 11C by performing an operation in the display area 12C.
  • the display data having the same display size as the entire display area of the display 4 is reduced in a similar shape so that the same display size as that of the display area 11A, 11B, or 11C is obtained. You may convert beforehand to the 1st display data which have.
  • the CPU 1 reduced the first display data in a similar shape and converted it into second display data or converted it into second display data having the same shape as the first display data.
  • the present invention is not limited to this, and it is sufficient that an operation in the display area of the first display data can be performed by a user operating in the display area of the second display data.
  • FIG. 12 is a flowchart showing step S3A of the control process according to the third embodiment of the present invention.
  • the control process executed by the CPU 1 in the present embodiment is obtained by replacing step S3 with step S3A in the control process of FIG. 3, and the steps S1 to S2, S4 to S4 other than step S3A.
  • S12 is the same as FIG.
  • the CPU 1 reduces the first display data to an unsimilar shape and converts it into second display data. At this time, the CPU 1 reduces the display content of the first display data such as the application icons 51 to 61 in an unsimilar shape.
  • step S4 the first display data is combined with the second display data, and the combined display data is displayed on the display 4.
  • the CPU 1 displays the first display data in the display area 11D at the top of the display screen of the display 4 and the second display data in the display area 12D at the bottom of the display screen of the display 4.
  • the first display data is combined with the second display data (so that the display areas 11D and 12D do not overlap each other), and the combined display data is generated.
  • FIG. 13 is a diagram showing a display example of the display 4 in step S3A of FIG.
  • application icons 51, 52, 53, 54,... are arranged in a grid pattern in the display area 11D.
  • application icons 71, 72, 73, 74,... Corresponding to the application icons 51, 52, 53, 54,... are rearranged in the display area 12D and arranged in a line.
  • the CPU 1 uses the finger coordinates (xf, yf) as the application icons 71, 72, 73,.
  • the finger coordinates (xf, yf) are converted into predetermined coordinates in the application icons 51, 52, 53,...
  • the display area 11D corresponding to the icon including the coordinates (xf, yf), and the finger icon IC1 or the like is indicated in the converted coordinates. Display children.
  • the user can perform the same operation as when operating in the display area 11D by performing an operation in the display area 12D.
  • the display area 12D does not overlap the display area 11D, but the present invention is not limited to this, and at least a part of the display area 12D may overlap the display area 11D.
  • FIG. 14 is a diagram showing a display example of the icon IC2 according to the fourth embodiment of the present invention.
  • the CPU 1 determines that the actual positions of the fingers in the display areas 12, 12A, 12B, 12C, and 12D are the finger icons IC1 in the display areas 11, 11A, 11B11C, and 11D. There was a one-to-one correspondence with the position of the indicator.
  • the CPU 1 detects a finger movement operation in the display area 12 based on the detection signal S2 from the touch panel 2, the CPU 1 detects the movement operation detected in the display area 11.
  • the cursor IC2 is displayed so that the cursor IC2 is moved in the movement direction by the movement amount corresponding to the movement amount of the detected movement operation.
  • the user can use the display area 12 as a touch pad.
  • the first display data is converted into the second display data and displayed in the display area 12.
  • the present invention is not limited to this, and another display data other than the first and second display data is displayed. Display data may be displayed in the display area 12.
  • FIG. 15 is a diagram showing a display example of the display areas 11, 12R and 12L according to the fifth embodiment of the present invention.
  • the CPU 1 converts the first display data into one second display data.
  • the CPU 1 reduces the first display data with a similar shape and converts it into two pieces of second display data. Then, the CPU 1 displays the first display data in the display area 11 of the entire display screen of the display 4, displays one second display data in the display area 12R in the lower right corner portion of the display 4, and the other display The first display data is combined with the two second display data so that the display data of 2 is displayed in the display area 12L in the lower left corner portion of the display 4.
  • the display areas 12R and 12L are used for operations with the right hand and the left hand, respectively.
  • the touch panel 2 detects a pressure in addition to the finger coordinates (xf, yf) and the operation content, and generates a detection signal S2 including the finger coordinates (xf, yf), the operation content, and the pressure. Generated and output to the CPU 1. Further, the CPU 1 selects the display area with the greater pressing out of the display areas 12R and 12L based on the detection signal S2 from the touch panel 2, and based on the user's operation in the selected display area, The finger icon IC1 is displayed in the display area 11 as in the first embodiment (see FIG. 15).
  • the operation of the display area with the larger pressing of the display areas 12R and 12L is adopted, so that even if the user touches the display 4 with either the left or right finger, the operation of either finger Only becomes active. For this reason, the information terminal device 100 is operated with either finger of the left and right hands of the user.
  • the CPU 1 selects the display area with the greater pressing out of the display areas 12R and 12L, and displays the finger icon IC1 based on the user's operation within the selected display area.
  • the present invention is not limited to this.
  • the CPU 1 may select the display area touched first or the display area selected by a predetermined operation such as the operation of the button 9 among the display areas 12R and 12L.
  • FIG. 16 is a block diagram showing a configuration of an information terminal device 100A according to the sixth embodiment of the present invention.
  • the information terminal device 100 includes a display instruction unit 200, a display control unit 201, an operation unit 202, a monitor 102, a conversion unit 203, and a processing unit 204.
  • the monitor 102 corresponds to the interface 3 and the display 4 in FIG.
  • the operation unit 202 corresponds to the touch panel 2 in FIG. 1
  • the display instruction unit 200 corresponds to the buttons 7 to 9 and the touch panel 2 in FIG.
  • the display control unit 201, the conversion unit 203, and the processing unit 204 correspond to the CPU 1 in FIG. 1, and execute the control process of the CPU 1 according to each of the above-described embodiments and modifications thereof.
  • the operation means 202 is operated by the user with a finger in the display area 11, 11A, 11B, 11C, 11D, 12, 12A, 12B, 12C, 12D, 12L or 12R of the display screen of the monitor 102.
  • the position and movement of the finger are detected, and the user's operation content is determined from the detected position and movement information. For example, when the user clicks on the application icon 51 displayed in the display area 11 of FIG. 2, the operation unit 202 determines that the operation content of the user is an instruction to start an application corresponding to the application icon 51. .
  • the processing unit 204 executes processing corresponding to the operation content according to the operation content determined by the operation unit 202.
  • the user's operation content is an application activation instruction corresponding to the application icon 51
  • the program of the application is actually executed.
  • the display instruction means 200 The instruction is notified to the display control means 201. For example, when the user touches a button (for example, the application icon 52 of FIG. 2) for displaying the display area 12 displayed in the display area 11, the display area 12 is displayed.
  • the display control unit 201 is notified by the display instruction unit 200 to display the display areas 12, 12A, 12B, 12C, 12D or 12L and 12R
  • the display areas 12, 12A, 12B, 12C, 12D or 12L and 12R are displayed on the display screen of the monitor 102.
  • the display area 12 is displayed on the display screen of the monitor 102 so as to hide a part of the display area 11.
  • the conversion unit 203 generates the second display data to be displayed in the display area 12 by reducing the display content of the display area 11 as it is.
  • This embodiment has the same effects as the first to fifth embodiments and their modifications.
  • the second display data is displayed in the display areas 12, 12A, 12B, 12C, 12D, 12L, and 12R below the display 4, but the present invention is not limited to this. Instead, it may be displayed in any area of the display 4.
  • the first display data is combined with the second display data so that the display area of the second display data is located at the lower part of the display screen of the display 4.
  • the first display data is stored in the second display data so that the display area of the second display data is located at the lower part of the display screen of the display 4. By combining with the display data, the user can touch the display area of the second display data with the handle.
  • the present invention has been described by taking the information terminal device 100 as an example, but the present invention is not limited to this.
  • the present invention can be applied to an electronic device provided with detection means such as a touch panel 2 that detects an operation on a display screen such as the display 4. Therefore, it is possible to provide an electronic device that can be operated on the entire display screen by operating the sub-screen displayed at the position of the finger of the electronic device. For this reason, according to this invention, even if it is a tablet-type electronic device with which a user is generally assumed to hold
  • predetermined first display data is converted into at least one second display data, and the first display data is combined with each second display data.
  • the display data synthesized in this way is displayed on the display screen, and an operation for instructing execution of a predetermined process is performed within the display area of one of the at least one second display data by the detecting means.
  • the control means for executing the processing is provided, the user simply performs an operation in the display area of the second display data, and is similar to the operation performed in the display area of the first display data. Can be operated. Therefore, it is possible to provide an electronic device with better operability compared to the prior art.
  • buttons, 10 housing, 11, 11A, 11B, 11C, 11D, 12, 12A, 12B, 12C, 12D, 12L, 12R ... display area, 51-61, 71-81 ... application icons, 100, 100A ... information terminal device, 102 ... monitor, 200 ... display instruction means, 201 ... display control means, 202 ... operation means, 203 ... conversion means, 204 ... processing means, IC1, IC3 ... finger icon, IC2 ... cursor.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention consiste à convertir des premières données d'affichage contenant des icônes d'application (51 à 61) en deuxièmes données d'affichage en réduisant leur taille mais en conservant une forme similaire, et à combiner les premières données d'affichage et les deuxièmes données d'affichage de telle façon que les premières données d'affichage soient affichées sur une zone d'affichage (11) et les deuxièmes données d'affichage soient affichées sur une zone d'affichage (12). Lorsqu'un contact tactile est détecté sur la zone d'affichage (12), la position touchée est convertie en une position correspondante sur la zone d'affichage (11) et une icône en forme de doigt (IC1) est affichée à la position convertie.
PCT/JP2011/005929 2010-12-07 2011-10-24 Dispositif électronique WO2012077273A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US42046110P 2010-12-07 2010-12-07
US61/420,461 2010-12-07
JP2010-272463 2010-12-07
JP2010272463 2010-12-07

Publications (1)

Publication Number Publication Date
WO2012077273A1 true WO2012077273A1 (fr) 2012-06-14

Family

ID=46206789

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2011/005929 WO2012077273A1 (fr) 2010-12-07 2011-10-24 Dispositif électronique

Country Status (1)

Country Link
WO (1) WO2012077273A1 (fr)

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102915201A (zh) * 2012-09-17 2013-02-06 广东欧珀移动通信有限公司 一种大屏幕触控手机的单手操作方法
JP2013030050A (ja) * 2011-07-29 2013-02-07 Kddi Corp スクリーンパッドによる入力が可能なユーザインタフェース装置、入力処理方法及びプログラム
CN103488419A (zh) * 2013-08-26 2014-01-01 宇龙计算机通信科技(深圳)有限公司 通信终端的操作方法及通信终端
KR20140005481A (ko) * 2012-07-04 2014-01-15 엘지전자 주식회사 휴대 단말기 및 그 제어 방법
CN103543913A (zh) * 2013-10-25 2014-01-29 小米科技有限责任公司 一种终端设备操作方法、装置和终端设备
CN103927080A (zh) * 2014-03-27 2014-07-16 小米科技有限责任公司 控制控件操作的方法和装置
JP2014153948A (ja) * 2013-02-08 2014-08-25 International Business Maschines Corporation 制御装置及び制御プログラム
JP2014178790A (ja) * 2013-03-14 2014-09-25 Ricoh Co Ltd 投影システム、投影装置、投影プログラム及び投影方法
EP2799971A2 (fr) * 2013-05-03 2014-11-05 Samsung Electronics Co., Ltd. Procédé d'exploitation d'écran tactile et son dispositif électronique
EP2806339A1 (fr) * 2013-05-24 2014-11-26 Samsung Electronics Co., Ltd Procédé et appareil pour afficher une image sur un dispositif portable
CN104380238A (zh) * 2013-12-03 2015-02-25 华为技术有限公司 一种处理方法、装置及终端
CN104516654A (zh) * 2013-09-26 2015-04-15 联想(北京)有限公司 操作处理方法和装置
JP2015106418A (ja) * 2013-11-29 2015-06-08 株式会社 ハイヂィープ 仮想タッチパッド操作方法及びこれを行う端末機
EP2685369A3 (fr) * 2012-07-12 2015-07-01 Samsung Electronics Co., Ltd Procédé et dispositif mobile permettant de régler la taille de la fenêtre d'entrée tactile
CN104898976A (zh) * 2015-06-03 2015-09-09 北京百纳威尔科技有限公司 在移动设备上使用小屏窗口的方法及移动设备
EP2752753A3 (fr) * 2013-01-02 2016-11-30 Samsung Display Co., Ltd. Terminal et son procédé de fonctionnement
US9696882B2 (en) 2013-08-28 2017-07-04 Lenovo (Beijing) Co., Ltd. Operation processing method, operation processing device, and control method
JPWO2017022031A1 (ja) * 2015-07-31 2018-02-22 マクセル株式会社 情報端末装置
JP2019083965A (ja) * 2017-11-06 2019-06-06 株式会社カプコン ゲームプログラム、およびゲームシステム
CN113260426A (zh) * 2018-12-28 2021-08-13 株式会社万代南梦宫娱乐 游戏系统、处理方法以及信息存储介质
WO2021227628A1 (fr) * 2020-05-14 2021-11-18 华为技术有限公司 Dispositif électronique et son procédé d'interaction
WO2024114234A1 (fr) * 2022-11-30 2024-06-06 华为技术有限公司 Procédé de fonctionnement à une seule main et dispositif électronique

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05330289A (ja) * 1992-05-29 1993-12-14 Hitachi Software Eng Co Ltd 電子黒板装置
JPH11259237A (ja) * 1998-03-12 1999-09-24 Ricoh Co Ltd 画像表示装置
JP2006018348A (ja) * 2004-06-30 2006-01-19 Hitachi Ltd 大画面ディスプレイを用いたときの、入力・表示システムとその方法
JP2009064209A (ja) * 2007-09-06 2009-03-26 Sharp Corp 情報表示装置
JP2009122837A (ja) * 2007-11-13 2009-06-04 Sharp Corp 情報表示装置、情報表示方法、プログラム及び記録媒体

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05330289A (ja) * 1992-05-29 1993-12-14 Hitachi Software Eng Co Ltd 電子黒板装置
JPH11259237A (ja) * 1998-03-12 1999-09-24 Ricoh Co Ltd 画像表示装置
JP2006018348A (ja) * 2004-06-30 2006-01-19 Hitachi Ltd 大画面ディスプレイを用いたときの、入力・表示システムとその方法
JP2009064209A (ja) * 2007-09-06 2009-03-26 Sharp Corp 情報表示装置
JP2009122837A (ja) * 2007-11-13 2009-06-04 Sharp Corp 情報表示装置、情報表示方法、プログラム及び記録媒体

Cited By (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9244544B2 (en) 2011-07-29 2016-01-26 Kddi Corporation User interface device with touch pad enabling original image to be displayed in reduction within touch-input screen, and input-action processing method and program
JP2013030050A (ja) * 2011-07-29 2013-02-07 Kddi Corp スクリーンパッドによる入力が可能なユーザインタフェース装置、入力処理方法及びプログラム
WO2013018480A1 (fr) * 2011-07-29 2013-02-07 Kddi株式会社 Dispositif d'interface utilisateur comprenant un pavé tactile pour réduire et afficher une image source dans un écran permettant une entrée tactile, procédé et programme de traitement d'entrée
KR102019116B1 (ko) 2012-07-04 2019-09-06 엘지전자 주식회사 휴대 단말기 및 그 제어 방법
KR20140005481A (ko) * 2012-07-04 2014-01-15 엘지전자 주식회사 휴대 단말기 및 그 제어 방법
EP2685369A3 (fr) * 2012-07-12 2015-07-01 Samsung Electronics Co., Ltd Procédé et dispositif mobile permettant de régler la taille de la fenêtre d'entrée tactile
CN102915201A (zh) * 2012-09-17 2013-02-06 广东欧珀移动通信有限公司 一种大屏幕触控手机的单手操作方法
EP2752753A3 (fr) * 2013-01-02 2016-11-30 Samsung Display Co., Ltd. Terminal et son procédé de fonctionnement
JP2014153948A (ja) * 2013-02-08 2014-08-25 International Business Maschines Corporation 制御装置及び制御プログラム
JP2014178790A (ja) * 2013-03-14 2014-09-25 Ricoh Co Ltd 投影システム、投影装置、投影プログラム及び投影方法
EP2799971A3 (fr) * 2013-05-03 2014-11-12 Samsung Electronics Co., Ltd. Procédé d'exploitation d'écran tactile et son dispositif électronique
US9652056B2 (en) 2013-05-03 2017-05-16 Samsung Electronics Co., Ltd. Touch-enable cursor control method and electronic device thereof
EP2799971A2 (fr) * 2013-05-03 2014-11-05 Samsung Electronics Co., Ltd. Procédé d'exploitation d'écran tactile et son dispositif électronique
EP2806339A1 (fr) * 2013-05-24 2014-11-26 Samsung Electronics Co., Ltd Procédé et appareil pour afficher une image sur un dispositif portable
US10691291B2 (en) 2013-05-24 2020-06-23 Samsung Electronics Co., Ltd. Method and apparatus for displaying picture on portable device
CN103488419A (zh) * 2013-08-26 2014-01-01 宇龙计算机通信科技(深圳)有限公司 通信终端的操作方法及通信终端
US9696882B2 (en) 2013-08-28 2017-07-04 Lenovo (Beijing) Co., Ltd. Operation processing method, operation processing device, and control method
CN104516654A (zh) * 2013-09-26 2015-04-15 联想(北京)有限公司 操作处理方法和装置
CN103543913A (zh) * 2013-10-25 2014-01-29 小米科技有限责任公司 一种终端设备操作方法、装置和终端设备
JP2015106418A (ja) * 2013-11-29 2015-06-08 株式会社 ハイヂィープ 仮想タッチパッド操作方法及びこれを行う端末機
US10073613B2 (en) 2013-12-03 2018-09-11 Huawei Technologies Co., Ltd. Processing method and apparatus, and terminal
CN104380238A (zh) * 2013-12-03 2015-02-25 华为技术有限公司 一种处理方法、装置及终端
CN103927080A (zh) * 2014-03-27 2014-07-16 小米科技有限责任公司 控制控件操作的方法和装置
CN104898976A (zh) * 2015-06-03 2015-09-09 北京百纳威尔科技有限公司 在移动设备上使用小屏窗口的方法及移动设备
JPWO2017022031A1 (ja) * 2015-07-31 2018-02-22 マクセル株式会社 情報端末装置
JP2019083965A (ja) * 2017-11-06 2019-06-06 株式会社カプコン ゲームプログラム、およびゲームシステム
CN113260426A (zh) * 2018-12-28 2021-08-13 株式会社万代南梦宫娱乐 游戏系统、处理方法以及信息存储介质
WO2021227628A1 (fr) * 2020-05-14 2021-11-18 华为技术有限公司 Dispositif électronique et son procédé d'interaction
WO2024114234A1 (fr) * 2022-11-30 2024-06-06 华为技术有限公司 Procédé de fonctionnement à une seule main et dispositif électronique

Similar Documents

Publication Publication Date Title
WO2012077273A1 (fr) Dispositif électronique
JP4372188B2 (ja) 情報処理装置および表示制御方法
US8638315B2 (en) Virtual touch screen system
JP5718042B2 (ja) タッチ入力処理装置、情報処理装置およびタッチ入力制御方法
JP5691464B2 (ja) 情報処理装置
TWI588734B (zh) 電子裝置及其操作方法
JP2011028524A (ja) 情報処理装置、プログラムおよびポインティング方法
TWI434202B (zh) 具觸控式螢幕的電子裝置及其顯示控制方法
EP2530573B1 (fr) Procédé de contrôle tactile et appareil électronique
JP5848732B2 (ja) 情報処理装置
JP5197533B2 (ja) 情報処理装置および表示制御方法
CA2766528A1 (fr) Processus convivial permettant d'entrer en interaction avec du contenu informationnel sur des dispositifs a ecran tactile
JP5780438B2 (ja) 電子機器、位置指定方法及びプログラム
JP2009276819A (ja) ポインティング装置の制御方法およびポインティング装置、並びにコンピュータプログラム
JP2011253252A (ja) 電子機器、及び入力制御方法
JP3850570B2 (ja) タッチパッド及びタッチパッドによるスクロール制御方法
WO2014006806A1 (fr) Dispositif de traitement d'informations
JP2008257629A (ja) タッチ式入力装置
JP5275429B2 (ja) 情報処理装置、プログラムおよびポインティング方法
KR20160019762A (ko) 터치 스크린 한손 제어 방법
JP5414134B1 (ja) タッチ式入力システムおよび入力制御方法
JP2011081447A (ja) 情報処理方法及び情報処理装置
KR101179584B1 (ko) 터치스크린내 가상마우스 구현방법 및 이를 수행하는 프로그램을 기록한 컴퓨터 판독 가능한 기록매체
JP4856136B2 (ja) 移動制御プログラム
JP5458130B2 (ja) 電子機器、及び入力制御方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11846786

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11846786

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP