WO2013047182A1 - 携帯型電子機器、タッチ操作処理方法およびプログラム - Google Patents
携帯型電子機器、タッチ操作処理方法およびプログラム Download PDFInfo
- Publication number
- WO2013047182A1 WO2013047182A1 PCT/JP2012/073138 JP2012073138W WO2013047182A1 WO 2013047182 A1 WO2013047182 A1 WO 2013047182A1 JP 2012073138 W JP2012073138 W JP 2012073138W WO 2013047182 A1 WO2013047182 A1 WO 2013047182A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- touch sensor
- touch
- touch operation
- display
- sensor detects
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1626—Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1637—Details related to the display arrangement, including those related to the mounting of the display in the housing
- G06F1/1643—Details related to the display arrangement, including those related to the mounting of the display in the housing the display being associated to a digitizer, e.g. laptops that can be used as penpads
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1637—Details related to the display arrangement, including those related to the mounting of the display in the housing
- G06F1/1647—Details related to the display arrangement, including those related to the mounting of the display in the housing including at least an additional display
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1684—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
- G06F1/169—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated pointing device, e.g. trackball in the palm rest area, mini-joystick integrated between keyboard keys, touch pads or touch stripes
- G06F1/1692—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated pointing device, e.g. trackball in the palm rest area, mini-joystick integrated between keyboard keys, touch pads or touch stripes the I/O peripheral being a secondary touch screen used as control interface, e.g. virtual buttons or sliders
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/30—Authentication, i.e. establishing the identity or authorisation of security principals
- G06F21/31—User authentication
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0485—Scrolling or panning
- G06F3/04855—Interaction with scrollbars
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/02—Constructional features of telephone sets
- H04M1/0202—Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
- H04M1/026—Details of the structure or mounting of specific components
- H04M1/0266—Details of the structure or mounting of specific components for a display module assembly
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/02—Constructional features of telephone sets
- H04M1/23—Construction or mounting of dials or of equivalent devices; Means for facilitating the use thereof
- H04M1/236—Construction or mounting of dials or of equivalent devices; Means for facilitating the use thereof including keys on side or rear faces
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2250/00—Details of telephonic subscriber devices
- H04M2250/22—Details of telephonic subscriber devices including a touch pad, a touch sensor or a touch detector
Definitions
- the present invention relates to a portable electronic device, a touch operation processing method, and a program.
- the portable electronic device displays a mode setting screen in response to a user operation, and the mode is changed in accordance with the user operation on the mode setting screen.
- a method of switching is generally used.
- the user in order to perform mode switching, the user needs to perform an operation for requesting display of a mode setting screen and further perform an operation for requesting mode switching. In this way, the user needs to perform a plurality of operations, which is troublesome for the user and takes time for setting.
- the mobile phone described in Patent Document 1 includes a memory unit that holds input data, a secret mode state holding unit that holds a secret mode state, and whether or not a secret attribute is given to the data.
- the determination part provided with the determination part and the control part which controls operation
- the control unit switches the content of the secret mode state holding unit to the secret mode, and the secret attribute is given.
- Data is hidden.
- the data is transferred from an old model mobile phone to a new model mobile phone at the same time when the model of the mobile phone etc. is changed, the data that the user has hidden on the old model mobile phone It is said that it is possible to easily hide the mobile phone as well as the old mobile phone.
- the mobile phone described in Patent Document 1 determines whether to switch to the secret mode based on whether or not the secret attribute is given to the data, the mobile phone switches to the secret mode against the user's intention. There is a case. For example, even if the user has given the secret attribute to the data in advance as preparation for enabling the switch to the secret mode, the user normally intends to use it in the normal mode. The mobile phone switches to the secret mode. Further, in the method disclosed in Patent Document 1, mode switching other than the secret mode cannot be performed.
- An object of the present invention is to provide a portable electronic device, a touch operation processing method, and a program that can solve the above-described problems.
- a portable electronic device includes a first touch sensor, a second touch sensor, and the first touch sensor performing a touch operation.
- the first process is performed, and when the first touch sensor detects a touch operation and the second touch sensor detects the touch operation, the first process is replaced with the first process.
- a processing unit that performs the second process.
- a touch operation processing method is a touch operation processing method for a portable electronic device including a first touch sensor and a second touch sensor, wherein the first touch sensor is a touch.
- a first process is performed.
- the first touch sensor detects a touch operation and the second touch sensor detects a touch operation
- the first process is associated with the first process instead of the first process.
- the second process having the property is performed.
- a program is configured to execute a first process when a first touch sensor detects a touch operation in a computer as a portable electronic device including a first touch sensor and a second touch sensor.
- a first touch sensor detects a touch operation
- the second touch sensor detects a touch operation
- a second process having relevance to the first process is performed instead of the first process.
- the present invention can reduce the user's effort while switching the mode reflecting the user's intention.
- FIG. 5 is a flowchart illustrating a processing procedure when the mobile terminal device selects and executes a processing mode in the embodiment.
- the present invention can be applied to various mobile terminal devices such as mobile phones and mobile information terminal devices.
- the application range of the present invention is not limited to the portable terminal device.
- the present invention can be applied to various portable information devices such as a stand-alone (that is, not a terminal) game machine and an electronic dictionary.
- FIG. 1 is a schematic block diagram showing a functional configuration of a mobile terminal device according to the first embodiment of the present invention.
- the mobile terminal device 100 includes a first display unit 111, a second display unit 112, a first touch sensor 121, a second touch sensor 122, an audio input unit 131, an audio output unit 132,
- storage part 190 are comprised.
- the control unit 180 includes a display control unit 181, an input processing unit 182, an audio processing unit 183, a communication control unit 184, and a processing unit 185.
- the portable terminal device 100 is, for example, a portable information terminal device, and provides various functions such as an Internet browsing function and an electronic mail function according to a user operation.
- the first display unit 111 has a display screen such as a liquid crystal display or an organic EL (Organic Electro-Luminescence) display, for example, and displays various images such as moving images, still images, and text (characters) according to the control of the display control unit 181. indicate.
- the display screen here is a device that displays an image. In the following, in order to distinguish between a screen as a device that displays an image and a screen as an image displayed on the device, the device that displays an image is referred to as a “display screen” and is displayed on the display screen. The image is referred to as a “screen image”.
- the 1st display part 111 is an example of the display screen in this invention, and the display screen of the 1st display part 111 is an example of the display part in this invention.
- the second display unit 112 has a display screen such as a liquid crystal display or an organic EL (Organic Electro-Luminescence) display, and displays various images such as moving images, still images, and text (characters) according to the control of the display control unit 181. indicate.
- a display screen such as a liquid crystal display or an organic EL (Organic Electro-Luminescence) display
- various images such as moving images, still images, and text (characters) according to the control of the display control unit 181. indicate.
- the first touch sensor 121 is provided on the display screen of the first display unit 111 and accepts a user's touch operation. That is, the display screen of the first display unit 111 and the first touch sensor 121 constitute a touch panel. The first touch sensor 121 detects a touch operation on the display screen of the first display unit 111. Then, the first touch sensor 121 outputs a signal indicating a touch position (a touched position on the display screen) to the input processing unit 182.
- the second touch sensor 122 is provided on the display screen of the second display unit 112 and accepts a user's touch operation. That is, the display screen of the second display unit 112 and the second touch sensor 122 constitute a touch panel. However, the second touch sensor 122 may be a touch sensor of a touch pad provided on the rear surface of the casing of the mobile terminal device 100. That is, the mobile terminal device 100 may not include the second display unit 112, and thus may not include a display screen on the back surface of the housing, and may include a touch pad on the back surface of the housing.
- the second touch sensor 122 detects a touch operation on the display screen of the second display unit 112. Then, the second touch sensor 122 outputs a signal indicating the touch position to the input processing unit 182.
- the audio input unit 131 has a microphone, collects ambient sounds, converts them into audio signals, and outputs them to the audio processing unit 183.
- the audio output unit 132 includes a speaker, and converts the audio signal output from the audio processing unit 183 as an analog electric signal into audio and outputs the audio.
- the wireless communication unit 140 communicates with a wireless base station and connects to a mobile phone communication network (a wireless communication network for mobile phones provided by a communication carrier). Specifically, the wireless communication unit 140 performs modulation processing on the signal output from the communication control unit 184 and transmits the signal as a wireless signal, and performs demodulation processing on the received wireless signal to perform communication. The data is output to the control unit 184. For example, the wireless communication unit 140 transmits and receives e-mail data using a wireless signal.
- the control unit 180 controls each unit of the mobile terminal device 100 to execute various functions.
- a CPU Central Processing Unit
- the control unit 180 controls each unit of the mobile terminal device 100 to execute various functions.
- a CPU Central Processing Unit
- the control unit 180 for example, a CPU (Central Processing Unit) provided in the mobile terminal device 100 reads a program from a memory provided in the mobile terminal device 100 and executes the program.
- a CPU Central Processing Unit
- the display control unit 181 controls the first display unit 111 and the second display unit 112 to display various images on each. Specifically, the display control unit 181 generates a screen display signal based on moving image data, still image data, text data, or the like output from the processing unit 185, and outputs the screen display signal to the first display unit 111. A screen image is displayed on the first display unit 111. Similarly, the display control unit 181 generates a screen display signal, outputs it to the second display unit 112, and causes the second display unit 112 to display a screen image.
- the input processing unit 182 outputs a signal corresponding to the operation received by the first touch sensor 121 or the second touch sensor 122 to the processing unit 185. For example, the input processing unit 182 outputs a signal indicating the touch position on the display screen of the first display unit 111 from the first touch sensor 121 in a state where the first display unit 111 displays an image component. It is determined whether the image component is touched. When the input processing unit 182 determines that the image component has been touched, the input processing unit 182 outputs information indicating the touched image component to the processing unit 185.
- the image component here is a partial image as a component constituting the screen image.
- an icon or a moving image or a still image displayed in the screen image corresponds to the image component.
- the icon here is an image symbolizing an object to be selected or designated, such as a file, folder, application program or function.
- the audio processing unit 183 converts the audio data output from the processing unit 185 into an electrical signal and outputs the electric signal to the audio output unit 132, thereby causing the audio output unit 132 to output audio.
- the voice processing unit 183 converts the electrical signal output by the voice input unit 131 by collecting and outputting the voice into voice data and outputs the voice data to the processing unit 185.
- the communication control unit 184 performs processing such as encoding on the data output from the processing unit 185, outputs the data to the wireless communication unit 140, modulates the data, and transmits the data as a wireless signal.
- the communication control unit 184 performs processing such as decoding on the signal received and demodulated by the wireless communication unit 140 to extract data, and outputs the data to the processing unit 185.
- the communication control unit 184 performs processing such as encoding on the e-mail data output from the processing unit 185 and outputs it to the wireless communication unit 140, and also decodes the signal received and demodulated by the wireless communication unit 140.
- the data such as e-mail data is extracted and output to the processing unit 185.
- the processing unit 185 executes an application program and provides various functions such as an Internet browsing function and an electronic mail function.
- the processing unit 185 performs the first process when the first touch sensor 121 detects a touch operation, the first touch sensor 121 detects the touch operation, and the second touch sensor 122 detects the touch operation. Instead of the first process, a second process having relevance to the first process is performed.
- the processing unit 185 causes the first touch sensor 121 to When a touch operation as an operation for changing the position of the image component is detected, a process for changing the position of the image component in the screen image is performed as the first process.
- the processing unit 185 detects the touch operation as an operation for changing the position of the image component, and the second touch sensor. If 122 detects a touch operation, the process which scrolls a screen image is performed as a 2nd process.
- the processing unit 185 performs user authentication when the first touch sensor 121 detects a touch operation as a user authentication request operation, and when the authentication is successful, As one process, a process of permitting access to the first data from a predetermined program is performed. Further, the processing unit 185 performs user authentication when the first touch sensor 121 detects a touch operation as a user authentication request operation and the second touch sensor 122 detects a touch operation. As processing, processing for permitting access to the second data from a predetermined program is performed.
- the processing unit 185 scrolls the screen image by a set scroll amount.
- the process is performed as a first process.
- the processing unit 185 performs processing for changing the setting of the scroll amount when the first touch sensor 121 detects the touch operation as the scroll operation and the second touch sensor 122 detects the touch operation.
- the process of scrolling the screen image by the scroll amount set in the second process is performed as the first process.
- the processing unit 185 performs a first process when the first touch sensor 121 detects a touch operation, the first touch sensor 121 detects the touch operation, and the second touch sensor 122 detects the touch operation. And when at least one of the 1st touch sensor 121 or the 2nd touch sensor 122 detects touch operation in a plurality of places, it replaces with the 2nd processing or adds to the 2nd processing, and performs the 3rd processing.
- the storage unit 190 is realized by a storage area of a memory included in the mobile terminal device 100, for example, and stores various data.
- the storage unit 190 stores telephone directory information. This telephone directory information will be described later with reference to FIG.
- the storage unit 190 stores various programs executed by the CPU included in the mobile terminal device 100 in advance.
- FIG. 2 is a perspective view showing an outline of the outer shape of the mobile terminal device 100 as viewed from the front side.
- a display screen of the first display unit 111 and a touch panel type display screen corresponding to the first touch sensor 121, a microphone of the audio input unit 131, and an audio output unit 132 are provided on the front surface of the casing of the mobile terminal device 100.
- Speaker is a display screen of the first display unit 111 and a touch panel type display screen corresponding to the first touch sensor 121, a microphone of the audio input unit 131, and an audio output unit 132 .
- FIG. 3 is a perspective view showing an outline of the outer shape of the mobile terminal device 100 as viewed from the back side.
- a touch screen display screen corresponding to the display screen of the second display unit 112 and the second touch sensor 122 is provided on the rear surface of the casing of the mobile terminal device 100.
- FIG. 4 is a perspective view of the state in which the mobile terminal device 100 is gripped with the right hand, as viewed from the front side of the mobile terminal device 100.
- the first touch sensor 121 detects the touch operation and outputs the coordinates of the point P101 as the touch position.
- a region having a small area is referred to as a “point”.
- the first touch sensor 121 has mesh-like detection points, and outputs the coordinates of those detection points included in the point P101 as a region. The same applies to the point P102 and the second touch sensor 122 described below.
- FIG. 5 is a perspective view of the back side of the mobile terminal device 100 viewed from the front side of the mobile terminal device 100 in a state where the mobile terminal device 100 is held with the right hand.
- the second touch sensor 122 detects the touch operation and outputs the coordinates of the point P102 as the touch position.
- the user can hold the mobile terminal device 100 with one hand and touch the display screen of the first display unit 111 and the display screen of the second display unit 112, and the first touch sensor 121 and the second touch sensor 122 detect these touch operations.
- the point P101 in FIG. 4 and the point P102 in FIG. 5 may be positions facing each other or may be shifted.
- the first touch sensor 121 and the second touch sensor 122 are not limited to the touch operation described above, and are performed in various forms, such as the display screen of the first display unit 111 and the display screen of the second display unit 112.
- the touch operation can be detected.
- the user holds the mobile terminal device 100 with the left hand, performs a touch operation on the display screen of the first display unit 111 with the right index finger, and performs a touch operation on the display screen of the second display unit 112 with the left index finger. You can also. Also in this case, the first touch sensor 121 and the second touch sensor 122 detect these touch operations.
- the casing of the mobile terminal device 100 can be folded, and the display screen of the first display unit 111 and the display screen of the second display unit 112 are arranged vertically on the front side with the casing open. You may be made to do.
- the user performs a touch operation on the display screen of the first display unit 111 with the right hand index finger, holds the mobile terminal device 100 with the left hand, and touches the display screen of the second display unit 112 with the left hand thumb. Touch operation can be performed.
- the first touch sensor 121 and the second touch sensor 122 detect these touch operations.
- the processing mode is a mode indicating which of a plurality of processes (first process and second process) having relevance to each other is performed.
- FIG. 6 is an explanatory diagram illustrating an example of a screen image when the position of the image component is changed.
- FIG. 4A shows a screen image of the first display unit 111 before changing the position of the image component
- FIG. 4B shows the first display unit after changing the position of the image component. 111 screen images are shown.
- an icon C101 which is an example of an image component and a slide bar G101 indicating a scroll position of the screen image (a position of the screen image in the entire display target image) are shown.
- the screen image of FIG. 5B is compared with the screen image of FIG. 4A, the position of the icon C101 in the screen image is changed.
- the scroll position of the screen image in FIG. 5B is the same as the scroll position in FIG.
- the display position of the icon C101 changes, while the position of the slide bar G101 in FIG. 10B is the same as the position of the slide bar G101 in FIG.
- the processing unit 185 is illustrated in FIG. Perform the process. More specifically, when drag and drop of the image component C101 is performed as an operation for changing the position of the image component C101, the first touch sensor 121 sequentially changes the touch position (that is, a predetermined value). The input processing unit 182 sequentially outputs information indicating that the image component C101 has been dragged and dropped and information indicating the touch position to the processing unit 185.
- the processing unit 185 sequentially outputs an indication of the position of the image component C101 in the screen image to the display control unit 181 based on the information output from the input processing unit 182. Then, the display control unit 181 sequentially changes the position of the image component C101 in the screen image of the first display unit 111 according to the instruction output from the processing unit 185.
- various operations can be used as the operation of changing the position of the image component C101.
- the finger is touched on the image component C101 and the finger is touched on the display screen.
- the operation of moving the finger to the position of the image component C101 in the screen image of FIG. 6B (the position that the user wants to move) and releasing it from the display screen can be used.
- the operation of flicking the image component that is, in the screen image of FIG. 6A
- the finger is touched to the image component C101 and the image component C101 is repelled by the finger, as shown in FIG.
- Operations other than drag-and-drop such as an operation of rubbing the display screen with the finger toward the position of the image component C101 in the screen image (toward the direction the user wants to move) may be used.
- FIG. 7 is an explanatory diagram illustrating an example of a screen image when the screen image is scrolled.
- the screen image in FIG. 6A is the same as that in FIG. 6A, and shows an icon C101 that is an example of an image component and a slide bar G101 that indicates the scroll position of the screen image.
- the display position of the icon C101 is changed as in the case of FIG. 6B.
- the scroll position also changes from the scroll position in FIG. 7A, which is different from the case of FIG. 6B.
- the position of the slide bar G101 in FIG. 7B is changed from the position of the slide bar in FIG. That is, the icon moving process shown in FIG.
- the icon moving process shown in FIG. 6 and the scrolling process shown in FIG. 7 are both related in that the icon display position is moved.
- the icon moving process shown in FIG. 6 and the scrolling process shown in FIG. 7 differ from each other in whether only the icon is moved or the entire image displayed on the display screen is moved.
- the processing unit 185 is illustrated in FIG. Perform the process. More specifically, when the drag and drop of the image component C101 is performed, the first touch sensor 121 sequentially detects the touch position, and the input processing unit 182 drags and drops the image component C101. The information indicating this and the information indicating the touch position are sequentially output to the processing unit 185. While the image part C101 is being dragged and dropped, the display screen of the second display unit 112 is touched, the second touch sensor 122 sequentially detects the touch position, and the input processing unit 182 touches. Information indicating the position is sequentially output to the processing unit 185.
- the processing unit 185 gives an instruction to scroll the screen image (information indicating the position of the portion to be displayed as the screen image among the display target images).
- the data is sequentially output to the display control unit 181.
- the display control unit 181 scrolls the screen image of the first display unit 111 according to the instruction output from the processing unit 185.
- the process in FIG. 6 and the process in FIG. 7 are examples of two processing modes in the process of changing the position of the image component C101. That is, the process in FIG. 6 and the process in FIG. 7 are common in that the process of changing the display position of the image component C101 is performed (has a relationship), and only the image component C101 is moved or displayed. It differs from each other in whether the entire image displayed on the screen is moved.
- the process for changing the display position of the image component C101 corresponds to the first process
- the process for scrolling the screen image corresponds to the second process.
- FIG. 8 is an explanatory diagram showing an example of phone book information stored in the storage unit 190.
- the phone book information includes information such as a name and a phone number that the mobile terminal device 100 displays in the phone book display function. Further, the phone book information is used as information for displaying by replacing the telephone number with a name when the mobile terminal device 100 displays information such as an incoming call history.
- the phone book information has a tabular data structure, and each row corresponds to data for one person.
- the data of Mr. A, Mr. B, and Mr. C are stored in rows L201, L202, and L203, respectively.
- This telephone directory information includes a secret flag, which is a flag indicating whether or not a secret is designated, in addition to data such as a name and a telephone number.
- the value “YES” of the secret flag indicates that the secret is designated.
- the value “NO” of the secret flag indicates that the secret is not designated.
- Mr. B's data is designated as a secret.
- rows L201 and L203 Mr. A's data and Mr. C's data are not designated as secrets, respectively. Whether or not these secrets are designated is set by the user.
- FIG. 9 is an explanatory diagram illustrating an example of a display image when the mobile terminal device 100 executes the telephone directory display function in the normal mode.
- the normal mode is a mode that restricts access from the application program to data designated by the secret and permits access to data not designated by the secret.
- the processing unit 185 that executes the application program for displaying the phone book causes the first display unit 111 to display data that is permitted to be accessed in accordance with the access restriction.
- access to Mr. A's data and Mr. C's data not designated as secrets in FIG. 8 from the phone book display application program is permitted.
- Data is displayed in areas A211 and A212, respectively.
- access from the application program for displaying the phone book to Mr. B's data designated as secret in FIG. 8 is restricted. For this reason, Mr. B's data is not displayed in the screen image of FIG.
- FIG. 10 is an explanatory diagram illustrating an example of a display screen when the mobile terminal device 100 executes the telephone directory display function in the secret mode.
- the secret mode is a mode that permits both access to data designated by the secret and access to data not designated by the secret from the application program.
- the processing unit 185 that executes the application program for displaying the phone book causes the first display unit 111 to display both the data designated as secret and the data not designated as secret.
- the processing unit 185 that executes the application program for displaying the phone book causes the first display unit 111 to display both the data designated as secret and the data not designated as secret.
- access from the phone book display application program to Mr. A's data and Mr. C's data not designated as secrets in FIG. Both access is permitted.
- These A's data, B's data, and C's data are displayed in areas A221, A222, and A223, respectively.
- FIG. 11 is an explanatory diagram illustrating an example of a user authentication screen (screen image when user authentication is performed) displayed by the first display unit 111.
- a user authentication screen displayed by the first display unit 111.
- whether to perform user authentication when using the mobile terminal device 100 is set in advance by the user of the mobile terminal device 100, and the setting content is stored in advance in the storage unit 190.
- the portable terminal device 100 is again used after a power supply is connected (ON) and starting, or after not being used for a definite period of time (for example, 10 minutes), and going into a sleep mode again.
- a user authentication screen is displayed on the first display unit 111 to accept a password input.
- the password input operation is performed as a result of the touch operation to the push button icons such as “0” to “9” (push button push operation). Is performed, the input processing unit 182 detects the touched icon. Then, the processing unit 185 detects the input password based on the detected icon. Thereafter, when a touch operation is performed on the icon C201 (hereinafter referred to as the “confirmed icon” C201) of the “confirm” push button, the processing unit 185 and the password stored in advance in the storage unit 190 Are authenticated, and user authentication is performed.
- the icon C201 hereinafter referred to as the “confirmed icon” C201
- the processing unit 185 transitions to the execution mode, and according to the user operation, Various functions such as the above phone book display function are executed.
- the execution mode includes the normal mode and the secret mode, and when the user authentication is successful, the processing unit 185 transitions to one of the modes depending on whether the second touch sensor 122 detects a touch operation. To do. Specifically, when the first touch sensor 121 detects a touch operation on the confirmation icon C201, the second touch sensor 122 detects a touch operation on the display screen of the second display unit 112. The processing unit 185 transitions to the secret mode, and transitions to the normal mode when the second touch sensor 122 has not detected a touch operation on the display screen of the second display unit 112.
- the processing unit 185 shifts to the sleep mode and waits for a password input again.
- the transition to the normal mode and the transition to the secret mode are examples of two processing modes when the processing unit 185 executes various functions.
- the transition to the normal mode and the transition to the secret mode are the processes for permitting access to data that is not secret-set from the application program of the function through the process of setting to accept execution requests for various functions. It is common to perform.
- the transition to the normal mode and the transition to the secret mode differ depending on whether or not a process (setting) for permitting access to the data set with the secret is performed.
- the data that is not set with the secret corresponds to the first data, and the data including both the data that is not set with the secret and the data that is set with the secret corresponds to the second data.
- the process for permitting access to the first data from the application program corresponds to the first process
- the process for permitting access to the second data from the application program corresponds to the second process.
- the first process and the second process are common (has a relationship) in that access to data from the application program is permitted, and are different from each other in data that permits access.
- FIG. 12 is an explanatory diagram illustrating an example of screen images before and after scrolling.
- the display screen of the first display unit 111 displays a web page, and a slide bar G301 indicating a scroll position is displayed on the right side of the display screen.
- 2A shows the screen image of the first display unit 111 before scrolling
- FIG. 2B shows the screen image of the first display unit 111 after scrolling.
- FIGS. 7A and 7B show screen images when the first display unit 111 displays the same web page, but the scroll position (screen image in the web page to be displayed). Is different).
- FIG. 4B shows a screen image when the display screen is scrolled down by one display screen from the scroll position in FIG. For this reason, the position of the slide bar G301 in FIG. 5B is a position that is lowered according to the scroll amount from the position of the slide bar G301 in FIG.
- the processing unit 185 When the first touch sensor 121 detects a touch operation as a scroll operation and the second touch sensor 122 does not detect a touch operation, the processing unit 185 performs the process illustrated in FIG. More specifically, when a touch operation (tap) on the lower area A301 of the slide bar G301 is performed as the scroll operation, the first touch sensor 121 detects the touch position, and the input processing unit 182 Information indicating that the touch operation on A301 has been performed and information indicating the touch position are output to the processing unit 185. Based on the information output from the input processing unit 182, the processing unit 185 outputs an instruction to the display control unit 181 to lower the screen image scroll position by one display screen. Then, in accordance with the instruction output from the processing unit 185, the display control unit 181 lowers the scroll position of the screen image on the first display unit 111 by one display screen.
- various operations can be used as the scroll operation.
- an operation of tapping an area above or below the slide bar G301 can be used.
- the first display unit 111 may display an icon of the up scroll button and an icon of the down scroll button, and a touch operation on these icons may be used as the scroll operation.
- FIG. 13 is an explanatory diagram showing another example of screen images before and after scrolling.
- the display screen of the first display unit 111 displays a web page, and a slide bar G301 indicating the scroll position is displayed on the right side of the display screen.
- 2A shows the screen image of the first display unit 111 before scrolling
- FIG. 2B shows the screen image of the first display unit 111 after scrolling.
- the first display unit 111 shows screen images when the same web page is displayed, and the scroll position (web page to be displayed).
- the position of the screen image is different from that in FIG.
- the scroll amount is different from that in FIG. FIG. 13B shows a screen image when scrolling down five display screens from the scroll position in FIG.
- the position of the slide bar G301 in FIG. 13B is a position that is lowered according to the scroll amount from the position of the slide bar G301 in FIG.
- the amount of lowering of the slide bar G ⁇ b> 301 is increased as the scroll amount is larger than in the case of FIG. 12.
- the processing unit 185 When the first touch sensor 121 detects a touch operation as a scroll operation and the second touch sensor 122 detects a touch operation, the processing unit 185 performs the process illustrated in FIG. More specifically, when a touch operation (tap) on the lower area A301 of the slide bar G301 is performed as the scroll operation, the first touch sensor 121 detects the touch position, and the input processing unit 182 Information indicating that the touch operation on A301 has been performed and information indicating the touch position are output to the processing unit 185. Further, when a touch operation is performed on the region A301, the display screen of the second display unit 112 is touched, the second touch sensor 122 detects the touch position, and the input processing unit 182 indicates the touch position. The information is output to the processing unit 185.
- the processing unit 185 outputs to the display control unit 181 an instruction to lower the screen image scroll position by five display screens. Then, in accordance with the instruction output from the processing unit 185, the display control unit 181 lowers the scroll position of the screen image on the first display unit 111 by five display screens.
- the process in FIG. 12 and the process in FIG. 13 are examples of two processing modes in the process of scrolling the screen image of the first display unit 111. That is, the process in FIG. 12 and the process in FIG. 13 are common (have a relationship) in that the process of scrolling the screen image of the first display unit 111 is performed, and the scroll amount is different.
- the process of scrolling the screen image of the first display unit 111 downward by one display screen corresponds to the first process, and the screen image of the first display unit 111 is changed to the first process.
- the process of scrolling down by five display screens as different scroll amounts corresponds to the second process.
- FIG. 14 is a flowchart illustrating a processing procedure when the mobile terminal device 100 selects and executes a processing mode.
- the processing unit 185 determines whether or not the first touch sensor 121 has detected a predetermined touch operation on the display screen of the first display unit 111 (step S101).
- step S101 NO
- step S101: YES when it is determined that the predetermined touch operation on the display screen of the first display unit 111 is detected (step S101: YES), the processing unit 185 performs the touch operation on the display screen of the second display unit 112 as the second touch. It is determined whether or not the sensor 122 has detected (step S102). When it determines with the 2nd touch sensor 122 having detected the touch operation to the display screen of the 2nd display part 112 (step S102: YES), the process part 185 performs a 2nd process (step S103). Thereafter, the process of FIG.
- step S102 determines with the 2nd touch sensor 122 not detecting the touch operation to the display screen of the 2nd display part 112 in step S102 (step S102: NO)
- the process part 185 performs a 1st process. (Step S104). Thereafter, the process of FIG.
- the processing unit 185 performs the first process according to the touch operation detected by the first touch sensor 121 based on whether the second touch sensor 122 detects the touch operation. Whether to perform the first process or the second process is determined. That is, the processing unit 185 selects a mode for performing processing based on whether or not the second touch sensor 122 detects a touch operation. As described above, the processing unit 185 selects a mode according to the presence / absence of a user's touch operation, and thus can perform mode switching reflecting the user's intention. In addition, since the user can select a mode according to the presence or absence of a simple operation of touching the display screen of the second display unit 112 (instruct the mode to the mobile terminal device 100), the user's trouble Can be reduced.
- the second touch sensor 122 detects the touch operation when the processing unit 185 changes the display position of the image component in the screen image of the first display unit 111 according to the user's touch operation. Based on the above, it is determined whether or not to scroll the screen image of the first display unit 111. Therefore, the user changes whether the position of the component image in the screen image is changed or whether the screen image is scrolled according to the presence or absence of a simple operation of touching the display screen of the second display unit 112. You can choose. That is, the user does not need to separately set whether to move only the component image or scroll the screen image.
- the processing unit 185 performs a process of permitting access to data from the application program according to the user's touch operation, based on whether the second touch sensor 122 detects the touch operation or not. It is determined whether or not to permit access to the secret-set data (that is, the normal mode or the secret mode). Accordingly, the user can select either the normal mode or the secret mode depending on whether or not a simple operation of touching the display screen of the second display unit 112 is performed. In particular, the user does not need to set the mode separately.
- the processing unit 185 determines the scroll amount based on whether the second touch sensor 122 detects the touch operation. Decide whether to change. Therefore, the user can change the scroll amount according to the presence or absence of a simple operation of touching the display screen of the second display unit 112. In particular, the user does not need to set the scroll amount separately.
- the number of processes selected by the unit 185 may be three or more. That is, according to the number of touch locations detected by the first touch sensor 121 and the second touch sensor 122, whether the processing unit 185 performs the third process instead of the second process or in addition to the second process.
- the processing unit 185 may selectively perform two or more processes, such as determining whether or not.
- the processing unit 185 replaces the process of scrolling the screen image by five display screens (second process), and the process of scrolling the screen image by ten display screens (third process). Process).
- the processing unit 185 selects and executes one of a plurality of processes according to the number of touch locations detected by the first touch sensor 121 and the second touch sensor 122. That is, the processing unit 185 selects one of three or more processing modes according to the number of touch locations detected by the first touch sensor 121 and the second touch sensor 122. Therefore, the user selects one of the three or more processing modes using a simple operation of changing the number of fingers touching the display screen of the first display unit 111 or the display screen of the second display unit 112. be able to. In particular, the user does not need to separately perform an operation for setting the processing mode.
- a program for realizing all or part of the functions of the control unit 180 is recorded on a computer-readable recording medium, and the program recorded on the recording medium is read and executed by a computer system. You may perform the process of.
- the “computer system” herein includes an OS (Operating System) and hardware such as peripheral devices. Further, the “computer system” includes a homepage providing environment (or display environment) if a WWW (World Wide Web) system is used.
- the “computer-readable recording medium” means a portable medium such as a flexible disk, a magneto-optical disk, a ROM (Read Only Memory) and a CD (Compact Disc) -ROM, and a hard disk built in the computer system. Refers to the device.
- the “computer-readable recording medium” dynamically holds a program for a short time like a communication line when transmitting a program via a network such as the Internet or a communication line such as a telephone line.
- a volatile memory in a computer system serving as a server or a client in that case and a program that holds a program for a certain period of time are also included.
- the program may be a program for realizing a part of the functions described above, and may be a program capable of realizing the functions described above in combination with a program already recorded in a computer system.
- the present invention can be used for various portable electronic devices.
- the present invention can reduce the time and effort of the user while performing mode switching reflecting the user's intention.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Computer Hardware Design (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Computer Security & Cryptography (AREA)
- Software Systems (AREA)
- Signal Processing (AREA)
- User Interface Of Digital Computer (AREA)
- Telephone Function (AREA)
Abstract
Description
しかしながら、この方法では、モード切替を行うために、ユーザは、モード設定画面の表示を要求する操作を行い、さらにモード切替を要求する操作を行う必要がある。このように、ユーザが複数の操作を行う必要がある点で、ユーザにとって手間であり、また、設定に時間を要してしまう。
例えば、特許文献1に記載の携帯電話機は、入力されたデータを保持するメモリ部と、シークレットモードの状態を保持するシークレットモード状態保持部と、当該データにシークレット属性が付与されているか否かを判定する判定部と、各構成要素の動作を制御する制御部とを備える。この制御部は、当該データにシークレット属性が付与され、かつ、シークレットモード状態保持部の内容が通常モードである場合に、シークレットモード状態保持部の内容をシークレットモードに切り替え、シークレット属性が付与されているデータを非表示とする。
これにより、携帯電話機等の機種変更時にデータを一括して旧機種の携帯電話機から新機種の携帯電話機へ移行する作業において、ユーザが旧機種の携帯電話機で非表示としていたデータについては、新機種の携帯電話機でも旧機種の携帯電話機と同様に非表示とすることが容易に可能になるとされている。
また、特許文献1に示されている方法では、シークレットモード以外のモード切替を行うことは出来ない。
第1表示部111は、例えば液晶ディスプレイまたは有機EL(Organic Electro-Luminescence)ディスプレイ等の表示画面を有し、表示制御部181の制御に従って動画像や静止画像やテキスト(文字)などの各種画像を表示する。ここでいう表示画面は、画像を表示するデバイスである。以下では、画像を表示するデバイスとしての画面と、デバイスに表示されている画像としての画面とを区別するために、画像を表示するデバイスを「表示画面」と称し、表示画面に表示されている画像を「画面画像」と称する。
第1表示部111は、本発明における表示画面の一例であり、第1表示部111の表示画面は、本発明における表示部の一例である。
第1タッチセンサ121は、第1表示部111の表示画面へのタッチ操作を検出する。そして、第1タッチセンサ121は、タッチ位置(表示画面におけるタッチされた位置)を示す信号を入力処理部182に出力する。
但し、第2タッチセンサ122は、携帯端末装置100の筐体背面に設けられたタッチパッドのタッチセンサであってもよい。すなわち、携帯端末装置100が、第2表示部112を具備せず、従って筐体背面に表示画面を具備せず、代わりに筐体背面にタッチパッドを具備するようにしてもよい。
第2タッチセンサ122は、第2表示部112の表示画面へのタッチ操作を検出する。そして、第2タッチセンサ122は、タッチ位置を示す信号を入力処理部182に出力する。
音声出力部132はスピーカを有し、音声処理部183からアナログの電気信号にて出力される音声信号を音声に変換して出力する。
例えば、入力処理部182は、第1表示部111が画像部品を表示している状態で、第1表示部111の表示画面におけるタッチ位置を示す信号が第1タッチセンサ121から出力されると、画像部品がタッチされたか否かを判定する。そして、入力処理部182は、画像部品がタッチされたと判定すると、タッチされた画像部品を示す情報を処理部185に出力する。
特に、処理部185は、第1タッチセンサ121がタッチ操作を検出すると第1処理を行い、第1タッチセンサ121がタッチ操作を検出し、かつ、第2タッチセンサ122がタッチ操作を検出すると、第1処理に代えて、前記第1処理と関連性を有する第2処理を行う。
図4は、携帯端末装置100を右手で把持する状態を、携帯端末装置100の前面側から見た斜視図である。
例えば、ユーザが、第1表示部111の表示画面の点P101を右手親指でタッチすると、第1タッチセンサ121は、当該タッチ操作を検出し、点P101の座標をタッチ位置として出力する。
なお、点P101に関しては、小さな面積を有する領域を「点」と称する。例えば、第1タッチセンサ121は、メッシュ状の検出点を有し、これらの検出点のうち領域としての点P101に含まれるものの座標を出力する。次に説明する点P102および第2タッチセンサ122に関しても同様である。
例えば、ユーザが、第2表示部112の表示画面の点P102を右手人差し指でタッチすると、第2タッチセンサ122は、当該タッチ操作を検出し、点P102の座標をタッチ位置として出力する。
例えば、ユーザは、携帯端末装置100を左手で把持し、右手人差し指で第1表示部111の表示画面へのタッチ操作を行い、左手人差し指で第2表示部112の表示画面へのタッチ操作を行うこともできる。この場合も、第1タッチセンサ121と第2タッチセンサ122とは、これらのタッチ操作を検出する。
図6は、画像部品の位置を変化させた場合の画面画像の例を示す説明図である。同図(a)は、画像部品の位置を変化させる前の、第1表示部111の画面画像を示し、同図(b)は、画像部品の位置を変化させた後の、第1表示部111の画面画像を示す。
また、同図(b)の画面画像を、同図(a)の画面画像と比較すると、画面画像中におけるアイコンC101の位置が変化している。一方、同図(b)における画面画像のスクロール位置は、同図(a)におけるスクロール位置と同じである。このため、アイコンC101の表示位置が変化し、一方、同図(b)におけるスライドバーG101の位置は、同図(a)におけるスライドバーG101の位置と同じになっている。
より具体的には、画像部品C101の位置を変化させる操作として、画像部品C101のドラグ・アンド・ドロップ(Drag And Drop)が行われると、第1タッチセンサ121がタッチ位置を順次(すなわち所定の時間間隔で)検出し、入力処理部182は、画像部品C101がドラグ・アンド・ドロップされたことを示す情報と、タッチ位置を示す情報とを、処理部185に順次出力する。そして、処理部185は、入力処理部182から出力される情報に基づいて、画面画像内における画像部品C101の位置の指示を、表示制御部181に順次出力する。そして、表示制御部181は、処理部185から出力される指示に従って、第1表示部111の画面画像内における画像部品C101の位置を順次変化させる。
同図(a)の画面画像は、図6(a)の場合と同様であり、画像部品の一例であるアイコンC101と、画面画像のスクロール位置を示すスライドバーG101とが示されている。
また、図7(b)の画面画像では、図6(b)の場合と同様、アイコンC101の表示位置が変化している。ただし、図7(b)における画面画像では、スクロール位置も、同図(a)におけるスクロール位置から変化しており、この点で図6(b)の場合と異なる。このため、図7(b)におけるスライドバーG101の位置は、同図(a)におけるスライドバーの位置から変化している。
すなわち、図6に示すアイコンの移動処理と、図7に示すスクロール処理とは、共にアイコンの表示位置を移動させる点で関連性を有する。一方、図6に示すアイコンの移動処理と、図7に示すスクロール処理とは、アイコンのみを移動させるか、あるいは、表示画面の表示する画像全体を移動させるかにおいて、互いに異なる。
より具体的には、画像部品C101のドラグ・アンド・ドロップが行われると、第1タッチセンサ121がタッチ位置を順次検出し、入力処理部182が、画像部品C101をドラグ・アンド・ドロップされたことを示す情報と、タッチ位置を示す情報とを、処理部185に順次出力する。また、画像部品C101のドラグ・アンド・ドロップが行われる間、第2表示部112の表示画面がタッチされており、第2タッチセンサ122がタッチ位置を順次検出し、入力処理部182が、タッチ位置を示す情報を処理部185に順次出力する。
図8は、記憶部190が記憶する電話帳情報の例を示す説明図である。この電話帳情報には、携帯端末装置100が、電話帳表示機能において表示する、氏名や電話番号等の情報が含まれている。また、電話帳情報は、携帯端末装置100が着信履歴等の情報を表示する際に、電話番号を氏名に置き換えて表示するための情報として用いられる。
この電話帳情報には、氏名や電話番号等のデータに加えて、シークレット指定の有無を示すフラグであるシークレットフラグが含まれている。このシークレットフラグの値「YES」は、シークレット指定されていることを示す。一方、シークレットフラグの値「NO」は、シークレット指定されていないことを示す。例えば、行L202において、Bさんのデータがシークレット指定されている。一方、行L201とL203とにおいて、それぞれAさんのデータとCさんのデータとは、シークレット指定されていない。これらシークレット指定の有無は、ユーザが設定している。
通常モードは、アプリケーションプログラムからの、シークレット指定されているデータへのアクセスを制限し、シークレット指定されていないデータへのアクセスを許可するモードである。
図9の例では、電話帳表示のアプリケーションプログラムからの、図8においてシークレット指定されていないAさんのデータおよびCさんのデータへのアクセスは許可されており、これらAさんのデータとCさんのデータとが、それぞれ領域A211とA212とに表示されている。
一方、電話帳表示のアプリケーションプログラムからの、図8においてシークレット指定されているBさんのデータへのアクセスは制限されている。このため、図9の画面画像には、Bさんのデータは表示されていない。
シークレットモードは、アプリケーションプログラムからの、シークレット指定されているデータへのアクセスと、シークレット指定されていないデータへのアクセスとを、いずれも許可するモードである。
図10の例では、電話帳表示のアプリケーションプログラムからの、図8においてシークレット指定されていないAさんのデータおよびCさんのデータへのアクセスも、図8においてシークレット指定されているBさんのデータへのアクセスも、いずれも許可されている。そして、これらAさんのデータとBさんのデータとCさんのデータとが、それぞれ領域A221とA222とA223とに表示されている。
ここで、携帯端末装置100の使用に際してユーザ認証を行うか否かは、携帯端末装置100のユーザが予め設定しており、記憶部190が設定内容を予め記憶している。そして、ユーザ認証を行う設定が為されている場合、携帯端末装置100は、電源を接続(ON)されて起動する際や、一定時間(例えば10分間)使用されず休止モードとなった後に再度使用される際に、第1表示部111にユーザ認証画面を表示させてパスワード入力を受け付ける。
その後、「確定」の押ボタンのアイコンC201(以下、「確定アイコン」C201と称する)へのタッチ操作が行われると、処理部185は、入力されたパスワードと、記憶部190の予め記憶するパスワードが同一か否かを判定して、ユーザ認証を行う。
実行モードには上記の通常モードとシークレットモードとがあり、処理部185は、ユーザ認証に成功した際、第2タッチセンサ122がタッチ操作を検出したか否かに応じていずれかのモードに遷移する。具体的には、第1タッチセンサ121が確定アイコンC201へのタッチ操作を検出した際に、第2タッチセンサ122が、第2表示部112の表示画面へのタッチ操作を検出していれば、処理部185は、シークレットモードに遷移し、第2タッチセンサ122が、第2表示部112の表示画面へのタッチ操作を検出していなければ、通常モードに遷移する。
そして、シークレット設定されていないデータが第1データに該当し、シークレット設定されていないデータとシークレット設定されているデータとの両方を含むデータが第2データに該当する。また、アプリケーションプログラムからの、第1データへのアクセスを許可する処理が第1処理に該当し、アプリケーションプログラムからの、第2データへのアクセスを許可する処理が第2処理に該当する。
この第1処理と第2処理とは、アプリケーションプログラムからのデータへのアクセスを許可する点で共通し(関連性を有し)、アクセスを許可するデータにおいて互いに異なる。
図12は、スクロール前後の画面画像の例を示す説明図である。同図において、第1表示部111の表示画面はウェブページを表示しており、また、表示画面の右側には、スクロール位置を示すスライドバーG301が表示されている。そして、同図(a)は、スクロール前の、第1表示部111の画面画像を示し、同図(b)は、スクロール後の、第1表示部111の画面画像を示す。
より具体的には、スクロール操作として、スライドバーG301の下側の領域A301へのタッチ操作(タップ)が行われると、第1タッチセンサ121がタッチ位置を検出し、入力処理部182が、領域A301へのタッチ操作が行われたことを示す情報と、タッチ位置を示す情報とを、処理部185に出力する。そして、処理部185は、入力処理部182から出力される情報に基づいて、画面画像のスクロール位置を、表示画面1つ分下にする旨の指示を、表示制御部181に出力する。そして、表示制御部181は、処理部185から出力される指示に従って、第1表示部111の画面画像のスクロール位置を、表示画面1つ分下にする。
ただし、図13では、スクロール量が図12の場合と異なる。図13(b)は、図13(a)におけるスクロール位置から、表示画面5つ分下にスクロールした場合の画面画像を示している。このため、図13(b)におけるスライドバーG301の位置は、同図(a)におけるスライドバーG301の位置から、スクロール量に応じて下がった位置となっている。そして、図13では、図12の場合よりもスクロール量が多い分、スライドバーG301の下がる量も多くなっている。
より具体的には、スクロール操作として、スライドバーG301の下側の領域A301へのタッチ操作(タップ)が行われると、第1タッチセンサ121がタッチ位置を検出し、入力処理部182が、領域A301へのタッチ操作が行われたことを示す情報と、タッチ位置を示す情報とを、処理部185に出力する。また、領域A301へのタッチ操作が行われた際、第2表示部112の表示画面がタッチされており、第2タッチセンサ122がタッチ位置を検出し、入力処理部182が、タッチ位置を示す情報を処理部185に出力する。
そして、処理部185は、入力処理部182から出力される情報に基づいて、画面画像のスクロール位置を、表示画面5つ分下にする旨の指示を、表示制御部181に出力する。そして、表示制御部181は、処理部185から出力される指示に従って、第1表示部111の画面画像のスクロール位置を、表示画面5つ分下にする。
図14は、携帯端末装置100が処理モードを選択して実行する際の処理手順を示すフローチャートである。
同図の処理において、まず、処理部185は、第1表示部111の表示画面における所定のタッチ操作を第1タッチセンサ121が検出したか否かを判定する(ステップS101)。所定のタッチ操作を第1タッチセンサ121が検出していないと判定した場合(ステップS101:NO)、同図の処理を終了する。
このように、処理部185は、ユーザのタッチ操作の有無に応じてモードを選択するので、ユーザの意図を反映してモード切替を行うことができる。また、ユーザは、第2表示部112の表示画面にタッチするという簡単な操作の有無に応じてモードを選択する(携帯端末装置100に対してモードを指示する)ことができるので、ユーザの手間を軽減し得る。
従って、ユーザは、第2表示部112の表示画面にタッチするという簡単な操作の有無に応じて、画面画像中における部品画像の位置を変化させるか、あるいは、画面画像をスクロールさせるか否かを選択し得る。すなわち、ユーザは、部品画像のみを移動させるか画面画像をスクロールさせるかの設定を別途行う必要が無い。
従って、ユーザは、第2表示部112の表示画面にタッチするという簡単な操作の有無に応じて、通常モードとシークレットモードとのいずれかを選択しうる。特に、ユーザは、モード設定を別途行う必要が無い。
従って、ユーザは、第2表示部112の表示画面にタッチするという簡単な操作の有無に応じて、スクロール量を変化させることができる。特に、ユーザは、スクロール量の設定を別途行う必要が無い。
従って、ユーザは、第1表示部111の表示画面や第2表示部112の表示画面にタッチする指の数を変えるという簡単な操作を用いて、3つ以上の処理モードのいずれかを選択することができる。特に、ユーザは、処理モードを設定する操作を別途行う必要が無い。
また、「コンピュータシステム」は、WWW(World Wide Web)システムを利用している場合であれば、ホームページ提供環境(あるいは表示環境)も含むものとする。
また、「コンピュータ読み取り可能な記録媒体」とは、フレキシブルディスク、光磁気ディスク、ROM(Read Only Memory)、CD(Compact Disc)-ROM等の可搬媒体、コンピュータシステムに内蔵されるハードディスク等の記憶装置のことをいう。さらに「コンピュータ読み取り可能な記録媒体」とは、インターネット等のネットワークや電話回線等の通信回線を介してプログラムを送信する場合の通信線のように、短時間の間、動的にプログラムを保持するもの、その場合のサーバやクライアントとなるコンピュータシステム内部の揮発性メモリのように、一定時間プログラムを保持しているものも含むものとする。また上記プログラムは、前述した機能の一部を実現するためのものであっても良く、さらに前述した機能をコンピュータシステムにすでに記録されているプログラムとの組み合わせで実現できるものであっても良い。
111 第1表示部
112 第2表示部
121 第1タッチセンサ
122 第2タッチセンサ
131 音声入力部
132 音声出力部
140 無線通信部
180 制御部
181 表示制御部
182 入力処理部
183 音声処理部
184 通信制御部
185 処理部
190 記憶部
Claims (7)
- 第1タッチセンサと、
第2タッチセンサと、
前記第1タッチセンサがタッチ操作を検出すると第1処理を行い、前記第1タッチセンサがタッチ操作を検出し、かつ、前記第2タッチセンサがタッチ操作を検出すると、前記第1処理に代えて、前記第1処理と関連性を有する第2処理を行う処理部と、
を具備する携帯型電子機器。 - 表示画面を有する表示部を具備し、
前記第1タッチセンサは、前記表示画面へのタッチ操作を検出し、
前記表示画面に表示されている画像である画面画像に、前記画面画像を構成する部品としての画像である画像部品が含まれている場合に、前記第1タッチセンサが、当該画像部品の位置を変化させる操作としてのタッチ操作を検出すると、前記処理部は、前記画面画像中における当該画像部品の位置を変化させる処理を、前記第1処理として行い、
前記画面画像に前記画像部品が含まれている場合に、前記第1タッチセンサが、当該画像部品の位置を変化させる操作としてのタッチ操作を検出し、かつ、前記第2タッチセンサが、タッチ操作を検出すると、前記処理部は、前記画面画像をスクロールさせる処理を、前記第2処理として行う
請求項1に記載の携帯型電子機器。 - 前記第1タッチセンサがユーザ認証要求操作としてのタッチ操作を検出すると、前記処理部は、ユーザ認証を行って、認証に成功すると、前記第1処理として、所定のプログラムからの第1データへのアクセスを許可する処理を行い、
前記第1タッチセンサがユーザ認証要求操作としてのタッチ操作を検出し、かつ、前記第2タッチセンサがタッチ操作を検出すると、前記処理部は、ユーザ認証を行って、認証に成功すると、前記第2処理として、前記所定のプログラムからの第2データへのアクセスを許可する処理を行う
請求項1に記載の携帯型電子機器。 - 表示画面を有する表示部を具備し、
前記第1タッチセンサは、前記表示画面へのタッチ操作を検出し、
前記第1タッチセンサがスクロール操作としてのタッチ操作を検出すると、前記処理部は、前記表示画面に表示されている画像である画面画像を、設定されているスクロール量でスクロールさせる処理を、前記第1処理として行い、
前記第1タッチセンサがスクロール操作としてのタッチ操作を検出し、かつ、前記第2タッチセンサがタッチ操作を検出すると、前記表示画面に表示されている画像である画面画像を、前記第1処理におけるスクロール量とは異なるスクロール量でスクロールさせる処理を、前記第2処理として行う
請求項1に記載の携帯型電子機器。 - 前記処理部は、前記第1タッチセンサがタッチ操作を検出すると前記第1処理を行い、前記第1タッチセンサがタッチ操作を検出し、かつ、前記第2タッチセンサがタッチ操作を検出し、かつ、前記第1タッチセンサまたは前記第2タッチセンサの少なくとも一方が複数箇所におけるタッチ操作を検出すると、前記第2処理に代えてあるいは前記第2処理に加えて第3処理を行う請求項1から4のいずれか1項に記載の携帯型電子機器。
- 第1タッチセンサと、第2タッチセンサと、を具備する携帯型電子機器のタッチ操作処理方法であって、
前記第1タッチセンサがタッチ操作を検出すると第1処理を行い、
前記第1タッチセンサがタッチ操作を検出し、かつ、前記第2タッチセンサがタッチ操作を検出すると、前記第1処理に代えて、前記第1処理と関連性を有する第2処理を行うタッチ操作処理方法。 - 第1タッチセンサと、第2タッチセンサと、を具備する携帯型電子機器としてのコンピュータに、
前記第1タッチセンサがタッチ操作を検出すると第1処理を行い、前記第1タッチセンサがタッチ操作を検出し、かつ、前記第2タッチセンサがタッチ操作を検出すると、前記第1処理に代えて、前記第1処理と関連性を有する第2処理を行う処理ステップを実行させるためのプログラム。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013536141A JP6115470B2 (ja) | 2011-09-27 | 2012-09-11 | 携帯型電子機器、タッチ操作処理方法およびプログラム |
EP12835820.7A EP2763378B1 (en) | 2011-09-27 | 2012-09-11 | Portable electronic device, touch operation processing method and program |
US14/347,190 US9274632B2 (en) | 2011-09-27 | 2012-09-11 | Portable electronic device, touch operation processing method, and program |
EP18198568.0A EP3457672B1 (en) | 2011-09-27 | 2012-09-11 | Portable electronic device, touch operation processing method, and program |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2011-210976 | 2011-09-27 | ||
JP2011210976 | 2011-09-27 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2013047182A1 true WO2013047182A1 (ja) | 2013-04-04 |
Family
ID=47995217
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2012/073138 WO2013047182A1 (ja) | 2011-09-27 | 2012-09-11 | 携帯型電子機器、タッチ操作処理方法およびプログラム |
Country Status (4)
Country | Link |
---|---|
US (1) | US9274632B2 (ja) |
EP (2) | EP3457672B1 (ja) |
JP (1) | JP6115470B2 (ja) |
WO (1) | WO2013047182A1 (ja) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2014132965A1 (ja) * | 2013-02-27 | 2014-09-04 | Necカシオモバイルコミュニケーションズ株式会社 | 携帯電子機器、その制御方法及びプログラム |
JP2016178612A (ja) * | 2015-03-23 | 2016-10-06 | Necプラットフォームズ株式会社 | 携帯端末、携帯端末保護カバーおよび携帯端末カメラアプリケーション起動方法 |
JP2021036460A (ja) * | 2020-11-16 | 2021-03-04 | マクセル株式会社 | 携帯情報端末の発呼方法 |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6062351B2 (ja) * | 2013-11-28 | 2017-01-18 | 京セラ株式会社 | 電子機器 |
US10225737B1 (en) * | 2017-10-31 | 2019-03-05 | Konica Minolta Laboratory U.S.A., Inc. | Method and system for authenticating a user using a mobile device having plural sensors |
JP7259581B2 (ja) * | 2019-06-18 | 2023-04-18 | 京セラドキュメントソリューションズ株式会社 | 情報処理装置 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008263324A (ja) | 2007-04-10 | 2008-10-30 | Fujitsu Ltd | シークレットモード機能を有する端末装置およびシークレットモード切替制御方法 |
JP2010117842A (ja) * | 2008-11-12 | 2010-05-27 | Sharp Corp | 携帯情報端末 |
JP2011076233A (ja) * | 2009-09-29 | 2011-04-14 | Fujifilm Corp | 画像表示装置、画像表示方法およびプログラム |
JP2011149749A (ja) * | 2010-01-20 | 2011-08-04 | Yupiteru Corp | 車両用情報表示装置 |
Family Cites Families (38)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH048060A (ja) * | 1990-04-26 | 1992-01-13 | Ricoh Co Ltd | ファクシミリ装置 |
JPH0934579A (ja) * | 1995-07-19 | 1997-02-07 | Casio Comput Co Ltd | 電子機器 |
CA2857208C (en) * | 2003-05-30 | 2018-09-04 | Privaris, Inc. | An in-circuit security system and methods for controlling access to and use of sensitive data |
FR2866726B1 (fr) * | 2004-02-23 | 2006-05-26 | Jazzmutant | Controleur par manipulation d'objets virtuels sur un ecran tactile multi-contact |
JP2006163360A (ja) * | 2004-11-11 | 2006-06-22 | Casio Comput Co Ltd | 投影装置、投影方法、投影制御プログラム |
KR100826532B1 (ko) * | 2006-03-28 | 2008-05-02 | 엘지전자 주식회사 | 이동 통신 단말기 및 그의 키 입력 검출 방법 |
JP2008059561A (ja) * | 2006-08-04 | 2008-03-13 | Canon Inc | 情報処理装置、データ処理装置、および、それらの方法 |
US7924271B2 (en) * | 2007-01-05 | 2011-04-12 | Apple Inc. | Detecting gestures on multi-event sensitive devices |
KR100891099B1 (ko) * | 2007-01-25 | 2009-03-31 | 삼성전자주식회사 | 사용성을 향상시키는 터치 스크린 및 터치 스크린에서 사용성 향상을 위한 방법 |
WO2008103018A1 (en) * | 2007-02-23 | 2008-08-28 | Tp-I Co., Ltd | Virtual keyboard input system using pointing apparatus in digial device |
JP4378494B2 (ja) * | 2007-07-03 | 2009-12-09 | シャープ株式会社 | 表示装置 |
KR20090014579A (ko) * | 2007-08-06 | 2009-02-11 | 삼성전자주식회사 | 터치 패널 일체형 표시 장치와 그 에러 보정 방법 및디스플레이 시스템 |
US20090109181A1 (en) * | 2007-10-26 | 2009-04-30 | Research In Motion Limited | Touch screen and electronic device |
TWI368161B (en) * | 2007-12-21 | 2012-07-11 | Htc Corp | Electronic apparatus and input interface thereof |
TWI360061B (en) * | 2007-12-31 | 2012-03-11 | Htc Corp | Electronic device and method for operating applica |
US8174503B2 (en) * | 2008-05-17 | 2012-05-08 | David H. Cain | Touch-based authentication of a mobile device through user generated pattern creation |
US8130207B2 (en) | 2008-06-18 | 2012-03-06 | Nokia Corporation | Apparatus, method and computer program product for manipulating a device using dual side input devices |
KR101001824B1 (ko) * | 2008-10-16 | 2010-12-15 | 주식회사 팬택 | 터치 입력을 이용한 휴대용 단말기 제어 방법 및 휴대용 단말기 |
TWI412987B (zh) * | 2008-11-28 | 2013-10-21 | Htc Corp | 可攜式電子裝置及利用觸控屏幕將其由睡眠模式中喚醒之方法 |
JP5200948B2 (ja) * | 2009-01-15 | 2013-06-05 | 株式会社Jvcケンウッド | 電子機器、操作制御方法、及びプログラム |
US8125347B2 (en) * | 2009-04-09 | 2012-02-28 | Samsung Electronics Co., Ltd. | Text entry system with depressable keyboard on a dynamic display |
US8633904B2 (en) * | 2009-04-24 | 2014-01-21 | Cypress Semiconductor Corporation | Touch identification for multi-touch technology |
TWI399676B (zh) * | 2009-06-30 | 2013-06-21 | Pixart Imaging Inc | 觸控螢幕之物件偵測校正系統及其方法 |
US20110096034A1 (en) * | 2009-10-23 | 2011-04-28 | Sonix Technology Co., Ltd. | Optical touch-sensing display |
KR101615964B1 (ko) * | 2009-11-09 | 2016-05-12 | 엘지전자 주식회사 | 이동 단말기 및 그 표시방법 |
EP2341414A1 (en) | 2009-12-31 | 2011-07-06 | Sony Computer Entertainment Europe Limited | Portable electronic device and method of controlling a portable electronic device |
EP2341417A1 (en) * | 2009-12-31 | 2011-07-06 | Sony Computer Entertainment Europe Limited | Device and method of control |
KR20110081040A (ko) * | 2010-01-06 | 2011-07-13 | 삼성전자주식회사 | 투명 디스플레이를 구비한 휴대단말에서 컨텐츠 운용 방법 및 장치 |
US20110193791A1 (en) * | 2010-02-11 | 2011-08-11 | Research In Motion Limited | Capacitive touch sensitive overlay including touch sensor and electronic device including same |
US8345073B1 (en) * | 2010-03-24 | 2013-01-01 | Amazon Technologies, Inc. | Touch screen layer reduction |
US20110248928A1 (en) * | 2010-04-08 | 2011-10-13 | Motorola, Inc. | Device and method for gestural operation of context menus on a touch-sensitive display |
JP5010714B2 (ja) * | 2010-05-21 | 2012-08-29 | 株式会社東芝 | 電子機器、入力制御プログラム、及び入力制御方法 |
US8669946B2 (en) * | 2010-05-28 | 2014-03-11 | Blackberry Limited | Electronic device including touch-sensitive display and method of controlling same |
US20120113458A1 (en) * | 2010-11-10 | 2012-05-10 | Flextronics Id, Llc | Mobile printing framework |
US8115749B1 (en) * | 2010-12-05 | 2012-02-14 | Dilluvah Corp. | Dual touch pad interface |
KR20120091975A (ko) * | 2011-02-10 | 2012-08-20 | 삼성전자주식회사 | 적어도 두 개의 터치 스크린을 포함하는 정보 표시 장치 및 그의 정보 표시 방법 |
US8955046B2 (en) * | 2011-02-22 | 2015-02-10 | Fedex Corporate Services, Inc. | Systems and methods for authenticating devices in a sensor-web network |
JP5091338B1 (ja) * | 2011-05-26 | 2012-12-05 | 株式会社コナミデジタルエンタテインメント | 情報表示装置、情報表示方法、ならびに、プログラム |
-
2012
- 2012-09-11 EP EP18198568.0A patent/EP3457672B1/en active Active
- 2012-09-11 US US14/347,190 patent/US9274632B2/en active Active
- 2012-09-11 WO PCT/JP2012/073138 patent/WO2013047182A1/ja active Application Filing
- 2012-09-11 JP JP2013536141A patent/JP6115470B2/ja active Active
- 2012-09-11 EP EP12835820.7A patent/EP2763378B1/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008263324A (ja) | 2007-04-10 | 2008-10-30 | Fujitsu Ltd | シークレットモード機能を有する端末装置およびシークレットモード切替制御方法 |
JP2010117842A (ja) * | 2008-11-12 | 2010-05-27 | Sharp Corp | 携帯情報端末 |
JP2011076233A (ja) * | 2009-09-29 | 2011-04-14 | Fujifilm Corp | 画像表示装置、画像表示方法およびプログラム |
JP2011149749A (ja) * | 2010-01-20 | 2011-08-04 | Yupiteru Corp | 車両用情報表示装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP2763378A4 |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2014132965A1 (ja) * | 2013-02-27 | 2014-09-04 | Necカシオモバイルコミュニケーションズ株式会社 | 携帯電子機器、その制御方法及びプログラム |
JP2016178612A (ja) * | 2015-03-23 | 2016-10-06 | Necプラットフォームズ株式会社 | 携帯端末、携帯端末保護カバーおよび携帯端末カメラアプリケーション起動方法 |
JP2021036460A (ja) * | 2020-11-16 | 2021-03-04 | マクセル株式会社 | 携帯情報端末の発呼方法 |
Also Published As
Publication number | Publication date |
---|---|
EP3457672B1 (en) | 2019-11-20 |
EP2763378A1 (en) | 2014-08-06 |
JP6115470B2 (ja) | 2017-04-19 |
EP3457672A1 (en) | 2019-03-20 |
US9274632B2 (en) | 2016-03-01 |
EP2763378B1 (en) | 2019-07-24 |
EP2763378A4 (en) | 2015-09-16 |
US20140235297A1 (en) | 2014-08-21 |
JPWO2013047182A1 (ja) | 2015-03-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20220137765A1 (en) | Portable multifunction device, method, and graphical user interface supporting user navigations of graphical objects on a touch screen display | |
US10430078B2 (en) | Touch screen device, and graphical user interface for inserting a character from an alternate keyboard | |
US10198178B2 (en) | Electronic apparatus with split display areas and split display method | |
US8739053B2 (en) | Electronic device capable of transferring object between two display units and controlling method thereof | |
US9329770B2 (en) | Portable device, method, and graphical user interface for scrolling to display the top of an electronic document | |
US8589823B2 (en) | Application user interface with navigation bar showing current and prior application contexts | |
US9753607B2 (en) | Electronic device, control method, and control program | |
US20080098331A1 (en) | Portable Multifunction Device with Soft Keyboards | |
US20130082824A1 (en) | Feedback response | |
US20100214218A1 (en) | Virtual mouse | |
JP5296795B2 (ja) | 電子機器及び電子機器における表示方法 | |
JP6115470B2 (ja) | 携帯型電子機器、タッチ操作処理方法およびプログラム | |
US20130111346A1 (en) | Dual function scroll wheel input | |
WO2013047271A1 (ja) | 携帯型電子機器、タッチ領域設定方法およびタッチ領域設定プログラム | |
EP2735951A1 (en) | Method for processing documents by terminal having touch screen and terminal having touch screen | |
WO2014129581A1 (ja) | 電子機器及び制御プログラム並びに電子機器の動作方法 | |
JP2014222379A (ja) | 情報端末、タッチ操作処理方法及びプログラム | |
WO2014003012A1 (ja) | 端末装置、表示制御方法およびプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12835820 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2013536141 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14347190 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2012835820 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |