EP2071436B1 - Tragbares endgerät und verfahren zu dessen steuerung - Google Patents

Tragbares endgerät und verfahren zu dessen steuerung Download PDF

Info

Publication number
EP2071436B1
EP2071436B1 EP07828604.4A EP07828604A EP2071436B1 EP 2071436 B1 EP2071436 B1 EP 2071436B1 EP 07828604 A EP07828604 A EP 07828604A EP 2071436 B1 EP2071436 B1 EP 2071436B1
Authority
EP
European Patent Office
Prior art keywords
input
application
screen
portable terminal
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
EP07828604.4A
Other languages
English (en)
French (fr)
Other versions
EP2071436A4 (de
EP2071436A1 (de
Inventor
Kenta Kinoshita
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Kyocera Corp
Original Assignee
Kyocera Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2006264866A external-priority patent/JP5080773B2/ja
Priority claimed from JP2007106200A external-priority patent/JP4927633B2/ja
Application filed by Kyocera Corp filed Critical Kyocera Corp
Publication of EP2071436A1 publication Critical patent/EP2071436A1/de
Publication of EP2071436A4 publication Critical patent/EP2071436A4/de
Application granted granted Critical
Publication of EP2071436B1 publication Critical patent/EP2071436B1/de
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen

Definitions

  • the present invention relates to a portable terminal for detecting an input using display unit when information is input and a control method therefor.
  • Patent Document 3 In order to prevent malfunction of soft keys displayed in a narrow space, there is also proposed a method of calculating a frequency of use of the soft keys by a user and increasing a size of frequently-used soft keys to thereby prevent wrong operation of the user (e.g., Patent Document 3).
  • Patent Document 4 In order to quickly perform character input operation using as few keys as possible, for example, there are also proposed a method of displaying a soft key character string (e.g., Patent Document 4) and a method of causing a touch pen to come into contact with soft keys when characters are input and inputting a large number of characters to soft key display according to directions for moving the touch pen (e.g., Patent Document 5),
  • EP 1457870 A2 discloses a touch system comprising a touch surface on which an image of an application is displayed.
  • the touch system is configured to automatically determine the type of pointer used to contact the touch surface. This enables the touch system to differentiate between contacts made on the touch surface using a finger and pen tool.
  • a user is provided with the ability to write, draw or annotate with a pen tool and use a finger to generate mouse events, e.g., to control execution of the displayed application.
  • This provides the user with a seamless ability to write into and control application programs without purposefully having to change the mode of operation of the touch system prior to initiating touch events with different pointers.
  • a user may contact the touch surface simultaneously with a pen tool and a finger.
  • WO 99/28811 discloses a touch system comprising a touch-sensitive display configured to display a document and a navigation tool for navigating around the displayed document, wherein the navigation tool transparently overlies the document.
  • the touch system is configured to accept stylus input to access control buttons in the underlying document while a finger touch on any part of the navigation tool invokes a corresponding navigation function.
  • WO 03/065192 A1 discloses a touch system comprising a touch panel placed on top of a display.
  • a touch of an area of an image displayed on the display can trigger the execution of different operations based on the type of pointing means used for touching the area. For example, a user can perform a selection operation by using a first pointing means (e.g. harder pointing means or a fingernail), and a scrolling operation by using the second pointing means (e.g. softer pointing means or a fingertip).
  • a first pointing means e.g. harder pointing means or a fingernail
  • a scrolling operation e.g. softer pointing means or a fingertip
  • US 6611258 B1 discloses a touch system configured to discriminate whether a touch input has been performed by a pen or finger, and to perform a proper operation according to the detected touch input.
  • EP 1517228 A2 discloses a touch system comprising a touch surface on which an image of an application is displayed.
  • the touch system is configured to differentiate between the type of pointers used to contact the touch surface, wherein different functionality can be assigned to similar gestures that are performed using different pointers. For example, a rotate gesture performed by contacting the touch surface over an object displayed within an application window with one finger and then subsequently contacting the touch surface with another finger that moves in an arc while maintaining the touch surface contact with the one finger, results in a rotation of the object. If the same gesture is carried out using a finger to initially contact the object within the application window and a pen tool to describe the arc, a pattern fill operation rather than a rotate operation is executed.
  • the present invention provides a portable terminal according to claim 1 and a control method according to claim 8. Further embodiments of the present invention are described in the dependent claims.
  • an input to the display unit is an input by the first input means or an input by the second input means and functions other than an object displayed are realized according to the input means. Therefore, in realizing a UI employing the display unit, it is possible to realize multifunctional operations in a limited display space using a general input device without increasing the number of kinds of operation performed until a purpose is attained.
  • FIG. 1 is a block diagram of a portable terminal according to a first example useful to understanding the present invention.
  • a portable terminal 1 shown in Figure 1 is a portable communication terminal such as a PDA (personal digital assistant).
  • a storing unit 2 stores various applications 3 and a display unit 4 displays various objects (e.g., soft keys) and the like that form screens of the applications 3.
  • a touch panel 5 is disposed on a front surface of the display unit 4 in association with the display unit 4. Inputs to the objects are detected by an input detecting unit 6.
  • a control unit 7 includes an input-means determining unit 8, an object control unit 9, and an input-position determining unit 11 and controls operations of the applications 3 on the basis of inputs from the touch panel 5.
  • the input-means determining unit 8 includes a contact-area detecting unit 10 and calculates, when an input to the touch panel 5 is detected by the input detecting unit 6, a contact area of the input and discriminates input means. For example, the input-means determining unit 8 discriminates that the input is a pen input when the area calculated by the contact-area calculating unit 10 is smaller than a predetermined value and discriminates that the input is an input by a finger when the area calculated by the contact-area calculating unit 10 is equal to or larger than the predetermined value.
  • the input-position determining unit 11 determines, when an input to the touch panel 5 is detected by the input detecting unit 6, an input coordinate of the input.
  • the object control unit 9 executes, according to the input means discriminated by the input-means determining unit 8 and the input coordinate determined by the input-position determining unit 11, a function of an object corresponding thereto. For example, when an input to the touch panel 5 is detected and it is determined by the input-means determining unit 8 that the input is a pen input, the object control unit 9 executes a function of an object displayed in a position on the display unit 4 corresponding to an input coordinate of the pen input determined by the input-position determining unit 11. When it is determined by the input-means determining unit 8 that the input is an input by a finger, the object control unit 9 prohibits functions of all objects displayed on the display unit 4 and executes a function allocated to the input by the finger.
  • Figure 2 is a diagram for explaining an input-means determining method in the case in which a panel of a surface elastic type is used as a touch panel. It is determined whether an input is a pen input (a main input) or an input by a finger (a sub-input).
  • an input is a pen input (a main input) or an input by a finger (a sub-input).
  • the touch panel of the surface elastic type it is possible to detect an input position by measuring time from the application of vibration until the reflected vibration comes back. A change occurs in a signal value by an amount of absorption of surface acoustic waves due to the input. It is possible to calculate a contact area by integrating time in which the change occurs.
  • the input-means determining unit 8 discriminates, when a contact area on the touch panel 5 is smaller than a predetermined value, that the input is the main input, i.e., the input by a pen and discriminates, when a contact area on the touch panel 5 is equal to or larger than the predetermined value, that the input is the sub-input, i.e., the input by a finger.
  • the input-means determining unit 8 calculates contact areas for the respective inputs and specifies input means.
  • Figure 3A is a flowchart for determining an input state.
  • Figure 3B is a flowchart for determining input means.
  • a method of discriminating input means in step S106 is shown in Figure 3B .
  • step S102 when the input to the touch panel 5 is released, i.e., when the pen or the finger is released from the touch panel ("contact release" in step S102), the control unit 7 erases the input device information Ln stored in association with the released input (step S108). Since the input detection places decreases by one, the control unit 7 decrements the counter "n" (step S109).
  • the object control unit 9 executes, on the basis of the main input, a function (an object) of an application allocated in advance to an area on the screen of the application specified by an input coordinate of the main input.
  • the object control unit 9 prohibits execution of the function (the object) of the application allocated in advance on the screen of the application and executes, on the basis of the sub-input, a predetermined function not allocated to the area on the screen of the application.
  • the object control unit 9 prohibits execution of the function allocated in advance on the application screen and executes the predetermined function not allocated to the application screen.
  • Figures 4 to 6 are diagrams for explaining an example in the case in which the present invention is applied to a WEB browser.
  • a scroll function is allocated to the sub-input and a range designation function is allocated to the main input in the sub-input.
  • a link displayed in a contact position functions and the WEB screen jumps to a link destination of the WEB.
  • the WEB screen changes to a range designation mode.
  • the functions displayed on the application screen are invalidated and, for example, the WEB screen does not jump to a link destination even if the pen touches a link.
  • a function allocated to the pen input in the state in which the finger touches the WEB browser i.e., in this embodiment, a function such as copying on the WEB screen by range designation works.
  • Figures 7 and 8 are diagrams for explaining a case in which the present invention is applied to a manual character input application.
  • operations based on the sub-input by a finger are character erasing and character shift.
  • the character erasing is performed by moving the finger up and down after inputting a character to the touch panel 5 with the pen as shown in Figure 7 .
  • the character erasing may be performed according to the sub-input irrespective of a position of the displayed character or the character may be erased according to a position of the sub-input.
  • a character shift operation is performed when the user touches the touch panel 5 with the finger from the right to left on the screen after inputting a character to the touch panel 5 with the pen as shown in Figure 8 .
  • the character input is performed by the main input to the touch panel 3 by the pen and the character erasing or the character shift is performed by the sub-input to the touch panel 3 by the finger.
  • Figures 9 to 11 are diagrams for explaining a case in which the present invention is applied to a paint tool.
  • an operation based on the sub-input by a finger is scroll of a screen.
  • range designation and clipping operations for an image sub-object as at least one of an image object are performed by encircling the image object with the pen.
  • the portable terminal 1 while executing image editing based on the pen input in a rendering area, when the portable terminal 1 detects an input by a finger on an application screen, the portable terminal 1 invalidates an image editing function allocated on the application screen and executes scroll of the application screen on the basis of an input track of the finger. As shown in Figure 11 , while the finger as the sub-input touches the touch panel 5, the portable terminal 1 designates the range of a portion desired to be clipped by the pen as the main input and performs the clipping operation.
  • the portable terminal 1 prohibits execution of a function of the application allocated in advance on the screen of the application.
  • the portable terminal 1 executes, on the basis of the input by a pen as the main input, a predetermined function (e.g., an area setting and clipping function) not allocated to an area on the screen of the application.
  • the image editing function on the screen is invalidated. Even if the pen touches a rendering area, rendering is not executed. An area on an image is determined by the pen input in a state in which the finger touches the screen and an image in that area is clipped.
  • Figures 12 and 13 are diagrams for explaining an example in which the present invention is applied when a mail application runs in the foreground and a music application runs in the background.
  • a sub-operation by a finger on a mail application screen is simple operation such as volume adjustment or music forward and reverse of the music application.
  • a function of the main application allocated in advance to an area on a main application screen is executed on the basis of the main input by a pen.
  • the execution of the function of the main application allocated in advance on the screen of the main application is prohibited and a predetermined function of the background application is executed on the basis of the sub-input by a finger.
  • the input means (the pen or the finger) for input to the touch panel is determined and functions to be executed are distinguished according to the main input, the sub-input, and a combination of these inputs. Therefore, since special hardware is unnecessary and a switching operation and the like for the input device are not performed, the user can perform intuitive operation with less wrong operation in a limited space.
  • an application in the background can be operated on an application screen running in the foreground. Therefore, it is unnecessary to bring the application in the background to the foreground every time the application in the background is operated.
  • FIG 14 is a block diagram showing a configuration of a portable terminal according to an embodiment of the present invention.
  • a portable terminal 20 is, for example, a portable communication terminal apparatus such as a PDA (personal digital assistant).
  • the portable terminal 20 has components and actions same as those of the portable terminal 1 (see Figure 1 ) according to the first embodiment except that the portable terminal 20 includes, instead of the display unit 4 and the touch panel 5, a touch panel 21 serving as both a display unit and an operation unit and includes a control unit 25 that includes, instead of the contact-surface detecting unit 10, an input-means determining unit 24 including a contact detecting unit 22 and a characteristic determining unit 23.
  • the touch panel 21 functions as a display unit that displays soft keys (objects) and, at the same time, functions as an operation unit that detects an input to the objects.
  • the control unit 25 performs an operation associated with the touch panel according to an application program stored in the storing unit 2 and includes the input-means determining unit 24 and the object control unit 9.
  • the contact detecting unit 22 detects (monitors) contact with the touch panel 21.
  • the characteristic determining unit 23 determines a characteristic of a state of contact with the touch panel 21 from a detection result.
  • the input-means determining unit 24 determines, on the basis of a result of this determination, what is a type of input means, for example, a pen or a finger. In other words, the input-means determining unit 24 determines a type of the input means from the contact state.
  • the object control unit 9 executes, on the basis of the type of the input means determined by the input-means determining unit 24 and an input position determined by the input-position determining unit 11, a function of an object according to an application program stored in the storing unit 6.
  • the storing unit 6 stores the application program 3.
  • the object control unit 9 variously controls operations of an application according to the main input means and the sub-input means and sometimes controls operations of an application in the background.
  • the control unit 25 of the portable terminal 20 includes the input-means determining unit 24 including the contact detecting unit 22 and the characteristic determining unit 23.
  • An application for changing operations according to input means is stored in the storing unit 6.
  • the portable terminal 20 performs, using an image sensor, determination of input means for applying input operation to the touch panel 21.
  • Figure 15 conceptually shows a determination method in the case in which determination of input means is performed by using the image sensor, wherein (a) is a plan explanatory view and (b) is a side explanatory view. For simplification of explanation, it is determined whether input means is a finger or a pen.
  • an image sensor panel which is used as the touch panel 21, integrally configured by adding a function of the image sensor to a liquid crystal display, can reproduce, when an input device (input means) 26 such as a finger or a pen touches the surface thereof (see (a) and (b)), a target image by converting an amount of light received in each of pixels into charges and reading the pixels in an entire area.
  • control unit 25 can determine, on the basis of a characteristic part of the input device 26 detected by the image sensor panel, whether the input device 26 is a finger or a pen. When the image sensor panel is used, it is possible to simultaneously determine plural inputs.
  • the determination of input means for applying input operation to the touch panel 21 may be performed by using a camera instead of the image sensor.
  • the determination of input means is performed by using the camera as imaging unit, plural (in this example, two) cameras 27 are set in the portable terminal 20 together with the liquid crystal display.
  • the input-means determining unit 24 includes an image analyzing unit having an image analyzing function instead of the contact detecting unit 22 and the characteristic determining unit 23.
  • Figure 16 conceptually shows a determination method in the case in which determination of input information is performed by using a camera, wherein (a) is a plan explanatory view and (b) is a side explanatory view.
  • a special detecting apparatus is not mounted on the touch panel 21 itself (i.e., the touch panel 21 only has to be a simple display).
  • Two cameras 27 are set on a peripheral edge side of the touch panel 21, where photographing of an entire area of the surface of the touch panel 21 is possible, such that the input device 26 that touches the touch panel 21 can be photographed in any place of the touch panel 21.
  • the portable terminal 20 detects, using the two cameras 27, a target, i.e., the input device 26 that touches the touch panel 21.
  • the two cameras 27 pick up images in a contact position or near the contact position of the input device 26 that touches the surface of the touch panel 21.
  • the image analyzing unit subjects picked-up images picked up by the cameras 27 to image analysis processing and detects a type of the input device (the input means) that touches the touch panel 21 and a contact position coordinate.
  • the image analyzing unit can discriminate whether the input device 26 is the pen or the finger with the thickness of a picked-up detection target set as a threshold. In this case, the input device 26 does not always have to touch the display unit, i.e., the touch panel 21.
  • FIG 17 is a flowchart showing a flow of processing for determining an input state.
  • Figure 18 is a flowchart showing a flow of processing for determining an input type in Figure 17 .
  • the control unit 25 reads out the stored image data In (step S401).
  • the control unit 25 reads out reference images A and B (step S402).
  • the reference image A is an image pattern of a finger and the reference image B is an image pattern of a pen.
  • FIG 19 is a flowchart showing a flow of processing for controlling an object.
  • FIG 20 is a flowchart showing a flow of processing of the processing 1 in Figure 19 .
  • the control unit 25 executes a function of an object S1 (step S603).
  • the control unit 25 prohibits the execution of the function of the object S1 and executes, on the basis of the sub-input, a predetermined function not allocated in advance on the application screen (step S604).
  • FIG 21 is a flowchart showing a flow of processing of the processing 2 in Figure 19 .
  • This processing 2 is performed when there are two or more detected inputs (i.e., i>1).
  • S705 predetermined function not allocated on the application screen in advance
  • the control unit 25 stores a coordinate of a position where the touch panel 21 is touched and stores, using the image sensor, an image data of a body that touches the touch panel 21.
  • the control unit 25 performs matching of the stored image data and the reference data (the image patterns of the finger and the pen) stored in advance and performs determination of input means.
  • the control unit 25 stores a result of the determination as a list Ln together with an input position.
  • the control unit 25 performs, on the basis of the result of the input means determination, operation control for an object displayed according to the input means.
  • the control unit 25 switches processing of an application between a case in which there are plural inputs and a case in which there is one input.
  • the control unit 25 executes, when it is determined by the input detecting unit (the touch panel 21) that an input to the display unit is the first input by the first input means (the main input means (the pen)), on the basis of the first input, a function of an application allocated in advance to an area on the screen of the application specified by the first input.
  • the control unit 25 prohibits, when it is detected by the input detecting unit that the input to the display unit is the second input by the second input means (the sub-input means (the finger)), the execution of the function of the application allocated in advance on the screen of the application and executes, on the basis of the second input, a predetermined function not allocated to the area on the screen of the application.
  • the control unit 25 executes, when the first input is detected during a period in which the second input continues, on the basis of the first input, the predetermined function not allocated to the area on the screen of the application.
  • the control unit 25 executes, when it is detected by the input detecting unit that the input to the display unit is the first input by the first input means, on the basis of the first input, a function of a main application allocated in advance to an area on the screen of the main application specified by the first input.
  • the control unit 25 prohibits, when it is detected by the input detecting unit that the input to the display unit is the second input by the second input means, the execution of the function of the main application allocated in advance to the screen of the main application and executes, on the basis of the second input, a predetermined function of the background application.
  • the input detecting unit includes the cameras (the imaging unit) 27 that pick up images of the periphery of the display unit and the image analyzing unit for analyzing images picked up by the cameras or includes the image sensor incorporated in the display unit and the image analyzing unit for analyzing images output from the image sensor.
  • screen operation such as scroll can be applied to the touch panel by the sub-input means, for example, it is unnecessary to switch display to view a reception mail, it is unnecessary to provide a display area for soft keys, it is possible to effectively use a narrow space, and the number of kinds of operation by the user is reduced.
  • the portable terminal includes the storing unit that stores various applications, the display unit (the display unit) that displays a screen of each of the applications, the input detecting unit that detects each of plural inputs to the display unit, and the control unit that executes, when it is detected by the input detecting unit that an input to the display unit is the first input by the first input means (the main input means (the pen)), on the basis of the first input, a function of the application allocated in advance to an area on the screen of the application specified by the first input, and prohibits, when it is detected by the input detecting unit that an input to the display unit is the second input by the second input means (the sub-input means (the finger)), the execution of the function of the application allocated in advance on the screen of the application and executes, on the basis of the second input, a predetermined function not allocated to the area on the screen of the application.
  • the display unit the display unit
  • the input detecting unit that detects each of plural inputs to the display unit
  • the control unit that executes, when it is detected
  • Figure 22 is an explanatory diagram showing an operation (No. 1) of the present invention in the mail application.
  • Figure 23 is an explanatory diagram showing an operation (No. 2) of the present invention in the mail application.
  • Figure 24 shows an operation (No. 3) of the present invention in the mail application, wherein (a) is an explanatory diagram of a mail input screen and (b) is an explanatory diagram of a reception mail screen.
  • Figure 25 is an explanatory diagram of screen scroll showing an operation (No. 4) of the present invention in the mail application.
  • Figure 26 shows an operation (No. 5) of the present invention in the mail application, wherein (a) is an explanatory diagram of a reception mail viewing screen and (b) is an explanatory diagram of a calling mode screen.
  • Figure 27 is an explanatory diagram of screen transition states showing an operation (No. 6) of the present invention in the mail application.
  • Figure 28 is an explanatory diagram of screen transition states (a) to (f) showing an operation (No. 7) of the present invention in the mail application.
  • Figure 29 shows an operation (No. 8) of the present invention in the mail application, wherein (a) is an explanatory diagram of an attachment file pasting screen and (b) is an explanatory diagram of an attachment file selection screen.
  • Figure 30 shows an operation (No. 9) of the present invention in the mail application, wherein (a) is an explanatory diagram of a pen input screen and (b) is an explanatory diagram of a file reproduction screen.
  • a switching function for a reception mail and a return mail corresponding to the reception mail and a scroll function for a mail screen are allocated to the sub-input.
  • FIG 22 a reception mail from Mr. A is shown on the left side and a creation screen for a return mail responding to the reception mail is shown on the right side.
  • Figure 23 shows a state in which mail creation is performed on the creation screen for a return mail shown in Figure 22 .
  • mail creation employing a function of an object displayed on a screen of the mailer, i.e., normal mail creation is performed by the pen input as the main input.
  • a user can switch a mail creation screen (see (a)) to a reception mail from Mr. A (see (b)) by depressing the screen with a finger during mail creation and releasing the finger in substantially the same position.
  • the portable terminal 20 performs switching of the screen by using information indicating that an input position of an input to the touch panel 5 by the finger and a release position are substantially the same.
  • FIG. 25 A scene in which the user performs screen scroll by moving the finger while keeping the screen depressed by the finger is shown in Figure 25 .
  • the portable terminal 20 performs, by using information indicating that an input position of an input to the touch panel 5 by the finger and a release position are different, a scroll operation without performing the switching of the screen.
  • the switching of the screen is not performed even if the user releases the finger after the scroll.
  • FIG. 28 A scene in which the user performs copying in order to cite a character string of a reception mail is shown in Figure 28 .
  • a start position of the pen input is a start position of the range designation and a position where the pen is released is an end position of the range designation.
  • a character string range-designated by the pen input is stored until a paste operation is performed. After the range is designated as shown in Figures 28(a) to 28(c) and the user copies the designated range, as shown in Figures 28(d) and 28(e) , the user performs switching to the mail creation screen with the finger input. As shown in Figure 28(f) , the user pastes, with the pen input, copied contents on a mail creation screen at a switching destination.
  • an input position of the pen input is set as a paste start position and pasting of the character string is performed. After the paste, the portable terminal 20 returns to the normal operation.
  • FIG 29 A scene in which the user is about to attach an attachment file to a mail is shown in Figure 29 .
  • the user invokes a data folder and selects a file to be attached (see (b)).
  • the attached file is reproduced (see (b)).
  • the input means (the pen or the finger) for input to the touch panel is determined and functions to be executed are distinguished according to the main input, the sub-input, and a combination of these inputs. Therefore, since special hardware is unnecessary and a switching operation and the like for the input device are not performed, the user can perform intuitive operation with less wrong operation in a limited space.
  • the mail application is used as in the present invention, it is possible to realize, without requiring a special apparatus, a reception mail viewing operation and a scroll operation realized by allocating special keys and by plural times of menu operation as in the past.
  • the present invention is not limited to the above embodiments and various alterations and modifications are possible without departing from the scope of the present invention as defined by the appended claims.
  • the present invention can also be realized by using a touch sensor of another sensor type.
  • the embodiment is explained by using the PDA as an example.
  • the present invention can be widely applied to portable radio terminals such as a cellular phone and portable terminals such as a portable game machine, a portable audio player, a portable video player, a portable electronic dictionary, and a portable electronic book viewer.
  • portable radio terminals such as a cellular phone
  • portable terminals such as a portable game machine, a portable audio player, a portable video player, a portable electronic dictionary, and a portable electronic book viewer.
  • the case in which the main input is the pen and the sub-input is the finger is explained.
  • the present invention can also be applied to a case in which the main input is the finger and the sub-input is the pen.
  • the present invention can also be applied to a case in which the

Claims (14)

  1. Ein tragbares Endgerät (20), aufweisend:
    eine Speichereinheit (2) zum Speichern von verschiedenen Anwendungen (3),
    eine Anzeigeeinheit zum Anzeigen eines Bildschirms von jeder der Anwendungen (3),
    eine Eingabedetektionseinheit zum Detektieren von jeder von mehreren Eingaben an die Anzeigeeinheit, und
    eine Steuereinheit (25), die konfiguriert ist, um,
    wenn durch die Eingabedetektionseinheit detektiert wird, dass eine Eingabe an die Anzeigeeinheit eine erste Eingabe durch einen ersten Eingabemitteltyp ist, auf der Grundlage der ersten Eingabe eine Funktion der Anwendung (3) durchzuführen, die einem Bereich auf dem Bildschirm der Anwendung (3) im Vorfeld zugewiesen wurde, wobei der Bereich durch die erste Eingabe festgelegt wird, und
    wenn durch die Eingabedetektionseinheit detektiert wird, dass eine Eingabe an die Anzeigeeinheit eine zweite Eingabe durch einen zweiten Eingabemitteltyp ist, der sich von dem ersten Eingabemitteltyp unterscheidet, das Ausführen der Funktion der Anwendung (3), die dem Bereich auf dem Bildschirm der Anwendung (3) im Vorfeld zugewiesen wurde, zu untersagen und auf der Basis der zweiten Eingabe eine vorbestimmte Funktion auszuführen, die dem Bereich auf dem Bildschirm der Anwendung (3) nicht zugewiesen wurde,
    dadurch gekennzeichnet, dass die Steuereinheit (25) ferner konfiguriert ist, um,
    wenn die erste Eingabe während eines Zeitraumes detektiert wird, in dem die zweite Eingabe andauert, auf der Grundlage der ersten Eingabe eine andere vorbestimmte Funktion auszuführen, die dem Bereich auf dem Bildschirm der Anwendung (3) nicht zugewiesen wurde.
  2. Das tragbare Endgerät gemäß Anspruch 1, ferner dadurch gekennzeichnet, dass, wenn der Bildschirm der Anwendung mit einer Scrollfunktion betrachtet werden kann, das tragbare Endgerät den Bildschirm der Anwendung auf der Grundlage der zweiten Eingabe scrollt.
  3. Das tragbare Endgerät gemäß Anspruch 1, ferner dadurch gekennzeichnet, dass, wenn die Anwendung eine handgeschriebenes-Zeichen-Eingabefunktion hat, das tragbare Endgerät eine Zeichen-Eingabe mit der ersten Eingabe durchführt und ein Zeichen-Löschen und ein Zeichen-Umschalten mit der zweiten Eingabe durchführt.
  4. Das tragbare Endgerät gemäß Anspruch 1, ferner dadurch gekennzeichnet, dass die Eingabe-Detektionseinheit aufweist: eine Bilderzeugungseinheit zum Aufnehmen eines Bildes der Umgebung der Anzeigeeinheit und eine Bildanalyseeinheit zum Analysieren des durch die Bilderzeugungseinheit aufgenommenen Bildes.
  5. Das tragbare Endgerät gemäß Anspruch 1, ferner dadurch gekennzeichnet, dass die Eingabedetektionseinheit einen Bildsensor, der in der Anzeigeeinheit aufgenommen ist, und eine Bildanalyseeinheit zum Analysieren einer Bildausgabe aus dem Bildsensor aufweist.
  6. Das tragbare Endgerät gemäß Anspruch 1, ferner dadurch gekennzeichnet, dass die erste Eingabe eine Haupteingabe ist und die zweite Eingabe eine Nebeneingabe ist, und die Eingabedetektionseinheit konfiguriert ist, um eine Eingabe an die Anzeigeeinheit basierend auf einem Kontaktbereich der Eingabe mit der Anzeigeeinheit als Haupteingabe zu detektieren, wenn der Kontaktbereich kleiner als ein vorbestimmter Wert ist, und als Nebeneingabe, wenn der Kontaktbereich größer oder gleich dem vorbestimmten Wert ist.
  7. Das tragbare Endgerät gemäß Anspruch 6, wobei die Haupteingabe eine Eingabe durch einen Stift ist und die Nebeneingabe eine Eingabe durch einen Finger ist.
  8. Ein Steuerverfahren für ein tragbares Endgerät (20), aufweisend:
    eine Speichereinheit (2) zum Speichern verschiedener Anwendungen (3),
    eine Anzeigeeinheit zum Anzeigen eines Bildschirms von jeder der Anwendungen (3), und
    eine Eingabedetektionseinheit zum Detektieren von jeder von mehreren Eingaben an die Anzeigeeinheit, wobei
    das Steuerverfahren aufweist:
    das Ausführen auf der Grundlage der ersten Eingabe einer Funktion der Anwendung (3), die einem Bereich auf dem Bildschirm der Anwendung (3) im Vorfeld zugewiesen wurde, wobei der Bereich durch die erste Eingabe festgelegt wird, wenn durch die Eingabedetektionseinheit detektiert wird, dass eine Eingabe an die Anzeigeeinheit eine erste Eingabe durch einen ersten Typ von Eingabemittel ist, und
    das Untersagen des Ausführens der Funktion der Anwendung (3), die dem Bereich auf dem Bildschirm der Anwendung (3) im Vorfeld zugewiesen wurde, und das Ausführen auf der Grundlage der zweiten Eingabe einer vorbestimmten Funktion, die dem Bereich auf dem Bildschirm der Anwendung (3) nicht zugewiesen wurde, wenn durch die Eingabedetektionseinheit detektiert wird, dass eine Eingabe an die Anzeigeeinheit eine zweite Eingabe durch einen zweiten Typ von Eingabemittel ist, der sich von dem ersten Typ von Eingabemittel unterscheidet,
    gekennzeichnet durch
    das Ausführen auf der Grundlage der ersten Eingabe einer anderen vorbestimmten Funktion, die dem Bereich auf dem Bildschirm der Anwendung (3) nicht zugewiesen wurde, wenn die erste Eingabe während eines Zeitraumes detektiert wird, in dem die zweite Eingabe andauert.
  9. Das Steuerverfahren für ein tragbares Endgerät gemäß Anspruch 8, ferner gekennzeichnet durch das Scrollen des Bildschirms der Anwendung auf der Grundlage der zweiten Eingabe, wenn der Bildschirm der Anwendung mit einer Scrollfunktion betrachtet werden kann.
  10. Das Steuerverfahren für ein tragbares Endgerät gemäß Anspruch 8, ferner gekennzeichnet durch das Durchführen einer Zeichen-Eingabe mit der ersten Eingabe und das Durchführen eines Zeichen-Löschens und eines Zeichen-Umschaltens mit der zweiten Eingabe, wenn die Anwendung eine handschriftliches-Zeichen-Eingabefunktion hat.
  11. Das Steuerverfahren für ein tragbares Endgerät gemäß Anspruch 8, wobei die Eingabedetektionseinheit aufweist: eine Bilderzeugungseinheit zum Aufnehmen eines Bildes der Umgebung der Anzeigeeinheit und eine Bildanalyseeinheit zum Analysieren des von der Bilderzeugungseinheit aufgenommenen Bildes.
  12. Das Steuerverfahren für ein tragbares Endgerät gemäß Anspruch 8, wobei die Eingabedetektionseinheit einen Bildsensor, der in der Anzeigeeinheit aufgenommen ist, und eine Bildanalyseeinheit zum Analysieren einer Bildausgabe aus dem Bildsensor aufweist.
  13. Das Steuerverfahren für ein tragbares Endgerät gemäß Anspruch 8, wobei die erste Eingabe eine Haupteingabe ist, wobei ein Kontaktbereich davon mit der Anzeigeeinheit kleiner als ein vorbestimmter Wert ist, und die zweite Eingabe eine Nebeneingabe ist, wobei ein Kontaktbereich davon mit der Anzeigeeinheit größer oder gleich dem vorbestimmten Wert ist.
  14. Das Steuerverfahren für ein tragbares Endgerät gemäß Anspruch 13, wobei die Haupteingabe eine Eingabe durch einen Stift ist und die Nebeneingabe eine Eingabe durch einen Finger ist.
EP07828604.4A 2006-09-28 2007-09-27 Tragbares endgerät und verfahren zu dessen steuerung Expired - Fee Related EP2071436B1 (de)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2006265060 2006-09-28
JP2006264866A JP5080773B2 (ja) 2006-09-28 2006-09-28 携帯端末及びその制御方法
JP2007106200A JP4927633B2 (ja) 2006-09-28 2007-04-13 携帯端末及びその制御方法
PCT/JP2007/068858 WO2008047552A1 (en) 2006-09-28 2007-09-27 Portable terminal and method for controlling the same

Publications (3)

Publication Number Publication Date
EP2071436A1 EP2071436A1 (de) 2009-06-17
EP2071436A4 EP2071436A4 (de) 2014-07-30
EP2071436B1 true EP2071436B1 (de) 2019-01-09

Family

ID=39313799

Family Applications (1)

Application Number Title Priority Date Filing Date
EP07828604.4A Expired - Fee Related EP2071436B1 (de) 2006-09-28 2007-09-27 Tragbares endgerät und verfahren zu dessen steuerung

Country Status (5)

Country Link
US (2) US8997015B2 (de)
EP (1) EP2071436B1 (de)
KR (2) KR101058297B1 (de)
CN (1) CN101523331B (de)
WO (1) WO2008047552A1 (de)

Families Citing this family (71)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008047552A1 (en) * 2006-09-28 2008-04-24 Kyocera Corporation Portable terminal and method for controlling the same
CN101743529B (zh) * 2007-07-11 2012-06-13 株式会社爱可信 便携式信息终端及其控制方法
KR101264315B1 (ko) 2008-06-02 2013-05-22 에스케이플래닛 주식회사 모바일 플랫폼에서 어플리케이션 간의 연동 방법과 장치 및그 기록매체
DE102008039194A1 (de) * 2008-08-22 2010-02-25 Brose Fahrzeugteile GmbH & Co. Kommanditgesellschaft, Würzburg Elektromotor
US20100251112A1 (en) * 2009-03-24 2010-09-30 Microsoft Corporation Bimodal touch sensitive digital notebook
KR101699735B1 (ko) * 2009-10-26 2017-01-26 엘지전자 주식회사 네일터치 입력이 가능한 휴대 단말기 및 그 동작 제어방법
TWI520030B (zh) * 2009-05-21 2016-02-01 原相科技股份有限公司 互補式金氧半導體影像感測器及其操作方法
JP2011014044A (ja) * 2009-07-03 2011-01-20 Sony Corp 操作制御装置、操作制御方法およびコンピュータプログラム
WO2011023225A1 (en) * 2009-08-25 2011-03-03 Promethean Ltd Interactive surface with a plurality of input detection technologies
JP5310403B2 (ja) * 2009-09-02 2013-10-09 ソニー株式会社 情報処理装置、情報処理方法およびプログラム
WO2011037558A1 (en) * 2009-09-22 2011-03-31 Apple Inc. Device, method, and graphical user interface for manipulating user interface objects
US9310907B2 (en) 2009-09-25 2016-04-12 Apple Inc. Device, method, and graphical user interface for manipulating user interface objects
US8766928B2 (en) * 2009-09-25 2014-07-01 Apple Inc. Device, method, and graphical user interface for manipulating user interface objects
US8799826B2 (en) * 2009-09-25 2014-08-05 Apple Inc. Device, method, and graphical user interface for moving a calendar entry in a calendar application
US8832585B2 (en) 2009-09-25 2014-09-09 Apple Inc. Device, method, and graphical user interface for manipulating workspace views
WO2011048840A1 (ja) * 2009-10-19 2011-04-28 シャープ株式会社 入力動作解析方法および情報処理装置
JP5532300B2 (ja) * 2009-12-24 2014-06-25 ソニー株式会社 タッチパネル装置およびタッチパネル制御方法、プログラム、並びに記録媒体
US8612884B2 (en) * 2010-01-26 2013-12-17 Apple Inc. Device, method, and graphical user interface for resizing objects
US8539385B2 (en) * 2010-01-26 2013-09-17 Apple Inc. Device, method, and graphical user interface for precise positioning of objects
US8539386B2 (en) * 2010-01-26 2013-09-17 Apple Inc. Device, method, and graphical user interface for selecting and moving objects
US20120299856A1 (en) * 2010-02-19 2012-11-29 Nec Corporation Mobile terminal and control method thereof
JP5237980B2 (ja) * 2010-03-04 2013-07-17 レノボ・シンガポール・プライベート・リミテッド 座標入力装置、座標入力方法、およびコンピュータが実行可能なプログラム
JP5413673B2 (ja) * 2010-03-08 2014-02-12 ソニー株式会社 情報処理装置および方法、並びにプログラム
WO2011122627A1 (ja) 2010-03-29 2011-10-06 京セラ株式会社 情報処理装置および文字入力方法
JP5615583B2 (ja) 2010-04-08 2014-10-29 京セラ株式会社 文字入力装置、文字入力方法および文字入力プログラム
KR101997034B1 (ko) 2010-04-19 2019-10-18 삼성전자주식회사 인터페이스 방법 및 장치
JP4950321B2 (ja) 2010-04-26 2012-06-13 京セラ株式会社 文字入力装置、文字入力方法および文字入力プログラム
US8638303B2 (en) * 2010-06-22 2014-01-28 Microsoft Corporation Stylus settings
GB2481607A (en) * 2010-06-29 2012-01-04 Promethean Ltd A shared control panel
US9081494B2 (en) 2010-07-30 2015-07-14 Apple Inc. Device, method, and graphical user interface for copying formatting attributes
US8972879B2 (en) 2010-07-30 2015-03-03 Apple Inc. Device, method, and graphical user interface for reordering the front-to-back positions of objects
US9098182B2 (en) * 2010-07-30 2015-08-04 Apple Inc. Device, method, and graphical user interface for copying user interface objects between content regions
JP5666239B2 (ja) * 2010-10-15 2015-02-12 シャープ株式会社 情報処理装置、情報処理装置の制御方法、プログラム、および記録媒体
KR101802759B1 (ko) * 2011-05-30 2017-11-29 엘지전자 주식회사 이동 단말기 및 이것의 디스플레이 제어 방법
KR101924835B1 (ko) 2011-10-10 2018-12-05 삼성전자주식회사 터치 디바이스의 기능 운용 방법 및 장치
EP2634672A1 (de) * 2012-02-28 2013-09-04 Alcatel Lucent System und Verfahren zur Eingabe von Symbolen
CN103376995A (zh) * 2012-04-19 2013-10-30 富泰华工业(深圳)有限公司 触控式电子装置及其页面内容存储方法
JP5349642B2 (ja) 2012-04-27 2013-11-20 株式会社東芝 電子機器、制御方法およびプログラム
KR101928914B1 (ko) * 2012-06-08 2018-12-13 엘지전자 주식회사 이동 단말기
KR20140019206A (ko) * 2012-07-13 2014-02-14 삼성전자주식회사 사용자 단말에서 사용자 인터페이스 장치 및 방법
KR20140008985A (ko) * 2012-07-13 2014-01-22 삼성전자주식회사 사용자 단말에서 사용자 인터페이스 장치 및 방법
KR101383589B1 (ko) * 2012-07-19 2014-04-09 (주)멜파스 터치 센싱 방법 및 장치
JP6127401B2 (ja) * 2012-07-24 2017-05-17 カシオ計算機株式会社 情報処理装置、プログラム及び情報処理方法
US8826128B2 (en) * 2012-07-26 2014-09-02 Cerner Innovation, Inc. Multi-action rows with incremental gestures
CN104685452A (zh) * 2012-09-26 2015-06-03 松下知识产权经营株式会社 显示装置以及笔输入擦除方法
US9921687B2 (en) * 2012-10-02 2018-03-20 Autodesk, Inc. Always-available input through finger instrumentation
KR102102663B1 (ko) * 2012-10-05 2020-04-22 삼성전자주식회사 휴대단말기의 사용 방법 및 장치
KR20140046557A (ko) * 2012-10-05 2014-04-21 삼성전자주식회사 다점 입력 인식 방법 및 그 단말
KR20140055880A (ko) * 2012-11-01 2014-05-09 삼성전자주식회사 가상 화면 제어 방법 및 장치
JP6202345B2 (ja) * 2012-11-02 2017-09-27 ソニー株式会社 表示制御装置、表示制御方法、およびプログラム
US9367186B2 (en) * 2012-12-18 2016-06-14 Logitech Europe S.A. Method and system for discriminating stylus and touch interactions
US9836154B2 (en) * 2013-01-24 2017-12-05 Nook Digital, Llc Selective touch scan area and reporting techniques
KR101439855B1 (ko) * 2013-01-25 2014-09-17 주식회사 하이딥 터치 스크린 제어 장치 및 그의 제어 방법
US20140253462A1 (en) * 2013-03-11 2014-09-11 Barnesandnoble.Com Llc Sync system for storing/restoring stylus customizations
KR102157270B1 (ko) * 2013-04-26 2020-10-23 삼성전자주식회사 펜을 이용하는 사용자 단말 장치 및 그 제어 방법
BR112015027580B1 (pt) * 2013-07-22 2021-04-13 Hewlett Packard Enterprise Development Lp Método e sistema para apresentar dados em um formato escalonável
KR102138913B1 (ko) * 2013-07-25 2020-07-28 삼성전자주식회사 입력 처리 방법 및 그 전자 장치
KR102189787B1 (ko) * 2013-11-13 2020-12-11 삼성전자 주식회사 터치스크린을 가지는 전자 장치 및 이의 입력 처리 방법
CN104866213A (zh) * 2014-02-20 2015-08-26 联想(北京)有限公司 一种信息处理方法以及电子设备
JP6381240B2 (ja) * 2014-03-14 2018-08-29 キヤノン株式会社 電子機器、触感制御方法及びプログラム
WO2015141091A1 (ja) * 2014-03-20 2015-09-24 日本電気株式会社 情報処理装置、情報処理方法および情報処理プログラム
EP2930604B1 (de) * 2014-04-11 2018-11-07 Nokia Technologies OY Auslösung von Rückmeldung auf eine Benutzereingabe
CN104133633B (zh) * 2014-06-03 2017-12-29 广州三星通信技术研究有限公司 对电子设备的触摸屏上的输入进行控制的方法和装置
CN104267812B (zh) * 2014-09-22 2017-08-29 联想(北京)有限公司 一种信息处理方法及电子设备
CN105589648A (zh) * 2014-10-24 2016-05-18 深圳富泰宏精密工业有限公司 快速复制粘贴系统及方法
KR102386480B1 (ko) * 2015-10-05 2022-04-15 삼성전자주식회사 전자 기기의 입력 식별 방법 및 장치
CN107024981B (zh) 2016-10-26 2020-03-20 阿里巴巴集团控股有限公司 基于虚拟现实的交互方法及装置
EP3593236A1 (de) * 2017-06-02 2020-01-15 Apple Inc. Vorrichtung, verfahren und grafische benutzeroberfläche zum annotieren von inhalt
WO2019047234A1 (zh) * 2017-09-11 2019-03-14 广东欧珀移动通信有限公司 触摸操作响应方法及装置
EP3671412A4 (de) 2017-09-11 2020-08-05 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Berührungsoperationsreaktionsverfahren und -vorrichtung
WO2019047231A1 (zh) 2017-09-11 2019-03-14 广东欧珀移动通信有限公司 触摸操作响应方法及装置

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1517228A2 (de) * 2003-09-16 2005-03-23 Smart Technologies, Inc. Verfahren zur Erkennung von Gesten und dieses Verfahren enthaltendes Berührungssystem

Family Cites Families (31)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH028923A (ja) 1988-06-28 1990-01-12 Canon Inc 座標入力装置
JP2599019B2 (ja) 1990-06-28 1997-04-09 三洋電機株式会社 ペン入力装置
JPH07160398A (ja) 1993-12-08 1995-06-23 Canon Inc ペン入力装置
US5956020A (en) * 1995-07-27 1999-09-21 Microtouch Systems, Inc. Touchscreen controller with pen and/or finger inputs
JPH09190268A (ja) * 1996-01-11 1997-07-22 Canon Inc 情報処理装置およびその方法
JP3834871B2 (ja) 1996-05-21 2006-10-18 ソニー株式会社 座標入力装置及び方法
JPH10155038A (ja) 1996-11-20 1998-06-09 Casio Comput Co Ltd 情報通信端末
JPH10228350A (ja) 1997-02-18 1998-08-25 Sharp Corp 入力装置
JP3944949B2 (ja) 1997-06-05 2007-07-18 ソニー株式会社 情報処理装置、情報処理方法、媒体
WO1999028811A1 (en) * 1997-12-04 1999-06-10 Northern Telecom Limited Contextual gesture interface
JPH11265240A (ja) 1998-03-18 1999-09-28 Canon Inc 機器の使用条件入力装置及び入力方法
JPH11272423A (ja) 1998-03-19 1999-10-08 Ricoh Co Ltd コンピュータ入力装置
US8089470B1 (en) * 1998-10-20 2012-01-03 Synaptics Incorporated Finger/stylus touch pad
JP4006872B2 (ja) 1999-02-26 2007-11-14 ぺんてる株式会社 タッチパネル装置
JP2000284912A (ja) 1999-03-30 2000-10-13 Matsushita Electric Ind Co Ltd タッチパネル入力コンピュータ
CN1348559A (zh) 1999-07-07 2002-05-08 富士通株式会社 携带式文字输入装置
JP2001344062A (ja) 2000-05-31 2001-12-14 Sharp Corp 座標入力装置
US7030861B1 (en) * 2001-02-10 2006-04-18 Wayne Carl Westerman System and method for packing multi-touch gestures onto a hand
CN2489392Y (zh) * 2001-05-21 2002-05-01 神达电脑股份有限公司 行动型电子数据处理装置
JP3920067B2 (ja) * 2001-10-09 2007-05-30 株式会社イーアイティー 座標入力装置
JP2003157144A (ja) 2001-11-20 2003-05-30 Sony Corp 文字入力装置、文字入力方法、文字入力プログラム格納媒体及び文字入力プログラム
WO2003065192A1 (en) 2002-01-31 2003-08-07 Nokia Corporation Method, system and device for distinguishing pointing means
US7532206B2 (en) * 2003-03-11 2009-05-12 Smart Technologies Ulc System and method for differentiating between pointers used to contact touch surface
JP2005115714A (ja) 2003-10-09 2005-04-28 Noritz Corp 表示操作装置
JP4405335B2 (ja) * 2004-07-27 2010-01-27 株式会社ワコム 位置検出装置、及び、入力システム
US7847789B2 (en) * 2004-11-23 2010-12-07 Microsoft Corporation Reducing accidental touch-sensitive device activation
JP2006264866A (ja) 2005-03-23 2006-10-05 Mitsubishi Electric Corp エレベータの運転装置
JP4668657B2 (ja) 2005-03-25 2011-04-13 Hoya株式会社 モールドプレス成形装置、及び成形体の製造方法
JP2007106200A (ja) 2005-10-12 2007-04-26 Mitsuba Corp 車両周辺監視装置
US7620901B2 (en) * 2006-03-21 2009-11-17 Microsoft Corporation Simultaneous input across multiple applications
WO2008047552A1 (en) * 2006-09-28 2008-04-24 Kyocera Corporation Portable terminal and method for controlling the same

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1517228A2 (de) * 2003-09-16 2005-03-23 Smart Technologies, Inc. Verfahren zur Erkennung von Gesten und dieses Verfahren enthaltendes Berührungssystem

Also Published As

Publication number Publication date
EP2071436A4 (de) 2014-07-30
US20100095205A1 (en) 2010-04-15
US20150220267A1 (en) 2015-08-06
KR20090057078A (ko) 2009-06-03
KR20110007237A (ko) 2011-01-21
US8997015B2 (en) 2015-03-31
CN101523331A (zh) 2009-09-02
WO2008047552A1 (en) 2008-04-24
EP2071436A1 (de) 2009-06-17
KR101058297B1 (ko) 2011-08-22
US9836214B2 (en) 2017-12-05
CN101523331B (zh) 2015-12-09

Similar Documents

Publication Publication Date Title
EP2071436B1 (de) Tragbares endgerät und verfahren zu dessen steuerung
JP4927633B2 (ja) 携帯端末及びその制御方法
US11029827B2 (en) Text selection using a touch sensitive screen of a handheld mobile communication device
JP5080773B2 (ja) 携帯端末及びその制御方法
US8775966B2 (en) Electronic device and method with dual mode rear TouchPad
US8739053B2 (en) Electronic device capable of transferring object between two display units and controlling method thereof
CA2640785C (en) Electronic device and method of controlling the same
US7336263B2 (en) Method and apparatus for integrating a wide keyboard in a small device
US8963842B2 (en) Integrated hardware and software user interface
JP3143477U (ja) 電子デバイス
EP2098947A2 (de) Auswahl von Text durch Gesten
US20070263014A1 (en) Multi-function key with scrolling in electronic devices
TWI389015B (zh) 軟體鍵盤之操作方法
US20130100061A1 (en) Mobile terminal and controlling method thereof
EP1408400A2 (de) Berührungsempfindliches Tablett zur Benutzung in einem tragbaren elektronischen Gerät
JP5085780B2 (ja) 携帯端末及びその制御方法
KR101678213B1 (ko) 터치 영역 증감 검출에 의한 사용자 인터페이스 장치 및 그 제어 방법
JP4027937B2 (ja) 携帯型電子機器
KR20090046189A (ko) 프로그래스 바를 이용한 동작 제어 방법 및 장치
KR20100053001A (ko) 터치스크린의 정보입력방법
JP2011198382A (ja) タッチパネル付き無線端末の制御方法及びタッチパネル付き無線端末
JP2006100998A (ja) 電子機器及び撮像装置
TW201019206A (en) Handheld electronic device and program display switching method thereof

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20090331

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): DE FR GB

AX Request for extension of the european patent

Extension state: AL BA HR MK RS

DAX Request for extension of the european patent (deleted)
A4 Supplementary search report drawn up and despatched

Effective date: 20140702

RIC1 Information provided on ipc code assigned before grant

Ipc: H04M 1/00 20060101ALI20140626BHEP

Ipc: G06F 3/048 20130101ALI20140626BHEP

Ipc: G06F 3/041 20060101AFI20140626BHEP

17Q First examination report despatched

Effective date: 20170809

REG Reference to a national code

Ref country code: DE

Ref legal event code: R079

Ref document number: 602007057393

Country of ref document: DE

Free format text: PREVIOUS MAIN CLASS: G06F0003041000

Ipc: G06F0003048800

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

RIC1 Information provided on ipc code assigned before grant

Ipc: G06F 3/041 20060101ALI20180129BHEP

Ipc: G06F 3/0488 20130101AFI20180129BHEP

INTG Intention to grant announced

Effective date: 20180223

GRAJ Information related to disapproval of communication of intention to grant by the applicant or resumption of examination proceedings by the epo deleted

Free format text: ORIGINAL CODE: EPIDOSDIGR1

GRAJ Information related to disapproval of communication of intention to grant by the applicant or resumption of examination proceedings by the epo deleted

Free format text: ORIGINAL CODE: EPIDOSDIGR1

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

INTC Intention to grant announced (deleted)
INTG Intention to grant announced

Effective date: 20180802

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): DE FR GB

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 602007057393

Country of ref document: DE

RIC2 Information provided on ipc code assigned after grant

Ipc: G06F 3/0488 20130101AFI20180129BHEP

Ipc: G06F 3/041 20060101ALI20180129BHEP

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 602007057393

Country of ref document: DE

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: DE

Payment date: 20190814

Year of fee payment: 13

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

26N No opposition filed

Effective date: 20191010

GBPC Gb: european patent ceased through non-payment of renewal fee

Effective date: 20190927

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: GB

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20190927

Ref country code: FR

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20190930

REG Reference to a national code

Ref country code: DE

Ref legal event code: R119

Ref document number: 602007057393

Country of ref document: DE

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: DE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20210401