WO2016129772A1 - Appareil et procédé d'entrée multi-point - Google Patents

Appareil et procédé d'entrée multi-point Download PDF

Info

Publication number
WO2016129772A1
WO2016129772A1 PCT/KR2015/010753 KR2015010753W WO2016129772A1 WO 2016129772 A1 WO2016129772 A1 WO 2016129772A1 KR 2015010753 W KR2015010753 W KR 2015010753W WO 2016129772 A1 WO2016129772 A1 WO 2016129772A1
Authority
WO
WIPO (PCT)
Prior art keywords
touch input
location
input
module
touchscreen
Prior art date
Application number
PCT/KR2015/010753
Other languages
English (en)
Inventor
Lynn Andrew WARNER
Original Assignee
Samsung Electronics Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US14/621,898 external-priority patent/US9965173B2/en
Application filed by Samsung Electronics Co., Ltd. filed Critical Samsung Electronics Co., Ltd.
Priority to CN201580076118.5A priority Critical patent/CN107223226B/zh
Priority to EP15882135.5A priority patent/EP3256935A4/fr
Publication of WO2016129772A1 publication Critical patent/WO2016129772A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen

Definitions

  • the present disclosure relates to an apparatus and method for touch input. More particularly, the present disclosure relates to an apparatus and method for performing precise multi-touch input.
  • Mobile terminals are developed to provide wireless communication between users. As technology has advanced, mobile terminals now provide many additional features beyond simple telephone conversation. For example, mobile terminals are now able to provide additional functions such as an alarm, a Short Messaging Service (SMS), a Multimedia Message Service (MMS), E-mail, games, remote control of short range communication, an image capturing function using a mounted digital camera, a multimedia function for providing audio and video content, a scheduling function, and many more. With the plurality of features now provided, a mobile terminal has effectively become a necessity of daily life.
  • SMS Short Messaging Service
  • MMS Multimedia Message Service
  • E-mail electronic mail
  • games remote control of short range communication
  • an image capturing function using a mounted digital camera a multimedia function for providing audio and video content
  • a scheduling function a scheduling function
  • Mobile terminals are often provided with a touchscreen for user input.
  • Touchscreens allow a user to select and manipulate user interface elements by touching, tapping, dragging, or other touch input functions. In most situations, these gestures are a good substitute for other input devices, such as a mouse and keyboard.
  • touchscreens are not well equipped for applications needing precise input, such as photo editing, Computer Assisted Drafting (CAD) programs, and the like.
  • CAD Computer Assisted Drafting
  • Most touchscreens use a finger or a stylus as an input device, and these input mechanisms lack the precision of, for example, a computer mouse.
  • devices employing touchscreens provide the user with the ability to zoom in and out of a diagram or image to permit greater accuracy.
  • this process of zooming in and out can be time-consuming and cumbersome. Accordingly, there is a need for a more accurate user input technique for touchscreens.
  • an aspect of the present disclosure is to provide an apparatus and method for precise multi-touch input.
  • a method for precise multi-touch input includes detecting a first touch input at a first location on a touchscreen, while the first touch input is maintained, detecting a second touch input at a second location on the touchscreen, detecting removal of the first touch input at the first location while the second touch input is maintained, and adjusting the first location according to movement of the second touch input, such that movement of the adjusted location is less than the movement of the second touch input.
  • an apparatus configured to provide a precise multi-touch input.
  • the apparatus includes a display unit, a touchscreen operatively coupled to the display unit and configured to detect a touch, and a processor configured to detect a first touch input at a first location on the touchscreen, to detect a second touch input at a second location on the touchscreen while the first touch input is maintained, to detect removal of the first touch input at the first location while the second touch input is maintained, and to adjust the first location according to movement of the second touch input, such that movement of the adjusted location is less than the movement of the second touch input.
  • a method for precise multi-touch input includes detecting a user input, determining a first location on a touchscreen of the electronic device based on the user input, detecting a touch input at a second location; and adjusting the first location according to movement of the second touch input, such that movement of the adjusted location is less than the movement of the second touch input.
  • FIG. 1 illustrates a network environment including an electronic device according to various embodiments of the present disclosure
  • FIG. 2 illustrates components of an electronic device according to an embodiment of the present disclosure
  • FIGS. 3A, 3B, and 3C illustrate a method of precise multi-touch input according to an embodiment of the present disclosure
  • FIG. 4 illustrates a method of precise multi-touch input according to an embodiment of the present disclosure
  • FIG. 5 illustrates a method of precise multi-touch input according to another embodiment of the present disclosure
  • FIG. 6 illustrates a method of rotating an object through a multi-touch input according to an embodiment of the present disclosure
  • FIG. 7 illustrates a scaling method of an object selected through a multi-touch input according to an embodiment of the present disclosure
  • FIG. 8 illustrates a method of selecting a text through a multi-touch input according to an embodiment of the present disclosure
  • FIG. 9 illustrates a block diagram of hardware according to an embodiment of the present disclosure.
  • an electronic device may include communication functionality.
  • an electronic device may be a smart phone, a tablet Personal Computer (PC), a mobile phone, a video phone, an e-book reader, a desktop PC, a laptop PC, a netbook PC, a Personal Digital Assistant (PDA), a Portable Multimedia Player (PMP), an MP3 player, a mobile medical device, a camera, a wearable device (e.g., a Head-Mounted Device (HMD), electronic clothes, electronic braces, an electronic necklace, an electronic appcessory, an electronic tattoo, or a smart watch), and/or the like.
  • PDA Personal Digital Assistant
  • PMP Portable Multimedia Player
  • MP3 player MP3 player
  • an electronic device may be a smart home appliance with communication functionality.
  • a smart home appliance may be, for example, a television, a Digital Video Disk (DVD) player, an audio player, a refrigerator, an air conditioner, a vacuum cleaner, an oven, a microwave oven, a washer, a dryer, an air purifier, a set-top box, a TV box (e.g., Samsung HomeSync TM , Apple TV TM , or Google TV TM ), a gaming console, an electronic dictionary, an electronic key, a camcorder, an electronic picture frame, and/or the like.
  • DVD Digital Video Disk
  • an electronic device may be a medical device (e.g., Magnetic Resonance Angiography (MRA) device, a Magnetic Resonance Imaging (MRI) device, Computed Tomography (CT) device, an imaging device, or an ultrasonic device), a navigation device, a Global Positioning System (GPS) receiver, an Event Data Recorder (EDR), a Flight Data Recorder (FDR), an automotive infotainment device, a naval electronic device (e.g., naval navigation device, gyroscope, or compass), an avionic electronic device, a security device, an industrial or consumer robot, and/or the like.
  • MRA Magnetic Resonance Angiography
  • MRI Magnetic Resonance Imaging
  • CT Computed Tomography
  • an imaging device an ultrasonic device
  • GPS Global Positioning System
  • EDR Event Data Recorder
  • FDR Flight Data Recorder
  • automotive infotainment device e.g., a navigation device, a Global Positioning System (GPS) receiver, an Event
  • an electronic device may be furniture, part of a building/structure, an electronic board, electronic signature receiving device, a projector, various measuring devices (e.g., water, electricity, gas or electro-magnetic wave measuring devices), and/or the like that include communication functionality.
  • various measuring devices e.g., water, electricity, gas or electro-magnetic wave measuring devices
  • an electronic device may be any combination of the foregoing devices.
  • an electronic device according to various embodiments of the present disclosure is not limited to the foregoing devices.
  • FIG. 1 illustrates a network environment including an electronic device according to various embodiments of the present disclosure.
  • a network environment 100 includes an electronic device 101.
  • the electronic device 101 may include a bus 110, a processor 120, a memory 130, an Input/Output (I/O) interface 140, a display 150, a communication interface 160, a vernier input module 170, and/or the like.
  • I/O Input/Output
  • the bus 110 may be circuitry that connects the foregoing components and allows communication between the foregoing components.
  • the bus 110 may connect components of the electronic device 101 so as to allow control messages and/or other information to be communicated between the connected components.
  • the processor 120 may receive instructions from other components (e.g., the memory 130, the I/O interface 140, the display 150, the communication interface 160, the vernier input module 170, and/or the like), interpret the received instructions, and execute computation or data processing according to the interpreted instructions.
  • other components e.g., the memory 130, the I/O interface 140, the display 150, the communication interface 160, the vernier input module 170, and/or the like.
  • the memory 130 may store instructions and/or data that are received from, and/or generated by, other components (e.g., the I/O interface 140, the display 150, the communication interface 160, the vernier input module 170, and/or the like).
  • the memory 130 may include programming modules such as a kernel 131, a middleware 132, an Application Programming Interface (API) 133, an application 134, and/or the like.
  • API Application Programming Interface
  • Each of the foregoing programming modules may include a combination of at least two of software, firmware, or hardware.
  • the kernel 131 may control or manage system resources (e.g., the bus 110, the processor 120, the memory 130, and/or the like) that may be used in executing operations or functions implemented in other programming modules such as, for example, the middleware 132, the API 133, the application 134, and/or the like.
  • the kernel 131 may provide an interface for allowing or otherwise facilitating the middleware 132, the API 133, the application 134, and/or the like, to access individual components of the electronic device 101.
  • the middleware 132 may be a medium through which the kernel 131 may communicate with the API 133, the application 134, and/or the like to send and receive data.
  • the middleware 132 may control (e.g., scheduling, load balancing, and/or the like) work requests by one or more applications 134.
  • the middleware 132 may control work requests by the one or more applications 134 by assigning priorities for using system resources (e.g., the bus 110, the processor 120, the memory 130, and/or the like) of the electronic device 101 to the one or more applications 134.
  • system resources e.g., the bus 110, the processor 120, the memory 130, and/or the like
  • the API 133 may be an interface that may control functions that the application 134 may provide at the kernel 131, the middleware 132, and/or the like.
  • the API 133 may include at least an interface or a function (e.g., command) for file control, window control, video processing, character control, and/or the like.
  • the application 134 may include a Short Message Service (SMS) application, a Multimedia Messaging Service (MMS) application, an email application, a calendar application, an alarm application, a health care application (e.g., an exercise amount application, a blood sugar level measuring application, and/or the like), an environmental information application (e.g., an application that may provide atmospheric pressure, humidity, temperature information, and/or the like), an instant messaging application, a call application, an internet browsing application, a gaming application, a media playback application, an image/video capture application, a file management application, and/or the like.
  • the application 134 may be an application that is associated with information exchange between the electronic device 101 and an external electronic device (e.g., the electronic device 104).
  • the application 134 that is associated with the information exchange may include a notification relay application that may provide the external electronic device with a certain type of information, a device management application that may manage the external electronic device, and/or the like.
  • the notification relay application may include a functionality that provides notification generated by other applications at the electronic device 101 (e.g., the SMS/MMS application, the email application, the health care application, the environmental information application, the instant messaging application, the call application, the internet browsing application, the gaming application, the media playback application, the image/video capture application, the file management application, and/or the like) to an external electronic device (e.g., the electronic device 104).
  • the notification relay application may provide, for example, receive notification from an external electronic device (e.g., the electronic device 104), and may provide the notification to a user.
  • the device management application may manage enabling or disabling of functions associated with at least a portion of an external electronic device (e.g., the external electronic device itself, or one or more components of the external electronic device) in communication with electronic device 101, controlling of brightness (or resolution) of a display of the external electronic device, an application operated at, or a service (e.g., a voice call service, a messaging service, and/or the like) provided by, the external electronic device, and/or the like.
  • an external electronic device e.g., the external electronic device itself, or one or more components of the external electronic device
  • a service e.g., a voice call service, a messaging service, and/or the like
  • the application 134 may include one or more applications that are determined according to a property (e.g., type of electronic device, and/or the like) of the external electronic device (e.g., the electronic device 104). For example, if the external electronic device is an mp3 player, the application 134 may include one or more applications related to music playback. As another example, if the external electronic device is a mobile medical device, the application 134 may be a health care-related application. According to various embodiments of the present disclosure, the application 134 may include at least one of an application that is preloaded at the electronic device 101, an application that is received from an external electronic device (e.g., the electronic device 104, a server 106, and/or the like), and/or the like.
  • a property e.g., type of electronic device, and/or the like
  • the I/O interface 140 may receive instruction and/or data from a user.
  • the I/O interface 140 may send the instruction and/or the data, via the bus 110, to the processor 120, the memory 130, the communication interface 160, the vernier input module 170, and/or the like.
  • the I/O interface 140 may provide data associated with user input received via a touch screen to the processor 120.
  • the I/O interface 140 may, for example, output instructions and/or data received via the bus 110 from the processor 120, the memory 130, the communication interface 160, the vernier input module 170, and/or the like, via an I/O device (e.g., a speaker, a display, and/or the like).
  • the I/O interface 140 may output voice data (e.g., processed using the processor 120) via a speaker.
  • the display 150 may display various types of information (e.g., multimedia, text data, and/or the like) to the user.
  • the display 150 may display a Graphical User Interface (GUI) with which a user may interact with the electronic device 101.
  • GUI Graphical User Interface
  • the display may also include a touchscreen module 155, as described below with respect to FIG. 2.
  • the communication interface 160 may provide communication between the electronic device 101 and one or more external electronic devices (e.g., the electronic device 104, the server 106, and/or the like). For example, the communication interface 160 may communicate with the external electronic devices by establishing a connection with a network 162 using wireless or wired communication.
  • external electronic devices e.g., the electronic device 104, the server 106, and/or the like.
  • the communication interface 160 may communicate with the external electronic devices by establishing a connection with a network 162 using wireless or wired communication.
  • the wireless communication with which the communication interface 160 may communicate may be at least one of Wi-Fi, Bluetooth, Near Field Communication (NFC), Global Positioning System (GPS), cellular communication (e.g., Long Term Evolution (LTE), LTE Advanced (LTE-A), Code Division Multiple Access (CDMA), Wideband-CDMA (WDCMA), Universal Mobile Telecommunications System (UMTS), Wireless Broadband (WiBro), Global System for Mobile Communications (GSM), and/or the like), Infrared Data Association (IrDA) technology, and/or the like.
  • the wired communication with which the communication interface 160 may communicate may be at least one of Universal Serial Bus (USB), High Definition Multimedia Interface (HDMI), Recommended Standard 232 (RS-232), Plain Old Telephone Service (POTS), Ethernet, and/or the like.
  • the network 162 may be a telecommunications network.
  • the telecommunications network may include at least one of a computer network, the Internet, the Internet of Things, a telephone network, and/or the like.
  • a protocol e.g., a transport layer protocol, a data link layer protocol, a physical layer protocol, and/or the like
  • the application 134 may be supported by, for example, at least one of the application 134, the API 133, the middleware 132, the kernel 131, the communication interface 160, and/or the like.
  • the vernier input module 170 provides a mechanism for precise input on a touch-screen.
  • the vernier input module 170 is described below in more detail with respect to FIG. 2.
  • FIG. 2 illustrates components of an electronic device according to various embodiments of the present disclosure.
  • the electronic device 101 may include the display 150, the touchscreen module 155, and the vernier input module 170, in addition to other components such as those shown in FIG. 1.
  • the touchscreen module 155 detects a touch input from the user and provides the input to the vernier input module 170 as well as other components of the electronic device, such as the processor 120.
  • the touchscreen module 155 may detect touch input using capacitive, resistive, infrared, ultrasonic methods, and/or the like.
  • the touchscreen module 155 may be formed as a capacitive touchscreen, resistive touchscreen, infrared touchscreen, and the like.
  • the touchscreen 155 may be integrated with the display 150 or may be provided as a separate component arranged above or below the display 150.
  • the touchscreen module 155 may take up all or only a part of the display 150.
  • the vernier input module 170 allows the user to perform more precise touch input using a multi-touch input.
  • the user touches a general area of the screen where the user wishes to perform the input with one finger, and then touches another area of the screen with a second finger.
  • the vernier input module 170 records the location of the first finger, and then adjusts the first location according to the touch input from the second finger.
  • the movement of the first location is scaled such that moving the second finger results in a smaller, more precise adjustment of the first location.
  • the first location could be adjusted by 10% of the movement of the second finger.
  • other scaling factors may also be employed.
  • a finger is described as an input mechanism in the present disclosure
  • embodiments of the present disclosure are not limited to fingers as a touch input mechanism.
  • a stylus may also be employed instead of or in addition to a finger.
  • the precise touch input could be performed by a combination of a stylus and a finger or by two styli.
  • vernier input module 170 An example of how the vernier input module 170 enables a precise touch input is described below with respect to FIGS. 3A-3C.
  • FIGS. 3A, 3B, and 3C illustrate a method of a precise multi-touch input according to an embodiment of the present disclosure.
  • FIGS. 3A, 3B, and 3C a figure is drawn as an example of utilizing a touch input according to an embodiment of the present disclosure.
  • a process of adding a line 305 which connects a circle 301 and a rectangle 303 displayed on the display 150 through a finger touch input of a user is exemplified in FIGS. 3A, 3B, and 3C.
  • the user taps the circle 301 to generate a touch input.
  • the user moves the touch input toward the rectangle 303.
  • the touch input approaches to the rectangle 303, and the line 305 directed from the circle 301 to the rectangle 303 is drawn.
  • the touch input of the user is placed at a first location 310.
  • the user wishes to finish the line 303 in a boundary of the rectangle 303.
  • the first location 310 may not be a precise position the user wishes to select, that is, a precise position for the boundary of the rectangle 330, but the user is unable to place the touch input more precisely. This is because the boundary of the rectangle 330 may be covered by a user's finger, and also a precise control based on a finger touch input may not be easy.
  • the second location 330 may be a predefined location on a touchscreen display, or may be any location sufficiently distant from the first location to prevent the movement of the user’s finger at the second location 330 from affecting the user’s view of the first location.
  • the second location may be located inside a vernier rectangle 307.
  • the vernier input module 170 records the first location 310 as the position touched by a first finger when the second touch input was detected. This allows the user to determine an approximate position for the first location 310, and then adjust the touched position more precisely through the second touch input and the vernier input module 170. According to another embodiment of the present disclosure, the vernier input module 170 may detect an object or interface element at the first location 310 and select the object or interface element.
  • the user moves the second touch input at the second location 330 to control precise adjustment of the first location 310.
  • a movement of the second touch input e.g., a stylus or the user’s finger
  • the vernier input module 170 may control the precise movement of the first location 310 according to the second touch input at the second location 330.
  • the user may set this position by removing the touch input at the second location 30 as shown in FIG. 3C, or, according to another embodiment of the present disclosure, by touching the touchscreen module 155 at another location.
  • the vernier input module 170 may be used for a variety of operations and applications in which precise input is desired. Examples of applications where precise input may be useful include Computer Aided Drafting (CAD) applications, photo editing applications, and the like. However, the vernier input module 170 and associated method for precise touch input is not limited to these applications; nearly all applications may have a use for the precise input techniques described herein. For example, the vernier input module 170 may be used to select, scale, rotate, or copy an object. In a drawing application, the vernier input module 170 may be used to draw a line or a curve, or to create a rectangle or other shape. In a text application, browser, or E-book reader, the vernier input module 170 may be used to select a portion of text displayed on the screen. In a game, the vernier input module 170 may be used as part of an on-screen game controller, or in conjunction with an external game controller. The operations of the vernier input module 170 are described below with respect to FIGS. 4 and 5.
  • CAD Computer Aide
  • FIG. 4 illustrates a method of precise multi-touch input according to an embodiment of the present disclosure.
  • the vernier input module 170 detects the user’s touch input at a first location in operation 410.
  • the touch input may be any type of touch input, such as a touch and hold or a touch and drag.
  • the touch input may be via any type of input device, such as a finger or a stylus, but is not limited thereto.
  • the touch input may be a multi-touch input, such as a pinch or zoom.
  • this input may be a non-touch input, such as a gesture, voice command, or action detected by a sensor of the electronic device 101 (such as an accelerometer). Examples of sensors that may be included in the electronic device 101 and used for this purpose are described below with respect to FIG. 9.
  • the vernier input module detects a touch at a second location.
  • the second location may be within a ‘vernier rectangle’ displayed on the display 150 and provided to allow precise multi-touch input.
  • the vernier rectangle is not required and may be omitted. If the vernier rectangle is displayed, the vernier rectangle may be displayed on the display 150 in response to detecting the first touch input, when a particular user interface element is selected, or when a particular touch gesture is detected.
  • the vernier rectangle may be displayed in a variety of fashions.
  • the vernier rectangle may always be displayed in a fixed position, or could be placed by the user.
  • the display of the vernier rectangle may also be controlled by user-configured settings, or may be changed by the user after the vernier rectangle is initially displayed.
  • hint boxes may be arranged to allow the user to adjust the size and position of the rectangle after the rectangle is initially displayed.
  • the vernier rectangle may be displayed on a touchscreen of a connected external device, such as a touchpad, game controller, or other device having a touchscreen.
  • the vernier rectangle may make up the entire display area of the external device, or may make up a portion of the screen.
  • the various techniques for creating the second location may be applied similarly to the touchscreen of the external device.
  • the vernier rectangle may be displayed in an area of the display that does not conflict with the first location, or with the selected object or other interface element.
  • the vernier rectangle may be displayed in an area of the screen that does not overlap with the first location, selected object, or interface element.
  • the size of the vernier rectangle may be determined based on the size of the target and features of the display, such as the display’s pixel density.
  • the vernier input module 170 may determine that no precise multi-touch input is necessary and process the first input according to normal operations.
  • the vernier input module 170 detects the removal of the first touch input.
  • the vernier input module records the location where the touch input was removed as the first location.
  • the vernier input module 170 adjusts the first location according to the touch input at the second location.
  • a large movement at the second location will result in a comparatively smaller adjustment to the first location.
  • the adjustment of the first location may be 10% of the movement of the touch input at the second location.
  • Larger or smaller scales may also be employed.
  • the scale may also be non-linear.
  • the vernier input module 170 determines the final adjusted location of the first location.
  • the vernier input module 170 passes the final adjusted location (e.g., coordinates of the final adjusted location) to the processor for further processing.
  • the display 150 is updated according to the final adjusted location and any further processing by the processor and executing applications. At this time, the vernier input rectangle may be removed from the screen.
  • the vernier input module 170 may determine the final adjusted location of the first location when another touch input is detected, in addition to the second touch input. For example, if a vernier rectangle is displayed, the additional touch input may be detected outside the vernier rectangle. In this situation, when the additional touch is detected, the vernier input module 170 determines the final adjusted location of the first location as being the adjusted location at the time when the additional input was detected. This may be useful when the user wants to perform a rapid series of edits or precise adjustments. While FIGS. 3B and 3C show the vernier input box as a rectangle, the shape of the vernier input box is not limited thereto, and may be any other shape or format.
  • the vernier input module 170 determines the final adjusted location
  • the vernier input module 170 passes the final adjusted location to the processor for further processing. This may include forwarding the final adjusted location to the application controlling the first location (e.g., the application displaying the selected object, or the application responsible for the window or display area that includes the first location).
  • the action taken by the processor may depend on the application currently running. For example, if the user is selecting a portion of text, the final adjusted location may be the final position of the cursor for selecting the text. If the user is drawing a line or curve in a drawing application, the final adjusted location may be the final position of the line or curve.
  • this further processing may include status updates or error messages, depending on the nature of the application.
  • FIG. 5 illustrates a method of precise multi-touch input according to another embodiment of the present disclosure.
  • the vernier input module 170 may be employed to select and move an object to a precise location determined by the user.
  • the vernier input module 170 detects a touch input.
  • the vernier input module 170 identifies an object at the first location and selects the identified object.
  • the vernier input module 170 detects a touch at a second location.
  • the second location may be within a vernier input module, as described above, or may be in any location on the touchscreen module 155 other than the object selected at operation 520.
  • the vernier input module 170 detects the removal of the touch input at the first location.
  • the vernier input module 170 adjusts the location of the selected object according to the movement of the second touch input at the second location.
  • the adjustment of the selected object is less than the movement of the touch input at the second location; for example, the location of the selected object may be adjusted by 10% of the movement of the touch input at the second location.
  • the user may employ the vernier input module 170 to rotate the selected object.
  • the vernier input module 170 may be used to scale or copy the object, or to perform any other action on the object that may require precise positioning.
  • FIG. 6 illustrates a method of rotating an object through a multi-touch input according to an embodiment of the present disclosure.
  • an object rotation is illustrated in FIG. 6.
  • a user rotates an inclined straight line 601 through a multi-touch input so that it is parallel with a straight line 603.
  • the user may touch a first location 610 of the inclined straight line 601 through a first touch input.
  • the inclined straight line 601 may be selected through the first touch input. While the first touch input is maintained, the user may touch a second location 630 inside a vernier rectangle 605 through a second touch input. After the second touch input is achieved, the user may remove the first touch input. After removing the first touch input, the user may rotate the second touch input in a direction of an arrow 607 at the second location 630.
  • the inclined straight line 601 selected by the first touch input may be rotated in a direction of an arrow 609.
  • the rotation of the first location that is, the rotation of the inclined straight line 601
  • a rotation direction and an object to be rotated may differ according to an embodiment.
  • FIG. 7 illustrates a scaling method of an object selected through a multi-touch input according to an embodiment of the present disclosure.
  • scaling of an object is illustrated in FIG. 7.
  • a user performs scaling by increasing or decreasing a size of a star shape 701 through a multi-touch input.
  • the user may touch a first location 710 of the start shape 701 through a first touch input.
  • the start shape 701 may be selected through the first touch input.
  • the user may touch second locations 730 and 750 inside a vernier rectangle 705 through a second touch input.
  • the second touch may detect a multi-touch.
  • the user may remove the first touch input.
  • the user may move the second touch input in both directions of arrows at the second locations 730 and 750. The movement of the second touch input in the arrow direction may result in scaling of increasing a size of the selected object 701.
  • the movement of the second touch input in a direction opposite to the arrow direction may result in scaling of decreasing the size of the selected object 701.
  • Scaling regarding the size of the object 701 selected by the first touch input may be relatively less than the movement in the arrow direction of the second touch input or the direction opposite to the arrow direction. In this manner, precise scaling of the selected object 701 is possible.
  • the second touch input is achieved through the multi-touch input in the embodiment described with reference to FIG. 7, the second touch input may be a single touch input as shown in FIG. 3 or FIG. 6.
  • the second touch input may be moved in a specific direction to increase or decrease a size of the selected object.
  • the second touch input may be rotated in a specific direction to adjust the size of the selected object.
  • FIG. 8 illustrates a method of selecting a text through a multi-touch input according to an embodiment of the present disclosure.
  • a precise selection of a text is illustrated in FIG. 8.
  • a user precisely adjusts a text 801 approximately selected through a multi-touch input.
  • the reference numeral 801 indicates a text "disclosure related to" selected by the first touch input.
  • An area of the selected text 801 may be selected through the first touch input for a user's action such as copy, cut, etc.
  • the selected text 801 may not exactly correspond to an area the user wishes to select.
  • the user may touch a second location 830 inside a vernier rectangle 803 through a second touch input.
  • a reference numeral 810 indicates a first location as a position at which the first touch is removed.
  • the user may move the second touch input to the left and the right so that the first location 810 is moved to the left and the right in a direction of an arrow 807.
  • the selected word or sentence 801 may differ depending on the movement of the first location. For example, the user may move the second touch input to the left at the second location to exclude "to" which is wrongly selected at the first touch input from the selected area.
  • the movement of the first location may be less than the movement of the second touch input, and as a result, the first location may be precisely adjusted. Accordingly, the user can select the word or sentence 801 at an exactly desired position.
  • FIG. 9 illustrates a block diagram of hardware according to an embodiment of the present disclosure.
  • an electronic device 901 may be, for example, a part or all of the electronic device 101.
  • the electronic device 901 may include one or more Application Processors (AP) 910, a communication interface module 920, a Subscriber Identification Module (SIM) card 924, a memory 930, a sensor module 940, an input module 950, a display module 960, an interface 970, an audio module 980, a camera module 991, a power management module 995, a battery 996, an indicator 997, a motor 998, and/or the like.
  • AP Application Processors
  • SIM Subscriber Identification Module
  • the AP 910 may control one or more hardware or software components that are connected to AP 910, perform processing or computation of data (including multimedia data), and/or the like.
  • the AP 910 may be implemented as a System-on-Chip (SoC).
  • SoC System-on-Chip
  • the AP 910 may include a Graphics Processing Unit (GPU) (not shown).
  • GPU Graphics Processing Unit
  • the communication interface module 920 may transmit and receive data in communications between the electronic device 101 and other electronic devices (e.g., the electronic device 104, the server 106, and/or the like).
  • the communication interface module 920 may include one or more of a cellular module 921, a Wi-Fi module 923, a Bluetooth module 925, a GPS module 927, a NFC module 928, a Radio Frequency (RF) module 929, and/or the like.
  • RF Radio Frequency
  • the cellular module 921 may provide services such as, for example, a voice call, a video call, a Short Messaging Service (SMS), internet service, and/or the like, via a communication network (e.g., LTE, LTE-A, CDMA, WCDMA, UMTS, WiBro, GSM, and/or the like).
  • the cellular module 921 may differentiate and authorize electronic devices within a communication network using a Subscriber Identification Module (SIM) card (e.g., the SIM card 924).
  • SIM Subscriber Identification Module
  • the cellular module 921 may perform at least a part of the functionalities of the AP 910.
  • the cellular module 921 may perform at least a part of multimedia control functionality.
  • the communication interface module 920 and/or the cellular module 921 may include a Communication Processor (CP).
  • CP Communication Processor
  • the cellular module 921 may be implemented as SoC.
  • FIG. 9 illustrates components such as the cellular module 921 (e.g., CP), the memory 930, the power management module 995 as components that are separate from the AP 910, according to various embodiments of the present disclosure, the AP 910 may include, or be integrated with, one or more of the foregoing components (e.g., the cellular module 921).
  • the cellular module 921 e.g., CP
  • the memory 930 e.g., the memory 930
  • the power management module 995 components that are separate from the AP 910
  • the AP 910 may include, or be integrated with, one or more of the foregoing components (e.g., the cellular module 921).
  • the AP 910, the cellular module 921 may process instructions or data received from at least one of non-volatile memory or other components by loading in volatile memory.
  • the AP 910, the cellular module 921, the communication interface module 920, and/or the like may store at non-volatile memory at least one of data that is received from at least one of the other components or data that is generated by at least one of the other components.
  • the Wi-Fi module 923, the Bluetooth module 925, the GPS module 927, the NFC module 928, and/or the like may each include one or more processors that may process data received or transmitted by the respective modules.
  • FIG. 9 illustrates the cellular module 921, the Wi-Fi module 923, the Bluetooth module 925, the GPS module 927, and the NFC module 928 as separate blocks, according to various embodiments of the present disclosure, any combination (e.g., two or more) of the cellular module 921, the Wi-Fi module 923, the Bluetooth module 925, the GPS module 927, the NFC module 928, and/or the like may be included in an Integrated Chip (IC) or an IC package.
  • IC Integrated Chip
  • processors corresponding to the respective the cellular module 921, the Wi-Fi module 923, the Bluetooth module 925, the GPS module 927, the NFC module 928, and/or the like may be implemented as a single SoC.
  • a CP corresponding to the cellular module 921 and a Wi-Fi processor corresponding to Wi-Fi module 923 may be implemented as a single SoC.
  • the RF module 929 may, for example, transmit and receive RF signals.
  • the RF module 929 may include a transceiver, a Power Amp Module (PAM), a frequency filter, a Low Noise Amplifier (LNA), and/or the like.
  • the RF module 929 may include one or more components for transmitting and receiving Electro-Magnetic (EM) waves (e.g., in free space or the like) such as, for example, conductors or conductive wires.
  • EM Electro-Magnetic
  • the cellular module 921, the Wi-Fi module 923, the Bluetooth module 925, the GPS module 927, and the NFC module 928 are sharing one RF module 929, according to various embodiments of the present disclosure, at least one of the cellular module 921, the Wi-Fi module 923, the Bluetooth module 925, the GPS module 927, the NFC module 928, and/or the like may transmit and receive RF signals via a separate RF module.
  • the SIM card 924 may be a card implementing a SIM, and may be configured to be inserted into a slot disposed at a specified location of the electronic device.
  • the SIM card 924 may include a unique identifier (e.g., Integrated Circuit Card IDentifier (ICCID)) subscriber information (e.g., International Mobile Subscriber Identity (IMSI)), and/or the like.
  • ICCID Integrated Circuit Card IDentifier
  • IMSI International Mobile Subscriber Identity
  • the memory 930 may include an internal memory 932, an external memory 934, or a combination thereof.
  • the internal memory 932 may be, for example, at least one of volatile memory (e.g., Dynamic Random Access Memory (DRAM), Static Random Access Memory (SRAM) or Synchronous Dynamic Random Access Memory (SDRAM)), non-volatile memory (e.g., One Time Programmable Read Only Memory (OTPROM), Programmable Read Only Memory (PROM), Erasable and Programmable Read Only Memory (EPROM), Electrically Erasable and Programmable Read Only Memory (EEPROM), mask Read Only Memory (ROM), flash ROM, NAND flash memory, NOR flash memory), and/or the like.
  • volatile memory e.g., Dynamic Random Access Memory (DRAM), Static Random Access Memory (SRAM) or Synchronous Dynamic Random Access Memory (SDRAM)
  • non-volatile memory e.g., One Time Programmable Read Only Memory (OTPROM), Programmable Read Only Memory (PROM), Erasable and Programmable Read Only Memory (EPROM), Electrically Erasable and Programmable Read Only Memory (EEPROM), mask Read Only
  • the internal memory 932 may be a Solid State Drive (SSD).
  • the external memory 934 may be a flash drive (e.g., Compact Flash (CF drive), Secure Digital (SD), micro Secure Digital (micro-SD), mini Secure Digital (mini-SD), extreme Digital (xD), Memory Stick, and/or the like).
  • the external memory 934 may be operatively coupled to electronic device 901 via various interfaces.
  • the electronic device 901 may include recording devices (or recording media) such as, for example, Hard Disk Drives (HDD), and/or the like.
  • the sensor module 940 may measure physical/environmental properties detect operational states associated with electronic device 901, and/or the like, and convert the measured and/or detected information into signals such as, for example, electric signals or electromagnetic signals.
  • the sensor module 940 may include at least one of a gesture sensor 940A, a gyro sensor 940B, an atmospheric pressure sensor 940C, a magnetic sensor 940D, an accelerometer 940E, a grip sensor 940F, a proximity sensor 940G, an RGB sensor 940H, a biometric sensor 940I, a temperature/humidity sensor 940J, a luminosity sensor 940K, a Ultra Violet (UV) sensor 940M, and/or the like.
  • the sensor module 940 may detect the operation state of the electronic device and/or measure physical properties, and convert the detected or measured information into electrical signals. Additionally or alternatively, the sensor module 940 may also include, for example, an electrical-nose sensor (not shown), an electromyography (EMG) sensor (not shown), an electroencephalogram (EEG) sensor (not shown), an infrared (IR) sensor (not shown), an eye-scanning sensor (e.g., iris sensor) (not shown), a fingerprint sensor, and/or the like. The sensor module 940 may also include control circuitry for controlling one or more sensors included therein.
  • EMG electromyography
  • EEG electroencephalogram
  • IR infrared
  • IR eye-scanning sensor
  • a fingerprint sensor e.g., iris sensor
  • the input module 950 may include a touch panel 952, a (digital) pen sensor 954, a key 956, an ultrasonic input device 958, and/or the like.
  • the touch panel 952 may detect touch input using capacitive, resistive, infrared, ultrasonic methods, and/or the like.
  • the touch panel 952 may also include a touch panel controller (not shown).
  • a capacitive-type touch panel may detect proximity inputs (e.g. hovering input) in addition to, or as an alternative to, physical touch inputs.
  • the touch panel 952 may also include a tactile layer. According to various embodiments of the present disclosure, the touch panel 952 may provide haptic (or other) feedback to the user using the tactile layer.
  • the (digital) pen sensor 954 may be implemented using methods identical to or similar to receiving a touch input from a user, or using a separate detection sheet (e.g., a digitizer).
  • the key 956 may be a keypad, a touch key, and/or the like.
  • the ultrasonic input device 958 may be a device configured to identify data by detecting, using a microphone (e.g., microphone 988), ultrasonic signals generated by a device capable of generating the ultrasonic signal.
  • the ultrasonic input device 958 may detect data wirelessly.
  • the electronic device 901 may receive user input from an external device (e.g., a network, computer or server) connected to the electronic device 901 using the communication interface module 920.
  • an external device e.g., a network, computer or server
  • the display module 960 may include a panel 962, a hologram device 964, a projector 966, and/or the like.
  • the panel 962 may be, for example, a Liquid-Crystal Display (LCD), an Active-Matrix Organic Light-Emitting Diode (AM-OLED) display, and/or the like.
  • the panel 962 may be configured to be flexible, transparent, and/or wearable.
  • the panel 962 and the touch panel 952 may be implemented as a single module.
  • the hologram device 964 may provide a three-dimensional image.
  • the hologram device 964 may utilize the interference of light waves to provide a three-dimensional image in empty space.
  • the projector 966 may provide image by projecting light on a surface (e.g., a wall, a screen, and/or the like).
  • a surface e.g., a wall, a screen, and/or the like.
  • the surface may be positioned internal or external to electronic device 901.
  • the display module 960 may also include a control circuitry for controlling the panel 962, the hologram device 964, the projector 966, and/or the like.
  • the interface 970 may include, for example, one or more interfaces for a High-Definition Multimedia Interface (HDMI) 972, a Universal Serial Bus (USB) 974, a projector 976, or a D-subminiature (D-sub) 978, and/or the like.
  • the interface 970 may be part of the communication interface module 920. Additionally or alternatively, the interface 970 may include one or more interfaces for Mobile High-definition Link (MHL), Secure Digital (SD)/MultiMedia Card (MMC), Infrared Data Association (IrDA), and/or the like.
  • MHL Mobile High-definition Link
  • SD Secure Digital
  • MMC MultiMedia Card
  • IrDA Infrared Data Association
  • the audio module 980 may encode/decode sound into electrical signal, and vice versa. According to various embodiments of the present disclosure, at least a portion of the audio module 980 may be part of the I/O interface 140. As an example, the audio module 980 may encode/decode voice information that is input into, or output from, a speaker 982, a receiver 984, an earphone 986, the microphone 988, and/or the like.
  • the camera module 991 may capture still images and/or video.
  • the camera module 991 may include one or more image sensors (e.g., front sensor module, rear sensor module, and/or the like) (not shown), an Image Signal Processor (ISP) (not shown), or a flash (e.g., Light-Emitting Diode (flash LED), xenon lamp, and/or the like) (not shown).
  • image sensors e.g., front sensor module, rear sensor module, and/or the like
  • ISP Image Signal Processor
  • flash e.g., Light-Emitting Diode (flash LED), xenon lamp, and/or the like
  • the power management module 995 may manage electrical power of the electronic device 901.
  • the power management module 995 may include, for example, a Power Management Integrated Circuit (PMIC), a charger Integrated Circuit (charger IC), a battery gauge, a fuel gauge, and/or the like.
  • PMIC Power Management Integrated Circuit
  • charger IC charger Integrated Circuit
  • battery gauge battery gauge
  • fuel gauge fuel gauge
  • the PMIC may be disposed in an integrated circuit or an SoC semiconductor.
  • the charging method for the electronic device 901 may include wired or wireless charging.
  • the charger IC may charge a battery, may prevent excessive voltage or excessive current from a charger from entering the electronic device 901, and/or the like.
  • the charger IC may include at least one of a wired charger IC or a wireless charger IC.
  • the wireless charger IC may be a magnetic resonance type, a magnetic induction type, an electromagnetic wave type, and/or the like.
  • the wireless charger IC may include circuits such as a coil loop, a resonance circuit, a rectifier, and/or the like.
  • the battery gauge may measure a charge level, a voltage while charging, a temperature of the battery 996, and/or the like.
  • the battery 996 may supply power to the electronic device 901.
  • the battery 996 may be a rechargeable battery, a solar battery, and/or the like.
  • the indicator 997 may indicate one or more states (e.g., boot status, message status, charge status, and/or the like) of the electronic device 901 or a portion thereof (e.g., the AP 910).
  • the motor 998 may convert an electrical signal into a mechanical vibration.
  • the electronic device 901 may include one or more devices for supporting mobile television (mobile TV) (e.g., a Graphics Processing Unit (GPU)), and/or the like.
  • the devices for supporting mobile TV may support processing of media data compliant with, for example, Digital Multimedia Broadcasting (DMB), Digital Video Broadcasting (DVB), media flow, and/or the like.
  • DMB Digital Multimedia Broadcasting
  • DVD Digital Video Broadcasting
  • An embodiment of a method for precise multi-touch input may comprise detecting a first touch input at a first location on a touchscreen, while the first touch input is maintained, detecting a second touch input at a second location on the touchscreen, detecting removal of the first touch input at the first location while the second touch input is maintained, and adjusting the first location according to movement of the second touch input, such that movement of the adjusted location is less than the movement of the second touch input.
  • the method may further comprise displaying an input area on the touchscreen wherein the second touch input is detected within the displayed input area.
  • the method may further comprise determining a final location of the first location when a third touch input is detected at a location of the touchscreen outside of the input area.
  • the method may further comprise determining a final location of the first location when the second touch input is no longer detected in the input area.
  • the method may further comprise stopping the display of the input area after the adjusting of the first location is completed.
  • a size of the input area may be determined according to a pixel density of the touchscreen.
  • the displaying of the input area may comprise receiving a user selection of an area on the touchscreen; and displaying the input area as enclosing the area selected by the user.
  • the method may further comprise detecting an object displayed on the touchscreen at the first location; and selecting the detected object.
  • the adjusting of the first location may comprise changing a size of the object according to a movement of the second touch input, such that the change in size of the object is less than the the movement of the second touch input.
  • the adjusting of the first location may comprise changing a selected area for at least one text according to the movement of the second touch input, such that the change in the selected area is less than the movement of the second touch input.
  • the adjusting of the first location may comprise adjusting a position of the object according to the movement of the second touch input, such that the adjusted position of the object is less than the movement of the second touch input.
  • the adjusting of the first location may comprise rotating the object according to a rotation of the second touch input, such that the adjusted rotation of the object is less than the rotation of the second touch input.
  • the method may further comprise, when a final adjusted location of the first location is determined, displaying a copy of the selected object at the final adjusted location.
  • the method may further comprises determining a final adjusted location of the first location when the second touch input is no longer detected.
  • the method may further comprise determining a final adjusted location of the first location when a third input is detected on the touchscreen.
  • An embodiment of a method for precise multi-touch input in an electronic device comprises detecting a user input, determining a first location on a touchscreen of the electronic device based on the user input, detecting a touch input at a second location and adjusting the first location according to movement of the second touch input wherein the movement of the adjusted location is less than the movement of the second touch input.
  • the user input is at least one of a gesture, a voice, or an action sensed by a sensor of the electronic device.
  • the second location is a location on the touchscreen.
  • the second location is a location on a touchscreen of an external device connected to the electronic device.
  • Any such software may be stored in a non-transitory computer readable storage medium.
  • the non-transitory computer readable storage medium stores one or more programs (software modules), the one or more programs comprising instructions, which when executed by one or more processors in an electronic device, cause the electronic device to perform a method of the present disclosure.
  • Any such software may be stored in the form of volatile or non-volatile storage such as, for example, a storage device like a Read Only Memory (ROM), whether erasable or rewritable or not, or in the form of memory such as, for example, Random Access Memory (RAM), memory chips, device or integrated circuits or on an optically or magnetically readable medium such as, for example, a Compact Disk (CD), Digital Versatile Disc (DVD), magnetic disk or magnetic tape or the like.
  • ROM Read Only Memory
  • RAM Random Access Memory
  • CD Compact Disk
  • DVD Digital Versatile Disc
  • the storage devices and storage media are various embodiments of non-transitory machine-readable storage that are suitable for storing a program or programs comprising instructions that, when executed, implement various embodiments of the present disclosure. Accordingly, various embodiments provide a program comprising code for implementing apparatus or a method as claimed in any one of the claims of this specification and a non-transitory machine-readable storage storing such a program.

Abstract

La présente invention concerne un appareil et un procédé pour l'entrée multipoint précise. Le procédé consiste à détecter une première entrée tactile à un premier emplacement sur un écran tactile; tandis que la première entrée tactile est maintenue, détecter une seconde entrée tactile à un second emplacement sur l'écran tactile; détecter le retrait de la première entrée tactile au premier emplacement, tandis que la seconde entrée tactile est maintenue; et ajuster le premier emplacement en fonction du mouvement de la seconde entrée tactile, de sorte que le mouvement de l'emplacement ajusté soit inférieur au mouvement de la seconde entrée tactile.
PCT/KR2015/010753 2015-02-13 2015-10-13 Appareil et procédé d'entrée multi-point WO2016129772A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201580076118.5A CN107223226B (zh) 2015-02-13 2015-10-13 用于多点触摸输入的装置和方法
EP15882135.5A EP3256935A4 (fr) 2015-02-13 2015-10-13 Appareil et procédé d'entrée multi-point

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US14/621,898 US9965173B2 (en) 2015-02-13 2015-02-13 Apparatus and method for precise multi-touch input
US14/621,898 2015-02-13
KR1020150080097A KR102399529B1 (ko) 2015-02-13 2015-06-05 멀티 터치 입력을 위한 방법 및 장치
KR10-2015-0080097 2015-06-05

Publications (1)

Publication Number Publication Date
WO2016129772A1 true WO2016129772A1 (fr) 2016-08-18

Family

ID=56615458

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2015/010753 WO2016129772A1 (fr) 2015-02-13 2015-10-13 Appareil et procédé d'entrée multi-point

Country Status (1)

Country Link
WO (1) WO2016129772A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100245267A1 (en) 2009-03-31 2010-09-30 Lg Electronics Inc. Mobile terminal and method of controlling the same
US20110018806A1 (en) * 2009-07-24 2011-01-27 Kabushiki Kaisha Toshiba Information processing apparatus, computer readable medium, and pointing method
KR20120007574A (ko) * 2010-07-12 2012-01-25 엘지전자 주식회사 이동 단말기 및 그 제어 방법
US20120151401A1 (en) * 2010-12-14 2012-06-14 Samsung Electronics Co. Ltd. Method and apparatus for controlling touch screen using timeline bar, recording medium with program for the same recorded therein, and user terminal having the same
US20140078092A1 (en) * 2011-02-14 2014-03-20 Blackberry Limited Portable electronic device including touch-sensitive display and method of controlling same
US20140109017A1 (en) * 2006-04-19 2014-04-17 Microsoft Corporation Precise Selection Techniques For Multi-Touch Screens

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140109017A1 (en) * 2006-04-19 2014-04-17 Microsoft Corporation Precise Selection Techniques For Multi-Touch Screens
US20100245267A1 (en) 2009-03-31 2010-09-30 Lg Electronics Inc. Mobile terminal and method of controlling the same
US20110018806A1 (en) * 2009-07-24 2011-01-27 Kabushiki Kaisha Toshiba Information processing apparatus, computer readable medium, and pointing method
KR20120007574A (ko) * 2010-07-12 2012-01-25 엘지전자 주식회사 이동 단말기 및 그 제어 방법
US20120151401A1 (en) * 2010-12-14 2012-06-14 Samsung Electronics Co. Ltd. Method and apparatus for controlling touch screen using timeline bar, recording medium with program for the same recorded therein, and user terminal having the same
US20140078092A1 (en) * 2011-02-14 2014-03-20 Blackberry Limited Portable electronic device including touch-sensitive display and method of controlling same

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3256935A4

Similar Documents

Publication Publication Date Title
KR102311221B1 (ko) 오브젝트 운용 방법 및 이를 지원하는 전자 장치
WO2015194920A1 (fr) Dispositif électronique et procédé de commande d'affichage
WO2015182964A1 (fr) Dispositif électronique comportant un dispositif d'affichage pliable et son procédé de fonctionnement
WO2015186925A1 (fr) Dispositif pouvant être porté et procédé pour produire des informations de réalité augmentée
WO2017179905A1 (fr) Dispositif souple et son procédé de fonctionnement
WO2015108288A1 (fr) Procédé et appareil de traitement d'une entrée au moyen d'un écran tactile
WO2015126208A1 (fr) Procédé et système permettant une commande à distance d'un dispositif électronique
WO2016036135A1 (fr) Procédé et appareil de traitement d'entrée tactile
WO2015111835A1 (fr) Appareil et procédé pour fournir un service
KR20150127989A (ko) 사용자 인터페이스 제공 방법 및 장치
US20160018954A1 (en) Data processing method and electronic device thereof
EP2950188A1 (fr) Procédé et dispositif électronique pour commander un affichage
WO2016052983A1 (fr) Procédé de partage de données et dispositif électronique associé
CN106575197B (zh) 用于处理拖动和放置的装置和方法
WO2018074798A1 (fr) Dispositif électronique et son procédé de commande d'affichage
EP2983074A1 (fr) Procédé et appareil pour l'affichage d'un écran dans des dispositifs électroniques
US20180307387A1 (en) Electronic device and method for operating the electronic device
WO2015178661A1 (fr) Procede et appareil de traitement d'un signal d'entree au moyen d'un dispositif d'affichage
WO2015099300A1 (fr) Procédé et appareil de traitement d'objet fourni par le biais d'une unité d'affichage
WO2018084684A1 (fr) Procédé destiné à commander l'exécution d'une application sur un dispositif électronique à l'aide d'un écran tactile et dispositif électronique destiné à ce dernier
KR20160035865A (ko) 오브젝트를 식별하는 전자 장치 및 방법
WO2016039532A1 (fr) Procédé de commande de l'écran d'un dispositif électronique, et dispositif électronique correspondant
EP3256935A1 (fr) Appareil et procédé d'entrée multi-point
KR102305114B1 (ko) 데이터 처리 방법 및 그 전자 장치
US10289290B2 (en) Apparatus and method for displaying a portion of a plurality of background applications

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15882135

Country of ref document: EP

Kind code of ref document: A1

REEP Request for entry into the european phase

Ref document number: 2015882135

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: DE