WO2014148661A1 - Terminal et son procédé d'utilisation - Google Patents

Terminal et son procédé d'utilisation Download PDF

Info

Publication number
WO2014148661A1
WO2014148661A1 PCT/KR2013/002409 KR2013002409W WO2014148661A1 WO 2014148661 A1 WO2014148661 A1 WO 2014148661A1 KR 2013002409 W KR2013002409 W KR 2013002409W WO 2014148661 A1 WO2014148661 A1 WO 2014148661A1
Authority
WO
WIPO (PCT)
Prior art keywords
screen
user input
terminal
touch
moving
Prior art date
Application number
PCT/KR2013/002409
Other languages
English (en)
Korean (ko)
Inventor
김재운
이성준
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020157021609A priority Critical patent/KR102097079B1/ko
Priority to PCT/KR2013/002409 priority patent/WO2014148661A1/fr
Publication of WO2014148661A1 publication Critical patent/WO2014148661A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Definitions

  • the present invention relates to a terminal and a method of operating the same, and more particularly, to a method of selecting a part of text displayed on a screen.
  • Terminals may be divided into mobile / portable terminals and stationary terminals according to their mobility.
  • the mobile terminal may be further classified into a handheld terminal and a vehicle mount terminal according to whether a user can directly carry it.
  • Such terminals have multimedia functions, such as photographing or playing videos, playing music or video files, receiving games, broadcasting, and playing newspaper articles or magazines. It is implemented in the form of a multimedia player.
  • the terminal As such, as the terminal is implemented in the form of a multimedia device, a user needs to select a part of text from a text medium such as a newspaper article or a magazine, copy the selected text, and paste it into another application.
  • a text medium such as a newspaper article or a magazine
  • a terminal having a touch screen provides a text selection function through a touch input. For example, when a specific area is touch dragged on the screen on which the text is displayed, the terminal displays the touch dragged area to be distinguished from the area that is not, so that the user can check the selected text.
  • the terminal displays the touch dragged area to be distinguished from the area that is not, so that the user can check the selected text.
  • the terminal uses the tip of the finger, and thus, it is difficult to precisely select the text in the edge area of the touch screen.
  • the terminal is likely to release the text selection mode, causing inconvenience.
  • An embodiment of the present invention provides a terminal that allows a user to select text precisely through touch.
  • a method of operating a terminal may include displaying an electronic document on a screen, detecting a user input for text selection in the first area of the screen, and when the user input is detected, the electronic document. Moving to a second area of the screen.
  • a terminal includes a touch screen and a control unit including a touch sensor and a display unit, and the control unit displays an electronic document on a screen, and a user input for selecting text in a first area of the screen is provided. If detected, the electronic document is moved to the second area of the screen.
  • a method of operating a terminal including displaying a page including a plurality of icons on a screen, detecting a touch drag for selecting at least one icon in an edge area of the screen, and If a touch drag is detected, moving the page in a direction opposite to the touch drag moving direction.
  • the text may be selected without elaborately solving the text selection mode.
  • the text even if the text is selected through the capacitive touch, the text may be selected without solving the text selection mode precisely in character units.
  • FIG. 1 is a block diagram of a mobile terminal according to an embodiment of the present invention.
  • FIG. 2 is a flowchart illustrating an object selection operation method of a terminal according to an exemplary embodiment of the present invention.
  • FIG. 3 is a flowchart illustrating a method of operating a terminal in which a screen moving mode for object selection is performed according to an exemplary embodiment of the present invention.
  • FIG. 4 is a diagram for describing a screen moving mode of a terminal according to an exemplary embodiment.
  • FIG. 5 is a diagram for describing a screen moving mode of a terminal according to another embodiment of the present invention.
  • FIG. 6 is a flowchart illustrating a method of operating a terminal in which a screen reduction mode for object selection is performed according to an exemplary embodiment of the present invention.
  • FIG. 7 is a diagram for describing a screen reduction mode of a terminal according to an exemplary embodiment.
  • FIG. 8 is a diagram illustrating a screen reduction mode of a terminal according to another embodiment of the present invention.
  • FIG. 9 is a flowchart illustrating a method of entering a screen moving mode of a terminal according to an embodiment of the present invention.
  • FIG. 10 is a flowchart illustrating a method of entering a screen moving mode of a terminal according to another embodiment of the present invention.
  • FIG. 11 is a diagram illustrating an edge region of a terminal screen in which a user input for entering a screen moving mode is sensed according to an embodiment of the present disclosure.
  • FIG. 12 is a flowchart illustrating a method of entering a screen moving mode of a terminal according to another embodiment of the present invention.
  • FIG. 13 is a flowchart illustrating an indicator for guiding a screen movement mode according to an embodiment of the present invention.
  • FIG. 14 is a flowchart illustrating a pop-up window for guiding a screen movement mode entry according to an embodiment of the present invention.
  • 15 is a flowchart illustrating a guide window for guiding a screen moving mode according to an embodiment of the present invention.
  • 16 is a flowchart illustrating a user input for entering a screen moving mode and a screen moving method corresponding to the user input according to another embodiment of the present invention.
  • 17 is a flowchart illustrating a user input for entering a screen moving mode and a screen moving method corresponding to the user input according to an exemplary embodiment.
  • FIG. 18 is a diagram illustrating a screen on which a screen movement mode entry is executed according to an embodiment of the present invention.
  • the mobile terminal described herein may include a mobile phone, a smart phone, a laptop computer, a digital broadcasting terminal, a personal digital assistant (PDA), a portable multimedia player (PMP), navigation, and the like.
  • PDA personal digital assistant
  • PMP portable multimedia player
  • the configuration according to the embodiments described herein may also be applied to fixed terminals such as digital TVs, desktop computers, etc., except when applicable only to mobile terminals.
  • FIG. 1 is a block diagram of a mobile terminal according to an embodiment of the present invention.
  • the mobile terminal 100 includes a wireless communication unit 110, an A / V input unit 120, a user input unit 130, a sensing unit 140, an output unit 150, a memory 160, and an interface unit. 170, the controller 180, and the power supply 190 may be included.
  • the components shown in FIG. 1 are not essential, so that a mobile terminal having more or fewer components may be implemented.
  • the wireless communication unit 110 may include one or more modules that enable wireless communication between the mobile terminal 100 and the wireless communication system or between the mobile terminal 100 and a network in which the mobile terminal 100 is located.
  • the wireless communication unit 110 may include a broadcast receiving module 111, a mobile communication module 112, a wireless internet module 113, a short range communication module 114, a location information module 115, and the like. .
  • the broadcast receiving module 111 receives a broadcast signal and / or broadcast related information from an external broadcast management server through a broadcast channel.
  • the broadcast channel may include a satellite channel and a terrestrial channel.
  • the broadcast management server may mean a server that generates and transmits a broadcast signal and / or broadcast related information or a server that receives a previously generated broadcast signal and / or broadcast related information and transmits the same to a terminal.
  • the broadcast signal may include not only a TV broadcast signal, a radio broadcast signal, and a data broadcast signal, but also a broadcast signal having a data broadcast signal combined with a TV broadcast signal or a radio broadcast signal.
  • the broadcast related information may mean information related to a broadcast channel, a broadcast program, or a broadcast service provider.
  • the broadcast related information may also be provided through a mobile communication network. In this case, it may be received by the mobile communication module 112.
  • the broadcast related information may exist in various forms. For example, it may exist in the form of Electronic Program Guide (EPG) of Digital Multimedia Broadcasting (DMB) or Electronic Service Guide (ESG) of Digital Video Broadcast-Handheld (DVB-H).
  • EPG Electronic Program Guide
  • DMB Digital Multimedia Broadcasting
  • ESG Electronic Service Guide
  • DVB-H Digital Video Broadcast-Handheld
  • the broadcast receiving module 111 may include, for example, Digital Multimedia Broadcasting-Terrestrial (DMB-T), Digital Multimedia Broadcasting-Satellite (DMB-S), Media Forward Link Only (MediaFLO), and Digital Video Broadcast- (DVB-H).
  • DMB-T Digital Multimedia Broadcasting-Terrestrial
  • DMB-S Digital Multimedia Broadcasting-Satellite
  • MediaFLO Media Forward Link Only
  • DVD-H Digital Video Broadcast-
  • a digital broadcast signal may be received using a digital broadcast system such as handheld) or ISDB-T (Integrated Services Digital Broadcast-Terrestrial).
  • the broadcast receiving module 111 may be configured to be suitable for not only the above-described digital broadcast system but also other broadcast systems.
  • the broadcast signal and / or broadcast related information received through the broadcast receiving module 111 may be stored in the memory 160.
  • the mobile communication module 112 transmits and receives a wireless signal with at least one of a base station, an external terminal, and a server on a mobile communication network.
  • the wireless signal may include various types of data according to transmission and reception of a voice call signal, a video call call signal, or a text / multimedia message.
  • the wireless internet module 113 refers to a module for wireless internet access and may be embedded or external to the mobile terminal 100.
  • Wireless Internet technologies may include Wireless LAN (Wi-Fi), Wireless Broadband (Wibro), World Interoperability for Microwave Access (Wimax), High Speed Downlink Packet Access (HSDPA), and the like.
  • the short range communication module 114 refers to a module for short range communication.
  • Bluetooth Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB), ZigBee, and the like may be used.
  • RFID Radio Frequency Identification
  • IrDA Infrared Data Association
  • UWB Ultra Wideband
  • ZigBee ZigBee
  • the location information module 115 is a module for obtaining a location of the mobile terminal 100, and a representative example thereof is a GPS (Global Position System) module.
  • GPS Global Position System
  • the A / V input unit 120 is for inputting an audio signal or a video signal, and may include a camera 121 and a microphone 122.
  • the camera 121 processes image frames such as still images or moving images obtained by the image sensor in the video call mode or the photographing mode.
  • the processed image frame may be displayed on the display unit 151.
  • the image frame processed by the camera 121 may be stored in the memory 160 or transmitted to the outside through the wireless communication unit 110. Two or more cameras 121 may be provided according to the use environment.
  • the microphone 122 receives an external sound signal by a microphone in a call mode, a recording mode, a voice recognition mode, etc., and processes the external sound signal into electrical voice data.
  • the processed voice data may be converted into a form transmittable to the mobile communication base station through the mobile communication module 112 and output in the call mode.
  • the microphone 122 may implement various noise removing algorithms for removing noise generated in the process of receiving an external sound signal.
  • the user input unit 130 generates input data for the user to control the operation of the terminal.
  • the user input unit 130 may include a key pad dome switch, a touch pad (static pressure / capacitance), a jog wheel, a jog switch, and the like.
  • the sensing unit 140 is a mobile terminal 100 such as an open / closed state of the mobile terminal 100, a location of the mobile terminal 100, a presence or absence of a user contact, an orientation of the mobile terminal 100, an acceleration / deceleration of the mobile terminal 100, and the like.
  • a sensing signal for controlling the operation of the mobile terminal 100 is generated by detecting the current state. For example, when the mobile terminal 100 is in the form of a slide phone, it may sense whether the slide phone is opened or closed. In addition, whether the power supply unit 190 is supplied with power, whether the interface unit 170 is coupled to the external device may be sensed.
  • the sensing unit 140 may include a proximity sensor 141.
  • the output unit 150 is used to generate an output related to visual, auditory, or tactile, and may include a display unit 151, an audio output module 152, an alarm unit 153, and a haptic module 154. have.
  • the display unit 151 displays (outputs) information processed by the mobile terminal 100. For example, when the mobile terminal 100 is in a call mode, the mobile terminal 100 displays a user interface (UI) or a graphic user interface (GUI) related to the call. When the mobile terminal 100 is in a video call mode or a photographing mode, the mobile terminal 100 displays a photographed and / or received image, a UI, and a GUI.
  • UI user interface
  • GUI graphic user interface
  • the display unit 151 includes a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT LCD), an organic light-emitting diode (OLED), and a flexible display (flexible). and at least one of a 3D display.
  • LCD liquid crystal display
  • TFT LCD thin film transistor-liquid crystal display
  • OLED organic light-emitting diode
  • flexible display flexible display
  • Some of these displays can be configured to be transparent or light transmissive so that they can be seen from the outside. This may be referred to as a transparent display.
  • a representative example of the transparent display is TOLED (Transparant OLED).
  • the rear structure of the display unit 151 may also be configured as a light transmissive structure. With this structure, the user can see the object located behind the terminal body through the area occupied by the display unit 151 of the terminal body.
  • two or more display units 151 may exist.
  • a plurality of display units may be spaced apart or integrally disposed on one surface of the mobile terminal 100, or may be disposed on different surfaces, respectively.
  • the display unit 151 and a sensor for detecting a touch operation form a mutual layer structure (hereinafter, referred to as a touch screen)
  • the display unit 151 may be configured in addition to an output device. Can also be used as an input device.
  • the touch sensor may have, for example, a form of a touch film, a touch sheet, a touch pad, or the like.
  • the touch sensor may be configured to convert a change in pressure applied to a specific portion of the display unit 151 or capacitance generated in a specific portion of the display unit 151 into an electrical input signal.
  • the touch sensor may be configured to detect not only the position and area of the touch but also the pressure at the touch.
  • the touch controller processes the signal (s) and then transmits the corresponding data to the controller 180. As a result, the controller 180 can know which area of the display unit 151 is touched.
  • a proximity sensor 141 may be disposed in an inner region of the mobile terminal 100 surrounded by the touch screen or near the touch screen.
  • the proximity sensor 141 refers to a sensor that detects the presence or absence of an object approaching a predetermined detection surface or an object present in the vicinity without using a mechanical contact by using an electromagnetic force or infrared rays.
  • the proximity sensor 141 has a longer life and higher utilization than a contact sensor.
  • Examples of the proximity sensor 141 include a transmission photoelectric sensor, a direct reflection photoelectric sensor, a mirror reflection photoelectric sensor, a high frequency oscillation proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor.
  • the touch screen is capacitive, the touch screen is configured to detect the proximity of the pointer by the change of the electric field according to the proximity of the pointer.
  • the touch screen may be classified as a proximity sensor.
  • the act of allowing the pointer to be recognized without being in contact with the touch screen so that the pointer is located on the touch screen is referred to as a "proximity touch", and the touch
  • the act of actually touching the pointer on the screen is called “contact touch.”
  • the position where the proximity touch is performed by the pointer on the touch screen refers to a position where the pointer is perpendicular to the touch screen when the pointer is in proximity proximity.
  • the proximity sensor detects a proximity touch and a proximity touch pattern (for example, a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch position, and a proximity touch movement state).
  • a proximity touch and a proximity touch pattern for example, a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch position, and a proximity touch movement state.
  • Information corresponding to the sensed proximity touch operation and proximity touch pattern may be output on the touch screen.
  • the sound output module 152 may output audio data received from the wireless communication unit 110 or stored in the memory 160 in a call signal reception, a call mode or a recording mode, a voice recognition mode, a broadcast reception mode, and the like.
  • the sound output module 152 may also output a sound signal related to a function (eg, a call signal reception sound, a message reception sound, etc.) performed in the mobile terminal 100.
  • the sound output module 152 may include a receiver, a speaker, a buzzer, and the like.
  • the alarm unit 153 outputs a signal for notifying occurrence of an event of the mobile terminal 100. Examples of events generated in the mobile terminal 100 include call signal reception, message reception, key signal input, and touch input.
  • the alarm unit 153 may output a signal for notifying occurrence of an event in a form other than a video signal or an audio signal, for example, vibration.
  • the video signal or the audio signal may be output through the display unit 151 or the audio output module 152, so that they 151 and 152 may be classified as part of the alarm unit 153.
  • the haptic module 154 generates various haptic effects that a user can feel. Vibration is a representative example of the haptic effect generated by the haptic module 154.
  • the intensity and pattern of vibration generated by the haptic module 154 can be controlled. For example, different vibrations may be synthesized and output or may be sequentially output.
  • the haptic module 154 may be configured to provide a pin array that vertically moves with respect to the contact skin surface, a jetting force or suction force of air through the jetting or suction port, grazing to the skin surface, contact of the electrode, electrostatic force, and the like.
  • Various tactile effects can be generated, such as effects by the endothermic and the reproduction of a sense of cold using the elements capable of endotherm or heat generation.
  • the haptic module 154 may not only deliver the haptic effect through direct contact, but also may implement the user to feel the haptic effect through a muscle sense such as a finger or an arm. Two or more haptic modules 154 may be provided according to a configuration aspect of the mobile terminal 100.
  • the memory 160 may store a program for the operation of the controller 180 and may temporarily store input / output data (for example, a phone book, a message, a still image, a video, etc.).
  • the memory 160 may store data regarding vibration and sound of various patterns output when a touch input on the touch screen is performed.
  • the memory 160 may be a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (for example, SD or XD memory), RAM (Random Access Memory, RAM), Static Random Access Memory (SRAM), Read-Only Memory (ROM), Electrically Erasable Programmable Read-Only Memory (EEPROM), Programmable Read-Only Memory (PROM), Magnetic Memory, Magnetic It may include a storage medium of at least one type of disk, optical disk.
  • the mobile terminal 100 may operate in association with a web storage that performs a storage function of the memory 160 on the Internet.
  • the interface unit 170 serves as a path with all external devices connected to the mobile terminal 100.
  • the interface unit 170 receives data from an external device, receives power, transfers the power to each component inside the mobile terminal 100, or transmits data inside the mobile terminal 100 to an external device.
  • wired / wireless headset ports, external charger ports, wired / wireless data ports, memory card ports, ports for connecting devices with identification modules, audio input / output (I / O) ports, The video input / output (I / O) port, the earphone port, and the like may be included in the interface unit 170.
  • the identification module is a chip that stores various types of information for authenticating the use authority of the mobile terminal 100.
  • the identification module includes a user identification module (UIM), a subscriber identity module (SIM), and a universal user authentication module ( Universal Subscriber Identity Module (USIM), and the like.
  • a device equipped with an identification module (hereinafter referred to as an 'identification device') may be manufactured in the form of a smart card. Therefore, the identification device may be connected to the terminal 100 through a port.
  • the interface unit 170 may be a path through which power from the cradle is supplied to the mobile terminal 100, or various command signals input from the cradle may be input by the user. It may be a passage that is delivered to the mobile terminal. Various command signals or power input from the cradle may be operated as signals for recognizing that the mobile terminal is correctly mounted on the cradle.
  • the controller 180 typically controls the overall operation of the mobile terminal 100. For example, perform related control and processing for voice calls, data communications, video calls, and the like.
  • the controller 180 may include a multimedia module 181 for playing multimedia.
  • the multimedia module 181 may be implemented in the controller 180 or may be implemented separately from the controller 180.
  • the controller 180 may perform a pattern recognition process for recognizing a writing input or a drawing input performed on the touch screen as text and an image, respectively.
  • the power supply unit 190 receives an external power source and an internal power source under the control of the controller 180 to supply power for operation of each component.
  • Various embodiments described herein may be implemented in a recording medium readable by a computer or similar device using, for example, software, hardware or a combination thereof.
  • the embodiments described herein include application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), and the like. It may be implemented using at least one of processors, controllers, micro-controllers, microprocessors, and electrical units for performing other functions. These may be implemented by the controller 180.
  • ASICs application specific integrated circuits
  • DSPs digital signal processors
  • DSPDs digital signal processing devices
  • PLDs programmable logic devices
  • FPGAs field programmable gate arrays
  • embodiments such as procedures or functions may be implemented with separate software modules that allow at least one function or operation to be performed.
  • the software code may be implemented by a software application written in a suitable programming language.
  • the software code may be stored in the memory 160 and executed by the controller 180.
  • FIG. 2 is a flowchart illustrating an object selection operation method of a terminal according to an exemplary embodiment of the present invention.
  • the controller 180 receives a user input for entering the object selection mode (S101).
  • the user input for entering the object selection mode may be a long touch at the object position.
  • the user input for entering the object selection mode may be a pressing of the object selection menu among a plurality of menus displayed when the menu button is pressed.
  • the object may include an icon, text, an image, etc. displayed on the display unit 151 and selectable by a user input, but is not limited thereto.
  • the icon may indicate a function of the mobile terminal 100 or a program or the like that can operate on the mobile terminal 100.
  • the text or image may mean an object included in the electronic document.
  • the controller 180 When a user input for entering the object selection mode is received, the controller 180 enters the object selection mode (S103). In the object selection mode, when a user input for selecting one object is received, the controller 180 may display the selected object to be distinguished from the unselected object. In addition, in the object selection mode, when a user input for continuously selecting a plurality of objects is received, the controller 180 may display a user input path so that the selected area is distinguished from an area not selected.
  • the controller 180 receives a user input for entering the screen extension mode (S105).
  • the user input for entering the screen extension mode may be a long touch at the object position.
  • the user input for entering the screen extension mode may be touch, long touch, or touch drag at an edge position of the touch screen.
  • the user input for entering the screen extension mode may be a user input corresponding to a notification for entering the screen extension mode.
  • the long touch at the object position may be a user input for entering the object selection mode and a user input for entering the screen extension mode. That is, when a user input for long touching a specific object is received, the controller 180 may enter the object selection mode and enter the screen extension mode.
  • the edge position of the touch screen may include a boundary between a touch screen on which the touch sensor is disposed and a bezel on which the touch sensor is not disposed.
  • the position of the edge of the touch screen may include a specific point or a specific area.
  • the touch or long touch at the edge position of the touch screen may include a touch or long touch for selecting an object located at the edge of the touch screen.
  • Touch drag at the edge location of the touch screen is a touch drag to select a plurality of objects located at the edge area of the touch screen, a plurality of objects ranging from objects located at the non-edge area of the touch screen to objects located at the edge area of the touch screen. It may include a touch drag to select an object.
  • the notification for entering the screen extension mode may be output by the controller 180 when a user input for entering the object selection mode or a user input for entering the screen extension mode is received.
  • the notification for entering the screen extension mode may include a visual notification, an audio notification, a tactile notification, and a combination of the same.
  • the visual notification for entering the screen extension mode may include shaking of a screen displayed on the display unit 151, an indicator display in the form of a bar or an icon, a display of a guide window, or a pop-up window display providing a user interface.
  • the auditory notification for entering the screen extension mode may include a notification sound output.
  • the tactile notification for entering the screen extension mode may include a notification vibration output.
  • the user input corresponding to the notification for entering the screen extension mode may include a user input or a separate user input through a user interface provided when the screen extension mode is entered after the notification for entering the screen extension mode occurs.
  • the user interface provided for the screen extension mode entry guide may be provided by a popup window displayed for the screen extension mode entry guide.
  • the separate user input may include a user input for tilting the mobile terminal 100 or a long touch at an edge position of the touch screen after a notification for entering the screen extension mode is generated.
  • the controller 180 enters the screen extension mode (S107).
  • the screen expansion mode may mean that the currently displayed screen is moved in the opposite direction of the edge or the currently displayed screen is reduced so that an object located at the edge of the touch screen can be moved to a position other than the edge of the touch screen. have.
  • an object located near the bezel may be selected more precisely.
  • the screen moving mode which is an embodiment of the screen extension mode, will be described in detail with reference to FIGS. 3 to 5, and the same parts as described above will be briefly described or omitted.
  • FIG. 3 is a flowchart illustrating a method of operating a terminal in which a screen moving mode for object selection is performed according to an exemplary embodiment of the present invention.
  • the controller 180 When the controller 180 receives a user input for entering the object selection mode (S301), the controller 180 enters the object selection mode (S303).
  • the controller 180 receives a user input for entering the screen moving mode while entering the object selection mode (S305).
  • a user input for entering the screen moving mode will be described with reference to FIGS. 4 and 5.
  • FIG. 4 is a diagram for describing a screen moving mode of a terminal according to an exemplary embodiment.
  • the user input for entering the screen moving mode may be long touch at a text position included in the electronic document.
  • the long touch at the text position may be a user input for entering the text selection mode and a user input for entering the screen moving mode. That is, when the controller 180 receives the long touch at the text position, the controller 180 enters the text selection mode and simultaneously enters the screen moving mode.
  • the user input for entering the screen moving mode may be touch dragging at the edge position of the touch screen.
  • the touch drag at the edge position of the touch screen may refer to a case in which the controller 180 receives a touch drag, which is a user input for text selection, at the edge position of the touch screen.
  • the controller 180 receives a touch drag from the right side of the edge of the touch screen that continuously selects text toward the right direction of the electronic document displayed on the display unit 151. In this case, it may be recognized that a user input for entering the screen moving mode is received.
  • FIG. 5 is a diagram for describing a screen moving mode of a terminal according to another embodiment of the present invention.
  • the user input for entering the screen moving mode may be a touch or a touch drag at the icon position displayed on the background screen.
  • the touch at the icon position may be a user input for entering the icon selection mode and a user input for entering the screen moving mode.
  • the user input for entering the screen moving mode may be touch dragging at the edge position of the touch screen.
  • the controller 180 touches a touch drag to select an icon continuously toward the right side of the background screen in order to simultaneously select a plurality of icons displayed on the background screen.
  • an edge of the touch screen that receives a user input for entering the screen moving mode may include a specific point or a specific area.
  • the edge position may indicate a coordinate of a specific point or a size of a specific area on the display unit 151.
  • the edge position may be predetermined according to the finger size of the user or may be determined according to the user input.
  • the edge position predetermined according to the size of the finger of the user may include a boundary between the touch screen and the bezel.
  • the edge position predetermined according to the size of the finger of the user may include a predetermined area including a boundary between the touch screen and the bezel.
  • the edge position determined according to the user input may include a position selected by the user input from an edge position list provided by the mobile terminal 100.
  • the screen moving mode may refer to a state in which a screen currently displayed is moved in an opposite direction to a corresponding edge from a edge position where a user input for entering the screen moving mode is detected. In pan mode, additional space is displayed at the edges so that text or icons located near the edges can be selected more accurately.
  • the screen moving mode will be described with reference to FIGS. 4 and 5.
  • the controller 180 may display the screen by moving the screen to the opposite side of the right side of the edge, that is, to the left side of the edge. have.
  • the mobile terminal 100 may move the screen in a direction opposite to touch drag, thereby providing the user with space for selecting text located at the edge of the touch screen.
  • the controller 180 when the controller 180 receives a touch drag for selecting an icon from the right side of the edge of the touch screen, the controller 180 may move the screen to the left side of the edge and display the screen.
  • the controller 180 may move and display a screen to an area where a touch sensor exists in response to a user input for entering the screen moving mode. That is, the screen currently displayed on the touch screen is displayed in the region where the touch sensor exists and the region where the touch sensor does not exist.
  • the controller 180 responds to the user input and displays the screen displayed in the region where the touch sensor does not exist.
  • the screen may be moved by a certain distance to display in the area where the touch sensor exists.
  • the screen movement distance may be determined in advance according to the size of the region where the touch sensor does not exist, or may be determined according to a user input, but is not limited thereto.
  • the screen movement distance determined according to the user input may include a distance selected by the user input from the screen movement distance list provided by the mobile terminal 100.
  • the screen reduction mode which is an embodiment of the screen extension mode, will be described in detail with reference to FIGS. 6 to 8, and the same parts as described above will be briefly described or omitted.
  • FIG. 6 is a flowchart illustrating a method of operating a terminal in which a screen reduction mode for object selection is performed according to an exemplary embodiment of the present invention.
  • the controller 180 When the controller 180 receives a user input for entering the object selection mode (S501), the controller 180 enters the object selection mode (S505).
  • the controller 180 receives a user input for entering the screen reduction mode while entering the object selection mode (S505).
  • a user input for entering the screen reduction mode will be described with reference to FIGS. 7 and 8.
  • FIG. 7 is a diagram for describing a screen reduction mode of a terminal according to an exemplary embodiment.
  • the user input for entering the screen reduction mode may include a user input for entering the screen moving mode described with reference to FIG. 4A.
  • FIG. 8 is a diagram illustrating a screen reduction mode of a terminal according to another embodiment of the present invention.
  • the user input for entering the screen reduction mode may include a user input for entering the screen moving mode described with reference to FIG. 5A.
  • the screen reduction mode may refer to a state in which the size of the screen currently displayed on the display unit 151 is reduced in size. In screen reduction mode, additional space is displayed at every edge of the touch screen, so that text or icons located near the edge can be selected more accurately.
  • the screen reduction mode will be described with reference to FIGS. 7 and 8.
  • the controller 180 may reduce and display the entire currently displayed screen.
  • the mobile terminal 100 may reduce the screen as a whole to provide a space for selecting text located at the edge of the touch screen.
  • the controller 180 when the controller 180 receives a touch drag for selecting an icon from the right side of the edge of the touch screen, the controller 180 may reduce and display the entire currently displayed screen.
  • the controller 180 may reduce and display a screen to an area where a touch sensor exists in response to a user input for entering the screen reduction mode. That is, the screen currently displayed on the touch screen is displayed in the region where the touch sensor exists and the region where the touch sensor does not exist.
  • the controller 180 responds to the user input and displays the screen displayed in the region where the touch sensor does not exist.
  • the screen may be reduced to a predetermined size in order to display in the area where the touch sensor exists.
  • the screen reduction size may be determined in advance according to the size of the area where the touch sensor exists, or may be determined according to a user input, but is not limited thereto.
  • the screen reduction size determined according to the user input may include a size selected by the user input from a screen reduction size list provided by the mobile terminal 100.
  • FIG. 9 is a flowchart illustrating a method of entering a screen moving mode of a terminal according to an embodiment of the present invention.
  • the controller 180 When the controller 180 receives a user input for entering the text selection mode (S701), the controller 180 enters the text selection mode (S703).
  • the user input for entering the text selection mode may be a long touch at the text position.
  • the controller 180 receives a user input for text selection located at an edge area of the screen in the state of entering the text selection mode (S705).
  • a user input for selecting text located at an edge area of the screen will be described with reference to FIGS. 10 to 15.
  • FIG. 10 is a flowchart illustrating a method of entering a screen moving mode of a terminal according to another embodiment of the present invention.
  • the controller 180 receives a user input for entering an edge region of the screen (S901).
  • the user input for entering the edge area of the screen may be a touch at the edge position.
  • the user input for entering the edge area of the screen may be a touch drag held from the non-edge area to the edge area, but is not limited thereto.
  • the controller 180 receives a user input maintained in an edge area of the screen (S903).
  • the user input maintained in the edge area of the screen means that the user input received in step S901 is maintained. Therefore, the user input maintained at the edge region of the screen may be a touch held at the edge position for a predetermined time, that is, a long touch at the edge position. In addition, the user input maintained in the edge region of the screen may be a touch drag maintained from the non-edge region to the edge region for a certain time.
  • Steps S901 and S903 are described by subdividing step S705 described above with reference to FIG. 9, and a user input for selecting text located at an edge area of the screen is received including at least one of steps S901 and S903. Can be.
  • FIG. 11 is a diagram illustrating an edge region of a terminal screen in which a user input for entering a screen moving mode is sensed according to an embodiment of the present disclosure.
  • the front surface of the mobile terminal 100 may include a display unit 151 and a bezel 159 including a touch screen.
  • the edge area 155 of the touch screen may include both the boundary of the touch screen 151 and the boundary of the bezel 159.
  • the controller 180 may recognize that the user input for entering the screen moving mode is sensed.
  • the edge area 155 is not limited to the right edge area of the touch screen illustrated in FIG. 11, and may include all edge areas of the left, top, and bottom of the touch screen.
  • FIG. 12 is a flowchart illustrating a method of entering a screen moving mode of a terminal according to another embodiment of the present invention.
  • the controller 180 When the controller 180 receives a user input for text selection located at an edge of the screen (S705), the controller 180 outputs a notification for mode entry guidance (S1101). Notification for mode entry guidance will be described with reference to FIGS. 13 to 15.
  • FIG. 13 is a flowchart illustrating an indicator for guiding a screen movement mode according to an embodiment of the present invention.
  • the controller 180 When the controller 180 detects a user input for text selection in the edge area 155 of the screen, the controller 180 may display the indicator 201 on the display unit 151.
  • the indicator 201 may be displayed in the form of a bar or an icon. As shown in FIG. 13, the bar-shaped indicator 201 may be displayed as much as the size of the edge region 155 at the position of the edge region 155. At this time, the bar-shaped indicator 201 may include an edge displayed so as to be distinguished from the screen and an inner surface of the opaque edge. Although not shown in the drawing, the controller 180 may display an indicator in the form of an icon having a distinctive shape and color on a part of the edge region 155 in which a user input for text selection is detected. It doesn't work.
  • Indicator-related control information such as whether an indicator is displayed, a shape, a size, a position, and the like may be predetermined or determined according to a user input.
  • the indicator related control information determined according to the user input may include information selected by the user input from the indicator related control information menu provided by the mobile terminal 100.
  • FIG. 14 is a flowchart illustrating a pop-up window for guiding a screen movement mode and providing a user interface according to an embodiment of the present invention.
  • the controller 180 may display the pop-up window 203 on the display unit 151.
  • the pop-up window 203 may include a user interface and information for guiding the screen moving mode.
  • the information for guiding the screen movement mode may include query information, “Do you want to move the screen to the left?”.
  • the user interface may include a "confirm" button for receiving response information to the query information.
  • the controller 180 may enter the screen moving mode.
  • the pop-up window 203 may display information and a user interface for guiding the screen moving mode simultaneously in one window or may be displayed in several windows, respectively, but is not limited thereto.
  • the pop-up window 203 may be displayed at a location where a user input such as touch drag is received, or may be displayed at another location.
  • Control information related to the pop-up window such as whether the pop-up window is displayed, a location, a size, and information included in the pop-up window, may be predetermined or determined according to a user input.
  • the pop-up window related control information determined according to the user input may include information selected by the user input from the pop-up window related control information menu provided by the mobile terminal 100.
  • 15 is a flowchart illustrating a guide window for guiding a screen moving mode according to an embodiment of the present invention.
  • the controller 180 When the controller 180 detects a user input for text selection in the edge region 155 of the screen, the controller 180 may display a guide window 205 for guiding the screen movement mode entry on the display unit 151.
  • the guide window 205 may be displayed in the form of a chat window and may include information for guiding the screen moving mode entry.
  • the guide window 205 may include conditional screen shift mode entry guide information such as "when the terminal is tilted to the left side, the screen moves to the left side". Subsequently, when receiving a user input of tilting the terminal to the left, the controller 180 may enter the screen moving mode.
  • the guide window 205 may include unconditional scroll mode entry guide information such as “the screen moves to the left” or “the screen shift mode is started”, but is not limited thereto. no.
  • the controller 180 may enter the screen moving mode immediately after displaying the guide window 205.
  • the conditional screen movement mode entry guide information may display the second user input as a condition in response to the first user input for entering the screen movement mode described above.
  • the second user input may include, for example, various embodiments such as a user input of tilting the terminal as described above, a user input of selecting a separately displayed “confirmation” button, and a user input of maintaining touch drag for a predetermined time. It is not limited to this.
  • Control window related control information such as whether the guide window is displayed, a shape, a size, a position, and a type of the second user input corresponding to the condition may be determined in advance or may be determined according to the user input.
  • the guide window related control information determined according to the user input may include information selected by the user input from the guide window related control information menu provided by the mobile terminal 100.
  • the notification for entering the screen movement mode described with reference to FIGS. 13 to 15 corresponds to the visual notification.
  • the controller 180 may execute visual notification by displaying separate notification information on the display unit 151.
  • the entire screen displayed on the display unit 151 is left, right, up or down.
  • Visual notification may also be executed by displaying in a diagonal direction, but is not limited thereto.
  • the notification for entering the screen movement mode may include an auditory notification, a tactile notification, and a complex notification mixed with the same.
  • Acoustic notifications are information displayed by visual notifications, such as "Do you want to move the screen to the left?" Alternatively, this may mean a voice output of the guide information such as "tilting the terminal to the left moves the screen to the left". Alternatively, the auditory notification may mean an electronic sound output such as a beep sound, but is not limited thereto.
  • the tactile notification may mean a tactile effect that can be felt by the user through the haptic module 154.
  • the haptic module 154 may generate a tactile notification by generating a vibration.
  • the complex notification may mean mixing two or three of the visual notification, the audio notification, and the tactile notification.
  • the notification for entering the scroll mode may be determined in advance or determined according to a user input.
  • the notification for entering the scroll mode according to the user input may be a visual alert, an acoustic alert, a tactile alert, or a complex alert. It may include a notification selected by the input.
  • the controller 180 receives a user input for mode entry corresponding to the notification for mode entry guidance (S1103).
  • the user input for entering the mode may be received through a user interface included in the popup window 203.
  • the controller 180 may recognize a user input of selecting a "confirmation" button included in the popup window 203 as a user input for entering a mode.
  • the user input for mode entry may be a user input that satisfies a condition among the conditional screen movement mode entry guide information displayed on the guide window 205.
  • the controller 180 displays a guide window 205, such as "when the terminal is tilted to the left, the screen moves to the left", and then inputs a user input for tilting the terminal to the left. It can also be recognized.
  • the user input for entering the mode may be a user input that responds to a query asking whether the mobile terminal 100 proposes to enter the mode in various ways, but is not limited thereto.
  • the controller 180 determines whether the user input for text selection is continuous (S707).
  • the case where the user input for the text selection is continuous may include, for example, a touch drag starting from the text position of the region other than the edge region of the screen to the text position of the edge region.
  • the case where the user input for text selection is not continuous may include, for example, a case where a touch or long touch for selecting text located at an edge region of the screen is received.
  • the controller 180 detects the progress direction of the continuous user input (S709), and moves the electronic document in a direction opposite to the progress direction of the detected user input (S711). . Detection of the direction of continuous user input will be described with reference to FIG. 16.
  • 16 is a flowchart illustrating a user input for entering a screen moving mode and a screen moving method corresponding to the user input according to another embodiment of the present invention.
  • the controller 180 moves down and displays the electronic document currently displayed on the display unit 151 (S1311). As a result, additional space is displayed at the upper edge of the touch screen.
  • the controller 180 moves the electronic document upward to display it (S1313). As a result, additional space is displayed at the lower edge of the touch screen.
  • the controller 180 moves and displays the electronic document to the right (S1315). As a result, additional space is displayed at the left edge of the touch screen.
  • the controller 180 moves the electronic document to the left and displays it (S1317). As a result, additional space is displayed at the right edge of the touch screen.
  • the controller 180 may display the additional space at the edge by moving the electronic document in a direction opposite to the direction in which the user input proceeds. That is, the moving direction of the electronic document is not limited to the up, down, left and right directions.
  • the controller 180 detects an edge position where the discontinuous user input is detected (S713), and moves the electronic document in a direction opposite to the position of the detected user input (S715). ). The detection of the direction of travel of the discontinuous user input will be described with reference to FIG. 17.
  • 17 is a flowchart illustrating a user input for entering a screen moving mode and a screen moving method corresponding to the user input according to an exemplary embodiment.
  • the controller 180 moves down and displays the electronic document currently displayed on the display unit 151 (S1511). As a result, additional space is displayed at the upper edge of the touch screen.
  • the controller 180 moves the electronic document upward to display it (S1513). As a result, additional space is displayed at the lower edge of the touch screen.
  • the controller 180 moves the electronic document to the right and displays it (S1515). As a result, additional space is displayed at the left edge of the touch screen.
  • the controller 180 moves the electronic document to the left and displays it (S1517). As a result, additional space is displayed at the right edge of the touch screen.
  • the controller 180 may display an additional space at the edge by moving the electronic document in a direction opposite to the position where the user input is detected. That is, the moving direction of the electronic document is not limited to the up, down, left and right directions.
  • FIG. 18 is a diagram illustrating a screen on which a screen movement mode entry is executed according to an embodiment of the present invention.
  • the controller 180 When the controller 180 detects a user input for entering the screen moving mode from the right side of the edge, the controller 180 moves the screen displayed on the display unit 151 to the left. As a result, a predetermined space 157 is formed between the right edge of the display unit 151, that is, the right side of the page displayed on the current screen and the bezel 159, so that the text located at the right edge of the page can be selected more precisely. Can be.
  • the above-described method may be implemented as code that can be read by a processor in a medium in which a program is recorded.
  • processor-readable media include ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical data storage, and the like, and may be implemented in the form of a carrier wave (for example, transmission over the Internet). Include.
  • the above-described mobile terminal is not limited to the configuration and method of the above-described embodiments, but the embodiments may be configured by selectively combining all or some of the embodiments so that various modifications can be made. It may be.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

La présente invention concerne un terminal et un procédé d'utilisation de celui-ci, notamment un procédé pour sélectionner une partie du texte affiché sur un écran. Le procédé d'utilisation d'un terminal selon un mode de réalisation de la présente invention comprend les étapes suivantes : affichage d'un document électronique sur un écran ; détection d'une entrée d'utilisateur pour la sélection de texte dans une première région de l'écran ; et déplacement du document électronique vers une deuxième région de l'écran si une entrée d'utilisateur est détectée.
PCT/KR2013/002409 2013-03-22 2013-03-22 Terminal et son procédé d'utilisation WO2014148661A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
KR1020157021609A KR102097079B1 (ko) 2013-03-22 2013-03-22 단말기 및 그 동작 방법
PCT/KR2013/002409 WO2014148661A1 (fr) 2013-03-22 2013-03-22 Terminal et son procédé d'utilisation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/KR2013/002409 WO2014148661A1 (fr) 2013-03-22 2013-03-22 Terminal et son procédé d'utilisation

Publications (1)

Publication Number Publication Date
WO2014148661A1 true WO2014148661A1 (fr) 2014-09-25

Family

ID=51580324

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2013/002409 WO2014148661A1 (fr) 2013-03-22 2013-03-22 Terminal et son procédé d'utilisation

Country Status (2)

Country Link
KR (1) KR102097079B1 (fr)
WO (1) WO2014148661A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20090042740A (ko) * 2007-10-26 2009-04-30 리서치 인 모션 리미티드 핸드헬드 이동 통신 장치의 접촉 감지 스크린을 사용하는 텍스트 선택
KR20100088315A (ko) * 2009-01-30 2010-08-09 주식회사 팬택 이동통신 단말기의 화면출력 제어장치
EP2469398A2 (fr) * 2008-03-04 2012-06-27 Apple Inc. Sélection de texte à l'aide de gestes
US20130007612A1 (en) * 2011-06-28 2013-01-03 International Business Machines Corporation Manipulating Display Of Document Pages On A Touchscreen Computing Device
US20130042199A1 (en) * 2011-08-10 2013-02-14 Microsoft Corporation Automatic zooming for text selection/cursor placement

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101788052B1 (ko) * 2010-12-21 2017-10-19 엘지전자 주식회사 이동 단말기 및 이것의 모드 전환 제어 방법

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20090042740A (ko) * 2007-10-26 2009-04-30 리서치 인 모션 리미티드 핸드헬드 이동 통신 장치의 접촉 감지 스크린을 사용하는 텍스트 선택
EP2469398A2 (fr) * 2008-03-04 2012-06-27 Apple Inc. Sélection de texte à l'aide de gestes
KR20100088315A (ko) * 2009-01-30 2010-08-09 주식회사 팬택 이동통신 단말기의 화면출력 제어장치
US20130007612A1 (en) * 2011-06-28 2013-01-03 International Business Machines Corporation Manipulating Display Of Document Pages On A Touchscreen Computing Device
US20130042199A1 (en) * 2011-08-10 2013-02-14 Microsoft Corporation Automatic zooming for text selection/cursor placement

Also Published As

Publication number Publication date
KR20150142669A (ko) 2015-12-22
KR102097079B1 (ko) 2020-04-03

Similar Documents

Publication Publication Date Title
WO2015088123A1 (fr) Dispositif électronique et son procédé de commande
WO2012050248A1 (fr) Équipement mobile et son procédé de commande
WO2015056844A1 (fr) Terminal mobile et son procédé de commande
WO2012036324A1 (fr) Terminal mobile et procédé permettant de commander son fonctionnement
WO2012008628A1 (fr) Terminal mobile et procédé de configuration pour écran de veille associé
WO2015012449A1 (fr) Dispositif électronique et son procédé de commande
WO2016114444A1 (fr) Terminal mobile et son procédé de commande
WO2012108729A2 (fr) Dispositif comprenant une pluralité d'écrans tactiles et procédé de changement d'écran pour le dispositif
WO2012133983A1 (fr) Traitement d'image dans un dispositif d'affichage d'image monté sur véhicule
WO2014137074A1 (fr) Terminal mobile et procédé de commande du terminal mobile
WO2016010221A1 (fr) Terminal mobile et son procédé de commande
WO2012148242A2 (fr) Terminal mobile et procédé de commande dudit terminal
WO2012046891A1 (fr) Terminal mobile, dispositif afficheur, et procédé de commande correspondant
WO2011087204A2 (fr) Appareil de signalisation numérique et procédé l'utilisant
WO2015088166A1 (fr) Terminal mobile, et procédé de commande d'une unité d'entrée de face arrière du terminal
WO2016076474A1 (fr) Terminal mobile et son procédé de commande
WO2015068872A1 (fr) Dispositif électronique et procédé de commande
WO2017104941A1 (fr) Terminal mobile et son procédé de commande
WO2014208783A1 (fr) Terminal mobile et procédé pour commander un terminal mobile
WO2017175942A1 (fr) Terminal mobile et son procédé de commande
WO2016108547A1 (fr) Appareil d'affichage et procédé d'affichage
WO2015060501A1 (fr) Appareil et procédé de commande de terminal mobile
WO2015178520A1 (fr) Terminal mobile et son procédé de commande
WO2016056723A1 (fr) Terminal mobile et son procédé de commande
WO2011002238A2 (fr) Terminal mobile équipé d'écrans virtuels multiples et procédé de commande de celui-ci

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13878724

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 20157021609

Country of ref document: KR

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13878724

Country of ref document: EP

Kind code of ref document: A1