WO2018030568A1 - Terminal mobile et procédé de commande associé - Google Patents

Terminal mobile et procédé de commande associé Download PDF

Info

Publication number
WO2018030568A1
WO2018030568A1 PCT/KR2016/009128 KR2016009128W WO2018030568A1 WO 2018030568 A1 WO2018030568 A1 WO 2018030568A1 KR 2016009128 W KR2016009128 W KR 2016009128W WO 2018030568 A1 WO2018030568 A1 WO 2018030568A1
Authority
WO
WIPO (PCT)
Prior art keywords
key
output
mobile terminal
input signal
navigation keys
Prior art date
Application number
PCT/KR2016/009128
Other languages
English (en)
Korean (ko)
Inventor
김은혜
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Publication of WO2018030568A1 publication Critical patent/WO2018030568A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/725Cordless telephones
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2201/00Electronic components, circuits, software, systems or apparatus used in telephone systems
    • H04M2201/34Microprocessors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2201/00Electronic components, circuits, software, systems or apparatus used in telephone systems
    • H04M2201/38Displays
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/22Details of telephonic subscriber devices including a touch pad, a touch sensor or a touch detector

Definitions

  • the present invention relates to a mobile terminal and a control method thereof. More specifically, the present invention, when the mobile terminal detects the screen movement interaction (interaction) according to the output content, the vertical navigation including at least one key on the display unit based on the detected screen movement interaction (navigation) It relates to a mobile terminal and a control method for controlling to output a) key or left and right navigation keys.
  • Terminals may be divided into mobile / portable terminals and stationary terminals according to their mobility.
  • the mobile terminal may be further classified into a handheld terminal and a vehicle mounted terminal according to whether a user can directly carry it.
  • the functions of mobile terminals are diversifying. For example, data and voice communication, taking a picture and video with a camera, recording a voice, playing a music file through a speaker system, and outputting an image or video to a display unit.
  • Some terminals have an electronic game play function or a multimedia player function.
  • recent mobile terminals may receive multicast signals that provide visual content such as broadcasting, video, and television programs.
  • such a terminal is a multimedia player having a complex function such as taking a picture or a video, playing a music or video file, playing a game, or receiving a broadcast. Is being implemented.
  • recently output content from the mobile terminal includes a long or vertical content or a plurality of menus.
  • the user using the content may use the top or bottom content only by moving the scroll through multiple touches.
  • the current mobile terminal has no way of easily moving to the beginning or the end of the content using a software key.
  • an object of the present invention to solve the above and other problems. According to the present invention, when the mobile terminal detects a screen movement interaction based on the output content, the mobile terminal outputs a vertical navigation key or a left and right navigation keys including at least one key on the display unit based on the detected screen movement interaction.
  • An object of the present invention is to provide a mobile terminal and a control method thereof.
  • a mobile terminal comprising: a sensing unit; A display unit for outputting content; And a control unit, wherein the control unit senses a screen movement interaction according to the output content through the sensing unit, and includes at least one key on the display unit based on the detected screen movement interaction.
  • a mobile terminal which controls to output up and down navigation keys or left and right navigation keys.
  • control unit outputs a default key on the display unit and changes the default key to the up and down navigation keys or the left and right navigation keys based on the screen movement interaction. You can control the output.
  • the up and down navigation keys or the left and right navigation keys may include a first key, a second key and a third key.
  • the screen movement interaction includes a vertical movement interaction and the left and right movement interaction
  • the control unit outputs the vertical navigation key, when the vertical movement interaction is detected, the left and right movement interaction If it is detected, it can be controlled to output the left and right navigation keys.
  • control unit when the control unit senses an input signal for selecting the first key in a state in which the up and down navigation keys or the left and right navigation keys are output, a first associated with the output content
  • the control unit senses an input signal for selecting the first key in a state in which the up and down navigation keys or the left and right navigation keys are output
  • a first associated with the output content when the input signal for selecting the third key is sensed while the upper and lower navigation keys or the left and right navigation keys are output, the second function related to the output content may be performed.
  • the controller may determine the content controlled by the up and down navigation keys or the left and right navigation keys based on the point where the vertical movement interaction or the left and right movement interaction is input.
  • the controller may detect an icon edit mode according to sensing of a first input signal for selecting a first icon.
  • the first icon is an icon included in a first home screen.
  • the icon editing mode is executed, the first icon is controlled to output the up and down navigation keys or the left and right navigation keys. can do.
  • control unit when the control unit senses an input signal for selecting the first key in a state in which the up and down navigation keys or the left and right navigation keys are output, When the control unit moves to a second home screen among the home screens and senses an input signal for selecting the third key while the up and down navigation keys or the left and right navigation keys are output, Control to move to the third home screen of the home screen.
  • the controller executes a home screen edit mode in response to sensing a second input signal for selecting a first area in which the icon is not displayed among the plurality of home screens, and editing the home screen.
  • the controller may control to output the up and down navigation keys or the left and right navigation keys.
  • the control unit when the control unit senses an input signal for selecting the first key in the state in which the up and down navigation keys or the left and right navigation keys are output, the home including the first area And controlling the screen to move to the front of the plurality of home screens, and sensing an input signal for selecting the third key while the up and down navigation keys or the left and right navigation keys are output.
  • the home screen may be controlled to move to the rear of the plurality of home screens.
  • the controller when the output content is a widget, the controller outputs the up and down navigation keys or the left and right navigation keys according to the screen movement interaction of dragging and touching the widget.
  • the first option associated with the widget is executed and the up and down navigation keys or the left and right navigation keys are output.
  • the input signal for selecting the third key in the set state it may be controlled to execute the last option related to the widget.
  • the controller when the content is a content having a plurality of hierarchies, the controller outputs the up and down navigation keys or the left and right navigation keys as the upper screen movement interaction is detected.
  • the mobile terminal moves to the top layer of the content and the up-down navigation key or the left-right navigation key is output.
  • the control may be performed to move to the lowest layer of the content.
  • the controller when the content is a content having a plurality of menus, the controller outputs the up and down navigation keys or the left and right navigation keys according to sensing the screen movement interaction, and the up and down navigation keys or When sensing an input signal for selecting the first key while the left and right navigation keys are outputted, the user moves to a top menu of the content and the third key while the up and down navigation keys or the left and right navigation keys are output. When sensing an input signal for selecting, it may be controlled to move to the lowest menu of the content.
  • the controller when a plurality of contents are simultaneously executed, the controller outputs the up and down navigation keys or the left and right navigation keys according to the screen movement interaction, and the up and down navigation keys or the left and right navigation keys.
  • the controller may control to output the last executed content among the plurality of contents.
  • the third input signal for selecting the second key when the third input signal for selecting the second key is sensed while the vertical navigation key or the left and right navigation keys are output, the third input signal may be The first content area output on the display unit is mapped to the third input signal at the time of sensing, and the fourth input signal is output while the second content area different from the first content area is being output. When sensing, the first content area may be output.
  • the controller when the fifth input signal for selecting the second key is sensed after the third input signal is sensed, the controller may be configured to detect the fifth input signal at the time when the fifth input signal is sensed.
  • the sixth input signal is sensed.
  • the second content area may be output.
  • the content includes one of a vertical menu and a horizontal menu
  • the controller senses an input signal for selecting the vertical menu
  • the content does not sense the screen movement interaction.
  • the navigation key is output and the input signal for selecting the horizontal menu is sensed, the left and right navigation keys may be output without detecting the screen movement interaction.
  • the controller when the input signal is not input for a preset time after the up and down navigation keys or the left and right navigation keys are output, assigns the up and down navigation keys or the left and right navigation keys to the default values. It can be controlled to change with a key.
  • the controller controls to output the up and down navigation keys or the left and right navigation keys, and the up and down navigation keys or
  • the control unit is configured to change the output up and down navigation keys or the left and right navigation keys to the default key and output the same.
  • the direction may be opposite to the first direction.
  • a user may directly use a menu at the top or bottom of the content by using a navigation key output to the mobile terminal.
  • the user may easily control the content corresponding to the point where the screen interaction is input by using the navigation key.
  • FIG. 1A is a block diagram illustrating a mobile terminal related to the present invention.
  • 1B and 1C are conceptual views of one example of a mobile terminal, viewed from different directions.
  • FIG. 2 is a conceptual diagram illustrating another example of a deformable mobile terminal according to the present invention.
  • FIG. 7 illustrates an embodiment of performing a function by sensing an input signal for selecting a navigation key according to an embodiment of the present invention.
  • FIG. 11 is a view for explaining an embodiment of moving a position of a home screen using a navigation key according to an embodiment of the present invention.
  • FIG. 12 is a diagram for explaining an embodiment of controlling a widget using a navigation key according to an embodiment of the present invention.
  • FIG. 14 is a diagram illustrating an embodiment of moving a menu of contents including a plurality of menus using a navigation key according to an embodiment of the present invention.
  • FIG. 17 is a diagram for explaining an embodiment of adjusting a volume of sound output using a navigation key according to an embodiment of the present invention.
  • FIG. 18 is a diagram for explaining an embodiment of adjusting a volume of sound output using a navigation key according to another embodiment of the present invention.
  • 20 is a view for explaining an embodiment of automatically outputting a navigation key based on the attributes of content according to another embodiment of the present invention.
  • FIG. 21 is a diagram for describing another navigation key being output based on screen interaction according to another embodiment of the present invention.
  • FIG. 22 illustrates an example of moving a hierarchy of content including a plurality of hierarchies using a navigation key according to another exemplary embodiment of the present invention.
  • the broadcast receiving module 111 of the wireless communication unit 110 receives a broadcast signal and / or broadcast related information from an external broadcast management server through a broadcast channel.
  • the broadcast channel may include a satellite channel and a terrestrial channel.
  • Two or more broadcast receiving modules may be provided to the mobile terminal 100 for simultaneous broadcast reception or switching of broadcast channels for at least two broadcast channels.
  • the touch sensor applies a touch (or touch input) applied to the touch screen (or the display unit 151) using at least one of various touch methods such as a resistive film method, a capacitive method, an infrared method, an ultrasonic method, and a magnetic field method. Detect.
  • the touch controller processes the signal (s) and then transmits the corresponding data to the controller 180.
  • the controller 180 can know which area of the display unit 151 is touched.
  • the touch controller may be a separate component from the controller 180 or may be the controller 180 itself.
  • the controller 180 may perform different control or perform the same control according to the type of touch object that touches the touch screen (or a touch key provided in addition to the touch screen). Whether to perform different control or the same control according to the type of touch object may be determined according to the operation state of the mobile terminal 100 or an application program being executed.
  • the camera 121 which has been described as the configuration of the input unit 120, includes at least one of a camera sensor (eg, CCD, CMOS, etc.), a photo sensor (or an image sensor), and a laser sensor.
  • a camera sensor eg, CCD, CMOS, etc.
  • a photo sensor or an image sensor
  • a laser sensor e.g., a laser sensor
  • the camera 121 and the laser sensor may be combined with each other to detect a touch of a sensing object with respect to a 3D stereoscopic image.
  • the photo sensor may be stacked on the display element, which is configured to scan the movement of the sensing object in proximity to the touch screen. More specifically, the photo sensor mounts a photo diode and a transistor (TR) in a row / column and scans contents mounted on the photo sensor by using an electrical signal that varies according to the amount of light applied to the photo diode. That is, the photo sensor calculates coordinates of the sensing object according to the amount of light change, and thus, the position information of the sensing object can be obtained.
  • TR transistor
  • the display unit 151 may be configured as a stereoscopic display unit for displaying a stereoscopic image.
  • the stereoscopic display unit may be a three-dimensional display method such as a stereoscopic method (glasses method), an auto stereoscopic method (glasses-free method), a projection method (holographic method).
  • the sound output unit 152 may output audio data received from the wireless communication unit 110 or stored in the memory 170 in a call signal reception, a call mode or a recording mode, a voice recognition mode, a broadcast reception mode, and the like.
  • the sound output unit 152 may also output a sound signal related to a function (for example, a call signal reception sound or a message reception sound) performed in the mobile terminal 100.
  • the sound output unit 152 may include a receiver, a speaker, a buzzer, and the like.
  • the haptic module 153 may be used to stimulate pins that vertically move with respect to the contact skin surface, jetting force or suction force of air through the jetting or suction port, grazing to the skin surface, contact of electrodes, and electrostatic force.
  • Various tactile effects can be generated, such as effects by the endothermic and the reproduction of a sense of cold using the elements capable of endotherm or heat generation.
  • the haptic module 153 may not only deliver a tactile effect through direct contact, but also may allow a user to feel the tactile effect through a muscle sense such as a finger or an arm. Two or more haptic modules 153 may be provided according to a configuration aspect of the mobile terminal 100.
  • the interface unit 160 serves as a path to all external devices connected to the mobile terminal 100.
  • the interface unit 160 receives data from an external device, receives power, transfers the power to each component inside the mobile terminal 100, or transmits data inside the mobile terminal 100 to an external device.
  • the port, audio input / output (I / O) port, video input / output (I / O) port, earphone port, etc. may be included in the interface unit 160.
  • the interface unit 160 may be a passage for supplying power from the cradle to the mobile terminal 100 or may be input from the cradle by a user.
  • Various command signals may be a passage through which the mobile terminal 100 is transmitted.
  • Various command signals or power input from the cradle may operate as signals for recognizing that the mobile terminal 100 is correctly mounted on the cradle.
  • the memory 170 may store a program for the operation of the controller 180 and may temporarily store input / output data (for example, a phone book, a message, a still image, a video, etc.).
  • the memory 170 may store data regarding vibration and sound of various patterns output when a touch input on the touch screen is performed.
  • the controller 180 controls the operation related to the application program, and generally the overall operation of the mobile terminal 100. For example, if the state of the mobile terminal satisfies a set condition, the controller 180 may execute or release a lock state that restricts input of a user's control command to applications.
  • the power supply unit 190 receives an external power source and an internal power source under the control of the controller 180 to supply power for operation of each component.
  • the power supply unit 190 includes a battery, and the battery may be a built-in battery configured to be rechargeable, and may be detachably coupled to the terminal body for charging.
  • the power supply unit 190 may be provided with a connection port, the connection port may be configured as an example of the interface 160 is electrically connected to the external charger for supplying power for charging the battery.
  • the power supply unit 190 may be configured to charge the battery in a wireless manner without using the connection port.
  • the power supply unit 190 uses one or more of an inductive coupling based on a magnetic induction phenomenon or a magnetic resonance coupling based on an electromagnetic resonance phenomenon from an external wireless power transmitter. Power can be delivered.
  • various embodiments of the present disclosure may be implemented in a recording medium readable by a computer or a similar device using, for example, software, hardware, or a combination thereof.
  • the disclosed mobile terminal 100 includes a terminal body in the form of a bar.
  • the present invention is not limited thereto, and the present invention can be applied to various structures such as a watch type, a clip type, a glass type, or a folder type, a flip type, a slide type, a swing type, a swivel type, and two or more bodies which are coupled to be movable relative to each other.
  • a description of a particular type of mobile terminal may generally apply to other types of mobile terminals.
  • the display unit 151 may be disposed in front of the terminal body to output information. As shown, the window 151a of the display unit 151 may be mounted to the front case 101 to form a front surface of the terminal body together with the front case 101.
  • the rear cover 103 when the rear cover 103 is coupled to the rear case 102, a portion of the side surface of the rear case 102 may be exposed. In some cases, the rear case 102 may be completely covered by the rear cover 103 during the coupling. On the other hand, the rear cover 103 may be provided with an opening for exposing the camera 121b or the sound output unit 152b to the outside.
  • the cases 101, 102, and 103 may be formed by injecting a synthetic resin, or may be formed of a metal, for example, stainless steel (STS), aluminum (Al), titanium (Ti), or the like.
  • STS stainless steel
  • Al aluminum
  • Ti titanium
  • the mobile terminal 100 may be configured such that one case may provide the internal space, unlike the above example in which a plurality of cases provide an internal space for accommodating various electronic components.
  • the mobile terminal 100 of the unibody that the synthetic resin or metal from the side to the rear may be implemented.
  • the mobile terminal 100 may be provided with a waterproof portion (not shown) to prevent water from seeping into the terminal body.
  • the waterproof portion is provided between the window 151a and the front case 101, between the front case 101 and the rear case 102 or between the rear case 102 and the rear cover 103, and a combination thereof. It may include a waterproof member for sealing the inner space.
  • the display unit 151, the first sound output unit 152a, the proximity sensor 141, the illuminance sensor 142, and the light output unit may be disposed on the front surface of the terminal body.
  • the first camera 121a and the first operation unit 123a are disposed, and the second operation unit 123b, the microphone 122, and the interface unit 160 are disposed on the side of the terminal body.
  • the mobile terminal 100 in which the second sound output unit 152b and the second camera 121b are disposed on the rear surface of the mobile terminal 100 will be described as an example.
  • the display unit 151 displays (outputs) information processed by the mobile terminal 100.
  • the display unit 151 may display execution screen information of an application program driven in the mobile terminal 100 or user interface (UI) and graphical user interface (GUI) information according to the execution screen information. .
  • UI user interface
  • GUI graphical user interface
  • the display unit 151 may include a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT LCD), an organic light-emitting diode (OLED), and a flexible display (flexible display). display, a 3D display, or an e-ink display.
  • LCD liquid crystal display
  • TFT LCD thin film transistor-liquid crystal display
  • OLED organic light-emitting diode
  • flexible display flexible display
  • display a 3D display, or an e-ink display.
  • two or more display units 151 may exist according to an implementation form of the mobile terminal 100.
  • the plurality of display units may be spaced apart or integrally disposed on one surface of the mobile terminal 100, or may be disposed on different surfaces.
  • the display unit 151 may include a touch sensor that senses a touch on the display unit 151 so as to receive a control command by a touch method.
  • the touch sensor may sense the touch, and the controller 180 may generate a control command corresponding to the touch based on the touch sensor.
  • the content input by the touch method may be letters or numbers or menu items that can be indicated or designated in various modes.
  • the touch sensor is formed of a film having a touch pattern and disposed between the window 151a and the display (not shown) on the rear surface of the window 151a or directly patterned on the rear surface of the window 151a. It can also be Alternatively, the touch sensor may be integrally formed with the display. For example, the touch sensor may be disposed on a substrate of the display or provided in the display.
  • the display unit 151 may form a touch screen together with the touch sensor.
  • the touch screen may function as the user input unit 123 (see FIG. 1A).
  • the touch screen may replace at least some functions of the first manipulation unit 123a.
  • the first sound output unit 152a may be implemented as a receiver for transmitting a call sound to the user's ear, and the second sound output unit 152b may be a loud speaker for outputting various alarm sounds or multimedia reproduction sounds. It can be implemented in the form of).
  • a sound hole for emitting sound generated from the first sound output unit 152a may be formed in the window 151a of the display unit 151.
  • the present invention is not limited thereto, and the sound may be configured to be emitted along an assembly gap between the structures (for example, a gap between the window 151a and the front case 101).
  • an externally formed hole may be invisible or hidden for sound output, thereby simplifying the appearance of the mobile terminal 100.
  • the light output unit 154 is configured to output light for notifying when an event occurs. Examples of the event may include message reception, call signal reception, missed call, alarm, schedule notification, email reception, information reception through an application, and the like.
  • the controller 180 may control the light output unit 154 to end the light output.
  • the first camera 121a processes an image frame of a still image or a moving image obtained by the image sensor in a shooting mode or a video call mode.
  • the processed image frame may be displayed on the display unit 151 and stored in the memory 170.
  • the first and second manipulation units 123a and 123b may be collectively referred to as a manipulating portion as an example of the user input unit 123 operated to receive a command for controlling the operation of the mobile terminal 100. have.
  • the first and second manipulation units 123a and 123b may be adopted in any manner as long as the user is tactile manner such as touch, push, scroll, and the like while the user is tactile.
  • the first and second manipulation units 123a and 123b may be employed in such a manner that the first and second manipulation units 123a and 123b are operated without a tactile feeling by the user through proximity touch, hovering touch, or the like.
  • the first operation unit 123a is illustrated as being a touch key, but the present invention is not limited thereto.
  • the first manipulation unit 123a may be a mechanical key or a combination of a touch key and a push key.
  • the contents input by the first and second manipulation units 123a and 123b may be variously set.
  • the first operation unit 123a receives a command such as a menu, a home key, a cancellation, a search, etc.
  • the second operation unit 123b is output from the first or second sound output units 152a and 152b.
  • the user may receive a command such as adjusting the volume of the sound and switching to the touch recognition mode of the display unit 151.
  • a rear input unit (not shown) may be provided on the rear surface of the terminal body.
  • the rear input unit is manipulated to receive a command for controlling the operation of the mobile terminal 100, and the input contents may be variously set. For example, commands such as power on / off, start, end, scroll, etc., control of the volume of sound output from the first and second sound output units 152a and 152b, and the touch recognition mode of the display unit 151. Commands such as switching can be received.
  • the rear input unit may be implemented in a form capable of input by touch input, push input, or a combination thereof.
  • the rear input unit may be disposed to overlap the front display unit 151 in the thickness direction of the terminal body.
  • the rear input unit may be disposed at the rear upper end of the terminal body so that the user can easily manipulate the index body when the user grips the terminal body with one hand.
  • the present invention is not necessarily limited thereto, and the position of the rear input unit may be changed.
  • the rear input unit when the rear input unit is provided at the rear of the terminal body, a new type user interface using the same may be implemented.
  • the touch screen or the rear input unit described above replaces at least some functions of the first operation unit 123a provided in the front of the terminal body, the first operation unit 123a is not disposed on the front of the terminal body.
  • the display unit 151 may be configured with a larger screen.
  • the mobile terminal 100 may be provided with a fingerprint recognition sensor for recognizing a user's fingerprint, and the controller 180 may use fingerprint information detected through the fingerprint recognition sensor as an authentication means.
  • the fingerprint recognition sensor may be embedded in the display unit 151 or the user input unit 123.
  • the interface unit 160 serves as a path for connecting the mobile terminal 100 to an external device.
  • the interface unit 160 may be connected to another device (eg, an earphone or an external speaker), a port for short-range communication (for example, an infrared port (IrDA Port), or a Bluetooth port (Bluetooth). Port), a wireless LAN port, or the like, or a power supply terminal for supplying power to the mobile terminal 100.
  • the interface unit 160 may be implemented in the form of a socket for receiving an external card such as a subscriber identification module (SIM) or a user identity module (UIM), a memory card for storing information.
  • SIM subscriber identification module
  • UIM user identity module
  • the second camera 121b may include a plurality of lenses arranged along at least one line.
  • the plurality of lenses may be arranged in a matrix format.
  • Such a camera may be referred to as an 'array camera'.
  • the second camera 121b is configured as an array camera, images may be photographed in various ways using a plurality of lenses, and images of better quality may be obtained.
  • the flash 124 may be disposed adjacent to the second camera 121b.
  • the flash 124 shines light toward the subject when the subject is photographed by the second camera 121b.
  • the second sound output unit 152b may be additionally disposed on the terminal body.
  • the second sound output unit 152b may implement a stereo function together with the first sound output unit 152a and may be used to implement a speakerphone mode during a call.
  • the terminal body may be provided with at least one antenna for wireless communication.
  • the antenna may be built in the terminal body or formed in the case.
  • an antenna that forms part of the broadcast receiving module 111 (refer to FIG. 1A) may be configured to be pulled out from the terminal body.
  • the antenna may be formed in a film type and attached to the inner side of the rear cover 103, or may be configured such that a case including a conductive material functions as an antenna.
  • the terminal body is provided with a power supply unit 190 (see FIG. 1A) for supplying power to the mobile terminal 100.
  • the power supply unit 190 may include a battery 191 embedded in the terminal body or detachably configured from the outside of the terminal body.
  • the battery 191 may be configured to receive power through a power cable connected to the interface unit 160.
  • the battery 191 may be configured to enable wireless charging through a wireless charger.
  • the wireless charging may be implemented by a magnetic induction method or a resonance method (magnetic resonance method).
  • the rear cover 103 is coupled to the rear case 102 to cover the battery 191 to limit the detachment of the battery 191 and to protect the battery 191 from external shock and foreign matter.
  • the rear cover 103 may be detachably coupled to the rear case 102.
  • An accessory may be added to the mobile terminal 100 to protect the appearance or to assist or expand the function of the mobile terminal 100.
  • An example of such an accessory may be a cover or pouch that covers or accommodates at least one surface of the mobile terminal 100.
  • the cover or pouch may be configured to be linked with the display unit 151 to expand the function of the mobile terminal 100.
  • Another example of the accessory may be a touch pen for assisting or extending a touch input to a touch screen.
  • the information processed by the mobile terminal can be displayed using a flexible display.
  • this will be described in more detail with reference to the accompanying drawings.
  • FIG. 2 is a conceptual diagram illustrating another example of a deformable mobile terminal 200 according to the present invention.
  • the display unit 251 may be configured to be deformable by an external force.
  • the deformation may be at least one of bending, bending, folding, twisting, and curling of the display unit 251.
  • the deformable display unit 251 may be referred to as a 'flexible display unit'.
  • the flexible display unit 251 may include both a general flexible display, an electronic paper, and a combination thereof.
  • the mobile terminal 200 may include the features of the mobile terminal 100 of FIGS. 1A-1C or similar features.
  • a general flexible display is a light and durable display that is fabricated on a thin and flexible substrate that can be bent, bent, folded, twisted or curled like a paper while maintaining the characteristics of a conventional flat panel display.
  • electronic paper is a display technology to which the characteristics of general ink are applied, and the use of reflected light may be different from that of a conventional flat panel display.
  • Electronic paper can change information using twist balls or electrophoresis using capsules.
  • the display area of the flexible display unit 251 is flat.
  • the display area may be a curved surface.
  • the information displayed in the second state may be visual information output on a curved surface.
  • Such visual information is implemented by independently controlling light emission of a sub-pixel disposed in a matrix form.
  • the unit pixel refers to a minimum unit for implementing one color.
  • the flexible display unit 251 may be placed in a bent state (eg, bent vertically or horizontally) instead of being flat in the first state. In this case, when an external force is applied to the flexible display unit 251, the flexible display unit 251 may be deformed into a flat state (or less curved state) or more curved state.
  • the flexible display unit 251 may be combined with a touch sensor to implement a flexible touch screen.
  • the controller 180 (refer to FIG. 1A) may perform control corresponding to the touch input.
  • the flexible touch screen may be configured to detect a touch input not only in the first state but also in the second state.
  • the mobile terminal 200 may be provided with deformation detection means for detecting the deformation of the flexible display unit 251.
  • deformation detection means may be included in the sensing unit 140 (see FIG. 1A).
  • the deformation detecting means may be provided in the flexible display unit 251 or the case 201 to sense information related to deformation of the flexible display unit 251.
  • the information related to the deformation may include a direction in which the flexible display unit 251 is deformed, a degree of deformation, a deformation position, a deformation time, and an acceleration in which the flexible display 251 is restored.
  • due to the bending of the flexible display unit 251 may be a variety of information that can be detected.
  • the controller 180 changes the information displayed on the flexible display unit 251 or changes the information displayed on the flexible display unit 251 based on the information related to the deformation of the flexible display unit 251 detected by the deformation detecting means. It can generate a control signal for controlling the function of.
  • the mobile terminal 200 may include a case 201 for accommodating the flexible display unit 251.
  • the case 201 may be configured to be deformable together with the flexible display unit 251 by an external force in consideration of characteristics of the flexible display unit 251.
  • the battery (not shown) included in the mobile terminal 200 may also be configured to be deformable together with the flexible display unit 251 by an external force in consideration of characteristics of the flexible display unit 251.
  • a stack and folding method in which battery cells are stacked up may be applied.
  • the state deformation of the flexible display unit 251 is not limited only by external force.
  • the flexible display unit 251 may be transformed into the second state by a command of a user or an application.
  • the mobile terminal can be extended to a wearable device that can be worn on the body beyond the user mainly holding in the hand.
  • wearable devices include a smart watch, a smart glass, a head mounted display (HMD), and the like.
  • HMD head mounted display
  • the wearable device may be configured to exchange (or interlock) data with another mobile terminal 100.
  • the short range communication module 114 may detect (or recognize) a wearable device that can communicate around the mobile terminal 100.
  • the controller 180 transmits at least a portion of data processed by the mobile terminal 100 through the short range communication module 114. Can be sent to. Therefore, the user may use data processed by the mobile terminal 100 through the wearable device. For example, when a call is received by the mobile terminal 100, a phone call may be performed through the wearable device, or when the message is received by the mobile terminal 100, the received message may be confirmed through the wearable device. .
  • FIG. 3 is a perspective view illustrating an example of a watch type mobile terminal 300 according to another embodiment of the present invention.
  • the watch-type mobile terminal 300 includes a main body 301 having a display unit 351 and a band 302 connected to the main body 301 to be worn on a wrist.
  • the mobile terminal 300 may include the features of the mobile terminal 100 of FIGS. 1A to 1C or similar features.
  • the main body 301 includes a case forming an external appearance.
  • the case may include a first case 301a and a second case 301b that provide an interior space for accommodating various electronic components.
  • the present invention is not limited thereto, and one case may be configured to provide the internal space so that the mobile terminal 300 of the unibody may be implemented.
  • the watch type mobile terminal 300 is configured to enable wireless communication, and the main body 301 may be provided with an antenna for wireless communication.
  • the antenna can extend the performance using a case.
  • a case containing a conductive material may be configured to be electrically connected with the antenna to extend the ground area or the radiation area.
  • the display unit 351 may be disposed on the front surface of the main body 301 to output information, and the display unit 351 may be provided with a touch sensor to implement a touch screen. As illustrated, the window 351a of the display unit 351 may be mounted on the first case 301a to form the front surface of the terminal body together with the first case 301a.
  • the main body 301 may include a sound output unit 352, a camera 321, a microphone 322, a user input unit 323, and the like.
  • the display unit 351 When the display unit 351 is implemented as a touch screen, the display unit 351 may function as the user input unit 323, and thus a separate key may not be provided in the main body 301.
  • the band 302 is made to be worn on the wrist to surround the wrist, and may be formed of a flexible material to facilitate wearing.
  • the band 302 may be formed of leather, rubber, silicone, synthetic resin, or the like.
  • the band 302 is configured to be detachable to the main body 301, the user can be configured to be replaced with various types of bands according to taste.
  • the band 302 can be used to extend the performance of the antenna.
  • the band may include a ground extension (not shown) electrically connected to the antenna to extend the ground area.
  • the band 302 may be provided with a fastener 302a.
  • the fastener 302a may be implemented by a buckle, a snap-fit hook structure, a velcro (trade name), or the like, and may include an elastic section or material. . In this figure, an example in which the fastener 302a is implemented in the form of a buckle is shown.
  • the mobile terminal described below with reference to FIGS. 4 to 25 may be implemented as one of the mobile terminals 100, 200, and 300 illustrated in FIGS. 1 to 3.
  • FIG. 4 is a block diagram illustrating a configuration module of a mobile terminal according to an embodiment of the present invention.
  • a mobile terminal may include a sensing unit 410, a display unit 420, and a controller 430.
  • the sensing unit 410 may sense various inputs of the user to the mobile terminal and the environment of the mobile terminal, and may transmit a sensing result so that the control unit 430 may perform an operation accordingly.
  • the sensing unit 410 is provided in the display unit 420 may be implemented as a touch screen.
  • the sensing unit 410 may be implemented by the sensing unit 140 of FIG. 1A.
  • the mobile terminal 400 may detect a screen movement interaction according to the content output to the display unit 420 through the sensing unit 410.
  • the screen movement interaction may include various touch operations.
  • the screen movement interaction may include a vertical movement interaction and a horizontal movement interaction.
  • the vertical movement interaction may include a touch operation such as dragging up and down, flicking up and down
  • the horizontal movement interaction may include a touch operation such as left and right dragging and left and right flickering.
  • the display unit 420 may display visual information.
  • the visual information may include text, an indicator, an icon, content, an application, an image, a video, and the like.
  • the display unit 420 may output visual information to the screen based on the control command of the controller 430.
  • the display 430 may be implemented as the display 151 of FIG. 1A or the display 351 of FIG. 3A.
  • the display unit 420 may output up and down navigation keys or left and right navigation keys including at least one key based on the screen interaction detected by the sensing unit 410.
  • the up and down navigation keys or the left and right navigation keys may include a first key, a second key and a third key.
  • the up and down navigation keys may include an up key as a first key, a bookmark key as a second key, and a down key as a third key.
  • the left and right navigation keys may include a left key as a first key, a bookmark key as a second key, and a left key as a third key.
  • the up and down navigation keys or the left and right navigation keys are shown to include three keys, but the up and down navigation keys or the left and right navigation keys may include at least one key. That is, in FIGS. 5 to 20, the upper and lower navigation keys or the left and right navigation keys include three keys. For example, the upper and lower navigation keys or the left and right navigation keys include five keys. Listen and explain.
  • the controller 430 may process data, control the units of the above-described mobile terminal, and control data transmission / reception between the units.
  • the controller 430 may be implemented as the controller 180 of FIG. 1A.
  • operations performed by the mobile terminal may be controlled by the controller 430.
  • the drawings and the following description will collectively describe these operations to be described as being performed / controlled by the mobile terminal.
  • the mobile terminal may output a navigation key in response to detecting a screen movement interaction.
  • the mobile terminal may output the up and down navigation keys when detecting the up and down movement interaction, and may output the left and right navigation keys when detecting the left and right movement interaction.
  • the functions of the navigation key output according to the screen movement interaction will be described with reference to FIGS. 5 to 7.
  • FIG. 5 is a diagram for explaining an embodiment in which a navigation key is output based on screen interaction according to an embodiment of the present invention.
  • a navigation key is output based on screen interaction according to an embodiment of the present invention.
  • descriptions overlapping with those of FIG. 4 will be omitted in the exemplary embodiment of FIG. 5.
  • the mobile terminal can output the default key 510 on the display.
  • the default key 510 may include a back key 511, a home key 512, and a recent app list key 513.
  • the user may move to the previous screen through the previous screen key 511 or exit the executed keypad or pop-up window.
  • the user may move to the home screen through the home key 512.
  • the user may check or execute a list of recently used apps through the recent app list key 513.
  • the functions included in the default key 510 may be preset from the manufacture of the mobile terminal, and may be changed by the user.
  • the arrangement of the keys included in the default key 510 may be changed.
  • the mobile terminal can output a default key 510 on the display unit.
  • the mobile terminal can sense the left and right movement interaction 520 through the sensing unit.
  • the left and right movement interaction 520 may correspond to a touch operation of touch dragging or touch flicking from left to right or from right to left.
  • the mobile terminal can sense the left and right movement interaction 520 without the default key 510 being output.
  • the mobile terminal may change the default key 510 into the left and right navigation keys 530 and output the same.
  • the left and right navigation keys 530 may include a left key 531, a bookmark key 532, and a right key 533.
  • the mobile terminal may output the left and right navigation keys 530 as the left and right movement interaction 520 is sensed while the default key 510 is not output.
  • the mobile terminal can output a default key 510 on the display unit.
  • the mobile terminal can sense the vertical movement interaction 540 through the sensing unit.
  • the vertical movement interaction 540 may correspond to a touch operation of touch dragging or touching flicking from the bottom to the top or the top to the bottom.
  • the mobile terminal can sense the vertical movement interaction 540 without the default key 510 being output.
  • the mobile terminal may change the default key 510 to the vertical navigation key 550.
  • the up and down navigation keys 550 may include an up key 551, a bookmark key 552, and a down key 553.
  • the mobile terminal may output the up and down navigation keys 550 as the up and down movement interaction 540 is sensed while the default key 510 is not output.
  • left and right movement interaction 510 and the up and down movement interaction 540 are illustrated as being sensed on the default key 510 in the embodiment of FIG. 5, it may be sensed that other regions of the display unit may be sensed. This will be described in detail with reference to FIG. 7.
  • the default key 510, the left and right navigation keys 530, and the up and down navigation keys 550 are shown at the bottom of the display unit in FIGS. 5 and 5, the default key 510 and the left and right navigation keys 530 are illustrated. ) And the up and down navigation keys 550 may be output to any area on the display unit.
  • FIG. 6 is a diagram illustrating an embodiment in which a default key is output based on screen interaction according to an embodiment of the present invention.
  • a default key is output based on screen interaction according to an embodiment of the present invention.
  • FIGS. 4 and 5 descriptions overlapping with FIGS. 4 and 5 will be omitted in the embodiment of FIG. 6.
  • the mobile terminal can output the content 610 on the display unit.
  • the content 610 may correspond to a music playback application.
  • the mobile terminal can further output the default key 620 while the content 610 is output.
  • the mobile terminal can sense the first left and right movement interaction 630.
  • the mobile terminal may change the default key 620 to the left and right navigation keys 640 and output the same.
  • the left and right navigation keys 640 may include a left key 641, a bookmark key 642, and a right key 643.
  • the mobile terminal can sense an input signal 650 for selecting the right key 643.
  • the mobile terminal may perform a first function related to the output content 610.
  • the first function may correspond to a function of executing the last menu of the output content 610.
  • the first function may correspond to a function of playing the last song of the music playlist included in the music playback application.
  • sensing an input signal for selecting the left key 641 it is possible to perform another function related to the outputted content 610 (for example, executing the first menu).
  • another function related to the outputted content 610 for example, executing the first menu.
  • the mobile terminal can sense the second left and right movement interaction 670.
  • the second horizontal movement interaction 670 may correspond to an interaction in a direction opposite to the first horizontal movement interaction 630. More specifically, when the first horizontal movement interaction 630 is a touch input signal dragging in the first direction, the second horizontal movement interaction 670 is a touch input signal dragging in the second direction, and The second direction may correspond to the opposite direction. For example, when the first horizontal movement interaction 630 is an input signal for dragging touches from right to left, the second horizontal movement interaction 670 may correspond to an input signal for dragging touching from left to right.
  • the mobile terminal may control to change the left and right navigation keys 640 to the default keys 620 and output the same. That is, the mobile terminal may change the default key 620 to the left and right navigation keys 640 based on the first left and right movement interaction 630 and output the same, and has a direction opposite to the first left and right movement interaction 630.
  • the changed left and right navigation keys 640 based on the second left and right movement interaction 670 may be changed to the default keys 620 and output.
  • FIG. 7 illustrates an embodiment of performing a function by sensing an input signal for selecting a navigation key according to an embodiment of the present invention.
  • FIGS. 4 to 6 descriptions overlapping with those of FIGS. 4 to 6 will be omitted in the embodiment of FIG. 7.
  • the mobile terminal can output the first content 710.
  • the first content 710 may correspond to a music reproduction application.
  • the mobile terminal can further output the default key 720 in a state where the first content 710 is output.
  • the mobile terminal can sense the left and right movement interaction 730.
  • the left and right movement interaction 730 may be sensed on the first content 710 instead of being sensed on the default key 720. That is, unlike FIGS. 5 and 6, the mobile terminal may sense an input signal of dragging from left to right on the first content 710 instead of the output default key 720.
  • the mobile terminal may change the default key 720 to the left and right navigation keys 740 and output the same.
  • the left and right navigation keys 740 may include a left key 741, a bookmark key 742 and a right key 743.
  • the mobile terminal may perform a first function related to the first content 710 by sensing the input signal 750 for selecting the right key 743.
  • the first function may correspond to a function of executing the last menu of the first content 710.
  • the first function may correspond to a function of playing the last song of the music playlist included in the music playback application.
  • the mobile terminal may change the left and right navigation keys 740 to the default keys 720 and output the same.
  • the mobile terminal may immediately change the left and right navigation keys 740 to the default keys 720 and output the first function immediately after performing the first function.
  • the mobile terminal may change the left and right navigation keys 740 to the default keys 720 and output the same.
  • the mobile terminal can output the second content 711.
  • the second content 711 may correspond to a music list application.
  • the mobile terminal can further output the default key 720 in a state where the second content 711 is output.
  • the mobile terminal can sense the vertical movement interaction 731.
  • the vertical movement interaction 731 may correspond to the touch input signal dragged from the bottom to the top on the second content 711.
  • the mobile terminal may change the default key 720 to the up and down navigation keys 770 as the up and down movement interaction 731 is sensed, and output the same.
  • the up and down navigation keys 770 may include an up key 771, a bookmark key 772, and a down key 773.
  • the mobile terminal may perform a first function related to the second content 711 by sensing the input signal 780 for selecting the hockey 773.
  • the first function may correspond to a function of outputting the lowest end of the second content 711.
  • the second content 711 is a music list application
  • the first function may correspond to a function of outputting the last music of the music list included in the music list application.
  • the mobile terminal when the input signal is not input for a preset time after the up / down navigation key 770 is output, the mobile terminal assigns the up / down navigation key 770 to the default key 720. You can change the output to.
  • the mobile terminal when the screen movement interaction is not detected for a preset time, the mobile terminal may change the vertical navigation key 770 to the default key 720 and output the same.
  • the mobile terminal may change the navigation keys 740 and 770 to the default keys 720 when the user's input signal is not detected. .
  • the mobile terminal may output a navigation key in response to detecting a screen movement interaction.
  • the mobile terminal may determine the content controlled through the navigation key based on the point at which the screen moving interaction is input.
  • FIGS. 8 and 9 illustrate a case in which a landscape mode of the mobile terminal is activated unlike the above-described FIGS. 5 to 7.
  • FIG. 8 is a diagram for explaining an embodiment in which content to be controlled is determined based on a point at which a screen movement interaction is input according to an embodiment of the present invention.
  • descriptions overlapping with those of FIGS. 4 to 7 will be omitted in the embodiment of FIG. 8.
  • the mobile terminal can output the content 810.
  • the content 810 may correspond to a gallery application.
  • the gallery application may include at least one photo.
  • the mobile terminal may further output the default key 820.
  • the mobile terminal can sense the vertical movement interaction 830 on the content 810.
  • the mobile terminal since the horizontal mode is activated, the mobile terminal may sense an operation of dragging a user from the bottom to the top with the vertical movement interaction 830.
  • the mobile terminal can further output an indicator 840 for indicating the area to be controlled based on the point where the vertical movement interaction 830 is input. That is, when the vertical movement interaction 830 is detected on the photo output in the gallery application, the mobile terminal may output the indicator 840 in the frame of the photo. Accordingly, the user may know that the vertical navigation key 850 output based on the vertical movement interaction 830 controls the picture in the gallery application.
  • the mobile terminal may change the default key 820 into the vertical navigation key 850 and output the same.
  • the up and down navigation keys 850 may include an up key 851, a bookmark key 852, and a down key 853.
  • the mobile terminal can sense an input signal 860 that selects a hockey 853.
  • the mobile terminal may perform a last function related to a controlled area.
  • the mobile terminal since the controlled area corresponds to the picture output on the gallery application, the mobile terminal may output the last (eg, most recently taken) picture 870 of the output picture. .
  • FIG. 9 is a diagram for explaining an embodiment in which content to be controlled is determined based on a point at which a screen movement interaction is input, according to another exemplary embodiment.
  • FIG. 9 illustrates a case in which the vertical movement interaction is sensed in the effect setting menu of the picture.
  • the mobile terminal can output the content 910.
  • the content 910 may correspond to a gallery application.
  • the gallery application may include at least one photo, and may perform various functions for editing the photo.
  • the mobile terminal may further output the default key 920.
  • the mobile terminal can sense the vertical movement interaction 930 on the content 910.
  • the mobile terminal may further output an indicator 940 for indicating the area to be controlled based on the point where the vertical movement interaction 930 is input. That is, when the vertical movement interaction 930 is detected on the effect setting menu output in the gallery application, the mobile terminal may output the indicator 940 in the frame of the effect setting menu. Accordingly, the user may know that the vertical navigation key 960 output based on the vertical movement interaction 930 controls the effect setting menu in the gallery application.
  • the effect setting menu may mean a menu for setting an effect on a photo output in the gallery application.
  • the effect setting menu may include a menu for setting a black and white effect, a light effect, and a blur effect on the photo output in the gallery application. In the embodiment of FIG. 9, it is assumed that the effect setting menu is selected when the first menu 950 is selected.
  • the mobile terminal may change the default key 920 to the vertical navigation key 960 and output the same.
  • the up and down navigation keys 960 may include an up key, a bookmark key, and a down key.
  • the mobile terminal can sense an input signal 970 for selecting a hockey.
  • the mobile terminal may perform a last function related to a controlled area. That is, in the embodiment of FIG. 9, since the controlled area corresponds to the effect setting menu in the gallery application, the mobile terminal may select the last menu from the effect setting menu. More specifically, when the first menu 950 is selected, the mobile terminal senses an input signal 970 for selecting a hockey key, so that the mobile terminal selects the second menu 980 located at the end of the effect setting menu. You can choose. Accordingly, the mobile terminal may set the effect of the second menu 980 on the picture output in the gallery application according to the input signal 970 for selecting the hockey.
  • the home screen output to the mobile terminal may include at least one icon.
  • the user in order to move the position of the icon, the user needs to execute the icon edit mode and then drag the icon to the new home screen.
  • an embodiment of easily moving an icon or a home screen using a navigation key will be described.
  • FIG. 10 is a view for explaining an embodiment of moving the position of an icon using a navigation key according to an embodiment of the present invention.
  • descriptions overlapping with those of FIGS. 4 to 9 will be omitted in the embodiment of FIG. 10.
  • the mobile terminal may output a plurality of home screens including at least one icon. Referring to the first drawing of FIG. 10, the mobile terminal can output the first home screen 1010 including the first icon 1011. In addition, the mobile terminal may output the default key 1020 while the first home screen 1010 is output.
  • the mobile terminal can sense the first input signal 1030 that selects the first icon 1011.
  • the first input signal 1030 may correspond to the long press touch input signal.
  • the mobile terminal may execute the icon edit mode 1040 as sensing the first input signal 1030.
  • the icon editing mode 1040 may correspond to a mode for moving the position of the icon.
  • the mobile terminal can sense the left and right movement interaction 1050.
  • the mobile terminal may change the default key 1020 to the left and right navigation keys 1060 and output the same.
  • the left and right navigation keys 1060 may include a left key, a bookmark key and a right key.
  • the mobile terminal can sense the second input signal 1070 for selecting the right key.
  • the mobile terminal senses the vertical movement interaction, it is of course possible to output the vertical navigation keys.
  • the mobile terminal may control to move the first icon to the second home screen 1080 of the plurality of home screens.
  • the second home screen 1080 may correspond to the last home screen located among the plurality of home screens.
  • the mobile terminal when the mobile terminal senses a third input signal for selecting a left key, the mobile terminal moves the first icon 1011 to the third home screen located first among the plurality of home screens. Of course it can.
  • FIG. 11 is a view for explaining an embodiment of moving a position of a home screen using a navigation key according to an embodiment of the present invention.
  • FIG. 11 describes an embodiment of moving a position of a home screen using a navigation key according to an embodiment of the present invention.
  • descriptions overlapping with those of FIG. 10 will be omitted in the embodiment of FIG. 11.
  • the mobile terminal may output a plurality of home screens including at least one icon.
  • the mobile terminal can output a first home screen 1110 including the first area.
  • the first area may correspond to an area where the icon is not output.
  • the mobile terminal may output the default key 1120 while the first home screen 1110 is output.
  • the mobile terminal can sense the first input signal 1130 for selecting the first area.
  • the first input signal 1130 may correspond to the long press touch input signal.
  • the mobile terminal may execute the home screen edit mode 1140 as sensing the first input signal 1130.
  • the home screen editing mode 1140 may correspond to a mode for changing the order of the home screen.
  • the mobile terminal may sense the left and right interactions 1150.
  • the mobile terminal may change the default key 1120 to the left and right navigation keys 1160 and output the same as sensing the left and right interactions 1150.
  • the left and right navigation keys 1160 may include a left key, a bookmark key and a right key.
  • the mobile terminal can sense the second input signal 1170 for selecting the right key.
  • the mobile terminal when sensing the second input signal 1170, the mobile terminal may control to move the first home screen 1110 to the rear of the plurality of home screens.
  • the mobile terminal when the mobile terminal senses a third input signal for selecting a left key, the mobile terminal can control the first home screen 1110 to move to the front of the plurality of home screens.
  • the mobile terminal can output content including a plurality of menus or hierarchies. 12 to 14, an embodiment of moving to a top or bottom menu of a content by using a navigation key while outputting a content including a plurality of menus or hierarchies will be described.
  • FIG. 12 is a diagram for explaining an embodiment of controlling a widget using a navigation key according to an embodiment of the present invention.
  • descriptions overlapping with those of FIGS. 4 to 11 will be omitted in the embodiment of FIG. 12.
  • the mobile terminal can output the widget 1210 on the display unit.
  • the widget 1210 may correspond to a mini application made to directly use functions such as weather, calendar, calculator, news, games, stock information, etc., without using a web browser.
  • the widget 1210 may correspond to a mini music application that can directly play music on a home screen.
  • the mobile terminal may further output the default key 1220 while the widget 1210 is output.
  • the mobile terminal can detect the left and right movement interaction 1230 on the widget 1210.
  • the left and right movement interactions 1230 may correspond to an input signal for dragging and touching the area where the widget 1210 is output from right to left.
  • the mobile terminal may change the default key 1220 into the left and right navigation keys 1240 and output the same.
  • the left and right navigation keys 1240 may include a left key, a bookmark key and a right key.
  • the mobile terminal can sense the input signal 1250 for selecting the right key.
  • the mobile terminal may control to execute the last option related to the widget 1210 by sensing the input signal 1250 for selecting the right key. For example, when the user touches the right key of the left and right navigation keys while the mini music application is output, the mobile terminal may play the last song of the music playlist included in the mini music application.
  • FIG. 13 is a diagram illustrating an embodiment of moving a hierarchy of content including a plurality of hierarchies using a navigation key according to an exemplary embodiment of the present invention.
  • FIG. 13 describes descriptions overlapping with those of FIG. 12 will be omitted in the exemplary embodiment of FIG. 13.
  • the mobile terminal can output the first content 1310 having a plurality of hierarchies.
  • the first content 1310 having a plurality of hierarchies may correspond to content in which a plurality of menus are composed of a plurality of hierarchies (upper hierarchies, lower hierarchies, etc.) in one application.
  • the first content 1310 having a plurality of hierarchies may include a top bar 1311 indicating a currently output hierarchical layer.
  • the first content 1310 having a plurality of layers may correspond to a setting application.
  • the setting application includes a menu for setting various functions of the mobile terminal in a plurality of layers.
  • the mobile terminal is outputting the middle layer of the first content 1310 having a plurality of layers.
  • the mobile terminal may further output a default key 1320.
  • the mobile terminal can sense the left and right movement interaction 1330 on the top bar 1311.
  • the left and right movement interaction 1330 may correspond to an input signal that the user drags and touches the top bar 1311 from right to left.
  • the mobile terminal may change the default key 1320 to the left and right navigation keys 1340 and output the same.
  • the left and right navigation keys 1340 may include a left key, a bookmark key and a right key.
  • the mobile terminal can sense an input signal 1350 for selecting a left key.
  • the mobile terminal can control to move to the uppermost layer 1360 of the first content 1310 having a plurality of layers.
  • the mobile terminal senses an input signal 1350 for selecting a left key while outputting an intermediate layer in the first content 1310 having a plurality of hierarchies. It is possible to control to output the top layer 1360 in the 1310.
  • the mobile terminal senses an input signal 1350 for selecting a left key while outputting a 'language selection' menu in 'keyboard setting' in the setting application
  • the initial screen of the setting application is displayed. (Top layer) can be output.
  • the mobile terminal can output second content 1370 having a plurality of folders.
  • the second content 1370 having a plurality of folders may correspond to a folder application in which a plurality of data is divided into a plurality of folder structures.
  • the folder application may correspond to an application that stores various data stored in the mobile terminal in a hierarchical folder structure.
  • the mobile terminal is outputting an intermediate folder of the second content 1370 having a plurality of folders.
  • the mobile terminal may further output a default key 1320.
  • the mobile terminal can sense the left and right movement interaction 1330.
  • the mobile terminal may change the default key 1320 to the left and right navigation keys 1340 and output the same.
  • the left and right navigation keys 1340 may include a left key, a bookmark key and a right key.
  • the mobile terminal can sense the input signal 1380 for selecting the right key.
  • the mobile terminal may control to move to the lowest hierarchy of the second content 1370 having a plurality of folders as the input signal 1380 is sensed. More specifically, the mobile terminal outputs an intermediate folder in the second content 1370 having a plurality of folders, and senses an input signal 1380 for selecting the right key to generate a first folder having a plurality of folders. 2, the lowest folder in the content 1370 may be output.
  • FIG. 13A illustrates an embodiment of controlling content having a plurality of hierarchies using a navigation key
  • FIG. 13B illustrates an embodiment of controlling content having a plurality of folders.
  • the first or the last executed contents of the plurality of contents may be output by using a navigation key.
  • the mobile terminal can control various contents by using a navigation key.
  • FIG. 14 is a diagram illustrating an embodiment of moving a menu of contents including a plurality of menus using a navigation key according to an embodiment of the present invention.
  • FIGS. 12 and 13 descriptions overlapping with those of FIGS. 12 and 13 will be omitted in the exemplary embodiment of FIG. 14.
  • the mobile terminal can output the content 1410 having a plurality of menus.
  • the mobile terminal may further output the default key 1420 while outputting the content 1410 having a plurality of menus.
  • the mobile terminal may sense the vertical movement interaction 1440 with the top menu 1430 selected in the content 1410.
  • the vertical movement interaction 1440 may correspond to an input signal of dragging from bottom to top on the content 1410 having a plurality of menus.
  • the mobile terminal may change the default key 1420 to the vertical navigation key 1450 and output the same.
  • the up and down navigation keys 1450 may include an up key, a bookmark key, and a down key.
  • the mobile terminal can sense an input signal 1460 for selecting a hockey.
  • the mobile terminal may control to select the lowest menu 1470 in the content 1410 as the mobile terminal senses an input signal 1460 for selecting a low key.
  • the mobile terminal may control the lowest menu 1470 to be selected by sensing the input signal 1460 while the highest menu 1430 is selected in the content 1410.
  • the navigation key may include a bookmark key.
  • the mobile terminal senses an input signal for selecting a bookmark key while outputting another content by mapping a content area currently output using the bookmark key, the mapped content area This function can output a function.
  • embodiments of the bookmark key will be described with reference to FIGS. 15 and 16. 15 and 16, it is assumed that the up and down navigation keys are output.
  • FIG. 15 illustrates an example of outputting an area of mapped content using a navigation key according to an embodiment of the present invention.
  • FIGS. 4 to 14 descriptions overlapping with those of FIGS. 4 to 14 will be omitted in the embodiment of FIG. 15.
  • the mobile terminal can output the vertically long content 1510.
  • the content 1510 is a content having a large amount and may correspond to content that is not output on one screen. Therefore, the content 1510 may include a plurality of content areas output on the display unit.
  • the mobile terminal may sense the first input signal 1520 for selecting a bookmark key included in the up and down navigation keys while the first content area 1530 is output on the display unit.
  • the first input signal 1520 may correspond to an input signal for long press touching the bookmark key.
  • the mobile terminal may map the first content area 1530 and the first input signal 1520 output on the display unit when the first input signal 1520 is sensed.
  • the mobile terminal may output a second content area 1540 that is different from the first content area 1530.
  • the mobile terminal can sense the second input signal 1550.
  • the second input signal 1550 may correspond to an input signal for short touching the bookmark key. That is, the second input signal 1550 touches the bookmark key in the same manner as the first input signal 1520, but may correspond to the input signal touched in another manner.
  • the mobile terminal may output the first content area 1530 which is pre-mapped as the second input signal 1550 is sensed.
  • the mobile terminal After the mobile terminal maps the first content area 1530 through the first input signal 1520, the mobile terminal outputs the second content area 1540, and then the second input signal 1550. ), It may move from the second content area 1540 to the first content area 1530.
  • FIG. 16 illustrates an example of outputting an area of mapped content using a navigation key according to another exemplary embodiment of the present invention.
  • descriptions overlapping with those of FIG. 15 will be omitted in the embodiment of FIG. 16.
  • the first figure of FIG. 16 is a figure explaining the Example after the 3rd figure of FIG. That is, in the first drawing of FIG. 16, the mobile terminal assumes that the first content area 1530 is mapped according to the first input signal 1520.
  • the mobile terminal can output vertically long content 1510.
  • the mobile terminal may sense a third input signal 1620 for selecting a bookmark key included in the up and down navigation keys.
  • the third content area 1630 may correspond to a content area different from the first content area 1530.
  • the third input signal 1620 may correspond to an input signal for long press touching the bookmark key.
  • the third input signal 1620 may correspond to an input signal for one long press touch of the bookmark key after the first input signal 1520.
  • the third input signal 1620 may correspond to an input signal of pressing the bookmark key twice long after the first input signal 1520.
  • the mobile terminal may map the third content area 1630 and the third input signal 1620 output on the display unit when the third input signal 1620 is sensed.
  • the mobile terminal may output a fourth content area 1640 that is different from the first content area 1530 and the third content area 1630.
  • the mobile terminal can sense the second input signal 1550.
  • the mobile terminal may output the first content area 1530 which is pre-mapped as the second input signal 1550 is sensed.
  • the mobile terminal can sense the fourth input signal 1660.
  • the fourth input signal 1660 may correspond to an input signal for short-touching the bookmark key once. That is, the fourth input signal 1660 touches the bookmark key in the same manner as the third input signal 1620, but may correspond to the input signal touched in another manner. Also, in another embodiment, the fourth input signal 1660 may correspond to an input signal of two short touches of the bookmark key.
  • the mobile terminal may output a pre-mapped third content area 1630 as the fourth input signal 1660 is sensed.
  • the mobile terminal may map the first content area 1530 and the third content area 1630 through the first input signal 1520 and the third input signal 1620 for selecting the bookmark key.
  • the mobile terminal senses the second input signal 1550 or the fourth input signal 1660 while the second content area 1540 or the fourth content area 1640 is output. 1530 or the third content area 1630.
  • FIG. 17 is a diagram for explaining an embodiment of adjusting a volume of sound output using a navigation key according to an embodiment of the present invention.
  • descriptions overlapping with those of FIGS. 4 to 16 will be omitted in the embodiment of FIG. 17.
  • the mobile terminal can output the content 1710.
  • the content 1710 may correspond to an audio application. That is, the content 1710 may include a menu for adjusting the volume of the output sound.
  • the mobile terminal may further output a default key 1720 in a state in which the content 1710 is output.
  • the mobile terminal can sense the left and right movement interaction 1730 on the menu for adjusting the volume of the output sound.
  • the left and right movement interactions 1730 may correspond to an input signal for dragging and touching a menu for adjusting the volume of the output sound from left to right.
  • the mobile terminal may change the default key 1720 into the left and right navigation keys 1740 and output the same.
  • the left and right navigation keys 1740 may include a left key, a bookmark key and a right key.
  • the volume of sound output on the current audio application may correspond to the first volume 1750.
  • the mobile terminal can sense the input signal 1760 for selecting the right key.
  • the mobile terminal may change the volume of the output sound to the second volume 1770 by sensing the input signal 1760 for selecting the right key.
  • the second volume 1770 may correspond to the loudest sound output from the mobile terminal.
  • the volume of the output sound may be changed to the smallest sound.
  • FIG. 18 is a diagram for explaining an embodiment of adjusting a volume of sound output using a navigation key according to another embodiment of the present invention. Hereinafter, descriptions overlapping with those of FIG. 17 will be omitted in the embodiment of FIG. 18.
  • the mobile terminal may output an audio application including a menu for adjusting the volume of sound.
  • the volume of sound output on the audio application may correspond to the first volume 1810. 18, it is assumed that the left and right navigation keys are output.
  • the mobile terminal can sense the first input signal 1830 for selecting a bookmark key included in the left and right navigation keys 1820.
  • the first input signal 1830 may correspond to an input signal for long press touching the bookmark key.
  • the mobile terminal may map the first volume 1810, which is the volume of the currently output sound, with the first input signal 1830.
  • the mobile terminal can freely adjust the volume of sound output on the audio application while outputting a default key.
  • the mobile terminal can sense the left and right movement interaction 1840 on the menu for adjusting the volume of the output sound.
  • the mobile terminal may output left and right navigation keys 1850 as the left and right movement interactions 1840 are sensed.
  • the volume of sound output on the current audio application may correspond to the second volume 1860.
  • the second volume 1860 may correspond to a different value from the first volume 1810.
  • the mobile terminal can sense the second input signal 1870 for selecting the right key.
  • the volume of sound output from the audio application is converted from the second volume 1860 to the first volume 1810 previously mapped. You can change the output.
  • FIG. 17 and FIG. 18 an embodiment of adjusting the volume of a sound output from an audio application is described as an example, but an embodiment of setting a radio frequency may also be applied.
  • the mobile terminal may automatically output the up and down navigation keys or the left and right navigation keys according to the attribute of the output content. 19 and 20, an embodiment in which a navigation key is automatically output according to an attribute of a content will be described.
  • FIG. 19 is a diagram for explaining an embodiment of automatically outputting a navigation key based on an attribute of content according to an embodiment of the present invention.
  • descriptions overlapping with those of FIGS. 4 to 18 will be omitted in the embodiment of FIG. 19.
  • the mobile terminal can output the content 1910.
  • the content 1910 may include one of a vertical menu and a horizontal menu.
  • the content 1910 may correspond to a calendar application.
  • the mobile terminal may further output the default key 1920.
  • the mobile terminal may sense an input signal 1930 for selecting a vertical menu included in the content 1910.
  • the vertical menu may correspond to a year.
  • the mobile terminal may output the vertical menu list 1940 as the mobile terminal senses an input signal 1930 for selecting the vertical menu.
  • the vertical menu list 1940 may correspond to a list of items included in the vertical menu. For example, when the vertical menu is a year, the vertical menu list 1940 may include 2016, 2017, 2018, 2019, and 2020. In addition, the vertical menu list 1940 may list the items vertically.
  • the mobile terminal may change the default key 1920 to the up and down navigation keys 1950 and output the same as sensing the input signal 1930 for selecting the vertical menu.
  • the mobile terminal may output the up and down navigation keys 1950 without sensing the up and down movement interaction.
  • FIG. 20 is a view for explaining an embodiment of automatically outputting a navigation key based on the attributes of content according to another embodiment of the present invention.
  • descriptions overlapping with those of FIG. 19 will be omitted in the embodiment of FIG. 20.
  • FIG. 20A illustrates an embodiment in which the content includes a vertical menu
  • FIG. 20B illustrates an embodiment in which the content includes a horizontal menu.
  • the first drawing of FIG. 20A illustrates a later embodiment of the second drawing of FIG. 19.
  • the mobile terminal in a state in which the vertical menu list 2010 is output, the mobile terminal can sense an input signal 2020 for selecting a lower key included in the up and down navigation keys.
  • the mobile terminal may output the last item 2030 of the vertical menu list 2010 as the mobile terminal senses an input signal 2020 for selecting a low key.
  • the mobile terminal can output content including a horizontal menu.
  • the mobile terminal may output left and right navigation keys according to sensing an input signal for selecting a horizontal menu.
  • the mobile terminal can sense an input signal 2050 for selecting the right key.
  • the mobile terminal may be in a state where the first item 2040 is selected from the horizontal menu list.
  • the mobile terminal may select the last item 2060 of the horizontal menu list by sensing the input signal 2050 for selecting the right key.
  • the up and down navigation keys or the left and right navigation keys include three keys.
  • the up and down navigation keys or the left and right navigation keys include five keys.
  • FIG. 21 is a diagram for describing another navigation key being output based on screen interaction according to another embodiment of the present invention.
  • FIG. Hereinafter, descriptions overlapping with those of FIGS. 4 to 20 will be omitted in the embodiment of FIG. 21.
  • the mobile terminal can output a default key 2110 on the display unit.
  • the mobile terminal can sense the left and right movement interaction 2120.
  • the left and right movement interaction 2120 may correspond to a touch operation of dragging from left to right on the default key 2110.
  • the left and right movement interaction 2120 may correspond to a touch operation different from the left and right movement interaction 520 of FIG. 5.
  • the mobile terminal may change the default key 2110 into the left and right navigation keys 2130 and output the same.
  • the left and right navigation keys 2130 may include a left key 2131, a left folder key 2132, a bookmark key 2133, a right folder key 2134, and a right key 2135.
  • the mobile terminal can output a default key 2110 on the display unit.
  • the mobile terminal may sense the vertical movement interaction 2140.
  • the vertical movement interaction 2140 may correspond to a touch operation of dragging from the bottom to the top on the default key 2110.
  • the vertical movement interaction 2140 may correspond to a touch operation different from the vertical movement interaction 540 of FIG. 5.
  • the mobile terminal may change the default key 2110 into the vertical navigation key 2150 and output the same.
  • the up and down navigation keys 2150 may include an up key 2151, an up folder key 2152, a bookmark key 2153, a down folder key 2154, and a down key 2155.
  • FIG. 22 illustrates an example of moving a hierarchy of content including a plurality of hierarchies using a navigation key according to another exemplary embodiment of the present invention.
  • FIG. 22 illustrates an example of moving a hierarchy of content including a plurality of hierarchies using a navigation key according to another exemplary embodiment of the present invention.
  • descriptions overlapping with those of FIG. 21 will be omitted in the exemplary embodiment of FIG. 22.
  • the mobile terminal can output content 2210 having a plurality of folders.
  • the mobile terminal is outputting an intermediate folder of the content 2210 having a plurality of folders.
  • the mobile terminal may further output a default key 2220.
  • the mobile terminal can sense the vertical movement interaction 2230.
  • the mobile terminal may change the default key 2230 into the vertical navigation key 2240 and output the same.
  • the mobile terminal can sense an input signal 2250 for selecting the right folder key.
  • the mobile terminal may output the lowest folder 2260 located in the content 2210 having a plurality of folders. .
  • FIG. 23 is a view for explaining an embodiment of moving an icon position using a navigation key according to another embodiment of the present invention.
  • descriptions overlapping with those of FIG. 22 will be omitted in the embodiment of FIG. 23.
  • the mobile terminal may output a plurality of home screens including at least one icon.
  • the plurality of icons may form a folder. That is, the plurality of home screens may include a folder including at least one icon and a plurality of icons.
  • the mobile terminal may output the first home screen 2310 including the first icon 2311.
  • the mobile terminal may output a default key 2320 in a state where the first home screen 2310 is output.
  • the mobile terminal can sense the first input signal 2330 that selects the first icon 2311.
  • the mobile terminal may execute the icon edit mode 2340 as sensing the first input signal 2330.
  • the mobile terminal can sense the left and right movement interaction 2350.
  • the mobile terminal may change the default key 2320 to the left and right navigation keys 2360 and output the left and right navigation interactions 2350.
  • the left and right navigation keys 2360 may include a left key, a left folder key, a bookmark key, a right folder key, and a right key.
  • the mobile terminal can sense the second input signal 2370 for selecting the right folder key.
  • the mobile terminal may control to move the first icon 2311 to a folder 2380 located at the last of the plurality of home screens as the second input signal 2370 is sensed. .
  • the mobile terminal when the mobile terminal senses a third input signal for selecting a left folder key, the mobile terminal may move the first icon 2311 to the first folder among the plurality of home screens.
  • the mobile terminal may move the first icon 2311 to the first folder among the plurality of home screens.
  • FIG. 24 is a diagram for explaining an embodiment of moving a location of a home screen using a navigation key according to another embodiment of the present invention.
  • descriptions overlapping with those of FIG. 23 will be omitted in the embodiment of FIG. 24.
  • the mobile terminal can output a first folder 2410 including a plurality of icons.
  • the mobile terminal it may be assumed that the mobile terminal is in a state in which the left and right navigation keys 2420 are output.
  • the mobile terminal may sense the first input signal 2430 for selecting a bookmark key included in the left and right navigation keys 2420 while the first folder 2410 is output.
  • the first input signal 2430 may correspond to an input signal for long press-touching the bookmark key while the first folder 2410 is output.
  • the mobile terminal may map the first input signal 2430 and the first folder 2410.
  • the mobile terminal can output a first home screen 2440 including the first icon 2441.
  • the mobile terminal may sense the input signal for long press touching the first icon 2441 to execute the icon edit mode.
  • the mobile terminal can sense the left and right movement interaction 2460 on the first home screen 2440.
  • the mobile terminal may change the default key 2420 to the left and right navigation keys 2470 and output the same as sensing the left and right movement interaction 2460.
  • the mobile terminal can sense the second input signal 2480 for selecting the bookmark key.
  • the second input signal 2470 may correspond to an input signal for short touching the bookmark key. That is, the second input signal 2470 may touch the bookmark key in the same manner as the first input signal 2430, but may correspond to the input signal touched in another manner.
  • the mobile terminal may control to move the first icon 2441 to the first mapped folder 2410 as the second input signal 2480 is sensed.
  • FIG. 25 is a flowchart illustrating an embodiment of outputting a navigation key in response to detecting a screen movement interaction according to an embodiment of the present invention. Each step of FIG. 25 described below may be controlled by the controller of FIG. 1A.
  • the mobile terminal may output content on the display unit.
  • the content may include an icon, a home screen, a widget, and the like.
  • the mobile terminal may detect a screen movement interaction based on the content output through the sensing unit.
  • the screen movement interaction may include a vertical movement interaction and a left and right movement interaction.
  • the mobile terminal may determine the content controlled by the up and down navigation keys or the left and right navigation keys based on the point where the up and down movement interaction or the left and right movement interaction is input.
  • the mobile terminal may output up and down navigation keys or left and right navigation keys including at least one key based on the detected screen movement interaction.
  • the mobile terminal may change the default key into an up-down navigation key or a left-right navigation key based on the screen movement interaction.
  • the up and down navigation keys or the left and right navigation keys may include a first key, a second key and a third key.
  • the functions of the first key, the second key and the third key are as follows.
  • the mobile terminal when sensing an input signal for selecting a first key, may perform a first function related to content.
  • the first function may correspond to a function of performing a menu for the first time of the content.
  • the mobile terminal when sensing the first input signal for selecting the second key, the mobile terminal may map the first content area output on the display unit with the first input signal. Subsequently, when sensing the second input signal while the second content area different from the first content area is being output, the mobile terminal may output the first content area on the display.
  • the mobile terminal when sensing an input signal for selecting a third key, may perform a third function related to content.
  • the third function may correspond to a function of performing the last menu of the content.
  • the present invention described above can be embodied as computer readable codes on a medium in which a program is recorded.
  • the computer-readable medium includes all kinds of recording devices in which data that can be read by a computer system is stored. Examples of computer-readable media include hard disk drives (HDDs), solid state disks (SSDs), silicon disk drives (SDDs), ROMs, RAMs, CD-ROMs, magnetic tapes, floppy disks, optical data storage devices, and the like. This also includes implementations in the form of carrier waves (eg, transmission over the Internet).
  • the computer may include the controller 180 of the terminal. Accordingly, the above detailed description should not be construed as limiting in all aspects and should be considered as illustrative. The scope of the invention should be determined by reasonable interpretation of the appended claims, and all changes within the equivalent scope of the invention are included in the scope of the invention.
  • the present invention has industrial applicability in a mobile terminal, and can be repeatedly applied.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)
  • Telephone Function (AREA)

Abstract

La présente invention concerne un terminal mobile et son procédé de commande et, plus particulièrement, concerne un terminal mobile comprenant : une unité de détection; une unité d'affichage pour afficher un contenu; et une unité de commande, l'unité de commande commandant l'unité de détection pour détecter une interaction de mouvement d'écran en fonction du contenu de sortie, et commande l'unité d'affichage pour afficher sur celle-ci une touche de navigation montante/descendante ou une touche de navigation droite/gauche, qui comprend au moins une touche, sur la base de l'interaction de mouvement d'écran détectée.
PCT/KR2016/009128 2016-08-12 2016-08-18 Terminal mobile et procédé de commande associé WO2018030568A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020160102853A KR20180018015A (ko) 2016-08-12 2016-08-12 이동단말기 및 그 제어 방법
KR10-2016-0102853 2016-08-12

Publications (1)

Publication Number Publication Date
WO2018030568A1 true WO2018030568A1 (fr) 2018-02-15

Family

ID=61162800

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2016/009128 WO2018030568A1 (fr) 2016-08-12 2016-08-18 Terminal mobile et procédé de commande associé

Country Status (2)

Country Link
KR (1) KR20180018015A (fr)
WO (1) WO2018030568A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109032473A (zh) * 2018-08-09 2018-12-18 广州市箭冠网络科技有限公司 一种基于Android设备的列表顶部返回方法
CN109819102A (zh) * 2018-12-19 2019-05-28 努比亚技术有限公司 一种导航栏控制方法及移动终端、计算机可读存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120011462A1 (en) * 2007-06-22 2012-01-12 Wayne Carl Westerman Swipe Gestures for Touch Screen Keyboards
US20120127207A1 (en) * 2006-09-06 2012-05-24 Michael Matas Portable Electronic Device for Photo Management
US8365074B1 (en) * 2010-02-23 2013-01-29 Google Inc. Navigation control for an electronic device
US20140068424A1 (en) * 2012-08-31 2014-03-06 Adil Dhanani Gesture-based navigation using visual page indicators
KR101570368B1 (ko) * 2008-08-22 2015-11-20 엘지전자 주식회사 이동 단말기 및 그 메뉴 표시 방법

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120127207A1 (en) * 2006-09-06 2012-05-24 Michael Matas Portable Electronic Device for Photo Management
US20120011462A1 (en) * 2007-06-22 2012-01-12 Wayne Carl Westerman Swipe Gestures for Touch Screen Keyboards
KR101570368B1 (ko) * 2008-08-22 2015-11-20 엘지전자 주식회사 이동 단말기 및 그 메뉴 표시 방법
US8365074B1 (en) * 2010-02-23 2013-01-29 Google Inc. Navigation control for an electronic device
US20140068424A1 (en) * 2012-08-31 2014-03-06 Adil Dhanani Gesture-based navigation using visual page indicators

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109032473A (zh) * 2018-08-09 2018-12-18 广州市箭冠网络科技有限公司 一种基于Android设备的列表顶部返回方法
CN109819102A (zh) * 2018-12-19 2019-05-28 努比亚技术有限公司 一种导航栏控制方法及移动终端、计算机可读存储介质

Also Published As

Publication number Publication date
KR20180018015A (ko) 2018-02-21

Similar Documents

Publication Publication Date Title
WO2017057803A1 (fr) Terminal mobile et son procédé de commande
WO2017099276A1 (fr) Terminal mobile enroulable et son procédé de commande
WO2018105908A1 (fr) Terminal mobile et procédé de commande associé
WO2018034402A1 (fr) Terminal mobile et son procédé de commande
WO2016182132A1 (fr) Terminal mobile et son procédé de commande
WO2017131319A1 (fr) Terminal mobile doté d'un mode de fonctionnement à une seule main pour commander un dispositif jumelé, notification et application
WO2017082508A1 (fr) Terminal de type montre, et procédé de commande associé
WO2017104860A1 (fr) Terminal mobile enroulable
WO2020171287A1 (fr) Terminal mobile et dispositif électronique comportant un terminal mobile
WO2016032045A1 (fr) Terminal mobile et son procédé de commande
WO2015199270A1 (fr) Terminal mobile, et procédé de commande correspondant
WO2017030223A1 (fr) Terminal mobile à unité de carte et son procédé de commande
WO2017003055A1 (fr) Appareil d'affichage et procédé de commande
WO2017119529A1 (fr) Terminal mobile
WO2017047854A1 (fr) Terminal mobile et son procédé de commande
WO2017034126A1 (fr) Terminal mobile
WO2018084351A1 (fr) Terminal mobile, et procédé de commande associé
WO2017007064A1 (fr) Terminal mobile, et son procédé de commande
WO2017039051A1 (fr) Terminal mobile de type montre et son procédé de commande
WO2018043844A1 (fr) Terminal mobile
WO2015199381A1 (fr) Terminal mobile et son procédé de commande
WO2017051959A1 (fr) Appareil de terminal et procédé de commande pour appareil de terminal
WO2016039498A1 (fr) Terminal mobile et son procédé de commande
WO2021182692A1 (fr) Terminal mobile, dispositif électronique ayant un terminal mobile, et procédé de commande du dispositif électronique
WO2017086538A1 (fr) Terminal mobile et son procédé de commande

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16912767

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16912767

Country of ref document: EP

Kind code of ref document: A1