KR102018547B1 - Mobile terminal and control method for the mobile terminal - Google Patents

Mobile terminal and control method for the mobile terminal Download PDF

Info

Publication number
KR102018547B1
KR102018547B1 KR1020120129721A KR20120129721A KR102018547B1 KR 102018547 B1 KR102018547 B1 KR 102018547B1 KR 1020120129721 A KR1020120129721 A KR 1020120129721A KR 20120129721 A KR20120129721 A KR 20120129721A KR 102018547 B1 KR102018547 B1 KR 102018547B1
Authority
KR
South Korea
Prior art keywords
mobile terminal
image object
screen
lock
display unit
Prior art date
Application number
KR1020120129721A
Other languages
Korean (ko)
Other versions
KR20140062853A (en
Inventor
김미영
세르게이 고어릭
쿠젠스토프 바씰리
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020120129721A priority Critical patent/KR102018547B1/en
Publication of KR20140062853A publication Critical patent/KR20140062853A/en
Application granted granted Critical
Publication of KR102018547B1 publication Critical patent/KR102018547B1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/014Force feedback applied to GUI

Abstract

The present invention relates to a mobile terminal capable of executing a locked state for limiting input of a control command to an application and a control method thereof. According to an embodiment of the present invention, a mobile terminal includes a terminal main body, a display unit for outputting a lock screen in a locked state that restricts input of a control command for an application disposed on the main body, and an external physical force applied to the main body. And a controller configured to output an image object whose position is changed based on the lock screen, and to switch the locked state to a released state in response to the image object being positioned at a preset target point.

Description

MOBILE TERMINAL AND CONTROL METHOD FOR THE MOBILE TERMINAL}

The present invention relates to a mobile terminal capable of executing a locked state for limiting input of a control command to an application and a control method thereof.

Terminal is movable It may be divided into a mobile terminal (portable terminal) and a stationary terminal according to whether or not. The mobile terminal may be further classified into a handheld terminal and a vehicle mount terminal according to whether a user can directly carry it.

As the functions are diversified, for example, the terminal is implemented in the form of a multimedia player having complex functions such as taking a picture or video, playing a music or video file, playing a game or receiving a broadcast. have. Further, in order to support and increase the function of the terminal, it may be considered to improve the structural part and the software part of the terminal.

In addition, in the mobile terminal that receives a control command from the user through the touch screen included in the mobile terminal, it is possible to execute a locked state that restricts the input of the user's control command in order to prevent an unintended touch input by the user.

An object of the present invention is to provide a mobile terminal capable of providing a user interface that adds fun to a user on a lock screen and a control method thereof.

According to an embodiment of the present invention, a mobile terminal includes a terminal main body, a display unit for outputting a lock screen in a locked state that restricts input of a control command for an application disposed on the main body, and an external physical force applied to the main body. And a controller configured to output an image object whose position is changed based on the lock screen, and to switch the locked state to a released state in response to the image object being located at a preset target point.

According to an embodiment of the present disclosure, when the display unit is activated, the controller outputs the image object on a predetermined start point on the display unit together with the lock screen, and positions the image object in dependence on the external physical force. When the position of the image object is changed from the predetermined start point to the predetermined target point, characterized in that for switching to the unlocked state.

In an embodiment, the external physical force may be a touch input applied to the display unit or a motion input to the main body detected through a motion sensor provided in the main body.

The mobile terminal of claim 1, wherein when the image object is located at the predetermined target location, the controller executes a preset application and outputs an execution screen of the application on the display.

According to an embodiment of the present disclosure, a lock icon for switching a lock state to a release state is output on the lock screen, and the control unit is configured to release the lock state when a touch input of a preset method is applied to the lock icon. And switch to output a home screen page on the display unit.

A mobile terminal and a control method thereof according to an embodiment of the present invention can move an image object on a lock screen displayed in a locked state based on a user input. The position of the image object may be changed in response to the movement of the main body of the mobile terminal or a touch input applied to the mobile terminal. When the image object reaches the target point, the locked state may be switched to the unlocked state. have. Thus, the user can feel the same fun as playing a game by continuously applying user input to the mobile terminal on the mobile terminal to move the image object to the target point in the locked state.

1 is a block diagram illustrating a mobile terminal according to an embodiment of the present invention.
2A and 2B are perspective views of an example of a mobile terminal according to an embodiment of the present invention, viewed from the front.
3 is a flowchart illustrating a control method of releasing a locked state in a mobile terminal according to one embodiment disclosed herein.
4 is a conceptual diagram illustrating the control method described with reference to FIG. 3.
5A and 5B are conceptual views illustrating a method of moving an image object displayed on a lock screen in a mobile terminal according to one embodiment of the present invention.
6A, 6B, and 6C are conceptual views illustrating a method of releasing a locked state by using different methods in another mobile terminal according to one embodiment of the present invention.
7 is a conceptual diagram illustrating a method of providing different lock screens in different mobile terminals according to one embodiment of the present invention.

Hereinafter, embodiments of the present disclosure will be described in detail with reference to the accompanying drawings, and the same or similar components are denoted by the same reference numerals regardless of the reference numerals, and redundant description thereof will be omitted. The suffixes "module" and "unit" for components used in the following description are given or used in consideration of ease of specification, and do not have distinct meanings or roles from each other. In addition, in describing the embodiments disclosed herein, when it is determined that the detailed description of the related known technology may obscure the gist of the embodiments disclosed herein, the detailed description thereof will be omitted. In addition, it should be noted that the accompanying drawings are only for easily understanding the embodiments disclosed in the present specification and are not to be construed as limiting the technical spirit disclosed in the present specification by the accompanying drawings.

The mobile terminal described herein includes a mobile phone, a smart phone, a laptop computer, a digital broadcasting terminal, a personal digital assistant (PDA), a portable multimedia player (PMP), navigation, and a slate PC. , Tablet PC, Ultra Book, and so on. However, it will be readily apparent to those skilled in the art that the configuration according to the embodiments described herein may also be applied to fixed terminals such as digital TVs, desktop computers, etc., except when applicable only to mobile terminals.

1 is a block diagram illustrating a mobile terminal according to an exemplary embodiment disclosed herein.

The mobile terminal 100 includes a wireless communication unit 110, an A / V input unit 120, a user input unit 130, a sensing unit 140, an output unit 150, a memory 160, and an interface. The unit 170, the controller 180, and the power supply unit 190 may be included. The components shown in FIG. 1 are not essential, so that a mobile terminal having more or fewer components may be implemented.

Hereinafter, the components will be described in order.

The wireless communication unit 110 may include one or more modules that enable wireless communication between the mobile terminal 100 and the wireless communication system or between the mobile terminal 100 and a network in which the mobile terminal 100 is located. For example, the wireless communication unit 110 may include a broadcast receiving module 111, a mobile communication module 112, a wireless internet module 113, a short range communication module 114, a location information module 115, and the like. .

The broadcast receiving module 111 receives a broadcast signal and / or broadcast related information from an external broadcast management server through a broadcast channel.

The broadcast channel may include a satellite channel and a terrestrial channel. The broadcast management server may mean a server that generates and transmits a broadcast signal and / or broadcast related information or a server that receives a previously generated broadcast signal and / or broadcast related information and transmits the same to a terminal. The broadcast signal may include not only a TV broadcast signal, a radio broadcast signal, and a data broadcast signal, but also a broadcast signal having a data broadcast signal combined with a TV broadcast signal or a radio broadcast signal.

The broadcast related information may mean information related to a broadcast channel, a broadcast program, or a broadcast service provider. The broadcast related information may also be provided through a mobile communication network. In this case, it may be received by the mobile communication module 112.

The broadcast related information may exist in various forms. For example, it may exist in the form of Electronic Program Guide (EPG) of Digital Multimedia Broadcasting (DMB) or Electronic Service Guide (ESG) of Digital Video Broadcast-Handheld (DVB-H).

The broadcast receiving module 111 may include, for example, Digital Multimedia Broadcasting-Terrestrial (DMB-T), Digital Multimedia Broadcasting-Satellite (DMB-S), Media Forward Link Only (MediaFLO), and Digital Video Broadcast (DVB-H). Digital broadcast signals can be received using digital broadcasting systems such as Handheld and Integrated Services Digital Broadcast-Terrestrial (ISDB-T). Of course, the broadcast receiving module 111 may be configured to be suitable for not only the above-described digital broadcasting system but also other broadcasting systems.

The broadcast signal and / or broadcast related information received through the broadcast receiving module 111 may be stored in the memory 160.

The mobile communication module 112 transmits and receives a wireless signal with at least one of a base station, an external terminal, and a server on a mobile communication network. The wireless signal may include various types of data according to transmission and reception of a voice call signal, a video call call signal, or a text / multimedia message.

The mobile communication module 112 is configured to implement a video call mode and a voice call mode. The video call mode refers to a state of making a call while viewing the other party's video, and the voice call mode refers to a state of making a call without viewing the other party's image. In order to implement the video call mode and the voice call mode, the mobile communication module 112 is configured to transmit and receive at least one of audio and video.

The wireless internet module 113 refers to a module for wireless internet access and may be embedded or external to the mobile terminal 100. Wireless Internet technologies may include Wireless LAN (Wi-Fi), Wireless Broadband (Wibro), World Interoperability for Microwave Access (Wimax), High Speed Downlink Packet Access (HSDPA), and the like.

The short range communication module 114 refers to a module for short range communication. As a short range communication technology, Bluetooth, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB), ZigBee, etc. may be used.

The location information module 115 is a module for obtaining a location of a mobile terminal, and a representative example thereof is a GPS (Global Position System) module.

Referring to FIG. 1, the A / V input unit 120 is for inputting an audio signal or a video signal, and may include a camera 121 and a microphone 122. The camera 121 processes image frames such as still images or moving images obtained by the image sensor in a video call mode or a photographing mode. The processed image frame may be displayed on the display unit 151.

The image frame processed by the camera 121 may be stored in the memory 160 or transmitted to the outside through the wireless communication unit 110. Two or more cameras may be provided according to the use environment.

The microphone 122 receives an external sound signal by a microphone in a call mode, a recording mode, a voice recognition mode, etc., and processes the external sound signal into electrical voice data. The processed voice data may be converted into a form transmittable to the mobile communication base station through the mobile communication module 112 and output in the call mode. The microphone 122 may implement various noise removing algorithms for removing noise generated in the process of receiving an external sound signal.

The user input unit 130 generates input data for the user to control the operation of the terminal. The user input unit 130 may include a key pad, a dome switch, a touch pad (constant voltage / capacitance), a jog wheel, a jog switch, and the like.

The sensing unit 140 detects a current state of the mobile terminal 100 such as an open / closed state of the mobile terminal 100, a location of the mobile terminal 100, presence or absence of a user contact, orientation of the mobile terminal, acceleration / deceleration of the mobile terminal, and the like. To generate a sensing signal for controlling the operation of the mobile terminal 100. For example, when the mobile terminal 100 is in the form of a slide phone, it may sense whether the slide phone is opened or closed. In addition, whether the power supply unit 190 is supplied with power, whether the interface unit 170 is coupled to the external device may be sensed.

The output unit 150 is used to generate an output related to visual, auditory, or tactile, and may include a display unit 151, an audio output module 152, an alarm unit 153, and a haptic module 154. have.

The display unit 151 displays (outputs) information processed by the mobile terminal 100. For example, when the mobile terminal is in a call mode, the mobile terminal displays a user interface (UI) or a graphic user interface (GUI) related to the call. When the mobile terminal 100 is in a video call mode or a photographing mode, the mobile terminal 100 displays photographed and / or received images, a UI, and a GUI.

The display unit 151 includes a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT LCD), an organic light-emitting diode (OLED), and a flexible display (flexible). display, a 3D display, or an e-ink display.

Some of these displays can be configured to be transparent or light transmissive so that they can be seen from the outside. This may be referred to as a transparent display. A representative example of the transparent display is TOLED (Transparant OLED). The rear structure of the display unit 151 may also be configured as a light transmissive structure. With this structure, the user can see the object located behind the terminal body through the area occupied by the display unit 151 of the terminal body.

According to an implementation form of the mobile terminal 100, two or more display units 151 may exist. For example, a plurality of display units may be spaced apart or integrally disposed on one surface of the mobile terminal 100, or may be disposed on different surfaces, respectively.

When the display unit 151 and a sensor for detecting a touch operation (hereinafter, referred to as a touch sensor) form a mutual layer structure (hereinafter referred to as a touch screen), the display unit 151 may be used in addition to an output device. Can also be used as an input device. The touch sensor may have, for example, a form of a touch film, a touch sheet, a touch pad, or the like.

The touch sensor may be configured to convert a change in pressure applied to a specific portion of the display unit 151 or capacitance generated in a specific portion of the display unit 151 into an electrical input signal. The touch sensor may be configured to detect not only the position and area of the touch but also the pressure at the touch.

If there is a touch input to the touch sensor, the corresponding signal (s) is sent to the touch controller. The touch controller processes the signal (s) and then transmits the corresponding data to the controller 180. As a result, the controller 180 can know which area of the display unit 151 is touched.

Referring to FIG. 1, a proximity sensor may be disposed in an inner region of a mobile terminal surrounded by the touch screen or near the touch screen. The proximity sensor refers to a sensor that detects the presence or absence of an object approaching a predetermined detection surface or an object present in the vicinity without using a mechanical contact by using an electromagnetic force or infrared rays. Proximity sensors have a longer life and higher utilization than touch sensors.

Examples of the proximity sensor include a transmission photoelectric sensor, a direct reflection photoelectric sensor, a mirror reflection photoelectric sensor, a high frequency oscillation proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor. When the touch screen is capacitive, the touch screen is configured to detect the proximity of the pointer by the change of the electric field according to the proximity of the pointer. In this case, the touch screen (touch sensor) may be classified as a proximity sensor.

Hereinafter, for convenience of explanation, the act of allowing the pointer to be recognized without being in contact with the touch screen so that the pointer is located on the touch screen is referred to as a "proximity touch", and the touch The act of actually touching the pointer on the screen is called "contact touch." The position where the proximity touch is performed by the pointer on the touch screen refers to a position where the pointer is perpendicular to the touch screen when the pointer is in proximity proximity.

The proximity sensor detects a proximity touch and a proximity touch pattern (for example, a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch position, and a proximity touch movement state). Information corresponding to the sensed proximity touch operation and proximity touch pattern may be output on the touch screen.

The sound output module 152 may output audio data received from the wireless communication unit 110 or stored in the memory 160 in a call signal reception, a call mode or a recording mode, a voice recognition mode, a broadcast reception mode, and the like. The sound output module 152 may also output a sound signal related to a function (eg, a call signal reception sound, a message reception sound, etc.) performed in the mobile terminal 100. The sound output module 152 may include a receiver, a speaker, a buzzer, and the like.

The alarm unit 153 outputs a signal for notifying occurrence of an event of the mobile terminal 100. Examples of events occurring in the mobile terminal include call signal reception, message reception, key signal input, and touch input. The alarm unit 153 may output a signal for notifying occurrence of an event in a form other than a video signal or an audio signal, for example, vibration. The video signal or the audio signal may also be output through the display unit 151 or the sound output module 152, so that they 151 and 152 may be classified as part of the alarm unit 153.

The haptic module 154 generates various haptic effects that a user can feel. Vibration is a representative example of the haptic effect generated by the haptic module 154. The intensity and pattern of vibration generated by the haptic module 154 can be controlled. For example, different vibrations may be synthesized and output or may be sequentially output.

In addition to vibration, the haptic module 154 may be configured to provide a pin array that vertically moves with respect to the contact skin surface, a jetting force or suction force of air through an injection or inlet port, grazing to the skin surface, contact of an electrode, electrostatic force, and the like. Various tactile effects can be generated, such as effects by the endothermic and the reproduction of a sense of cold using the elements capable of endotherm or heat generation.

The haptic module 154 may not only deliver the haptic effect through direct contact, but also may implement the user to feel the haptic effect through a muscle sense such as a finger or an arm. Two or more haptic modules 154 may be provided according to configuration aspects of the mobile terminal 100.

The memory 160 may store a program for the operation of the controller 180 and may temporarily store input / output data (for example, a phone book, a message, a still image, a video, etc.). The memory 160 may store data regarding vibration and sound of various patterns output when a touch input on the touch screen is performed.

The memory 160 may be a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (for example, SD or XD memory), RAM random access memory (RAM), static random access memory (SRAM), read-only memory (ROM), electrically erasable programmable read-only memory (EEPROM), programmable read-only memory (PROM), magnetic memory, magnetic It may include a storage medium of at least one type of disk, optical disk. The mobile terminal 100 may operate in connection with a web storage that performs a storage function of the memory 160 on the Internet.

The interface unit 170 serves as a path with all external devices connected to the mobile terminal 100. The interface unit 170 receives data from an external device, receives power, transfers the power to each component inside the mobile terminal 100, or transmits data inside the mobile terminal 100 to an external device. For example, wired / wireless headset ports, external charger ports, wired / wireless data ports, memory card ports, ports for connecting devices with identification modules, audio input / output (I / O) ports, The video input / output (I / O) port, the earphone port, and the like may be included in the interface unit 170.

The identification module is a chip that stores various types of information for authenticating the use authority of the mobile terminal 100. The identification module includes a user identify module (UIM), a subscriber identify module (SIM), and a universal user authentication module ( universal subscriber identity module (USIM), and the like. A device equipped with an identification module (hereinafter referred to as an 'identification device') may be manufactured in the form of a smart card. Therefore, the identification device may be connected to the terminal 100 through a port.

The interface unit 170 may be a passage for supplying power from the cradle to the mobile terminal 100 when the mobile terminal 100 is connected to an external cradle, or various commands input from the cradle by a user. It may be a passage through which a signal is transmitted to the mobile terminal. Various command signals or power input from the cradle may be operated as signals for recognizing that the mobile terminal is correctly mounted on the cradle.

The controller 180 typically controls the overall operation of the mobile terminal. For example, perform related control and processing for voice calls, data communications, video calls, and the like. The controller 180 may include a multimedia module 181 for playing multimedia. The multimedia module 181 may be implemented in the controller 180 or may be implemented separately from the controller 180.

 In addition, the controller 180 may perform a pattern recognition process for recognizing a writing input or a drawing input performed on the touch screen as text and an image, respectively.

In addition, if the state of the mobile terminal satisfies a set condition, the controller 180 may execute a lock state for restricting input of a user's control command to applications. In addition, the controller 180 may control the lock screen displayed in the locked state based on a touch input detected through the display unit (hereinafter, referred to as a “touch screen” 151) in the locked state.

The power supply unit 190 receives an external power source and an internal power source under the control of the controller 180 to supply power for operation of each component.

Various embodiments described herein may be implemented in a recording medium readable by a computer or similar device using, for example, software, hardware or a combination thereof.

According to a hardware implementation, the embodiments described herein include application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), and field programmable gate arrays (FPGAs). It may be implemented using at least one of processors, controllers, micro-controllers, microprocessors, and electrical units for performing other functions. In some cases, the embodiments described herein may be implemented by the controller 180 itself.

According to the software implementation, embodiments such as the procedures and functions described herein may be implemented as separate software modules. Each of the software modules may perform one or more functions and operations described herein.

The software code may be implemented as a software application written in a suitable programming language. The software code may be stored in the memory 160 and executed by the controller 180.

Hereinafter, a structure of a mobile terminal or a mobile terminal in which components of the mobile terminal or the mobile terminal according to an embodiment of the present invention described with reference to FIG. 1 are disposed will be described.

2A is a front perspective view of an example of a mobile terminal or a mobile terminal of the present invention, and FIG. 2B is a rear perspective view of the mobile terminal shown in FIG. 2A.

2A is a front perspective view of an example of a mobile terminal or a mobile terminal according to the present invention, and FIG. 2B is a rear perspective view of the mobile terminal shown in FIG. 2A.

The disclosed mobile terminal 100 has a terminal body in the form of a bar. However, the present invention is not limited thereto and may be applied to various structures such as a slide type, a folder type, a swing type, a swivel type, and two or more bodies are coupled to be relatively movable.

According to the illustration, the terminal body 100 (hereinafter referred to as 'body') has a front, side and back. The body also has both ends formed along the longitudinal direction.

The body 100 includes a case (casing, housing, cover, etc.) forming an appearance. In the present embodiment, the case may be divided into a front side (hereinafter referred to as 'front case' 101) and a rear side (hereinafter referred to as 'rear case' 102). Various electronic components are built in the space formed between the front case 101 and the rear case 102. At least one intermediate case may be further disposed between the front case 101 and the rear case 102.

The cases may be formed by injecting synthetic resin or may be formed of a metal material, for example, a metal material such as stainless steel (STS) or titanium (Ti).

The terminal body 100 mainly includes a display unit 151, an audio output unit 152, a camera 121, a user input unit 130/131, 132, a microphone 122, an interface 170, and the like in the front case 101. Can be arranged.

The display unit 151 occupies most of the main surface of the front case 101. The sound output unit 152 and the camera 121 are disposed in regions adjacent to one end of both ends of the display unit 151, and the user input unit 131 and the microphone 122 are disposed in regions adjacent to the other end. The user input unit 131 and the interface 170 may be disposed on side surfaces of the front case 101 and the rear case 102. In contrast, the microphone 122 is disposed at the other end of the body 100.

The user input unit 130 is manipulated to receive a command for controlling the operation of the mobile terminal, and may include a plurality of manipulation units 131 and 132. The manipulation units 131 and 132 may also be collectively referred to as manipulating portions, and may be employed in any manner as long as the user operates the tactile manner with a tactile feeling.

Content input by the first or second manipulation units 131 and 132 may be variously set. For example, the first operation unit 131 receives a command such as start, end, scroll, and the like, and the second operation unit 132 adjusts the volume of the sound output from the sound output unit 152 or the display unit 151. Command), such as switching to the touch recognition mode.

Referring to FIG. 2B, a sound output unit 152 ′ may be additionally disposed on the rear of the terminal body, that is, the rear case 102. The sound output unit 152 ′ may implement a stereo function together with the sound output unit 152 (see FIG. 2A), and may be used to implement a speakerphone mode during a call.

The terminal body is equipped with a power supply unit 190 for supplying power to the mobile terminal 100. The power supply unit 190 may be embedded in the terminal body or may be directly detachable from the outside of the terminal body.

In addition, the rear case 102 may further be equipped with a touch pad 135 for sensing a touch. Like the display unit 151, the touch pad 135 may also be configured to have a light transmission type. In this case, if the display unit 151 is configured to output visual information from both sides, the visual information may be recognized through the touch pad 135. The information output on both surfaces may be controlled by the touch pad 135. Alternatively, a display may be additionally mounted on the touch pad 135, and a touch screen may also be disposed on the rear case 102.

In addition, a camera 121 'may be additionally mounted on the rear case 102 of the terminal body. The camera 121 ′ may be a camera having a photographing direction substantially opposite to that of the camera 121 mounted on the front case and having different pixels from the camera 121.

For example, the camera 121 has a low pixel so that the user's face is photographed and transmitted to the counterpart in case of a video call, and the camera 121 'photographs a general subject and does not transmit it immediately. Many can be made to have a high pixel. The camera 121 'may be installed in the terminal body 100 to be rotatable or pop-up.

A flash 123 and a mirror 124 are further disposed adjacent to the camera 121 '. The flash 123 shines light toward the subject when the subject is photographed by the camera 121 '. The mirror allows the user to see his / her face or the like when the user wants to photograph himself (self-photographing) using the camera 121 '.

The sound output unit 252 'may be further disposed on the rear surface of the terminal body. The sound output unit 252 ′ may implement a stereo function together with the sound output unit 252 (see FIG. 2A), and may be used to implement a speakerphone mode during a call.

The terminal body is equipped with a power supply unit 190 for supplying power to the mobile terminal 100. The power supply unit 190 may be embedded in the terminal body or may be directly detachable from the outside of the terminal body.

The rear case 102 may be further equipped with a touch pad 135 for sensing a touch. Like the display unit 151, the touch pad 135 may also be configured to have a light transmission type. In this case, if the display unit 151 is configured to output visual information from both sides, the visual information may be recognized through the touch pad 135. The information output on both surfaces may be controlled by the touch pad 135. Alternatively, a display may be additionally mounted on the touch pad 135, and a touch screen may also be disposed on the rear case 102.

The touch pad 135 operates in association with the display unit 151 of the front case 101. The touch pad 135 may be disposed in parallel to the rear of the display unit 151. The touch pad 135 may have the same or smaller size as the display unit 151.

In addition, the controller 180 of the mobile terminal according to an embodiment of the present invention, which may include at least one or more of the components as described above, may execute a locked state for restricting input of a control command for an application. . In addition, the lock screen may be output in the locked state, and the lock screen may include an image object whose display position is changed in response to a physical force to the outside.

The lock screen may be output to the display unit when the display unit 151 is in an on state, and the controller 180 may output the image object together with the screen information set as the lock screen. .

Here, the locked state is a state that limits the input of the user's control command to the applications included in the mobile terminal. This is to prevent a control command not intended by a user from being input in a mobile terminal having a touch screen (or a display unit 151 formed of a touch screen) to activate or deactivate functions and applications of the mobile terminal. Therefore, in the locked state, the input of the control command of the user input through the touch screen (or the display unit 151 and the other user input unit 130) may be limited within the set range.

On the other hand, in the locked state, the input of the control command of the user is limited, but the functions of the mobile terminal and the operations of the applications that were operating before the locked state can be continuously executed.

The release state is a state in which a user's input of a control command to the mobile terminal is not restricted. Therefore, in the released state, the functions and applications of the mobile terminal are activated or deactivated according to the control command input by the user through the touch screen (or the display unit 151 or the user input unit 130).

The lock state may be executed when a user's input is not detected for a time set for the touch screen (or the display unit 151 and the other user input unit 130) provided in the mobile terminal. The set time may be changed according to a user's setting. In addition, the locked state may be executed when the user presses a special key (for example, a hold key) provided in the mobile terminal which is set in advance for the locked state.

Hereinafter, a method of switching the locked state to the released state by moving the position of the image object included in the lock screen will be described in more detail with reference to the accompanying drawings. FIG. 3 is a flowchart illustrating a control method of releasing a locked state in a mobile terminal according to an exemplary embodiment of the present disclosure, and FIG. 4 is a conceptual diagram illustrating the control method described with reference to FIG. 3.

In the mobile terminal according to an embodiment of the present invention, the controller 180 outputs an image object on the lock screen (S310).

For example, referring to FIG. 4A, the image object 411 may have a specific shape. The visual appearance (shape, color, size, transparency, etc.) of the image object 411 may be variously changed.

In addition, the image object 411 may be displayed in a region 410 having a predetermined size occupying a predetermined portion of the lock screen. The image object 411 may be located anywhere in the one area 410.

As illustrated, the one area 410 may be smaller than the size of the screen display area of the display unit 151 and may be the same as the size of the screen display area although not shown.

In addition, the one area 410 may be displayed separately from the screen information displayed on the lock screen 400 so that the user can identify it.

Meanwhile, a position where the image object 410 is initially output may be specified in advance. That is, the controller 180 outputs the lock screen on the display unit 151 when the lighting of the display unit 151 is turned on from the off state to the on state in the locked state. When the lock screen is output on the display unit 151, the image object 411 included in the one area 410 may be output on a preset start point.

As such, while the image object 411 is output in the lock screen, the controller 180 detects an external physical force applied to the mobile terminal main body 100 (refer to FIGS. 1, 2A, and 2B) (S320). ). Here, the external physical force may mean various types of user inputs that can be applied to the mobile terminal according to an embodiment of the present invention to move the position of the image object 411.

For example, the external physical force may be a touch on the image object 411. That is, when the user touches the display unit 151 of the mobile terminal according to an embodiment of the present invention using a finger or a touch pen (or a stylus pen), the controller 180 touches the display unit 151. The input can be detected.

As another example, the external physical force may be a motion input such as a user tilts or rotates the mobile terminal main body 100 (refer to FIGS. 1, 2A, and 2B) according to an exemplary embodiment. . Such a motion input may be sensed through the sensing unit 140 (see FIG. 1).

The sensing unit 140 may be mounted inside the mobile terminal according to an embodiment of the present disclosure, and may recognize movement or rotation of the mobile terminal. The sensing unit 140 may include at least one of a terrestrial magnetism sensor, a gyro sensor, and an acceleration sensor.

The geomagnetic sensor is a sensor that detects the direction and size of the geomagnetic and generates an electrical signal using the same. The gyro sensor is a sensor that detects the rotational speed of the body and generates an electrical signal using the same. An acceleration sensor measures a direction of gravitational acceleration, detects an acceleration change in one direction, and uses the same to generate an electrical signal. As such, the sensing unit 140 may detect a movement of the mobile terminal body using at least one of a geomagnetic sensor, a gyro sensor, and an acceleration sensor.

Meanwhile, when an external physical force is sensed with respect to the mobile terminal main body according to an embodiment of the present disclosure, the controller 180 changes the position of the image object 411 in response to the external physical force (S330). ).

For example, referring to FIG. 4, in response to an external physical force, the controller 180 corresponds to an external physical force, as illustrated in FIGS. 4A and 4B, to correspond to an external physical force of the image object 411. Change the position (411a-> 411b-> 411c). Here, the position of the image object 411 may be changed depending on the degree and direction to which an external physical force is applied.

As described above, when the position of the image object 411 is changed in response to an external physical force, and the image object 411 reaches (or positions) a preset target point, the controller 180 releases the lock state. The state may be switched (S340). That is, the controller 180 can recognize that the image object 411 reaches a preset target point as a control command for switching the state of the mobile terminal from the locked state to the released state according to an embodiment of the present invention. have.

For example, as shown in (b) and (c) of FIG. 4, the position of the image object 411 is changed in response to an external physical force (411a-> 411b-> 411c-> 411d), and When the object 411 reaches the target point 412, the controller 180 may switch the locked state to the released state as shown in FIG. 4D.

Meanwhile, as described above, in order for the image object 411 to reach the target point, the user moves the obstacle image object 413 (see (a), (b) and (c) of FIG. 4) corresponding to the obstacle. The image object 411 may be reached at the target point. That is, the user may enjoy a game of moving the image object 411 in order to switch the locked state to the unlocked state. Here, the method for reaching the target point of the image object 411 may be made in various ways in addition to the methods shown or described herein.

On the other hand, in the locked state, when switched to the release state, the home screen page (or standby screen) may be output on the display unit 151. In this case, the screen information which was last output on the display unit 151 before the lock state is executed may be output on the display unit 151. In the locked state, the type of screen information output to the display unit 151 when the state is switched to the unlocked state may be variously changed based on the user's selection or the control of the controller.

Hereinafter, a method of moving an image object will be described in more detail with reference to the accompanying drawings. 5A and 5B are conceptual views illustrating a method of moving an image object displayed on a lock screen in a mobile terminal according to one embodiment of the present invention.

As an example, the external physical force (user input) for moving the image object may be a touch on the image object. As shown in (a), (b) and (c) of FIG. 5A, the position of the image object 411 may be changed in response to a touch input (including a proximity touch) with respect to the image object 411. . Here, the touch input to the image object 411 may be a drag touch input continuously input to the image object 411.

Meanwhile, in response to the touch input, when the image object 411 reaches the target point 412, the controller 180 may switch the locked state to the unlocked state as illustrated in (d) of FIG. 5A. Can be.

As another example, an external physical force (or user input) for moving an image object may be tilted or rotated by a user in the mobile terminal body 100 (refer to FIGS. 1, 2A, and 2B) according to an exemplary embodiment. It can be the same movement as letting.

As illustrated in (a), (b), and (c) of FIG. 5B, the controller 180 may move the image object 411 along the direction in which the main body 100 is inclined.

In addition, although not shown, the controller 180 may differently adjust the speed of moving the image object 411 based on at least one of an angle and a speed at which the main body 100 is inclined. As such, when the image object 411 reaches the target point 412 through the movement of the main body 100, the controller 180 may release the locked state as shown in (d) of FIG. 5B. You can switch.

As described above, the mobile terminal and its control method according to an embodiment of the present invention can switch the locked state to the released state based on the movement of the image object to the target point. Therefore, the user can obtain an effect such as playing a game on the lock screen.

Hereinafter, a method of releasing the locked state will be described in more detail with reference to the accompanying drawings. 6A, 6B, and 6C are conceptual views illustrating a method of releasing a locked state by using different methods in another mobile terminal according to one embodiment of the present invention.

In the mobile terminal according to an exemplary embodiment of the present invention, as described above with reference to FIGS. 4, 5A, and 5B, the locked state is released to another state in addition to the method of moving the image object to the unlocked state. You can switch.

For example, as illustrated in FIG. 6A, on the lock screen 600, in addition to an area 610 for moving the image object 611, a lock icon 621 capable of controlling a lock state. ) Can be output together.

That is, in the mobile terminal according to an embodiment of the present invention, as shown in FIGS. 6A and 6B, the locked state is moved by moving the image object 611 to the preset target point 612. Can be turned off. In addition, in the mobile terminal according to an embodiment of the present invention, as shown in FIGS. 6A and 6C, the locked state is released based on the touch of the lock icon 621 in a preset manner. Can be converted to Here, the preset method may be a touch input for dragging the lock icon 621 over a preset length as shown in (c) of FIG. 6A.

On the other hand, although not shown, in the mobile terminal according to an embodiment of the present invention, in addition to the method of switching the lock state to the unlocked state by using the lock icon, a predetermined method for any point on the lock screen 600 In response to the touch input of, the locked state may be switched to the released state. Here, the preset touch input may be a drag touch input having a preset length.

As described above, in the mobile terminal according to an embodiment of the present invention, by unlocking the lock state by using an image object on the lock screen, it is possible to provide fun while releasing the lock state. Furthermore, the lock icon can be used to quickly unlock the lock state.

Meanwhile, in the mobile terminal according to an embodiment of the present disclosure, when the locked state is switched to the released state using the image object, the screen provided in the released state may be differently provided.

For example, as shown in (a) and (b) of FIG. 6B, when the user releases the lock state by moving the image object 611 to the target point 612, the controller 180 As illustrated in (c) of FIG. 6B, the locked state may be changed to a released state, and a preset application may be executed. In this case, the execution screen of the preset application may be output on the display unit 151. For example, when the preset application is a camera function application, as shown in (c) of FIG. 6B, an execution screen of the camera function application may be output on the display unit 151. Meanwhile, the type of the preset application may be variously changed based on the user's selection.

On the other hand, in the mobile terminal according to an embodiment of the present invention, when the lock state is released using the image object 611, the controller 180 executes a preset application and locks it through the lock icon 621. When is released, as shown in (a) and (b) of FIG. 6C, the home screen page may be output.

As such, the mobile terminal according to an exemplary embodiment may output different types of screen information on the display unit 151 according to a method for switching the locked state to the released state.

Therefore, when the user wants to execute a specific application immediately, the user unlocks the lock using an image object, and when the user wants to use the home screen page, the user unlocks the lock using a general unlocking method (for example, using a lock icon). You can unlock it.

Hereinafter, a method of providing different lock screens will be described in more detail with reference to the accompanying drawings. 7 is a conceptual diagram illustrating a method of providing different lock screens in different mobile terminals according to one embodiment of the present invention.

The mobile terminal according to an embodiment of the present disclosure may provide various methods for moving an image object to a target point in order to add pleasure to the user through the lock screen. That is, the mobile terminal according to an embodiment of the present invention may provide a game screen on the lock screen to release the lock state through a kind of game. There may be a plurality of such types of games, and the controller 180 may provide a user interface for selecting a game to be used by a user on a lock screen among a plurality of games.

For example, as shown in (a) of FIG. 7, the tab icons 710, 720, and 730 corresponding to a plurality of games, respectively, are displayed on the lock screen 700 together with any one game screen 711. It may include. And, on the lock screen 700 (a) of FIG. As shown in (b) and (c), a game screen 711, 712 or 713 corresponding to the currently selected tab 710, 720 or 730 may be provided. In addition, when a predetermined target is achieved through the game screen currently output to the display unit 151, the controller 180 may switch the locked state to the released state.

As described above, according to an embodiment of the present invention, the mobile terminal and its control method may provide various game screens on the lock screen at the same time, thereby increasing the user's enjoyment of using the mobile terminal.

In addition, the mobile terminal and its control method according to an embodiment of the present invention can move the image object on the lock screen displayed in the locked state based on a user input. The position of the image object may be changed in response to the movement of the main body of the mobile terminal or a touch input applied to the mobile terminal. When the image object reaches the target point, the locked state may be switched to the unlocked state. have. Thus, the user can feel the same fun as playing a game by continuously applying user input to the mobile terminal on the mobile terminal to move the image object to the target point in the locked state.

In addition, according to one embodiment disclosed herein, the above-described method may be implemented as code that can be read by a processor in a medium in which a program is recorded. Examples of processor-readable media include ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical data storage, and the like, and may be implemented in the form of a carrier wave (for example, transmission over the Internet). Include.

The above-described mobile terminal is not limited to the configuration and method of the above-described embodiments, but the embodiments may be configured by selectively combining all or some of the embodiments so that various modifications can be made. It may be.

Claims (5)

Terminal body;
A display unit configured to output a lock screen in a locked state for restricting input of a control command for an application disposed on the main body; And
Outputting one game screen including an image object while corresponding to any one of a plurality of tab icons and the plurality of tab icons on the lock screen;
Change the position of the image object and output the image object based on a user input to the main body;
And a controller configured to switch the locked state to a released state in response to the image object being located at a predetermined target point in the game screen.
The user input corresponds to at least one of a touch input to the display unit and a motion input detected through a motion sensor provided in the main body.
The control unit,
If a touch is applied to any one of the plurality of tab icons, a different game screen is output from the game screen,
And displaying a location of an image object included in the other game screen based on a type of user input different from the user input while the other game screen is displayed.
The method of claim 1, wherein the control unit
When the display unit is activated, the image object is output on a predetermined start point on the display unit together with the lock screen and the game screen.
And move the lock state to a release state when the position of the image object is changed in dependence on the user input and the position of the image object is changed from the preset start point to the preset target point. terminal.
The method of claim 1,
The control unit,
And adjusting the speed at which the position of the image object is changed based on at least one of an inclination angle and a speed when the user input is a motion input to the main body.
The method of claim 1, wherein the control unit
And when the image object is located at the preset target point, executing a preset application and outputting an execution screen of the application on the display unit.
The method of claim 4, wherein
A lock icon for switching the lock state to a released state is output on the lock screen;
The control unit
When a touch input of a preset method is applied to the lock icon, the lock state is switched to the unlocked state, and the mobile terminal outputs a home screen page on the display unit.
KR1020120129721A 2012-11-15 2012-11-15 Mobile terminal and control method for the mobile terminal KR102018547B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020120129721A KR102018547B1 (en) 2012-11-15 2012-11-15 Mobile terminal and control method for the mobile terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020120129721A KR102018547B1 (en) 2012-11-15 2012-11-15 Mobile terminal and control method for the mobile terminal

Publications (2)

Publication Number Publication Date
KR20140062853A KR20140062853A (en) 2014-05-26
KR102018547B1 true KR102018547B1 (en) 2019-09-05

Family

ID=50890969

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020120129721A KR102018547B1 (en) 2012-11-15 2012-11-15 Mobile terminal and control method for the mobile terminal

Country Status (1)

Country Link
KR (1) KR102018547B1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101732129B1 (en) * 2014-09-22 2017-05-02 주식회사 카우치그램 Secure Call Button

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100931926B1 (en) * 2008-06-16 2009-12-15 주식회사 인프라웨어 Mobile communication terminal with moving menu icon ui

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20110008940A (en) * 2009-07-21 2011-01-27 엘지전자 주식회사 Mobile terminal and operation method thereof
KR101615975B1 (en) * 2009-10-19 2016-05-02 엘지전자 주식회사 Mobile terminal and operation control method thereof

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100931926B1 (en) * 2008-06-16 2009-12-15 주식회사 인프라웨어 Mobile communication terminal with moving menu icon ui

Also Published As

Publication number Publication date
KR20140062853A (en) 2014-05-26

Similar Documents

Publication Publication Date Title
KR102043146B1 (en) Mobile terminal and electronic note system using the mobile terminal
KR101343591B1 (en) Mobile device and control method for the same
KR102179056B1 (en) Mobile terminal and control method for the mobile terminal
KR101366861B1 (en) Mobile terminal and control method for mobile terminal
KR101677639B1 (en) Mobile device and control method for the same
KR102135358B1 (en) The mobile terminal and the control method thereof
KR102058368B1 (en) Mobile terminal and control method for the mobile terminal
KR20110045664A (en) Method for displaying a menu in mobile terminal and mobile terminal thereof
KR20130091189A (en) Mobile terminal and control method for the mobile terminal
KR20110016337A (en) Method for displaying data and mobile terminal thereof
KR20100030749A (en) Controling method of 3 dimension user interface switchover and mobile terminal using the same
KR20100114779A (en) Mobile terminal and control method thereof
KR20100042978A (en) Terminal and method for controlling the same
KR20100077982A (en) Terminal and method for controlling the same
KR20110045659A (en) Method for controlling icon display in mobile terminal and mobile terminal thereof
KR102063767B1 (en) Mobile terminal and control method thereof
KR20150063832A (en) The mobile terminal and the control method thereof
KR20110139570A (en) Method for executing an application in mobile terminal set up lockscreen and mobile terminal using the same
KR20100123336A (en) Method of switching user interface mode and mobile terminal using the same
KR20100104562A (en) Mobile terminal and method for controlling wallpaper display thereof
KR20100099587A (en) Mobile terminal and method for controlling the same
KR20100099447A (en) Mobile termianl and information processing method thereof
KR20110058525A (en) Mobile terminal and control method thereof
KR20100029611A (en) Mobile terminal and method for displaying icon thereof
KR102018547B1 (en) Mobile terminal and control method for the mobile terminal

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E701 Decision to grant or registration of patent right