WO2015083873A1 - Terminal mobile, et couvercle correspondant - Google Patents

Terminal mobile, et couvercle correspondant Download PDF

Info

Publication number
WO2015083873A1
WO2015083873A1 PCT/KR2013/012014 KR2013012014W WO2015083873A1 WO 2015083873 A1 WO2015083873 A1 WO 2015083873A1 KR 2013012014 W KR2013012014 W KR 2013012014W WO 2015083873 A1 WO2015083873 A1 WO 2015083873A1
Authority
WO
WIPO (PCT)
Prior art keywords
area
display unit
touch input
touch
mobile terminal
Prior art date
Application number
PCT/KR2013/012014
Other languages
English (en)
Korean (ko)
Inventor
장구앙
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Publication of WO2015083873A1 publication Critical patent/WO2015083873A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B1/00Details of transmission systems, not covered by a single one of groups H04B3/00 - H04B13/00; Details of transmission systems not characterised by the medium used for transmission
    • H04B1/38Transceivers, i.e. devices in which transmitter and receiver form a structural unit and in which at least one part is used for functions of transmitting and receiving
    • H04B1/40Circuits
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/72409User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories
    • H04M1/724092Interfacing with an external cover providing additional functionalities
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/22Details of telephonic subscriber devices including a touch pad, a touch sensor or a touch detector

Definitions

  • the present invention provides a mobile terminal mounted to the cover.
  • Terminals may be divided into mobile / portable terminals and stationary terminals according to their mobility.
  • the mobile terminal may be further classified into a handheld terminal and a vehicle mount terminal according to whether a user can directly carry it.
  • the terminal is implemented in the form of a multimedia player having complex functions such as taking a picture or video, playing a music or video file, playing a game or receiving a broadcast. have. Further, in order to support and increase the function of the terminal, it may be considered to improve the structural part and the software part of the terminal.
  • the cover surrounding the mobile terminal not only serves to protect the mobile terminal but also includes a function of controlling various functions of the mobile terminal based on the open state and the closed state.
  • the cover surrounding most of the exterior of the mobile terminal still has a disadvantage in that it is treated as an accessory that is cumbersome for the user.
  • the technical problem of the present invention is to provide a cover of an improved terminal function.
  • the mobile terminal is made of a front side, a rear side and a side, and in a closed state or an open state in which the front side is exposed by a cover formed with a light transmitting area.
  • the main body is implemented, a first area for outputting first content related to the event when an event occurs in the closed state, a second area for receiving a touch input of a user who controls the content, and a reception for which the touch input is restricted
  • a controller configured to control the first region to switch the first content to the second content based on a touch unit having three regions and a touch input applied to the second region.
  • the controller may control the display unit to output a first graphic image displayed in the second area and receiving a touch input to control the mobile terminal.
  • the first area is formed to receive a user's touch input, and the controller outputs the first graphic image when a touch input is applied to any one of the first area and the second area. And control the display unit.
  • the first content may include a control icon for receiving a touch input to form a control command for controlling the mobile terminal, and the controller controls the control when the touch input is applied to the first graphic image. Control the mobile terminal based on the command.
  • the controller controls the display unit to change the first graphic image into a second graphic image distinguished from the first graphic image based on a touch input applied to the second area.
  • the first graphic image corresponds to a virtual keyboard
  • the controller controls the display unit to output text to the first area based on a touch input applied to the virtual keyboard
  • the controller controls the display unit to switch the first content to the second content.
  • the lock mode in which at least some functions of the mobile terminal are limited, when the continuous touch input is applied from the first area to the second area, the lock mode is released.
  • the display unit may output screen information including at least one icon corresponding to an application in the open state, and the controller may output the icon to the second area in the closed state.
  • the control unit executes the application based on a touch input applied to an icon of the second area.
  • the controller controls the display unit to output a part of an execution screen of the application to the first area.
  • the controller controls the display unit to output another area of the execution screen to the first area based on a touch input applied to the second area.
  • the display unit further comprises a touch sensing unit for receiving the touch input
  • the control unit is the closed state
  • the touch The touch sensitivity of the first to third regions with respect to the input is changed.
  • the control unit in the closed state, the first region is exposed to the outside through the opening region formed in the cover, the control unit has a touch sensitivity of the first region is lower than the touch sensitivity of the second region
  • the touch sensing unit is controlled to adjust.
  • the controller in the closed state, adjusts the touch sensitivity of the third area to block reception of a touch input applied to the third area.
  • the mobile terminal receives an event, whether the display unit is in an open state covered by the cover or the display unit is exposed closed state, the closed state And outputting first content related to the event to the first area of the display unit and changing the first content to second content based on a touch input applied to the second area of the display unit in the closed state. It includes a step.
  • control method may further include outputting screen information related to the event to at least one region of the display unit in the open state.
  • a cover mounted to a mobile terminal having a display unit for outputting screen information the cover is connected to the first member, the first member is formed to cover the display unit or expose the display unit And a second member, wherein the second member is partitioned into first through third transmission areas having different transmission degrees of the screen information.
  • the display unit is made of an opening so that a part of the display unit is exposed, and the second area is applied to the display unit based on a user's touch.
  • one surface facing the display unit of the second region may include at least one protrusion protruding toward the display unit, and the second region may correspond to the protrusion. Transmit the graphic image output from.
  • the transmittance of the second area is changed to transmit the screen information output from the display unit.
  • the user can be provided with information on the event received by the mobile terminal even in the closed state by the cover, and can control the mobile terminal without switching to the open state.
  • FIG. 1 is a block diagram illustrating a mobile terminal according to one embodiment disclosed herein.
  • FIGS. 2A and 2B are conceptual views of a communication system operable by a mobile terminal according to the present invention.
  • 3A is a front perspective view of an example of a mobile terminal related to the present invention.
  • FIG. 3B is a rear perspective view of the mobile terminal shown in FIG. 3A.
  • 4A and 4B are conceptual views illustrating the mobile terminal mounted on the cover.
  • FIG. 5 is a flowchart illustrating a control method of a mobile terminal mounted on a cover of the present invention.
  • FIG. 6 is a conceptual diagram illustrating a control method of FIG. 5 according to an exemplary embodiment.
  • FIGS. 7A to 7C are conceptual views illustrating a method of controlling a first region according to various embodiments of the present disclosure.
  • 8A to 8C are conceptual views illustrating a control method of controlling the mobile terminal based on touch inputs applied to first and second areas according to various embodiments.
  • 9A and 9B are conceptual views illustrating a control method of adjusting the touch sensitivity of the display unit in an open state and a closed state.
  • FIG. 10 is a conceptual diagram illustrating a touch input applied through the second area.
  • 11 is a conceptual diagram illustrating a control method of outputting an image to a third region.
  • the mobile terminal described herein includes a mobile phone, a smart phone, a laptop computer, a digital broadcasting terminal, a personal digital assistant, a portable multimedia player, a navigation, a slate PC , Tablet PC, ultrabook, and the like.
  • a mobile phone a smart phone, a laptop computer, a digital broadcasting terminal, a personal digital assistant, a portable multimedia player, a navigation, a slate PC , Tablet PC, ultrabook, and the like.
  • fixed terminals such as digital TVs, desktop computers, etc., except when applicable only to mobile terminals.
  • FIG. 1 is a block diagram illustrating a mobile terminal according to an exemplary embodiment disclosed herein.
  • the mobile terminal 100 includes a wireless communication unit 110, an A / V input unit 120, a user input unit 130, a detection unit 140, an output unit 150, a memory 160, and an interface.
  • the unit 170, the controller 180, and the power supply unit 190 may be included.
  • the components shown in FIG. 1 are not essential, so that a mobile terminal having more or fewer components may be implemented.
  • the wireless communication unit 110 may include one or more modules that enable wireless communication between the mobile terminal 100 and the wireless communication system or between the mobile terminal 100 and a network in which the mobile terminal 100 is located.
  • the wireless communication unit 110 may include at least one of the broadcast receiving module 111, the mobile communication module 112, the wireless internet module 113, the short range communication module 114, and the location information module 115. Can be.
  • the broadcast receiving module 111 receives a broadcast signal and / or broadcast related information from an external broadcast management server through a broadcast channel.
  • the broadcast channel may include a satellite channel and a terrestrial channel.
  • the broadcast management server may mean a server that generates and transmits a broadcast signal and / or broadcast related information or a server that receives a previously generated broadcast signal and / or broadcast related information and transmits the same to a terminal.
  • the broadcast signal may include not only a TV broadcast signal, a radio broadcast signal, and a data broadcast signal, but also a broadcast signal having a data broadcast signal combined with a TV broadcast signal or a radio broadcast signal.
  • the broadcast related information may mean information related to a broadcast channel, a broadcast program, or a broadcast service provider.
  • the broadcast related information may also be provided through a mobile communication network. In this case, it may be received by the mobile communication module 112.
  • the broadcast related information may exist in various forms. For example, it may exist in the form of Electronic Program Guide (EPG) of Digital Multimedia Broadcasting (DMB) or Electronic Service Guide (ESG) of Digital Video Broadcast-Handheld (DVB-H).
  • EPG Electronic Program Guide
  • DMB Digital Multimedia Broadcasting
  • ESG Electronic Service Guide
  • DVB-H Digital Video Broadcast-Handheld
  • the broadcast receiving module 111 may include, for example, Digital Multimedia Broadcasting-Terrestrial (DMB-T), Digital Multimedia Broadcasting-Satellite (DMB-S), Media Forward Link Only (MediaFLO), and Digital Video Broadcast (DVB-H).
  • Digital broadcast signals can be received using digital broadcasting systems such as Handheld and Integrated Services Digital Broadcast-Terrestrial (ISDB-T).
  • ISDB-T Handheld and Integrated Services Digital Broadcast-Terrestrial
  • the broadcast receiving module 111 may be configured to be suitable for not only the above-described digital broadcasting system but also other broadcasting systems.
  • the broadcast signal and / or broadcast related information received through the broadcast receiving module 111 may be stored in the memory 160.
  • the mobile communication module 112 transmits and receives a wireless signal with at least one of a base station, an external terminal, and a server on a mobile communication network.
  • the wireless signal may include various types of data according to transmission and reception of a voice call signal, a video call call signal, or a text / multimedia message.
  • the mobile communication module 112 is configured to implement a video call mode and a voice call mode.
  • the video call mode refers to a state of making a call while viewing the other party's video
  • the voice call mode refers to a state of making a call without viewing the other party's image.
  • the mobile communication module 112 is configured to transmit and receive at least one of audio and video.
  • the wireless internet module 113 refers to a module for wireless internet access and may be embedded or external to the mobile terminal 100.
  • Wireless Internet technologies include Wireless LAN (WLAN), Wireless Fidelity (WiFi) Direct, Digital Living Network Alliance (DLNA), Wireless broadband (Wibro), World Interoperability for Microwave Access (Wimax), High Speed Downlink Packet Access (HSDPA), and more. This can be used.
  • the short range communication module 114 refers to a module for short range communication.
  • Short range communication technologies include Bluetooth TM, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB), ZigBee, and Near Field Communication (NFC). Can be.
  • RFID Radio Frequency Identification
  • IrDA Infrared Data Association
  • UWB Ultra Wideband
  • ZigBee ZigBee
  • NFC Near Field Communication
  • the location information module 115 is a module for obtaining a location of a mobile terminal, and a representative example thereof is a Global Position System (GPS) module or a Wireless Fidelity (WiFi) module.
  • GPS Global Position System
  • WiFi Wireless Fidelity
  • the A / V input unit 120 is for inputting an audio signal or a video signal, and may include a camera 121 and a microphone 122.
  • the camera 121 processes image frames such as still images or moving images obtained by the image sensor in a video call mode or a photographing mode.
  • the processed image frame may be displayed on the display unit 151.
  • the image frame processed by the camera 121 may be stored in the memory 160 or transmitted to an external device through the wireless communication unit 110. Further, the location information of the user may be stored from the image frame obtained by the camera 121. Two or more cameras 121 may be provided according to the use environment.
  • the microphone 122 receives an external sound signal by a microphone in a call mode, a recording mode, a voice recognition mode, etc., and processes the external sound signal into electrical voice data.
  • the processed voice data may be converted into a form transmittable to the mobile communication base station through the mobile communication module 112 and output in the call mode.
  • the microphone 122 may implement various noise removing algorithms for removing noise generated in the process of receiving an external sound signal.
  • the user input unit 130 generates input data according to a control command for controlling the operation of the mobile terminal 100 applied from the user.
  • the user input unit 130 may include a key pad, a dome switch, a touch pad (constant voltage / capacitance), a jog wheel, a jog switch, and the like.
  • the sensing unit may be configured such as an open / closed state of the mobile terminal 100, a position of the mobile terminal 100, presence or absence of user contact, orientation of the mobile terminal, acceleration / deceleration of the mobile terminal, and the like.
  • the current state is detected to generate a detection signal (or sensing signal) for controlling the operation of the mobile terminal 100.
  • the sensing unit 140 may detect whether the slide phone is opened or closed when the mobile terminal 100 is in the form of a slide phone.
  • the sensing unit 140 may detect whether the power supply unit 190 supplies power or whether the interface unit 170 is coupled to an external device.
  • the output unit 150 is used to generate an output related to visual, auditory or tactile senses, and may include a display unit 151, an audio output module 153, an alarm unit 154, and a haptic module 155. have.
  • the display unit 151 includes a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT LCD), an organic light-emitting diode (OLED), and a flexible display (flexible). display, a 3D display, or an e-ink display.
  • LCD liquid crystal display
  • TFT LCD thin film transistor-liquid crystal display
  • OLED organic light-emitting diode
  • flexible display flexible display
  • display a 3D display, or an e-ink display.
  • Some of these displays can be configured to be transparent or light transmissive so that they can be seen from the outside. This may be referred to as a transparent display.
  • a representative example of the transparent display is TOLED (Transparant OLED).
  • the rear structure of the display unit 151 may also be configured as a light transmissive structure. With this structure, the user can see the object located behind the terminal body through the area occupied by the display unit 151 of the terminal body.
  • two or more display units 151 may exist.
  • a plurality of display units may be spaced apart or integrally disposed on one surface of the mobile terminal 100, or may be disposed on different surfaces, respectively.
  • the display unit 151 may be configured as a stereoscopic display unit 152 for displaying a stereoscopic image.
  • the stereoscopic image represents a three-dimensional stereoscopic image
  • the three-dimensional stereoscopic image represents a gradual depth and reality in which an object is placed on a monitor or a screen. It is a video that makes you feel like the real space.
  • 3D stereoscopic images are implemented using binocular disparity. Binocular disparity means disparity made by the position of two eyes that are separated. When two eyes see different two-dimensional images and the images are transferred to the brain through the retina and are fused, they can feel the depth and reality of the three-dimensional image. do.
  • the stereoscopic display unit 152 may be a three-dimensional display method such as stereoscopic (glasses), auto stereoscopic (glasses), projection (holographic). Stereoscopic methods commonly used in home television receivers include a Wheatstone stereoscope method.
  • Examples of the auto stereoscopic method include a parallax barrier method, a lenticular method, an integrated imaging method, a switchable lens, and the like.
  • Projection methods include reflective holographic methods and transmissive holographic methods.
  • a 3D stereoscopic image is composed of a left image (left eye image) and a right image (right eye image).
  • a top-down method in which the left and right images are arranged up and down in one frame according to the way in which the left and right images are merged into three-dimensional stereoscopic images.
  • L-to-R (left-to-right, side by side) method to be arranged as a checker board method to arrange the pieces of the left and right images in the form of tiles, a column unit of the left and right images Or an interlaced method of alternately arranging rows, and a time sequential (frame by frame) method of alternately displaying left and right images by time.
  • the 3D thumbnail image may generate a left image thumbnail and a right image thumbnail from the left image and the right image of the original image frame, respectively, and combine them to generate one 3D thumbnail image.
  • a thumbnail refers to a reduced image or a reduced still image.
  • the left image thumbnail and the right image thumbnail generated as described above may be displayed with a three-dimensional space by displaying left and right distances on the screen by a depth corresponding to the parallax between the left and right images.
  • the left image and the right image necessary for implementing the 3D stereoscopic image may be displayed on the stereoscopic display 152 by a stereoscopic processor (not shown).
  • the stereo processing unit receives a 3D image and extracts a left image and a right image therefrom, or receives a 2D image and converts the image into a left image and a right image.
  • the display unit 151 and a sensor for detecting a touch operation form a mutual layer structure (hereinafter, referred to as a “touch screen”)
  • the display unit 151 outputs the same.
  • the touch sensor may have, for example, a form of a touch film, a touch sheet, a touch pad, or the like.
  • the touch sensor may be configured to convert a change in pressure applied to a specific portion of the display unit 151 or capacitance generated in a specific portion of the display unit 151 into an electrical input signal.
  • the touch sensor may be configured to detect not only the position and area where the touch object is touched on the touch sensor but also the pressure at the touch.
  • the touch object is an object applying a touch to the touch sensor and may be, for example, a finger, a touch pen or a stylus pen, a pointer, or the like.
  • the touch controller processes the signal (s) and then transmits the corresponding data to the controller 180. As a result, the controller 180 can know which area of the display unit 151 is touched.
  • a proximity sensor 141 may be disposed in an inner region of a mobile terminal surrounded by the touch screen or near the touch screen.
  • the proximity sensor 141 may be provided as an example of the sensing unit 140.
  • the proximity sensor 141 may be configured to determine whether an object approaching a predetermined detection surface or an object present in the vicinity of an object is applied to the force of an electromagnetic field.
  • the proximity sensor 141 has a longer life and higher utilization than a contact sensor.
  • Examples of the proximity sensor 141 include a transmissive photoelectric sensor, a direct reflective photoelectric sensor, a mirror reflective photoelectric sensor, a high frequency oscillating proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor.
  • the touch screen is capacitive, the touch screen is configured to detect the proximity of the pointer by a change in an electric field according to the proximity of a conductive object (hereinafter, referred to as a pointer).
  • the touch screen may be classified as a proximity sensor.
  • proximity touch an act of allowing the pointer to be recognized without being in contact with the touch screen so that the pointer is located on the touch screen
  • contact touch an act of actually touching the pointer on the screen.
  • the position at which the proximity touch is performed by the pointer on the touch screen means a position where the pointer is perpendicular to the touch screen when the pointer is in proximity proximity.
  • the proximity sensor 141 detects a proximity touch and a proximity touch pattern (for example, a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch position, and a proximity touch movement state). Information corresponding to the sensed proximity touch operation and proximity touch pattern may be output on the touch screen.
  • a proximity touch and a proximity touch pattern for example, a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch position, and a proximity touch movement state.
  • Information corresponding to the sensed proximity touch operation and proximity touch pattern may be output on the touch screen.
  • a stereoscopic touch screen When the stereoscopic display unit 152 and the touch sensor form a mutual layer structure (hereinafter referred to as a “stereoscopic touch screen”), or when the stereoscopic display unit 152 and the 3D sensor for detecting a touch operation are combined with each other.
  • the stereoscopic display unit 152 may also be used as a three-dimensional input device.
  • the sensing unit 140 may include a proximity sensor 141, a stereoscopic touch sensing unit 142, an ultrasonic sensing unit 143, and a camera sensing unit 144.
  • the proximity sensor 141 measures the distance between a sensing object (for example, a user's finger or a stylus pen) and a detection surface to which a touch is applied without mechanical contact by using an electromagnetic force or infrared rays.
  • the terminal recognizes which part of the stereoscopic image is touched using this distance.
  • the touch screen is capacitive, the proximity of the sensing object is detected by a change in electric field according to the proximity of the sensing object, and the touch screen is configured to recognize a three-dimensional touch using the proximity.
  • the stereoscopic touch sensing unit 142 is configured to detect the intensity or duration of the touch applied to the touch screen. For example, the three-dimensional touch sensing unit 142 detects a pressure to apply a touch, and if the pressure is strong, recognizes it as a touch on an object located farther from the touch screen toward the inside of the terminal.
  • the ultrasonic sensing unit 143 uses ultrasonic waves to recognize position information of the sensing object.
  • the ultrasonic sensing unit 143 may be formed of, for example, an optical sensor and a plurality of ultrasonic sensors.
  • the optical sensor is configured to detect light
  • the ultrasonic sensor is configured to detect ultrasonic waves. Because light is much faster than ultrasonic waves, the time that light reaches the optical sensor is much faster than the time that ultrasonic waves reach the ultrasonic sensor. Therefore, the position of the wave generation source can be calculated using the time difference from the time when the ultrasonic wave reaches the light as the reference signal.
  • the camera sensing unit 144 includes at least one of a camera 121, a photo sensor, and a laser sensor.
  • the camera 121 and the laser sensor are combined with each other to detect a touch of a sensing object with respect to a 3D stereoscopic image.
  • 3D information may be obtained.
  • a photo sensor may be stacked on the display element.
  • the photo sensor is configured to scan the movement of the sensing object in proximity to the touch screen. More specifically, the photo sensor mounts a photo diode and a transistor (TR) in a row / column and scans contents mounted on the photo sensor by using an electrical signal that varies according to the amount of light applied to the photo diode. That is, the photo sensor calculates coordinates of the sensing object according to the amount of change of light, and thereby obtains the position information of the sensing object.
  • TR transistor
  • the sound output module 153 may output audio data received from the wireless communication unit 110 or stored in the memory 160 in a call signal reception, a call mode or a recording mode, a voice recognition mode, a broadcast reception mode, and the like.
  • the sound output module 153 may also output a sound signal related to a function (for example, a call signal reception sound or a message reception sound) performed in the mobile terminal 100.
  • the sound output module 153 may include a receiver, a speaker, a buzzer, and the like.
  • the alarm unit 154 outputs a signal for notifying occurrence of an event of the mobile terminal 100.
  • Examples of events generated in the mobile terminal 100 include call signal reception, message reception, key signal input, and touch input.
  • the alarm unit 154 may output a signal for notifying occurrence of an event by using a form other than a video signal or an audio signal, for example, vibration.
  • the video signal or the audio signal may be output through the display unit 151 or the sound output module 153, and thus the display unit 151 and the sound output module 153 may be classified as part of the alarm unit 154. .
  • the haptic module 155 generates various tactile effects that a user can feel.
  • a representative example of the tactile effect generated by the haptic module 155 may be vibration.
  • the intensity and pattern of vibration generated by the haptic module 155 may be controlled by the user's selection or the setting of the controller. For example, the haptic module 155 may output different synthesized vibrations or sequentially output them.
  • the haptic module 155 may be configured to provide a pin array that vertically moves with respect to the contact skin surface, a jetting force or suction force of air through the jetting or suction port, grazing to the skin surface, contact of the electrode, electrostatic force, and the like.
  • Various tactile effects can be generated, such as effects by the endothermic and the reproduction of a sense of cold using the elements capable of endotherm or heat generation.
  • the haptic module 155 may not only transmit the haptic effect through direct contact, but also may implement the user to feel the haptic effect through the muscle sense such as a finger or an arm. Two or more haptic modules 155 may be provided according to configuration aspects of the mobile terminal 100.
  • the memory 160 may store a program for the operation of the controller 180 and may temporarily store input / output data (for example, a phone book, a message, a still image, a video, etc.).
  • the memory 160 may store data relating to various patterns of vibration and sound output when a touch input on the touch screen is performed.
  • the memory 160 may be a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (for example, SD or XD memory), RAM random access memory (RAM), static random access memory (SRAM), read-only memory (ROM), electrically erasable programmable read-only memory (EEPROM), programmable read-only memory (PROM), magnetic memory, magnetic It may include a storage medium of at least one type of a disk and an optical disk.
  • the mobile terminal 100 may be operated in connection with a web storage that performs a storage function of the memory 160 on the Internet.
  • the interface unit 170 serves as a path with all external devices connected to the mobile terminal 100.
  • the interface unit 170 receives data from an external device, receives power, transfers the power to each component inside the mobile terminal 100, or transmits data inside the mobile terminal 100 to an external device.
  • the audio input / output (I / O) port, video input / output (I / O) port, earphone port, and the like may be included in the interface unit 170.
  • the identification module is a chip that stores a variety of information for authenticating the usage rights of the mobile terminal 100, a user identification module (UIM), subscriber identity module (SIM), universal user authentication And a universal subscriber identity module (USIM).
  • a device equipped with an identification module (hereinafter referred to as an 'identification device') may be manufactured in the form of a smart card. Therefore, the identification device may be connected to the terminal 100 through the interface unit 170.
  • the interface unit 170 serves as a passage through which power from the cradle is supplied to the mobile terminal 100, or inputted from the cradle by a user.
  • Various command signals may be a passage through which the mobile terminal 100 is transmitted.
  • Various command signals or power input from the cradle may operate as signals for recognizing that the mobile terminal 100 is correctly mounted on the cradle.
  • the controller 180 typically controls the overall operation of the mobile terminal 100. For example, control and processing related to voice calls, data communications, video calls, and the like are performed.
  • the controller 180 may include a multimedia module 181 for playing multimedia.
  • the multimedia module 181 may be implemented in the controller 180 or may be implemented separately from the controller 180.
  • controller 180 may perform a pattern recognition process for recognizing a writing input or a drawing input performed on the touch screen as text and an image, respectively.
  • the controller 180 may execute a lock state for limiting input of a user's control command to applications.
  • the controller 180 may control the lock screen displayed in the locked state based on the touch input sensed by the display unit 151 in the locked state.
  • the power supply unit 190 receives an external power source and an internal power source under the control of the controller 180 to supply power for operation of each component.
  • Various embodiments described herein may be implemented in a recording medium readable by a computer or similar device using, for example, software, hardware or a combination thereof.
  • the embodiments described herein include application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), and field programmable gate arrays (FPGAs). It may be implemented using at least one of processors, controllers, micro-controllers, microprocessors, and electrical units for performing other functions. In some cases, the embodiments described herein may be implemented by the controller 180 itself.
  • ASICs application specific integrated circuits
  • DSPs digital signal processors
  • DSPDs digital signal processing devices
  • PLDs programmable logic devices
  • FPGAs field programmable gate arrays
  • embodiments such as the procedures and functions described herein may be implemented as separate software modules.
  • Each of the software modules may perform one or more functions and operations described herein.
  • the software code may be implemented as a software application written in a suitable programming language.
  • the software code may be stored in the memory 160 and executed by the controller 180.
  • 2A and 2B are conceptual views of a communication system in which the mobile terminal 100 according to the present invention can operate.
  • a communication system may use different air interfaces and / or physical layers.
  • a radio interface that can be used by a communication system includes frequency division multiple access (FDMA), time division multiple access (TDMA), and code division multiple access (CDMA). ), Universal Mobile Telecommunications Systems (UMTS) (in particular, Long Term Evolution (LTE)), Global System for Mobile Communications (GSM), and the like.
  • FDMA frequency division multiple access
  • TDMA time division multiple access
  • CDMA code division multiple access
  • UMTS Universal Mobile Telecommunications Systems
  • LTE Long Term Evolution
  • GSM Global System for Mobile Communications
  • CDMA Code Division Multiple Access
  • the CDMA wireless communication system includes at least one terminal 100, at least one base station (BS) 270, and at least one base station controller 180 (Base Station Controllers, BSCs). 275, a mobile switching center (MSC) 280.
  • the MSC 280 is configured to connect with a Public Switched Telephone Network (PSTN) 290 and BSCs 275.
  • PSTN Public Switched Telephone Network
  • the BSCs 275 may be coupled to the BS 270 through a backhaul line.
  • the backhaul line may be provided according to at least one of E1 / T1, ATM, IP, PPP, Frame Relay, HDSL, ADSL, or xDSL.
  • a plurality of BSCs 275 may be included in the system shown in FIG. 2A.
  • Each of the plurality of BSs 270 may include at least one sector, and each sector may include an omnidirectional antenna or an antenna pointing in a specific radial direction from the BS 270.
  • each sector may include two or more antennas of various types.
  • Each BS 270 may be configured to support multiple frequency assignments, each of which may have a specific spectrum (eg, 1.25 MHz, 5 MHz, etc.).
  • BS 270 may be called Base Station Transceiver Subsystem (BTSs).
  • BTSs Base Station Transceiver Subsystem
  • one BSC 275 and at least one BS 270 may be referred to as a “base station”.
  • the base station may also indicate "cell site”.
  • each of the plurality of sectors for a particular BS 270 may be called a plurality of cell sites.
  • a broadcasting transmitter (BT) 295 transmits a broadcast signal to terminals 100 operating in a system.
  • the broadcast receiving module 111 illustrated in FIG. 1 is provided in the terminal 100 to receive a broadcast signal transmitted by the BT 295.
  • FIG. 2A illustrates a satellite 300 of a Global Positioning System (GPS).
  • GPS Global Positioning System
  • the satellite 300 helps to locate the mobile terminal 100. Although two satellites are shown in FIG. 2A, useful location information may be obtained by two or less or more satellites.
  • the location information module 115 shown in FIG. 1 cooperates with the satellite 300 shown in FIG. 2A to obtain desired location information.
  • the location of the mobile terminal 100 may be tracked using all the technologies capable of tracking the location as well as the GPS tracking technology.
  • at least one of the GPS satellites 300 may optionally or additionally be responsible for satellite DMB transmission.
  • BS 270 receives a reverse link signal from mobile terminal 100.
  • the mobile terminal 100 is connecting a call, transmitting or receiving a message, or performing another communication operation.
  • Each of the reverse link signals received by a particular base station 270 is processed within by the particular base station 270.
  • the data generated as a result of the processing is transmitted to the connected BSC 275.
  • BSC 275 provides call resource allocation and mobility management functionality, including the organization of soft handoffs between base stations 270.
  • the BSCs 275 transmit the received data to the MSC 280, and the MSC 280 provides an additional transmission service for the connection with the PSTN 290.
  • the PSTN 290 is connected to the MSC 280
  • the MSC 280 is connected to the BSCs 275
  • the BSCs 275 are connected to the BS 100 so that the forward link signal is transmitted to the mobile terminal 100. 270 can be controlled.
  • FIG. 2B illustrates a method of acquiring location information of a mobile terminal using a Wi-Fi location tracking system (WPS: WiFi Positioning System).
  • WPS WiFi Positioning System
  • the Wi-Fi Positioning System (WPS) 300 moves by using a WiFi module provided in the mobile terminal 100 and a wireless access point 320 that transmits or receives a wireless signal with the WiFi module.
  • a technology for tracking the location of the terminal 100 it refers to a location positioning technology based on a wireless local area network (WLAN) using WiFi.
  • WLAN wireless local area network
  • the Wi-Fi location tracking system 300 includes a Wi-Fi location server 310, a mobile terminal 100, a wireless AP 330 connected to the mobile terminal 100, and a database 330 in which arbitrary wireless AP information is stored. It may include.
  • the Wi-Fi location positioning server 310 extracts information of the wireless AP 320 connected to the mobile terminal 100 based on the location information request message (or signal) of the mobile terminal 100. Information of the wireless AP 320 connected to the mobile terminal 100 is transmitted to the Wi-Fi location server 310 through the mobile terminal 100, or from the wireless AP 320 to the Wi-Fi location server 310 Can be sent.
  • the extracted information of the wireless AP is MAC Address, SSID, RSSI, channel information, Privacy, Network Type, Signal Strength and Noise Strength. It may be at least one of.
  • the Wi-Fi positioning server 310 receives the information of the wireless AP 320 connected to the mobile terminal 100, and includes the information included in the pre-built database 330 and the received wireless AP 320. By comparing the information, the location information of the mobile terminal 100 is extracted (or analyzed).
  • the wireless AP connected to the mobile terminal 100 is illustrated as the first, second and third wireless APs 320 as an example.
  • the number of wireless APs connected to the mobile terminal 100 may vary depending on the wireless communication environment in which the mobile terminal 100 is located.
  • the Wi-Fi location tracking system 300 may track the location of the mobile terminal 100 when the mobile terminal 100 is connected to at least one wireless AP.
  • the database 330 may store various information of arbitrary wireless APs disposed at different locations.
  • the information on any wireless AP stored in the database 300 includes MAC address, SSID, RSSI, channel information, Privacy, Network Type, latitude and longitude coordinates of the wireless AP, the name of the building where the wireless AP is located, the number of floors, and indoor detailed location information. (GPS coordinates available), the AP owner's address, phone number and the like.
  • the Wi-Fi positioning server 310 is connected to the mobile terminal 100 in the database 330.
  • the location information of the mobile terminal 100 may be extracted by searching for wireless AP information corresponding to the information of the wireless AP 320 connected to and extracting location information matched with the retrieved wireless AP information.
  • 3A is a front perspective view of an example of a mobile terminal 100 according to the present invention.
  • the disclosed mobile terminal 100 has a terminal body in the form of a bar.
  • the present invention is not limited thereto and may be applied to various structures such as a watch type, a clip type, a glasses type, or a folder type, a flip type, a slide type, a swing type, a swivel type, in which two or more bodies are relatively movable. have.
  • the body includes a case (frame, housing, cover, etc.) that forms an exterior.
  • the case may be divided into a front case 101 and a rear case 102.
  • Various electronic components are built in the space formed between the front case 101 and the rear case 102.
  • At least one intermediate case may be additionally disposed between the front case 101 and the rear case 102, and the battery cover 103 covering the battery 191 may be detachably attached to the rear case 102. have.
  • the cases may be formed by injecting synthetic resin, or may be formed of metal, for example, stainless steel (STS), aluminum (Al), titanium (Ti), or the like.
  • STS stainless steel
  • Al aluminum
  • Ti titanium
  • the front of the terminal body, the display unit 151, the first sound output module 153a, the first camera 121a, the first operation unit 131, and the like are disposed, the side of the microphone 122, the interface unit 170 ), The second operation unit 132 may be provided.
  • the display unit 151 is configured to display (output) information processed by the mobile terminal 100.
  • the display unit 151 includes a liquid crystal display (LCD) for visually representing information, a thin film transistor-liquid crystal display (TFT LCD), an organic light-emitting diode (OLED) ), At least one of a flexible display, a 3D display, and an e-ink display.
  • LCD liquid crystal display
  • TFT LCD thin film transistor-liquid crystal display
  • OLED organic light-emitting diode
  • the display unit 151 may include a touch sensing means to receive a control command by a touch method.
  • the touch sensing means may be configured to detect the touch and input content corresponding to the touched position.
  • the content input by the touch method may be letters or numbers or menu items that can be indicated or designated in various modes.
  • the touch sensing means is formed to be translucent so that the visual information output from the display unit 151 can be seen, and may include a structure for increasing the visibility of the touch screen in a bright place. According to FIG. 3A, the display unit 151 occupies most of the front surface of the front case 101.
  • the first sound output module 153a and the first camera 121a are disposed in an area adjacent to one end of both ends of the display unit 151, and the first operation unit 131 and the microphone 122 are located in an area adjacent to the other end. Is placed.
  • the second manipulation unit 132 (see FIG. 3B), the interface unit 170, and the like may be disposed on the side of the terminal body.
  • the first sound output module 153a may be implemented in the form of a receiver for transmitting a call sound to a user's ear or a loud speaker for outputting various alarm sounds or reproduction sounds of multimedia.
  • Sound generated from the first sound output module 153a may be configured to be emitted along an assembly gap between the structures.
  • the holes formed independently for the sound output may not be visible or hidden, so that the appearance of the mobile terminal 100 may be simplified.
  • the present invention is not limited thereto, and a hole for emitting the sound may be formed in the window.
  • the first camera 121a processes an image frame such as a still image or a moving image obtained by the image sensor in a video call mode or a photographing mode.
  • the processed image frame may be displayed on the display unit 151.
  • the user input unit 130 is manipulated to receive a command for controlling the operation of the mobile terminal 100 and may include first and second manipulation units 131 and 132.
  • the first and second manipulation units 131 and 132 may be collectively referred to as manipulating portions, and may be employed in any manner as long as the user is tactile in a tactile manner such as touch, push, or scroll. Can be.
  • the first operation unit 131 is illustrated as a touch key, but the present invention is not limited thereto.
  • the first manipulation unit 131 may be a push key or a combination of a touch key and a push key.
  • Content input by the first and / or second manipulation units 131 and 132 may be variously set.
  • the first operation unit 131 receives a command such as a menu, a home key, a cancellation, a search, etc.
  • the second operation unit 132 adjusts the volume of the sound output from the first sound output module 153a or A command such as switching to the touch recognition mode of the display unit 151 may be input.
  • the microphone 122 is formed to receive a user's voice, other sounds, and the like.
  • the microphone 122 may be provided at a plurality of locations and configured to receive stereo sound.
  • the interface unit 170 serves as a path for allowing the mobile terminal 100 to exchange data with an external device.
  • the interface unit 170 is a wired or wireless connection terminal for connecting to the earphone, a port for short-range communication (for example, an infrared port (IrDA Port), Bluetooth port (Bluetooth Port), wireless LAN port (Wireless) LAN Port) and the like, or at least one of power supply terminals for supplying power to the mobile terminal 100.
  • the interface unit 170 may be implemented in the form of a socket for receiving an external card, such as a subscriber identification module (SIM) or a user identity module (UIM), a memory card for storing information.
  • SIM subscriber identification module
  • UIM user identity module
  • FIG. 3B is a rear perspective view of the mobile terminal 100 shown in FIG. 3A.
  • a second camera 121b may be additionally mounted on the rear of the terminal body, that is, the rear case 102.
  • the second camera 121b has a photographing direction substantially opposite to that of the first camera 121a (see FIG. 3A), and may be a camera having different pixels from the first camera 121a.
  • the first camera 121a has a low pixel so that the user's face is photographed and transmitted to the counterpart in case of a video call, and the second camera 121b photographs a general subject and does not transmit it immediately. In many cases, it is desirable to have a high pixel.
  • the first and second cameras 121a and 121b may be installed in the terminal body in a rotatable or pop-up manner.
  • the flash 123 and the mirror 124 are further disposed adjacent to the second camera 121b.
  • the flash 123 shines light toward the subject when the subject is photographed by the second camera 121b.
  • the mirror 124 allows the user to see his / her own face or the like when photographing (self-photographing) the user using the second camera 121b.
  • the second sound output module 153b may be further disposed on the rear surface of the terminal body.
  • the second sound output module 153b may implement a stereo function together with the first sound output module 153a (see FIG. 3A), and may be used to implement a speakerphone mode during a call.
  • An antenna (not shown) for receiving a broadcast signal may be additionally disposed on the side of the terminal body.
  • An antenna that forms part of the broadcast receiving module 111 (refer to FIG. 1) may be installed to be pulled out of the terminal body.
  • the terminal body is provided with a power supply unit 190 (see FIG. 1) for supplying power to the mobile terminal 100.
  • the power supply unit 190 may include a battery 191 embedded in the terminal body or detachably configured from the outside of the terminal body.
  • the battery cover 103 is coupled to the rear case 102 so as to cover the battery 191 to limit the detachment of the battery 191, and to protect the battery 191 from external shocks and foreign matters. .
  • the extracted location information of the mobile terminal 100 is transmitted to the mobile terminal 100 through the Wi-Fi positioning server 310, the mobile terminal 100 can obtain the location information.
  • the cover according to the present invention includes a first member 510 and a second member 520.
  • the cover includes a connecting portion 530 connecting the first and second members 510 and 520.
  • the connection part 530 may be formed to cover the side of the mobile terminal.
  • the first and second members 510 and 520 may be moved relative to each other by deformation of the connection part 530, and based on the movement of the second member 520 with respect to the first member 510.
  • the display of the mobile terminal is covered or exposed.
  • the first and second members 510 and 520 may be formed in a plate shape formed to correspond to the rear side and the front side of the mobile terminal 100.
  • the first and second members 510 and 520 may be formed of a bendable material.
  • the first and second members 510 and 520 and the connection part 530 may include at least one of polyurethane, PVC, bakelite, and corrugated cardboard. Materials constituting the members 510 and 520 and the connection part 530 are not limited thereto.
  • FIG. 4A is a conceptual diagram illustrating a case in which the mobile terminal is in a closed state by the cover 500.
  • FIG. 4B is a conceptual diagram illustrating a case in which the mobile terminal is in an open state by the cover 500.
  • a state in which the display is entirely exposed by the second member 520 is defined as an open state, and a state in which the display is covered by the second member 520 is defined as a closed state.
  • the controller activates the display unit in the open state, and the display unit outputs screen information as a whole.
  • control unit controls the display unit to activate a region of the display unit in the closed state. That is, the display unit may output the screen information to pass through the second member 520 in the closed state.
  • the front surface of the mobile terminal covered with the cover in the closed state is divided into first to third regions A1, A2, and A3.
  • the first region A1 corresponds to an upper region of the front surface
  • the second region A2 corresponds to a lower region of the display portion
  • the third region A3 is adjacent to the display portion and It may correspond to an area surrounding the first and second areas A1 and A2.
  • the first to third areas A1, A2, and A3 may output different screen information based on a user's control command.
  • the display unit is divided into first to third output regions 151a, 151b, and 151c corresponding to the first to third regions A1, A2, and A3.
  • the controller may control the display unit such that the first to third output regions 151a, 151b, and 151c selectively output an image.
  • the second member 520 formed to cover the display unit is also divided into first to third transmission regions 521, 522, and 523.
  • the first to third transmission regions 521, 522, and 523 may have different light transmittances.
  • the first transmission region 521 may be an opening region. Accordingly, an image output from the display unit may be directly provided through the first transmission region 521.
  • the second transmission area 522 may be formed so that the image is transmitted and is visible to the eyes of the user.
  • the third transmission region 523 may be formed to limit transmission of the image output from the display unit.
  • the visual data is substantially output from each area of the display unit and provided to the user through the cover, but for convenience of description, the visual data is stored in the first area A1 or the second area A2. Will be output from
  • the touch input may be applied to the display unit through the first area A1 in the closed state.
  • the control unit may apply a touch input to the display unit through the second area A2 in the closed state.
  • the touch input may not be applied to the third area A3.
  • the screen information is output to the entire area of the display unit in the open state, and the user can control the mobile terminal by applying a touch input to the entire area of the display unit.
  • screen information is provided through the first and second areas A1 and A2 and a touch input is applied to the first and second areas A1 and A2 to control the mobile terminal.
  • FIG. 5 is a flowchart illustrating a control method of the mobile terminal in the closed state
  • FIG. 6 is a conceptual diagram illustrating the control method of FIG. 5 according to an exemplary embodiment.
  • the control unit may switch the display unit to an inactive state when the control command is not input for a predetermined time by the user in the closed state or the open state.
  • the controller may switch the display unit to an inactive state based on a specific control command applied to the operation unit.
  • the wireless communication unit 110 receives an event in the closed state (S501).
  • the controller outputs first content 610 related to the event to the first area A1 (S502). That is, the controller controls the display unit to activate the first display area 151a of the display unit and output the first content 610 to the first display area 151a based on the received event.
  • the first content 610 is provided to the user through a first transmission area 521 including an opening area.
  • the first content 610 is composed of an image having a predetermined shape and size to correspond to the first area A1, and is distinguished from an execution screen of an application driven to receive the first content 610. Can be.
  • the first content 610 may be composed of some data included in an execution screen of the application.
  • a user's touch input of the second area A2 is received while the first content 610 is output.
  • the display unit receives a user's touch input through the second cover unit 522 of the cover.
  • the controller controls the display unit to output the first graphic image 620 to the second area A based on a touch input applied to the second area A2.
  • the first graphic image 620 output from the display unit is provided to the user through the cover.
  • the first graphic image 620 corresponds to an icon that receives a touch input to control an application related to the first content.
  • the first graphic image 620 may correspond to an icon for controlling a message application.
  • the first graphic image 620 may include an icon for outputting details of the first content 610, an icon for sending a call to the external terminal, an icon for replying to the external terminal, and the like. have.
  • the controller controls the display unit to switch the first content 610 to the second content 611 based on a touch input applied to the first graphic image 620 (S504).
  • the second content 611 may include details of the first content 610.
  • the second content 611 is composed of information related to the first content 610.
  • the user may be provided with information related to the first content 610 or control the application without switching from the closed state to the open state.
  • control unit may be formed to adjust the degree of light transmission of the cover.
  • the controller may adjust the light transmittance of the second member 520 based on a control command.
  • control unit may adjust to increase light transmittance of the second area A2 only when a touch input is applied to the first area A1. Accordingly, the image output through the display unit may be provided through the second area A2.
  • FIG. 7A to 7B are conceptual views illustrating a method of controlling a first area according to various embodiments of the present disclosure.
  • a control method of controlling an application related to the first content in the closed state will be described with reference to FIG. 7A.
  • the first content 610 is output to the first area A1.
  • the controller may control the display unit to output the first graphic image 620 to the second area A2 along with the output of the first content 610.
  • the first content 610 may be converted into the second content 611 based on a touch input applied to the first area A1.
  • the controller may control the display unit to switch the first content 610 to the third content 612 based on a touch input received on the first graphic image 620 of the second area A2. have.
  • the controller may control the display unit to convert the first graphic image 620 into a second graphic image 622 based on a touch input applied to the first graphic image 620.
  • the second graphic image 622 may be implemented as a virtual keyboard that receives a touch input for inputting text
  • the third content 612 may include a window for outputting the input text. have.
  • the controller may control the display unit to convert the second graphic image 622 into a third graphic image 623 based on another type of touch input applied to the second area A2.
  • the touch input of another method is distinguished from the touch input method applied to the second area A2 in order to input the character.
  • the touch input may be a dragging touch input.
  • the controller may control the display unit to output a third graphic image 623 having another function of controlling the application when the touch input of another method is applied.
  • the third graphic image 623 is configured to receive a touch input to transmit an image, a voice message, etc. to an external terminal.
  • the present invention is not limited thereto, and when another touch input is applied while the second graphic image 622 is output, the second graphic image 622 is converted back to the first graphic image 621. You can.
  • FIG. 7B illustrates a state in which the display unit is inactivated and auditory data is output through the sound output unit 153.
  • the controller may activate the first display area 151a of the display unit based on a touch input applied to the first area A1.
  • the first area A1 outputs fourth content 630 of an application related to the output auditory data.
  • the fourth content 630 may include screen information related to music playback of a music playback application.
  • the fourth content 630 may include at least one control icon that receives a user's touch input and controls the music playback. That is, when a touch input is applied to the control icon, the controller controls an application for playing the music.
  • the controller may control the display unit to output the fourth graphic image 640 based on a touch input applied to the first and second areas A1 and A2. Alternatively, when the fourth content 630 is output, the controller may control the display unit to output the fourth graphic image 640 together.
  • the fourth graphic image 640 includes at least some of a plurality of control icons included in the fourth content 630.
  • the display unit may output the control icons to the display unit to output the control icons to the fourth content 630 in an arrangement substantially the same as the arrangement of the control icons included in the fourth content 630.
  • control icon included in the fourth content and the control icon included in the fourth graphic image 640 may be different.
  • the controller controls the application based on a touch input applied to a control icon of the fourth graphic image 640.
  • the controller may modify the shape of the icon of the fourth content 630 corresponding to the control icon receiving the touch input. To control. Accordingly, the user may check the control command input based on the touch input applied to the second area A2 from the first area A1.
  • the fourth content 640 may include additional icons other than the corresponding control icon.
  • the user's touch input may be facilitated by displaying the control icon included in the relatively narrow first area A1 in the second area A2 that is relatively wide and easily accessible to the user.
  • the controller is configured to be preset in the first area A1 while the fourth content 630 is output in the first area A1 and the fourth graphic image 640 is output in the second area A2.
  • the touch input can be applied.
  • the preset touch input may be distinguished from a touch input for controlling an application related to the fourth content 630 and may correspond to, for example, a dragging touch input.
  • the controller may control the display unit to switch the fourth content 630 to the first content 610.
  • the first and third contents 630 and 610 may correspond to screen information related to different applications.
  • the display unit converts the fourth graphic image 640 of the second area A2 into the second graphic image 620. That is, images associated with substantially the same application are output to the first and second areas A1 and A2.
  • the controller controls the display unit to change images output to the first and second areas A1 and A2 based on a touch input applied to the second area A2.
  • the display apparatus may be configured based on a touch input applied to the first region A1 or the second region A2.
  • the display unit may be controlled to output information related to an application to the first and second areas A1 and A2.
  • 8A to 8C are conceptual views illustrating a control method of controlling the mobile terminal based on touch inputs applied to first and second areas according to various embodiments.
  • the locked state refers to a state in which a part of the function of the mobile terminal is limited, and in the locked state, the mobile terminal may perform only the control allowed based on the user's touch input.
  • the controller may control the display unit to output the first content 610 related to the event to the first area A1. have. However, the controller may ignore the touch input even when a touch input for confirming additional content of the first content 610 is applied. In addition, in the locked state, the controller controls the display unit to limit the output of the graphic image related to the first content 610 to the second area A2.
  • the display unit may output an image indicating the locked state to the first area A1.
  • the controller may change the locked state to a released state based on a touch input applied to the first and second areas A1 and A2 continuously. For example, when the controller releases a continuous touch input in the second area A2 by applying an initial touch input to the image representing the locked state, the controller may change the locked state to a released state.
  • the controller may release the locked state when the touch input applied to the second area A2 is released in the first area A1 or when the touch input is reciprocated a predetermined number of times.
  • the controller may control the display unit to output the first graphic image 620 to the second area A2 when the locked state is released.
  • the locked state can be set even in the closed state, and the locked state can be released from the closed state.
  • the first area A1 outputs first screen information 710.
  • the first screen information 710 may correspond to a part of a standby screen in the open state.
  • the idle screen may correspond to a home screen page including an icon, a widget, and the like of at least one application.
  • the controller outputs second screen information 720 to the second area A2 when a preset touch input is applied while the first screen information 710 is output to the first area A1. And control the display unit.
  • the preset touch input may correspond to a touch input moving from the top to the bottom. There is no limitation in the touch method.
  • the second screen information 720 may correspond to another area of the idle screen. That is, the second screen information 720 may include at least one application icon included in the idle screen.
  • the controller may execute the application based on a touch input applied to an icon of the application on the second area A2.
  • the controller controls the display unit to output the first execution screen 730 of the application to the first area A1.
  • the first execution screen 730 corresponds to one area of the execution screen output on the display unit when the application is activated in the open state.
  • the controller controls the display unit to output the second execution screen 731 based on the touch input applied to the first area A1.
  • the second execution screen 731 may correspond to another area of the execution screen of the application.
  • the first area A1 provides the user with as much as the first area A1 of the execution screen of the executed application. Accordingly, the user can execute the application without changing the cover to the open state, and can receive only one region of the execution screen of the executed application.
  • the controller controls the display unit to output the first execution screen 730 of the executed application to the first area A1.
  • the controller controls the display unit to output the second execution screen 731 to the first area A1. That is, the execution screen of the activated application may be controlled by the touch input of the second area A2.
  • FIGS 9A and 9B are conceptual views illustrating a control method of adjusting the touch sensitivity of the display unit in an open state and a closed state.
  • the controller detects an open state or a closed state (S601).
  • the controller controls the touch sensitivity of the display unit to the same range in the entire area of the display unit (S602).
  • the controller adjusts the second area A2 to have the highest sensitivity and controls the display unit to limit the touch detection of the third area A3 (S603).
  • the controller may detect the open state and the closed state.
  • the controller adjusts the touch sensitivity of the second and third regions A2 and A2 while outputting the first content 610 corresponding to the event.
  • the display unit may sense a user's touch applied to the second area A2 covered with the transmission area material.
  • the controller may output a notification image 610 ′ corresponding to the event to one region of the display unit, in which case the sensitivity of the display unit is controlled to be substantially the same.
  • the controller may detect the closed state or the open state by using a proximity sensor, an illuminance sensor, or a movement detection sensor mounted on the cover.
  • FIG. 10 is a conceptual diagram illustrating a touch input applied through the second area A2.
  • the first area A1 may be an opening area.
  • a touch input may be applied to the display unit through the second area A2.
  • At least one protrusion 520 ′ may be formed on a surface of the second area A2 facing the display unit.
  • the protrusion 520 ' is preferably formed to correspond to the graphic image displayed on the second display area 151b of the display unit in the closed state.
  • the protrusion 520 ′ may be in contact with the display unit based on the pressure applied by the user applied to the second transmission region 520, and thus a touch input may be applied to the display unit.
  • the protrusion 520 'to which the touch input is not applied may form a predetermined gap with the display unit.
  • the controller may control the display unit to output an image to the third area A3 according to the content output to the first area A1.
  • the controller may change an application activated in the first area A1 based on a touch input applied to the first area A1. For example, when the content 630 of the music playing application is output to the first area A1, the controller may output the equalizer image 960 of the reproduced music to the third area A3.
  • the display unit may be controlled to display.
  • the mobile terminal and the mobile terminal equipped with the cover described above are not limited to the configuration and method of the embodiments described above, the embodiments are all or part of each embodiment so that various modifications can be made May be optionally combined.
  • Embodiments of the present invention include a method of controlling a mobile terminal in a closed state in which a cover is mounted on the mobile terminal and can be applied to various industrial fields.

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Telephone Function (AREA)

Abstract

La présente invention concerne un terminal mobile comprenant : un corps de terminal, sur l'une des surfaces duquel une unité d'affichage est formée ; un couvercle, qui est couplé au corps du terminal et configuré pour passer d'un état fermé où il couvre l'unité d'affichage à un état ouvert où il expose l'unité d'affichage, et qui comprend au moins un aimant qui fait changer une propriété magnétique lorsque le couvercle passe de l'état fermé à l'état ouvert ; une unité capteur placée dans le corps du terminal pour correspondre à l'aimant et configurée pour détecter le changement de la propriété magnétique ; et un contrôleur qui commande à l'unité d'affichage de délivrer en sortie des informations d'écran relatives à une application prédéfinie, d'après le changement de la propriété magnétique du ou des aimants.
PCT/KR2013/012014 2013-12-03 2013-12-23 Terminal mobile, et couvercle correspondant WO2015083873A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020130149407A KR102135362B1 (ko) 2013-12-03 2013-12-03 이동 단말기 및 이의 커버
KR10-2013-0149407 2013-12-03

Publications (1)

Publication Number Publication Date
WO2015083873A1 true WO2015083873A1 (fr) 2015-06-11

Family

ID=53273610

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2013/012014 WO2015083873A1 (fr) 2013-12-03 2013-12-23 Terminal mobile, et couvercle correspondant

Country Status (2)

Country Link
KR (1) KR102135362B1 (fr)
WO (1) WO2015083873A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017065432A1 (fr) * 2015-10-13 2017-04-20 삼성전자 주식회사 Procédé de commande de sortie d'écran et dispositif électronique prenant en charge celui-ci

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10642426B2 (en) 2015-12-23 2020-05-05 Lg Chem, Ltd. Touch screen sensor
KR20170098078A (ko) * 2016-02-19 2017-08-29 엘지전자 주식회사 이동 단말기 및 그 제어방법
KR102453572B1 (ko) * 2017-07-24 2022-10-14 삼성전자 주식회사 원격 조종 케이스를 구비하는 전자 장치
KR20230120024A (ko) * 2022-02-08 2023-08-16 삼성전자주식회사 디스플레이의 화면의 번인 방지를 위한 전자 장치 및 방법
WO2024128544A1 (fr) * 2022-12-12 2024-06-20 삼성전자 주식회사 Dispositif électronique et procédé de reconnaissance d'accessoire l'utilisant

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20110009710U (ko) * 2011-09-01 2011-10-12 김용식 휴대폰 보호케이스
KR20120005719U (ko) * 2012-06-25 2012-08-08 임동춘 외부 터치가 가능한 이동통신 단말기용 케이스
KR200465990Y1 (ko) * 2012-10-18 2013-03-20 박재영 폴더형 휴대폰 케이스
KR101285669B1 (ko) * 2013-01-28 2013-07-11 주식회사 신해 휴대단말기 보호케이스
KR20130005320U (ko) * 2012-02-29 2013-09-06 지복진 비개방 사용이 가능한 모바일기기용 플립타입 보호케이스

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20110009710U (ko) * 2011-09-01 2011-10-12 김용식 휴대폰 보호케이스
KR20130005320U (ko) * 2012-02-29 2013-09-06 지복진 비개방 사용이 가능한 모바일기기용 플립타입 보호케이스
KR20120005719U (ko) * 2012-06-25 2012-08-08 임동춘 외부 터치가 가능한 이동통신 단말기용 케이스
KR200465990Y1 (ko) * 2012-10-18 2013-03-20 박재영 폴더형 휴대폰 케이스
KR101285669B1 (ko) * 2013-01-28 2013-07-11 주식회사 신해 휴대단말기 보호케이스

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017065432A1 (fr) * 2015-10-13 2017-04-20 삼성전자 주식회사 Procédé de commande de sortie d'écran et dispositif électronique prenant en charge celui-ci
KR20170043317A (ko) * 2015-10-13 2017-04-21 삼성전자주식회사 화면 출력 제어 방법 및 이를 지원하는 전자 장치
US20180299926A1 (en) * 2015-10-13 2018-10-18 Samsung Electronics Co., Ltd Method for controlling screen output and electronic device supporting same
US10908645B2 (en) 2015-10-13 2021-02-02 Samsung Electronics Co., Ltd. Method for controlling screen output and electronic device supporting same
KR102395794B1 (ko) * 2015-10-13 2022-05-10 삼성전자주식회사 화면 출력 제어 방법 및 이를 지원하는 전자 장치

Also Published As

Publication number Publication date
KR102135362B1 (ko) 2020-07-17
KR20150064561A (ko) 2015-06-11

Similar Documents

Publication Publication Date Title
WO2015046636A1 (fr) Terminal mobile et son procédé de commande
WO2015020284A1 (fr) Terminal mobile et procédé de commande associé
WO2015190666A1 (fr) Terminal mobile et son procédé de commande
WO2015111778A1 (fr) Terminal du type lunettes et procédé de commande dudit terminal
WO2015050345A1 (fr) Appareil de commande pour terminal mobile et son procédé de commande
WO2014175513A1 (fr) Terminal mobile et son procédé de commande
WO2015020283A1 (fr) Terminal mobile et son procédé de commande
WO2017086576A1 (fr) Terminal mobile, et son procédé de commande
WO2015060501A1 (fr) Appareil et procédé de commande de terminal mobile
WO2015053449A1 (fr) Dispositif d'affichage d'image de type lunettes et son procédé de commande
WO2016010262A1 (fr) Terminal mobile et son procédé de commande
EP3028206A1 (fr) Terminal mobile, montre intelligente, et procédé de mise en uvre d'authentification à l'aide du terminal mobile et de la montre intelligente
WO2015083873A1 (fr) Terminal mobile, et couvercle correspondant
WO2018124334A1 (fr) Dispositif électronique
WO2019022328A1 (fr) Antenne réseau et terminal mobile
WO2015083894A1 (fr) Dispositif électronique et système de dispositifs électroniques
EP2982042A1 (fr) Terminal et son procédé de commande
WO2015026030A1 (fr) Dispositif d'affichage et son procédé de commande
WO2022045408A1 (fr) Terminal mobile pour afficher une interface utilisateur (ui) de notification et procédé de commande correspondant
WO2017010595A1 (fr) Clavier et système de terminal comprenant ce dernier
WO2014123306A1 (fr) Terminal mobile, et procédé de commande associé
WO2015125993A1 (fr) Terminal mobile et son procédé de commande
WO2014065595A1 (fr) Dispositif d'affichage d'image et procédé de commande associé
WO2019135458A1 (fr) Terminal mobile
WO2014010874A1 (fr) Terminal mobile et son procédé de commande

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13898562

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13898562

Country of ref document: EP

Kind code of ref document: A1