WO2017111192A1 - Terminal mobile apte à rouler et son procédé de commande - Google Patents

Terminal mobile apte à rouler et son procédé de commande Download PDF

Info

Publication number
WO2017111192A1
WO2017111192A1 PCT/KR2015/014291 KR2015014291W WO2017111192A1 WO 2017111192 A1 WO2017111192 A1 WO 2017111192A1 KR 2015014291 W KR2015014291 W KR 2015014291W WO 2017111192 A1 WO2017111192 A1 WO 2017111192A1
Authority
WO
WIPO (PCT)
Prior art keywords
screen
touch
windows
mobile terminal
display area
Prior art date
Application number
PCT/KR2015/014291
Other languages
English (en)
Korean (ko)
Inventor
윤성혜
김수진
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to PCT/KR2015/014291 priority Critical patent/WO2017111192A1/fr
Publication of WO2017111192A1 publication Critical patent/WO2017111192A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1637Details related to the display arrangement, including those related to the mounting of the display in the housing
    • G06F1/1652Details related to the display arrangement, including those related to the mounting of the display in the housing the display being flexible, e.g. mimicking a sheet of paper, or rollable
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1615Constructional details or arrangements for portable computers with several enclosures having relative motions, each enclosure supporting at least one I/O or computing function
    • G06F1/1624Constructional details or arrangements for portable computers with several enclosures having relative motions, each enclosure supporting at least one I/O or computing function with sliding enclosures, e.g. sliding keyboard or display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1675Miscellaneous details related to the relative movement between the different enclosures or enclosure parts
    • G06F1/1677Miscellaneous details related to the relative movement between the different enclosures or enclosure parts for detecting open or closed state or particular intermediate positions assumed by movable parts of the enclosure, e.g. detection of display lid position with respect to main body in a laptop, detection of opening of the cover of battery compartment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09FDISPLAYING; ADVERTISING; SIGNS; LABELS OR NAME-PLATES; SEALS
    • G09F9/00Indicating arrangements for variable information in which the information is built-up on a support by selection or combination of individual elements
    • G09F9/30Indicating arrangements for variable information in which the information is built-up on a support by selection or combination of individual elements in which the desired character or characters are formed by combining individual elements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/725Cordless telephones
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04803Split screen, i.e. subdividing the display area or the window area into separate subareas

Definitions

  • the present invention relates to a rollable mobile terminal having a rollable display and a control method thereof.
  • Terminals may be divided into mobile / portable terminals and stationary terminals according to their mobility.
  • the mobile terminal may be further classified into a handheld terminal and a vehicle mounted terminal according to whether a user can directly carry it.
  • the functions of mobile terminals are diversifying. For example, data and voice communication, taking a picture and video with a camera, recording a voice, playing a music file through a speaker system, and outputting an image or video to a display unit.
  • Some terminals have an electronic game play function or a multimedia player function.
  • recent mobile terminals may receive multicast signals that provide visual content such as broadcasting, video, and television programs.
  • such a terminal is a multimedia player having a complex function such as taking a picture or a video, playing a music or video file, playing a game, or receiving a broadcast. Is being implemented.
  • the display area of a conventional mobile terminal is fixed and always has a fixed size.
  • Mobile terminals having a fixed size have a problem of being inconvenient to carry.
  • flexible and flexible display devices which are flexible and rollable display devices that can be rolled around, have been researched and developed.
  • the user can adjust the display area to be used according to his / her preference by using characteristics of the rollable display.
  • at least one region to which screen information is output may be selected from the entire region of the rollable display.
  • the display is made of a touch screen, and receives a user input using a touch applied to the touch screen. Furthermore, a user input that is hard to be input by touch is received through a home button provided on the front of the mobile terminal. For example, when an application installed in the mobile terminal is executed, the execution screen is displayed. The mobile terminal controls the execution screen by using a touch applied to the touch screen, and when the home button is pressed, the home screen is replaced with the home screen. Display the screen page.
  • the existing home button is designed to control a single screen that is output on a fixed display area.
  • the rollable display since an area in which screen information is output is changed by the user, it is necessary to change the function of the home button in consideration of the characteristics of the rollable display.
  • An object of the present invention is to provide a rollable mobile terminal having a home button in consideration of characteristics of a rollable display, and implementing a new user interface using the home button, and a control method thereof.
  • the rollable mobile terminal may include a touch screen formed to be rollable, a first body connected to one end of the touch screen, a second body connected to the other end opposite to the one end of the touch screen, and the first body.
  • a guide unit disposed on at least one of the second bodies and configured to wind and receive the touch screen and at least a portion of the touch screen exposed outside the guide unit as a display area, and display a main screen on the display area
  • a control unit configured to control the upper touch screen to output a multitasking screen including an execution screen of each of the applications to the display area when a predetermined user input is input. Reduce to size to display on the multitasking screen, the main Guide information for guiding one or more windows included in the screen is displayed on the reduced main screen.
  • the one or more windows may include first and second windows
  • the guide information may include guide lines for distinguishing the first and second windows.
  • the controller may change a screen included in any one of the first and second windows based on a touch input applied to the multitasking screen.
  • a part of the reduced main screen is changed as the screen included in the one window is changed, and the controller is configured to change the part of the main screen when the output of the multitasking screen is finished.
  • the touch screen may be controlled to be output to the display area.
  • the controller when a touch input is applied to move any one execution screen included in the multitasking screen to the one window, the controller displays the screen output on the one window. You can switch to the run screen.
  • the screen displayed on one of the windows is switched to the one execution screen, but the screen displayed on the other window may be maintained.
  • the controller may switch a screen output from the one window to a home screen page.
  • different home screen pages may be set in the first and second windows.
  • the control unit may control the touch screen to change display positions of the first and second windows when a touch input for moving one of the first and second windows to another is applied. can do.
  • the controller may change the layout of the first and second windows based on a touch input with respect to the guide line.
  • the control unit may end the display of the first window or the second window based on the one end when a touch input for moving the guide line to one end of the reduced main screen is applied. can do.
  • the controller may display at least one of a screen split icon and a screen integration icon on the multitasking screen, and the screen split icon divides one window included in the main screen into a plurality of windows.
  • the screen integration icon may be linked to a function of integrating a plurality of windows included in the main screen into one window.
  • the controller may update the guide information when a function corresponding to the screen division icon or a function corresponding to the screen integration icon is executed.
  • the controller may selectively output the screen division icon or the screen integration icon according to the number of the one or more windows.
  • the number of one or more windows displayed on the display area may vary according to the size of the display area.
  • an embodiment of the present invention for realizing the above problem includes a control method of a rollable mobile terminal.
  • the control method may include setting at least a portion of the touch screen as a display area when at least a portion of the touch screen wound around the main body is exposed outside the main body, and displaying a main screen on the display area. And displaying a multitasking screen including the execution screen of each of the applications in the display area when a predetermined user input is input while the main screen is displayed.
  • the multitasking screen is displayed in the display area.
  • the displaying may include reducing the main screen to a predetermined size to display the multitasking screen and displaying guide information for guiding one or more windows included in the main screen on the reduced main screen. Include.
  • the one or more windows may include first and second windows
  • the guide information may include a guide line for distinguishing the first and second windows
  • the control method may include the multitasking.
  • the method may further include changing a screen included in any one of the first and second windows based on a touch input applied to the screen.
  • a part of the reduced main screen is changed as the screen included in the one window is changed, and the control method is configured to change the part of the main screen when the output of the multitasking screen is finished.
  • the method may further include controlling the touch screen to be output to the display area.
  • the changing of the screen included in any one of the first and second windows may include: a touch input for moving one execution screen included in the multitasking screen to the one window; When applied, it may be a step of converting the screen output to any one window to the one execution screen.
  • the screen displayed on one of the windows is switched to the one execution screen, but the screen displayed on the other window may be maintained.
  • one or more windows are output according to the size of the display area.
  • the multitasking screen is displayed according to the user's request. The user may control the one or more windows by using the multitasking screen.
  • the multitasking screen is used to control at least one of the plurality of windows included in the main screen, in addition to simply switching the entire main screen to another screen. Accordingly, a multitasking screen specialized for a rollable mobile terminal is provided, and a user can control one or more windows being output to the display area as needed.
  • FIG. 1A is a block diagram illustrating a mobile terminal related to the present invention.
  • FIGS. 1B, 1C, and 1D are conceptual views illustrating a rollable mobile terminal according to an embodiment of the present invention.
  • FIG. 2 is a conceptual diagram illustrating a rollable mobile terminal for outputting a plurality of screen information
  • 3A to 3C are conceptual views illustrating a rollable mobile terminal outputting a plurality of windows or one window to a display area.
  • FIG. 4 is a flowchart illustrating a control method of a rollable mobile terminal according to the present invention.
  • 5A to 5L are conceptual diagrams for describing the control method of FIG. 4.
  • 6A and 6B are conceptual views illustrating a method of merging or dividing windows displayed in a display area.
  • the mobile terminal described herein includes a mobile phone, a smart phone, a laptop computer, a digital broadcasting terminal, a personal digital assistant, a portable multimedia player, a navigation, a slate PC , Tablet PCs, ultrabooks, wearable devices, such as smartwatches, glass glasses, head mounted displays, and the like. have.
  • the mobile terminal described herein includes a mobile phone, a smart phone, a laptop computer, a digital broadcasting terminal, a personal digital assistant, a portable multimedia player, a navigation, a slate PC , Tablet PCs, ultrabooks, wearable devices, such as smartwatches, glass glasses, head mounted displays, and the like. have.
  • FIG. 1A is a block diagram illustrating a mobile terminal according to the present invention
  • FIGS. 1B and 1C are conceptual views of one example of the mobile terminal, viewed from different directions.
  • the mobile terminal 100 includes a wireless communication unit 110, an input unit 120, a sensing unit 140, an output unit 150, an interface unit 160, a memory 170, a controller 180, and a power supply unit 190. ) May be included.
  • the components shown in FIG. 1A are not essential to implementing a mobile terminal, so that the mobile terminal described herein may have more or fewer components than those listed above.
  • the wireless communication unit 110 of the components, between the mobile terminal 100 and the wireless communication system, between the mobile terminal 100 and another mobile terminal 100, or the mobile terminal 100 and the external server It may include one or more modules that enable wireless communication therebetween.
  • the wireless communication unit 110 may include one or more modules for connecting the mobile terminal 100 to one or more networks.
  • the wireless communication unit 110 may include at least one of the broadcast receiving module 111, the mobile communication module 112, the wireless internet module 113, the short range communication module 114, and the location information module 115. .
  • the input unit 120 may include a camera 121 or an image input unit for inputting an image signal, a microphone 122 for inputting an audio signal, an audio input unit, or a user input unit 123 for receiving information from a user. , Touch keys, mechanical keys, and the like.
  • the voice data or the image data collected by the input unit 120 may be analyzed and processed as a control command of the user.
  • the sensing unit 140 may include one or more sensors for sensing at least one of information in the mobile terminal, surrounding environment information surrounding the mobile terminal, and user information.
  • the sensing unit 140 may include a proximity sensor 141, an illumination sensor 142, an illumination sensor, a touch sensor, an acceleration sensor, a magnetic sensor, and gravity.
  • Optical sensors e.g. cameras 121), microphones (see 122), battery gauges, environmental sensors (e.g.
  • the mobile terminal disclosed herein may use a combination of information sensed by at least two or more of these sensors.
  • the output unit 150 is for generating an output related to sight, hearing, or tactile sense, and includes at least one of the display unit 151, the audio output unit 152, the hap tip module 153, and the light output unit 154. can do.
  • the display unit 151 forms a layer structure with or is integrally formed with the touch sensor, thereby implementing a touch screen.
  • the touch screen may function as a user input unit 123 that provides an input interface between the mobile terminal 100 and the user, and may also provide an output interface between the mobile terminal 100 and the user.
  • the interface unit 160 serves as a path to various types of external devices connected to the mobile terminal 100.
  • the interface unit 160 connects a device equipped with a wired / wireless headset port, an external charger port, a wired / wireless data port, a memory card port, and an identification module. It may include at least one of a port, an audio input / output (I / O) port, a video input / output (I / O) port, and an earphone port.
  • I / O audio input / output
  • I / O video input / output
  • earphone port an earphone port
  • the memory 170 stores data supporting various functions of the mobile terminal 100.
  • the memory 170 may store a plurality of application programs or applications driven in the mobile terminal 100, data for operating the mobile terminal 100, and instructions. At least some of these applications may be downloaded from an external server via wireless communication.
  • at least some of these application programs may exist on the mobile terminal 100 from the time of shipment for basic functions of the mobile terminal 100 (for example, a call forwarding, a calling function, a message receiving, and a calling function).
  • the application program may be stored in the memory 170 and installed on the mobile terminal 100 to be driven by the controller 180 to perform an operation (or function) of the mobile terminal.
  • the controller 180 In addition to the operation related to the application program, the controller 180 typically controls the overall operation of the mobile terminal 100.
  • the controller 180 may provide or process information or a function appropriate to a user by processing signals, data, information, and the like, which are input or output through the above-described components, or by driving an application program stored in the memory 170.
  • controller 180 may control at least some of the components described with reference to FIG. 1A in order to drive an application program stored in the memory 170. Furthermore, the controller 180 may operate by combining at least two or more of the components included in the mobile terminal 100 to drive the application program.
  • the power supply unit 190 receives power from an external power source and an internal power source under the control of the controller 180 to supply power to each component included in the mobile terminal 100.
  • the power supply unit 190 includes a battery, which may be a built-in battery or a replaceable battery.
  • At least some of the components may operate in cooperation with each other to implement an operation, control, or control method of the mobile terminal according to various embodiments described below.
  • the operation, control, or control method of the mobile terminal may be implemented on the mobile terminal by driving at least one application program stored in the memory 170.
  • the broadcast receiving module 111 of the wireless communication unit 110 receives a broadcast signal and / or broadcast related information from an external broadcast management server through a broadcast channel.
  • the broadcast channel may include a satellite channel and a terrestrial channel.
  • Two or more broadcast receiving modules may be provided to the mobile terminal 100 for simultaneous broadcast reception or switching of broadcast channels for at least two broadcast channels.
  • the mobile communication module 112 may include technical standards or communication schemes (eg, Global System for Mobile communication (GSM), Code Division Multi Access (CDMA), Code Division Multi Access 2000 (CDMA2000), and EV).
  • GSM Global System for Mobile communication
  • CDMA Code Division Multi Access
  • CDMA2000 Code Division Multi Access 2000
  • EV Enhanced Voice-Data Optimized or Enhanced Voice-Data Only (DO), Wideband CDMA (WCDMA), High Speed Downlink Packet Access (HSDPA), High Speed Uplink Packet Access (HSUPA), Long Term Evolution (LTE), LTE-A (Long Term Evolution-Advanced) and the like to transmit and receive a radio signal with at least one of a base station, an external terminal, a server on a mobile communication network.
  • GSM Global System for Mobile communication
  • CDMA Code Division Multi Access
  • CDMA2000 Code Division Multi Access 2000
  • EV Enhanced Voice-Data Optimized or Enhanced Voice-Data Only (DO)
  • WCDMA Wideband CDMA
  • HSDPA High
  • the wireless signal may include various types of data according to transmission and reception of a voice call signal, a video call call signal, or a text / multimedia message.
  • the wireless internet module 113 refers to a module for wireless internet access and may be embedded or external to the mobile terminal 100.
  • the wireless internet module 113 is configured to transmit and receive wireless signals in a communication network according to wireless internet technologies.
  • wireless Internet technologies include Wireless LAN (WLAN), Wireless-Fidelity (Wi-Fi), Wireless Fidelity (Wi-Fi) Direct, Digital Living Network Alliance (DLNA), Wireless Broadband (WiBro), and WiMAX (World).
  • the wireless Internet module 113 for performing a wireless Internet access through the mobile communication network 113 May be understood as a kind of mobile communication module 112.
  • the short range communication module 114 is for short range communication, and includes Bluetooth TM, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB), ZigBee, and NFC. (Near Field Communication), at least one of Wi-Fi (Wireless-Fidelity), Wi-Fi Direct, Wireless USB (Wireless Universal Serial Bus) technology can be used to support short-range communication.
  • the short-range communication module 114 may be configured between a mobile terminal 100 and a wireless communication system, between the mobile terminal 100 and another mobile terminal 100, or through the wireless area networks. ) And a network in which the other mobile terminal 100 (or an external server) is located.
  • the short range wireless communication network may be short range wireless personal area networks.
  • the other mobile terminal 100 is a wearable device capable of exchanging (or interworking) data with the mobile terminal 100 according to the present invention (for example, smartwatch, smart glasses). (smart glass), head mounted display (HMD).
  • the short range communication module 114 may sense (or recognize) a wearable device that can communicate with the mobile terminal 100, around the mobile terminal 100.
  • the controller 180 may include at least a portion of data processed by the mobile terminal 100 in the short range communication module ( The transmission may be transmitted to the wearable device through 114. Therefore, the user of the wearable device may use data processed by the mobile terminal 100 through the wearable device. For example, according to this, when a call is received by the mobile terminal 100, the user performs a phone call through the wearable device or when a message is received by the mobile terminal 100, the received through the wearable device. It is possible to check the message.
  • the location information module 115 is a module for obtaining a location (or current location) of a mobile terminal, and a representative example thereof is a Global Positioning System (GPS) module or a Wireless Fidelity (WiFi) module.
  • GPS Global Positioning System
  • Wi-Fi Wireless Fidelity
  • the mobile terminal may acquire the location of the mobile terminal using a signal transmitted from a GPS satellite.
  • the mobile terminal may acquire the location of the mobile terminal based on information of the wireless access point (AP) transmitting or receiving the Wi-Fi module and the wireless signal.
  • the location information module 115 may perform any function of other modules of the wireless communication unit 110 to substitute or additionally obtain data regarding the location of the mobile terminal.
  • the location information module 115 is a module used to obtain the location (or current location) of the mobile terminal, and is not limited to a module that directly calculates or obtains the location of the mobile terminal.
  • the input unit 120 is for inputting image information (or signal), audio information (or signal), data, or information input from a user.
  • the mobile terminal 100 is one.
  • the plurality of cameras 121 may be provided.
  • the camera 121 processes image frames such as still images or moving images obtained by the image sensor in the video call mode or the photographing mode.
  • the processed image frame may be displayed on the display unit 151 or stored in the memory 170.
  • the plurality of cameras 121 provided in the mobile terminal 100 may be arranged to form a matrix structure, and through the camera 121 forming a matrix structure in this way, the mobile terminal 100 may have various angles or focuses.
  • the plurality of pieces of image information may be input.
  • the plurality of cameras 121 may be arranged in a stereo structure to acquire a left image and a right image for implementing a stereoscopic image.
  • the microphone 122 processes external sound signals into electrical voice data.
  • the processed voice data may be variously used according to a function (or an application program being executed) performed by the mobile terminal 100. Meanwhile, various noise reduction algorithms may be implemented in the microphone 122 to remove noise generated in the process of receiving an external sound signal.
  • the user input unit 123 is for receiving information from a user. When information is input through the user input unit 123, the controller 180 may control an operation of the mobile terminal 100 to correspond to the input information. .
  • the user input unit 123 may be a mechanical input unit (or a mechanical key, for example, a button, a dome switch, a jog wheel, or the like positioned on the front / rear or side of the mobile terminal 100). Jog switch, etc.) and touch input means.
  • the touch input means may include a virtual key, a soft key, or a visual key displayed on the touch screen through a software process, or a portion other than the touch screen. It may be made of a touch key disposed in the.
  • the virtual key or the visual key may be displayed on the touch screen while having various forms, for example, graphic, text, icon, video, or the like. It can be made of a combination of.
  • the sensing unit 140 senses at least one of information in the mobile terminal, surrounding environment information surrounding the mobile terminal, and user information, and generates a sensing signal corresponding thereto.
  • the controller 180 may control driving or operation of the mobile terminal 100 or perform data processing, function or operation related to an application program installed in the mobile terminal 100 based on the sensing signal. Representative sensors among various sensors that may be included in the sensing unit 140 will be described in more detail.
  • the proximity sensor 141 refers to a sensor that detects the presence or absence of an object approaching a predetermined detection surface or an object present in the vicinity without using a mechanical contact by using an electromagnetic force or infrared rays.
  • the proximity sensor 141 may be disposed in an inner region of the mobile terminal covered by the touch screen described above or near the touch screen.
  • the proximity sensor 141 examples include a transmissive photoelectric sensor, a direct reflective photoelectric sensor, a mirror reflective photoelectric sensor, a high frequency oscillating proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor.
  • the proximity sensor 141 may be configured to detect the proximity of the object by the change of the electric field according to the proximity of the conductive object.
  • the touch screen (or touch sensor) itself may be classified as a proximity sensor.
  • the proximity sensor 141 may detect a proximity touch and a proximity touch pattern (for example, a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch position, and a proximity touch movement state). have.
  • the controller 180 processes data (or information) corresponding to the proximity touch operation and the proximity touch pattern detected through the proximity sensor 141 as described above, and further, provides visual information corresponding to the processed data. It can be output on the touch screen. Further, the controller 180 may control the mobile terminal 100 to process different operations or data (or information) according to whether the touch on the same point on the touch screen is a proximity touch or a touch touch. .
  • the touch sensor senses a touch (or touch input) applied to the touch screen (or the display unit 151) using at least one of various touch methods such as a resistive film method, a capacitive method, an infrared method, an ultrasonic method, and a magnetic field method. do.
  • the touch sensor may be configured to convert a change in pressure applied to a specific portion of the touch screen or capacitance generated at the specific portion into an electrical input signal.
  • the touch sensor may be configured to detect a position, an area, a pressure at the touch, a capacitance at the touch, and the like, when the touch object applying the touch on the touch screen is touched on the touch sensor.
  • the touch object is an object applying a touch to the touch sensor and may be, for example, a finger, a touch pen or a stylus pen, a pointer, or the like.
  • the touch controller processes the signal (s) and then transmits the corresponding data to the controller 180.
  • the controller 180 can know which area of the display unit 151 is touched.
  • the touch controller may be a separate component from the controller 180 or may be the controller 180 itself.
  • the controller 180 may perform different control or perform the same control according to the type of touch object that touches the touch screen (or a touch key provided in addition to the touch screen). Whether to perform different control or the same control according to the type of touch object may be determined according to the operation state of the mobile terminal 100 or an application program being executed.
  • the touch sensor and the proximity sensor described above may be independently or combined, and may be a short (or tap) touch, a long touch, a multi touch, a drag touch on a touch screen. ), Flick touch, pinch-in touch, pinch-out touch, swipe touch, hovering touch, etc. A touch can be sensed.
  • the ultrasonic sensor may recognize location information of a sensing object using ultrasonic waves.
  • the controller 180 can calculate the position of the wave generation source through the information detected from the optical sensor and the plurality of ultrasonic sensors.
  • the position of the wave source can be calculated using the property that the light is much faster than the ultrasonic wave, that is, the time that the light reaches the optical sensor is much faster than the time when the ultrasonic wave reaches the ultrasonic sensor. More specifically, the position of the wave generation source may be calculated using a time difference from the time when the ultrasonic wave reaches the light as the reference signal.
  • the camera 121 which has been described as the configuration of the input unit 120, includes at least one of a camera sensor (eg, CCD, CMOS, etc.), a photo sensor (or image sensor), and a laser sensor.
  • a camera sensor eg, CCD, CMOS, etc.
  • a photo sensor or image sensor
  • a laser sensor e.g., a laser sensor
  • the camera 121 and the laser sensor may be combined with each other to detect a touch of a sensing object with respect to a 3D stereoscopic image.
  • the photo sensor may be stacked on the display element, which is configured to scan the movement of the sensing object in proximity to the touch screen. More specifically, the photo sensor mounts a photo diode and a transistor (TR) in a row / column and scans contents mounted on the photo sensor by using an electrical signal that varies according to the amount of light applied to the photo diode. That is, the photo sensor calculates coordinates of the sensing object according to the amount of light change, and thus, the position information of the sensing object can be obtained.
  • TR transistor
  • the display unit 151 displays (outputs) information processed by the mobile terminal 100.
  • the display unit 151 may display execution screen information of an application program driven in the mobile terminal 100 or user interface (UI) and graphical user interface (GUI) information according to the execution screen information. .
  • UI user interface
  • GUI graphical user interface
  • the display unit 151 may be configured as a stereoscopic display unit for displaying a stereoscopic image.
  • the stereoscopic display unit may be a three-dimensional display method such as a stereoscopic method (glasses method), an auto stereoscopic method (glasses-free method), a projection method (holographic method).
  • the sound output unit 152 may output audio data received from the wireless communication unit 110 or stored in the memory 170 in a call signal reception, a call mode or a recording mode, a voice recognition mode, a broadcast reception mode, and the like.
  • the sound output unit 152 may also output a sound signal related to a function (for example, a call signal reception sound or a message reception sound) performed in the mobile terminal 100.
  • the sound output unit 152 may include a receiver, a speaker, a buzzer, and the like.
  • the haptic module 153 generates various haptic effects that a user can feel.
  • a representative example of the tactile effect generated by the haptic module 153 may be vibration.
  • the intensity and pattern of vibration generated by the haptic module 153 may be controlled by the user's selection or the setting of the controller. For example, the haptic module 153 may synthesize different vibrations and output or sequentially output them.
  • the haptic module 153 may be used to stimulate pins that vertically move with respect to the contact skin surface, jetting force or suction force of air through the jetting or suction port, grazing to the skin surface, contact of electrodes, and electrostatic force
  • Various tactile effects can be generated, such as effects by the endothermic and the reproduction of a sense of cold using the elements capable of endothermic heat generation.
  • the haptic module 153 may not only deliver a tactile effect through direct contact, but also may allow a user to feel the tactile effect through a muscle sense such as a finger or an arm. Two or more haptic modules 153 may be provided according to a configuration aspect of the mobile terminal 100.
  • the light output unit 154 outputs a signal for notifying occurrence of an event by using light of a light source of the mobile terminal 100.
  • Examples of events occurring in the mobile terminal 100 may be message reception, call signal reception, missed call, alarm, schedule notification, email reception, information reception through an application, and the like.
  • the signal output from the light output unit 154 is implemented as the mobile terminal emits light of a single color or a plurality of colors to the front or the rear.
  • the signal output may be terminated by the mobile terminal detecting the user's event confirmation.
  • the interface unit 160 serves as a path to all external devices connected to the mobile terminal 100.
  • the interface unit 160 receives data from an external device, receives power, transfers the power to each component inside the mobile terminal 100, or transmits data inside the mobile terminal 100 to an external device.
  • the port, audio input / output (I / O) port, video input / output (I / O) port, earphone port, etc. may be included in the interface unit 160.
  • the identification module is a chip that stores a variety of information for authenticating the usage rights of the mobile terminal 100, a user identification module (UIM), subscriber identity module (SIM), universal user authentication And a universal subscriber identity module (USIM).
  • a device equipped with an identification module (hereinafter referred to as an 'identification device') may be manufactured in the form of a smart card. Therefore, the identification device may be connected to the terminal 100 through the interface unit 160.
  • the interface unit 160 may be a passage for supplying power from the cradle to the mobile terminal 100 or may be input from the cradle by a user.
  • Various command signals may be a passage through which the mobile terminal 100 is transmitted.
  • Various command signals or power input from the cradle may operate as signals for recognizing that the mobile terminal 100 is correctly mounted on the cradle.
  • the memory 170 may store a program for the operation of the controller 180 and may temporarily store input / output data (for example, a phone book, a message, a still image, a video, etc.).
  • the memory 170 may store data regarding vibration and sound of various patterns output when a touch input on the touch screen is performed.
  • the memory 170 may include a flash memory type, a hard disk type, a solid state disk type, an SSD type, a silicon disk drive type, and a multimedia card micro type. ), Card-type memory (e.g., SD or XD memory), random access memory (RAM), static random access memory (SRAM), read-only memory (ROM), electrically erasable programmable read It may include at least one type of storage medium of -only memory (PROM), programmable read-only memory (PROM), magnetic memory, magnetic disk and optical disk.
  • the mobile terminal 100 may be operated in connection with a web storage that performs a storage function of the memory 170 on the Internet.
  • the controller 180 controls the operation related to the application program, and generally the overall operation of the mobile terminal 100. For example, if the state of the mobile terminal satisfies a set condition, the controller 180 may execute or release a lock state that restricts input of a user's control command to applications.
  • controller 180 may perform control and processing related to voice call, data communication, video call, or the like, or may perform pattern recognition processing for recognizing handwriting input or drawing input performed on a touch screen as text and images, respectively. Can be. Furthermore, the controller 180 may control any one or a plurality of components described above in order to implement various embodiments described below on the mobile terminal 100 according to the present invention.
  • the power supply unit 190 receives an external power source and an internal power source under the control of the controller 180 to supply power for operation of each component.
  • the power supply unit 190 includes a battery, and the battery may be a built-in battery configured to be rechargeable, and may be detachably coupled to the terminal body for charging.
  • the power supply unit 190 may be provided with a connection port, the connection port may be configured as an example of the interface 160 is electrically connected to the external charger for supplying power for charging the battery.
  • the power supply unit 190 may be configured to charge the battery in a wireless manner without using the connection port.
  • the power supply unit 190 uses one or more of an inductive coupling based on a magnetic induction phenomenon or a magnetic resonance coupling based on an electromagnetic resonance phenomenon from an external wireless power transmitter. Power can be delivered.
  • various embodiments of the present disclosure may be implemented in a recording medium readable by a computer or a similar device using, for example, software, hardware, or a combination thereof.
  • the disclosed mobile terminal 100 includes a terminal body in the form of a bar.
  • the present invention is not limited thereto, and the present invention can be applied to various structures such as a watch type, a clip type, a glass type, or a folder type, a flip type, a slide type, a swing type, a swivel type, and two or more bodies which are coupled to be movable relative.
  • a description of a particular type of mobile terminal may generally apply to other types of mobile terminals.
  • the terminal body may be understood as a concept that refers to the mobile terminal 100 as at least one aggregate.
  • the mobile terminal 100 includes a case (eg, a frame, a housing, a cover, etc.) forming an external appearance. As shown, the mobile terminal 100 may include a front case 101 and a rear case 102. Various electronic components are disposed in the internal space formed by the combination of the front case 101 and the rear case 102. At least one middle case may be additionally disposed between the front case 101 and the rear case 102.
  • a case eg, a frame, a housing, a cover, etc.
  • the mobile terminal 100 may include a front case 101 and a rear case 102.
  • Various electronic components are disposed in the internal space formed by the combination of the front case 101 and the rear case 102.
  • At least one middle case may be additionally disposed between the front case 101 and the rear case 102.
  • the display unit 151 may be disposed in front of the terminal body to output information. As shown, the window 151a of the display unit 151 may be mounted to the front case 101 to form a front surface of the terminal body together with the front case 101.
  • an electronic component may be mounted on the rear case 102.
  • Electronic components attachable to the rear case 102 include a removable battery, an identification module, a memory card, and the like.
  • the rear cover 102 may be detachably coupled to the rear case 102 to cover the mounted electronic component. Therefore, when the rear cover 103 is separated from the rear case 102, the electronic components mounted on the rear case 102 are exposed to the outside.
  • the rear cover 103 when the rear cover 103 is coupled to the rear case 102, a portion of the side surface of the rear case 102 may be exposed. In some cases, the coupling serial case 102 may be completely covered by the rear cover 103. On the other hand, the rear cover 103 may be provided with an opening for exposing the camera 121b or the sound output unit 152b to the outside.
  • the cases 101, 102, and 103 may be formed by injecting a synthetic resin, or may be formed of a metal, for example, stainless steel (STS), aluminum (Al), titanium (Ti), or the like.
  • STS stainless steel
  • Al aluminum
  • Ti titanium
  • the mobile terminal 100 may be configured such that one case may provide the internal space, unlike the above example in which a plurality of cases provide an internal space for accommodating various electronic components.
  • the mobile terminal 100 of the unibody that the synthetic resin or metal from the side to the rear may be implemented.
  • the mobile terminal 100 may be provided with a waterproof portion (not shown) to prevent water from seeping into the terminal body.
  • the waterproof portion is provided between the window 151a and the front case 101, between the front case 101 and the rear case 102 or between the rear case 102 and the rear cover 103, and a combination thereof. It may include a waterproof member for sealing the inner space.
  • the mobile terminal 100 includes a display unit 151, first and second sound output units 152a and 152b, a proximity sensor 141, an illuminance sensor 142, an optical output unit 154, and first and second units.
  • the cameras 121a and 121b, the first and second manipulation units 123a and 123b, the microphone 122, the interface unit 160, and the like may be provided.
  • the display unit 151, the first sound output unit 152a, the proximity sensor 141, the illuminance sensor 142, and the light output unit may be disposed on the front surface of the terminal body.
  • the first camera 121a and the first operation unit 123a are disposed, and the second operation unit 123b, the microphone 122, and the interface unit 160 are disposed on the side of the terminal body.
  • the mobile terminal 100 in which the second sound output unit 152b and the second camera 121b are disposed on the rear surface of the mobile terminal 100 will be described as an example.
  • first manipulation unit 123a may not be provided on the front surface of the terminal body, and the second sound output unit 152b may be provided on the side of the terminal body instead of the rear surface of the terminal body.
  • the display unit 151 displays (outputs) information processed by the mobile terminal 100.
  • the display unit 151 may display execution screen information of an application program driven in the mobile terminal 100 or user interface (UI) and graphical user interface (GUI) information according to the execution screen information. .
  • UI user interface
  • GUI graphical user interface
  • the display unit 151 may include a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT LCD), an organic light-emitting diode (OLED), and a flexible display (flexible display). display, a 3D display, or an e-ink display.
  • LCD liquid crystal display
  • TFT LCD thin film transistor-liquid crystal display
  • OLED organic light-emitting diode
  • flexible display flexible display
  • display a 3D display, or an e-ink display.
  • two or more display units 151 may exist according to an implementation form of the mobile terminal 100.
  • the plurality of display units may be spaced apart or integrally disposed on one surface of the mobile terminal 100, or may be disposed on different surfaces.
  • the display unit 151 may include a touch sensor that senses a touch on the display unit 151 so as to receive a control command by a touch method.
  • the touch sensor may sense the touch, and the controller 180 may generate a control command corresponding to the touch based on the touch sensor.
  • the content input by the touch method may be letters or numbers or menu items that can be indicated or designated in various modes.
  • the touch sensor is formed of a film having a touch pattern and disposed between the window 151a and the display (not shown) on the rear surface of the window 151a or directly patterned on the rear surface of the window 151a. May be Alternatively, the touch sensor may be integrally formed with the display. For example, the touch sensor may be disposed on a substrate of the display or provided in the display.
  • the display unit 151 may form a touch screen together with the touch sensor.
  • the touch screen may function as the user input unit 123 (see FIG. 1A).
  • the touch screen may replace at least some functions of the first operation unit 123a.
  • the first sound output unit 152a may be implemented as a receiver for transmitting a call sound to the user's ear, and the second sound output unit 152b may be a loud speaker for outputting various alarm sounds or multimedia reproduction sounds. It can be implemented in the form of).
  • a sound hole for emitting sound generated from the first sound output unit 152a may be formed in the window 151a of the display unit 151.
  • the present invention is not limited thereto, and the sound may be configured to be emitted along an assembly gap between the structures (for example, a gap between the window 151a and the front case 101).
  • an externally formed hole may be invisible or hidden for sound output, thereby simplifying the appearance of the mobile terminal 100.
  • the light output unit 154 is configured to output light for notifying when an event occurs. Examples of the event may include message reception, call signal reception, missed call, alarm, schedule notification, email reception, information reception through an application, and the like.
  • the controller 180 may control the light output unit 154 to end the light output.
  • the first camera 121a processes an image frame of a still image or a moving image obtained by the image sensor in a shooting mode or a video call mode.
  • the processed image frame may be displayed on the display unit 151 and stored in the memory 170.
  • the first and second manipulation units 123a and 123b may be collectively referred to as a manipulating portion as an example of the user input unit 123 operated to receive a command for controlling the operation of the mobile terminal 100. have.
  • the first and second manipulation units 123a and 123b may be adopted in any manner as long as the user is tactile manner such as touch, push, scroll, and the like while the user is tactile.
  • the first and second manipulation units 123a and 123b may be employed in such a manner that the first and second manipulation units 123a and 123b are operated without a tactile feeling by the user through proximity touch, hovering touch, or the like.
  • the first operation unit 123a is illustrated as being a touch key, but the present invention is not limited thereto.
  • the first manipulation unit 123a may be a mechanical key or a combination of a touch key and a push key.
  • the contents input by the first and second manipulation units 123a and 123b may be variously set.
  • the first operation unit 123a receives a command such as a menu, a home key, a cancellation, a search, etc.
  • the second operation unit 123b is output from the first or second sound output units 152a and 152b.
  • the user may receive a command such as adjusting the volume of the sound and switching to the touch recognition mode of the display unit 151.
  • a rear input unit (not shown) may be provided on the rear surface of the terminal body.
  • the rear input unit is manipulated to receive a command for controlling the operation of the mobile terminal 100, and the input contents may be variously set. For example, commands such as power on / off, start, end, scroll, etc., control of the volume of sound output from the first and second sound output units 152a and 152b, and the touch recognition mode of the display unit 151. Commands such as switching can be received.
  • the rear input unit may be implemented in a form capable of input by touch input, push input, or a combination thereof.
  • the rear input unit may be disposed to overlap the front display unit 151 in the thickness direction of the terminal body.
  • the rear input unit may be disposed at the rear upper end of the terminal body so that the user can easily manipulate the index body when the user grips the terminal body with one hand.
  • the present invention is not necessarily limited thereto, and the position of the rear input unit may be changed.
  • the rear input unit when the rear input unit is provided at the rear of the terminal body, a new type user interface using the same may be implemented.
  • the touch screen or the rear input unit described above replaces at least some functions of the first operation unit 123a provided in the front of the terminal body, the first operation unit 123a is not disposed on the front of the terminal body.
  • the display unit 151 may be configured with a larger screen.
  • the mobile terminal 100 may be provided with a fingerprint recognition sensor for recognizing a user's fingerprint, and the controller 180 may use fingerprint information detected through the fingerprint recognition sensor as an authentication means.
  • the fingerprint recognition sensor may be embedded in the display unit 151 or the user input unit 123.
  • the microphone 122 is configured to receive a user's voice, other sounds, and the like.
  • the microphone 122 may be provided at a plurality of locations and configured to receive stereo sound.
  • the interface unit 160 serves as a path for connecting the mobile terminal 100 to an external device.
  • the interface unit 160 may be connected to another device (eg, an earphone or an external speaker), a port for short-range communication (for example, an infrared port (IrDA Port), or a Bluetooth port (Bluetooth). Port), a wireless LAN port, or the like, or a power supply terminal for supplying power to the mobile terminal 100.
  • the interface unit 160 may be implemented in the form of a socket for receiving an external card such as a subscriber identification module (SIM) or a user identity module (UIM), a memory card for storing information.
  • SIM subscriber identification module
  • UIM user identity module
  • the second camera 121b may be disposed on the rear surface of the terminal body. In this case, the second camera 121b has a photographing direction substantially opposite to that of the first camera 121a.
  • the second camera 121b may include a plurality of lenses arranged along at least one line.
  • the plurality of lenses may be arranged in a matrix format.
  • Such a camera may be referred to as an 'array camera'.
  • the second camera 121b is configured as an array camera, images may be photographed in various ways using a plurality of lenses, and images of better quality may be obtained.
  • the flash 124 may be disposed adjacent to the second camera 121b.
  • the flash 124 shines light toward the subject when the subject is photographed by the second camera 121b.
  • the second sound output unit 152b may be additionally disposed on the terminal body.
  • the second sound output unit 152b may implement a stereo function together with the first sound output unit 152a and may be used to implement a speakerphone mode during a call.
  • the terminal body may be provided with at least one antenna for wireless communication.
  • the antenna may be built in the terminal body or formed in the case.
  • an antenna that forms part of the broadcast receiving module 111 (refer to FIG. 1A) may be configured to be pulled out from the terminal body.
  • the antenna may be formed in a film type and attached to the inner side of the rear cover 103, or may be configured such that a case including a conductive material functions as an antenna.
  • the terminal body is provided with a power supply unit 190 (see FIG. 1A) for supplying power to the mobile terminal 100.
  • the power supply unit 190 may include a battery 191 embedded in the terminal body or detachably configured from the outside of the terminal body.
  • the battery 191 may be configured to receive power through a power cable connected to the interface unit 160.
  • the battery 191 may be configured to enable wireless charging through a wireless charger.
  • the wireless charging may be implemented by a magnetic induction method or a resonance method (magnetic resonance method).
  • the rear cover 103 is coupled to the rear case 102 to cover the battery 191 to limit the detachment of the battery 191 and to protect the battery 191 from external shock and foreign matter.
  • the rear cover 103 may be detachably coupled to the rear case 102.
  • An accessory may be added to the mobile terminal 100 to protect the appearance or to assist or expand the function of the mobile terminal 100.
  • An example of such an accessory may be a cover or pouch that covers or accommodates at least one surface of the mobile terminal 100.
  • the cover or pouch may be configured to extend the function of the mobile terminal 100 in conjunction with the display unit 151.
  • Another example of the accessory may be a touch pen for assisting or extending a touch input to a touch screen.
  • the display unit 151 described above may be configured to be deformable by an external force.
  • the deformation may be at least one of bending, bending, folding, twisting, and curling of the display unit 151.
  • the deformable display unit 151 may be referred to as a 'flexible display unit'.
  • the flexible display unit may include both a general flexible display, an e-paper, and a combination thereof.
  • a general flexible display is a light and durable display that is fabricated on a thin and flexible substrate that can be bent, bent, folded, twisted or curled like a paper while maintaining the characteristics of a conventional flat panel display.
  • electronic paper is a display technology to which the characteristics of general ink are applied, and the use of reflected light may be different from that of a conventional flat panel display.
  • Electronic paper can change information using twist balls or electrophoresis using capsules.
  • the display area of the flexible display unit becomes flat.
  • the display area may be a curved surface.
  • the information displayed in the second state may be visual information output on a curved surface.
  • Such visual information is implemented by independently controlling light emission of a sub-pixel disposed in a matrix form.
  • the unit pixel refers to a minimum unit for implementing one color.
  • At least a portion of the flexible display unit may be placed in a curved state (for example, a vertically or horizontally curved state) rather than a flat state in the first state.
  • a curved state for example, a vertically or horizontally curved state
  • the flexible display unit may be deformed into a flat state (or less bent state) or more bent state.
  • the flexible display unit may be combined with a touch sensor to implement a flexible touch screen.
  • the controller 180 (refer to FIG. 1A) may perform control corresponding to the touch input.
  • the flexible touch screen may be configured to detect a touch input not only in the first state but also in the second state.
  • the mobile terminal 100 may be provided with deformation detection means for detecting the deformation of the flexible display unit.
  • deformation detection means may be included in the sensing unit 140 (see FIG. 1A).
  • the deformation detecting means may be provided in the flexible display unit or the case to detect information related to deformation of the flexible display unit.
  • the information related to the deformation may be a direction in which the flexible display unit is deformed, a degree of deformation, a deformed position, a deformed time, and an acceleration in which the deformed flexible display unit is restored. It may be a variety of detectable information.
  • the deformation detection unit may detect at least one region exposed out of the terminal from the entire region of the flexible display unit.
  • the controller 180 controls to change information displayed on the flexible display unit or control a function of the mobile terminal 100 based on the information related to the deformation of the flexible display unit detected by the deformation detecting unit. You can generate a signal.
  • the mobile terminal 100 may include a case accommodating the flexible display unit.
  • the case may be configured to be deformable together with the flexible display unit 251 by an external force in consideration of characteristics of the flexible display unit.
  • the battery (not shown) included in the mobile terminal 200 may also be configured to be deformable together with the flexible display unit by an external force in consideration of characteristics of the flexible display unit.
  • a stack and folding method in which battery cells are stacked up may be applied.
  • the state deformation of the flexible display unit is not limited to external force.
  • the flexible display unit may be transformed into the second state by a command of a user or an application.
  • the rollable display unit is stored in a dried state.
  • the user may apply an external force to the mobile terminal to expose or receive the display unit.
  • FIGS. 1B, 1C, and 1D are conceptual views illustrating a rollable mobile terminal according to an embodiment of the present invention.
  • FIG. 1B is a front view illustrating the mobile terminal in the closed state in which the first touch screen 151a is accommodated in the body portion
  • FIG. 1C illustrates the mobile terminal in the open state in which the first touch screen 151a is exposed from the body portion
  • 1D is a rear view in the closed state.
  • the mobile terminal 100 of the present invention includes first and second body parts 101 and 102, first and second touch screens 151a and 151b, and a first operation unit 123a. ) May be included.
  • the mobile terminal 100 of the present invention includes first and second body parts 101 and 102.
  • the first and second body parts 101 and 102 extend in a first direction D1 and are spaced apart from each other based on a tension applied in a direction crossing the first direction D1.
  • One surface of the first body portion 101 and one surface of the second body portion 102 may be in contact with each other in the closed state.
  • the first and second body parts 101 and 102 may extend along the first direction D1 such that the lengths of the first and second body parts 101 and 102 are substantially the same.
  • the shape of the one surface of the first body portion 101 may correspond to the shape of one surface of the second body portion 102.
  • the first touch screen 151a may be stored inside at least one of the first and second body parts 101 and 102 and may not be exposed. This is to prevent the first touch screen 151a from being damaged by an external environment.
  • the user can freely adjust the size of the terminal according to the use environment using the rollable characteristic.
  • the first touch screen 151a may be accommodated for easy portability to reduce the size of the terminal, or the entire area of the first touch screen 151a may be exposed to use a large display.
  • the first touch screen 151a is stored in at least one of the first and second body parts 101 and 102 in the closed state, and thus is not exposed.
  • the invention is described, but is not limited thereto.
  • a predetermined area of the first touch screen 151a may be exposed without being received in the first and second body parts 101 and 102.
  • the user can check information related to an event occurring in the mobile terminal even in the closed state through the predetermined area.
  • the first body portion 101 and / or the second body portion 102 of the mobile terminal 100 includes a guide unit for guiding the first touch screen 151a to be rolled and accommodated.
  • the guide unit When the guide unit is provided in the first body portion 101, the guide unit extends along the first direction D1, and the guide unit includes a storage space to accommodate a stylus pen (not shown). do.
  • the mobile terminal 100 may further include a stylus pen accommodated in the first body portion 101.
  • the stylus pen extends along the first direction D1 and may be received in the first body portion 101 along the first direction D1.
  • the stylus pen is formed to be detachable from the first body portion 101 based on an external force of the user.
  • An end portion of the stylus pen may be exposed to the outside of the first body portion 101 in a state accommodated in the guide unit.
  • a user may apply a force to the stylus pen exposed from the first body portion 101 to separate the stylus pen from the first body portion 101.
  • One end of the first touch screen 151a is fixed to the first body portion 101 so as to be rollable by the guide unit, and the other end of the first touch screen 151a is fixed to the second body. It is fixed to the body portion (102).
  • the first touch screen 151a is dried by the guide module, and gradually becomes inside the first body part 101. It is stored.
  • the first and second body parts 101 and 102 are spaced apart from each other based on the tension applied in the direction crossing the first direction D1, and between the first and second body parts 101 and 102.
  • the first touch screen 151a is exposed. As the first and second body parts 101 and 102 move away by the external force, an area of the first touch screen 151a exposed increases.
  • the first body part 101 may include a light transmitting part (not shown) made of a light transmitting material so that a part of the first touch screen 151a is reflected. An image output from one area of the first touch screen 151a accommodated in the first body part 101 may be checked by the light transmitting part.
  • the first touch screen 151a may be integrally formed with a touch sensor unit that receives a user's touch input. Accordingly, the touch sensor included in the first touch screen 151a may sense the touch input applied through the light transmitting unit. An independent touch sensor may be included on the light transmitting part.
  • the first touch screen 151a may be integrally formed with a touch sensor unit that receives a user's touch input. Accordingly, the touch sensor included in the first touch screen 151a may sense the touch input applied through the light transmitting unit.
  • the controller may be configured such that one region of the first touch screen 151a corresponding to the light emitting unit in the closed state is current state information (current time, current location, current date, etc.) of the mobile terminal 100. ), And outputs received event information. In this case, the remaining area except for one area of the first touch screen 151a corresponding to the light transmitting part may be maintained in the closed state. As a result, unnecessary waste of power can be prevented.
  • a second touch screen 151b may be formed in the first body portion 101 instead of the light transmitting portion.
  • the second touch screen 151b may include at least one of status information of the mobile terminal 100 and icons of preset applications.
  • the state information of the mobile terminal 100 may include at least one of antenna information, communication mode information, battery information, event information generated, information on a set function, time information, and weather information of the mobile terminal 100.
  • the preset application may include at least one of an icon and a widget corresponding to a specific application preset by the user's selection or the controller 180.
  • the second touch screen 151b includes a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT LCD), an organic light-emitting diode (OLED), and a flexible display.
  • the display device may include at least one of a flexible display, a 3D display, and an e-ink display.
  • the first touch screen 151a is accommodated in at least one of the first and second body parts 101 and 102 and the state information of the mobile terminal 100 is stored in the second touch screen 151b. Can be displayed.
  • one end of the first touch screen 151a is fixed to the first body portion 101, and the other end of the first touch screen 151a is connected to the second body portion 102. It is fixed. Accordingly, the user may grasp and pull the first and second body parts 101 and 102 by hand to expose the display part which is rolled and accommodated in at least one of the first and second body parts 101 and 102. .
  • At least one area that is exposed to the outside of the entire area of the first touch screen 151a to allow the user to check visual information is defined as a 'display area', and an area excluding the display area is defined. This is defined as the 'rest region'.
  • the controller 180 may detect at least one of the display area and the remaining area by using the sensing unit 140 or the deformation detection means. Based on the detection result, the controller 180 can control the first touch screen 151a to output the screen information while the display area is turned on and to maintain the remaining area off.
  • the display area displays (outputs) information processed by the mobile terminal 100 in the open state.
  • the display area may display execution screen information of an application program driven in the mobile terminal 100, or user interface (UI) and graphical user interface (GUI) information according to the execution screen information.
  • UI user interface
  • GUI graphical user interface
  • the first and second touch screens 151a and 151b may include a touch sensor that senses a touch on the touch screen to receive a control command by a touch method. Using this, when a touch is made with respect to the touch screen, the touch sensor may sense the touch, and the controller 180 may generate a control command corresponding to the touch based on the touch sensor.
  • the content input by the touch method may be letters or numbers or menu items that can be indicated or designated in various modes.
  • the touch sensor is formed of a film having a touch pattern and disposed between the window 151a and the display (not shown) on the rear surface of the window 151a or directly patterned on the rear surface of the window 151a. May be Alternatively, the touch sensor may be integrally formed with the display. For example, the touch sensor may be disposed on a substrate of the display or provided in the display.
  • the display unit illustrated in FIG. 1A may form a touch screen together with a touch sensor, and in this case, the touch screen may function as a user input unit 123 (see FIG. 1A). In some cases, the touch screen may replace at least some functions of the user input unit.
  • a first manipulation unit 123a is disposed in the second body portion 102.
  • the present invention is not necessarily limited thereto, and the position of the first manipulation unit 123a may be changed.
  • the control command for the first touch screen 151a is received from the first user input unit 123a. More specifically, when a push input is applied to the first touch screen 151a in the open state, the first touch screen 151a is switched to an activated state. That is, the first user input unit 123a may be a home button for turning on and off the first touch screen 151a.
  • the first user input unit 123a includes a button disposed on a front surface of the second body part and configured to receive a push input, and a fingerprint scanner arranged to overlap with the button to recognize a fingerprint of a contacting finger.
  • the button is exposed to the outside from the front of the terminal, it is made to receive a push input of the first function.
  • the first function may be a function of displaying predetermined screen information on the display area.
  • the web page screen of the display area may be converted to a home screen page.
  • the home screen page may be switched by a push input applied to the button. have.
  • the first user input unit 123a may be configured to scan a fingerprint from a finger of the user applying the push along with the push input applied. More specifically, a fingerprint scanner module is mounted on one surface of the button (upper surface in this example), thereby recognizing a fingerprint of a finger in contact with the button. The button is slidably formed by the push, thereby pressing the switch. The controller detects the push input when the switch is pressed to process the corresponding control command.
  • the push button may be replaced with a touch button.
  • the fingerprint scanner module may be disposed on the upper surface of the touch button. Therefore, in the example described below, the push input through the push button may be replaced with the touch input through the touch button.
  • the fingerprint scanner module may be utilized as a function key associated with power on and off. For example, when the button of the first user input unit 123a is pushed while the terminal is powered off, fingerprint recognition is performed while power is supplied to the fingerprint scanner module. The terminal determines whether the recognized fingerprint corresponds to a pre-stored user's fingerprint, and if so, continues the power on and booting of the terminal. In this way, as the power on command of the terminal is made in two stages (fingerprint scanner power on and terminal power on), the current consumption may be smaller. That is, a security function according to fingerprint recognition is provided to the power-on operation, but the fingerprint scanner module can be turned off together with the terminal body instead of being continuously turned on.
  • the display area may directly enter the home screen page without passing through the lock screen.
  • the terminal may output a warning message indicating that the user's fingerprint is not.
  • the warning message may be configured to give a lighting effect to the front of the terminal for a predetermined time or output an effect sound.
  • the lock screen may be switched to a home screen page while being unlocked.
  • the first user input unit 123a may be a hot key for unlocking.
  • the unlocking is performed only when the fingerprint scanner performs fingerprint recognition on an object contacting the fingerprint scanner for push, and the recognized fingerprint corresponds to a pre-stored user's fingerprint.
  • a notification message indicating that fingerprint is being recognized may be output on the second touch screen 151b of the terminal.
  • the first body portion 101 and the second body portion 102 may move in a direction away from each other by a user gesture, or may move in a direction approaching each other.
  • the first and second body parts 101 and 102 are spaced apart from each other based on a tension applied in a direction crossing the first direction D1 in the locked state.
  • the first touch screen 151a is exposed between the portions 101 and 102. Exposed at least one area of the first touch screen 151a stored in the guide unit out of the guide unit is defined as 'roll-out'.
  • the closed state may be switched to the open state by the roll out.
  • the controller 180 determines whether to release the locked state based on the fingerprint scanner recognizing the fingerprint of the finger.
  • a lock screen configured to receive a password for unlocking while maintaining the lock state is output to the display area.
  • the password may be set in various forms such as a password set by a user or a pattern.
  • the first touch screen 151a may be a state in which only a touch input related to an operation of releasing the lock state is possible.
  • the operation of releasing the locked state may be an operation of inputting a pattern or a password set by a user.
  • the controller 180 switches the locked state to the released state and outputs a home screen page to the display area.
  • Guide information for guiding that fingerprint recognition is successful may be output to the second touch screen 121b.
  • the home screen page may also be referred to as an idle screen, and a touch input to the display area is possible.
  • the first user input unit 123a may be a hot key for unlocking.
  • the home screen page may be output on the display unit 251 when the terminal is in the standby state. More specifically, the home screen page may display an icon or a widget of an application installed in the mobile terminal. In addition, a plurality of home screen pages may be configured according to a user's selection or the number of applications installed in the terminal.
  • switching to the home screen page may be performed in a form of returning to the home screen page in a state in which a specific application is being executed, rather than a lock mode.
  • the first user input unit 123a may be pushed to return to the home screen page. That is, the first user input unit 123a may operate as a home key for performing a control command to return to the home screen page.
  • the display area which is turned off by the roll out, is turned on and the screen information is displayed.
  • the locked state is released or maintained according to whether the fingerprint matching the user fingerprint previously stored in the first user input unit 123a is recognized.
  • the user may control the rollable terminal to output different screen information upon roll out.
  • a plurality of icons related to the execution of different applications may be displayed on the second touch screen 151b.
  • the controller 180 may output a plurality of icons to the second touch screen 151b.
  • the plurality of icons may be associated with an application that was selected by the user or recently executed.
  • the type of the preset application may be reset by user input.
  • the controller 180 executes an application corresponding to the one icon and outputs the execution screen to the display area. do. For example, if a touch is applied to any one of the icons displayed on the second touch screen 151b while the home screen page is output to the display area, the home screen page is executed by the application corresponding to the one. You are switched to the screen.
  • the icons disappear from the second touch screen 151b, and graphic objects corresponding to a function related to the application are displayed on the second touch screen.
  • the display area is changed from a long rectangular shape to a long rectangular shape.
  • the rollable mobile terminal according to the present invention replaces at least one of the graphic objects to be included in the execution screen with the second touch screen instead of the display area.
  • the display may be displayed on the display panel 151b (or may be overlapped on the display area and the second touch screen 151b). As a result, the user may execute a function related to the execution screen by using the hand holding the terminal.
  • the second touch screen 151b serves as an information provider for providing simple information in the locked state, provides a predetermined icon for quick execution in the open state, or displays screen information being output to the display area. It can provide an interface to control the.
  • the mobile terminal can effectively output information necessary for the user.
  • the controller 180 locks the terminal. That is, the terminal is switched to the locked state.
  • second and third user input units 123b and 123c may be disposed on the rear surface of the mobile terminal. More specifically, the second and third user input units 123b and 123c may be disposed on the rear surface of the first body unit 101.
  • the second user input unit 123b is provided on a side of the first body unit 101 instead of the rear surface thereof, and the third user input unit 123b is not the first body unit 101. It may be provided on the rear of the second body portion (102).
  • the on / off of the terminal itself may be performed by the second user input unit 123b.
  • the third user input unit 123b may serve as a volume key. More specifically, the third user input unit 123b is formed along the longitudinal direction of the first body unit 101 and is exposed to the rear surface of the terminal body. The third user input unit 123b includes keys disposed above and below each other, and is configured to receive a push input of a function related to adjusting the size of the sound output from the terminal body.
  • a first camera may be disposed on the rear surface of the first body portion 101
  • a second camera may be disposed on the rear surface of the second body portion 102.
  • a home screen page may be output to the display area in the open state, and preset icons may be output to the second touch screen 151b.
  • the controller determines the number of pages to be displayed in the display area based on the size of the display area. For example, when the display area is smaller than the reference size, a first page corresponding to any one of the plurality of pages is output, and when the display area is larger than or equal to the reference size, the plurality of pages The first page and the second page may be output. In other words, if the display area is smaller than the reference size, one page may be output. If the display area is larger than or equal to the reference size, two or more pages may be output.
  • the display area varies depending on roll-in or roll-out.
  • the rollable mobile terminal 100 includes a sensing unit 140 (see FIG. 1), and measures at least one of the display area and the remaining area by using the sensing unit 140. Based on the measurement result, the number of pages to be displayed in the display area is determined.
  • the controller 180 divides the display area into different areas based on the size of the display area, and outputs different pages to the divided areas. For example, when the number of pages to be output is determined to be two, the display area is divided into first and second areas A and B, and the first area A includes the first page in the second area ( In B), second pages are output, respectively.
  • the first and second regions A and B may be used as windows that operate independently. Specifically, when a touch input is applied to the first area A, the screen information being output to the first area A is converted to other screen information, but the screen information being output to the second area B is The output is maintained.
  • the display area is larger than the reference size while the first window is displayed in the first area A
  • screen division is performed on the display area.
  • the display area larger than the reference size is divided into the first and second areas A and B, the first area A becomes a first window, and the second area B is a second one. It becomes a window.
  • the touch input applied to the first area A or the first window generates a control command related to the screen information displayed on the first window, and differs from the screen information output to the first area A. Change screen information.
  • the touch input applied to the second area B or the second window generates a control command related to the screen information displayed on the second window and is output to the second area B. Allows you to change the information to other screen information. That is, the touch input applied to any one of the plurality of windows output to the display area generates a control command for the one window and does not affect the other windows.
  • the execution screen of the first application is displayed in the first area A or the first window, and the execution screen of the first application is displayed in the second area B or the second window.
  • the execution screen of the second application among the plurality of applications may be displayed.
  • the term 'window' may refer to a region itself in which any one of a plurality of user interfaces operating independently of each other in the entire region of the touch screen is output.
  • the first area is called a first window
  • the second area is called a second window.
  • a window is a visual area with a rectangular shape.
  • Each window contains a separate user interface that accepts user input and displays output.
  • the present invention will be described with reference to an embodiment in which a plurality of windows do not overlap, but the plurality of windows may overlap each other.
  • the vertical length of the display area is fixed. Therefore, the size of the display area is determined by the horizontal length of the portion exposed to the outside of the entire horizontal length of the touch screen.
  • the controller 180 may divide the display state into at least two or more. For example, when the horizontal length W of the display area is within the first range (0 ⁇ W ⁇ a), it is determined as the first state, and the horizontal length W of the display area is within the second range (a ⁇ W ⁇ b) It may be determined as the second state, and when the horizontal length W of the display area is within the third range (b ⁇ W ⁇ max), it may be determined as the third state.
  • the first to third states are merely examples, and the display state may be divided into at least two states according to the overall width of the touch screen.
  • the controller 180 may output a first window in the first state, output first and second windows in the second state, and output first to third windows in the third state.
  • an area in which the first to third windows are output may be fixed among the entire areas of the touch screen.
  • a first window is output in a portion (first region, or A) having a horizontal length of 0 to a on the touch screen, and a second window is output in a portion (a second region, or B) from a to b.
  • a window is output, and a third window is output in a portion from b to c (third region, or C). If the width W of the display area is within a second range, a first window is output at a portion A from 0 to a, and a second at a portion (at least part of B) from a to W.
  • the window is output. In this case, the user can appropriately fit the display area to be used by performing a roll out or roll in while checking the boundary between the windows.
  • an area in which the first to third windows are output may vary depending on the display area. If the horizontal length W of the display area is within a second range, the controller 180 displays first and second windows, dividing the display area into two, and dividing the display area into two different areas. And a second window, respectively. Since one or more windows vary in size depending on the size of the display area, the user can be provided with an interface optimized for the display area.
  • the home screen page when the home screen page is composed of a plurality of pages, different pages among the plurality of pages may be preset in the windows output to the display area. For example, a first page may be preset in the first window, a second page may be preset in the second window, and a third page may be preset in the third window.
  • the preset page is displayed instead of the screen information being output. In this case, the first page is displayed in the first window, and the second page is displayed in the second window.
  • different home screen pages may be displayed on the plurality of windows, or execution screens of different applications may be displayed.
  • the number of windows displayed in the display area is determined according to the size of the display area, the number of windows displayed in the display area may vary according to a user's request.
  • the controller 180 may integrate the plurality of windows into one window in response to a user request.
  • the controller 180 may control the touch screen 151 to enlarge and display any one of the plurality of windows in the display area and to disappear the remaining windows from the display area.
  • the controller 180 may divide the one window into a plurality of windows in response to a user request. In this case, the controller 180 divides the display area into a plurality of areas, reduces and displays the one window in any one of the plurality of areas, and displays the remaining area of the plurality of areas.
  • the execution screen or a preset home screen page of an application running by multitasking can be displayed.
  • the controller 180 can display one window or a plurality of windows in the display area according to a user's request.
  • 3A to 3C are conceptual views illustrating a rollable mobile terminal that outputs a plurality of windows or one window to a display area.
  • 3A to 3C an embodiment in which one or more windows are displayed in the display area will be described in detail.
  • the first to third windows A, B, and C are displayed in the display area, or the fourth window D is displayed in one window.
  • the present invention is not limited thereto and may be applied to an embodiment in which two windows or more windows are displayed in the display area.
  • Substantially the same screen information may be displayed in the display area when both a window is displayed in the display area and when a plurality of windows are displayed in the display area. For example, as shown in FIG. 3A, the first to third pages included in the home screen page are displayed in different windows AC, or as shown in FIG. 3B, one window D is shown. May be displayed.
  • the controller 180 individually or independently controls the plurality of windows.
  • the controller 180 detects a control command related to information displayed on the first window by using a touch input applied to the first window, The control command related to the information displayed on the second window is sensed using a touch input applied to the second window.
  • a first page may be displayed in the first window A, and the first page may include a shooting icon 310 linked to a shooting application.
  • a shooting icon 310 linked to a shooting application.
  • an execution screen of the photographing application is displayed on the first window A.
  • FIG. In other windows B and C, even if a touch is applied to the photographing icon 310, the output information is continuously output. Therefore, the execution screen of the photographing application is displayed on a portion of the display area where the second and third windows B and C are not displayed.
  • the user may set the number of windows displayed in the display area to one or a plurality, or change a setting related to the number of windows. More specifically, the controller 180 may set the number of windows displayed in the display area to one or a plurality according to a user's request. For example, as illustrated in FIG. 3C, the first to third pages included in the home screen page are displayed as a plurality of windows AC and then displayed as one window D by a request for a use car. It may be displayed as one window D and then a plurality of windows AC.
  • the user request may be input to the rollable mobile terminal in various forms.
  • the user request may be defined as a user input of a preset method applied to the first user input unit 123a.
  • a virtual toggle button for displaying one window or a plurality of windows may be displayed on the display area.
  • the user When the user wants to use the execution screen of a specific application as a big screen, the user can set the window displayed on the display area as one, so that the execution screen is displayed on the entire display area.
  • execution screens of different applications may be output to the plurality of windows.
  • the output screen size is smaller than when the window is one, but there is an advantage in that the execution screens of various applications can be checked simultaneously.
  • FIG. 4 is a flowchart illustrating a control method of a rollable mobile terminal according to the present invention.
  • the controller 180 sets at least a portion of the touch screen 151 exposed out of the main body as a display area (S410).
  • the controller 180 senses at least one area exposed to the outside of the main body by using the sensing unit 140 (refer to FIG. 1A), and sets the detected at least one area as the display area.
  • the main body may be at least one of a guide unit and the first and second bodies configured to wind and receive the touch screen 151.
  • the controller 180 may detect at least one of a horizontal length, a size, a position, and an aspect ratio of the display area by using the sensing unit 140.
  • the controller 180 controls the touch screen 151a such that screen information is output to the detected display area and the remaining area is turned off.
  • the display area is changed.
  • the controller 180 resets the display area.
  • the second area may be set as the display area.
  • the display area may be continuously set differently accordingly.
  • the controller 180 controls the touch screen 151 so that screen information is output to the display area and the remaining area is turned off. As the roll in / roll out occurs, the display area is changed, and thus the area where the screen information is displayed is changed.
  • the controller 180 displays a main screen on the display area (S430).
  • the main screen is defined as a screen displayed on the display area. Since the display area includes one or more windows according to the size of the display area, the main screen includes all the information output from the one or more windows.
  • the screen capture function is executed in the terminal, an image capturing the main screen is generated.
  • the user may recognize how many windows are included in the main screen.
  • the number of windows displayed in the display area may vary by roll out or roll in, and the information displayed in the one or more windows may vary according to a user gesture.
  • the information displayed in the one or more windows may be changed by touch input, roll in, or roll out.
  • the controller 180 sets the display area, and the touch screen 151 displays different screen information on the display area based on the method in which the roll in or roll out occurs. Can be controlled.
  • the controller 180 when the closed state is switched to the open state by the at least one movement, the controller 180 outputs different screen information according to the at least one of the movements to the display area. 151 can be controlled.
  • the controller 180 outputs first screen information to the display area when the closed state is changed to the open state by the movement of the first body 101, and the closed state is the second body.
  • first screen information may be output to the display area.
  • second screen information may be output to the display area.
  • third screen information may be output to the display area.
  • the controller 180 may execute any one of functions related to the application based on the at least one movement. In this case, the controller 180 may display the first execution screen in the first area, and display a second execution screen corresponding to any one function in the second area.
  • the controller 180 executes a first function among functions related to the application, and the second area includes the second body ( When exposed by the movement of 102, a second function among functions related to the application may be executed. In contrast, when the second region is exposed by the movement of the first and second bodies 101 and 102, the controller 180 may execute a third function among functions related to the application.
  • the controller 180 displays a multitasking screen on the display area in response to a preset user input (S450).
  • controller 180 is configured to display the multitasking screen in the display area instead of the main screen when the preset user input is input while displaying the main screen. To control.
  • the multitasking screen may be output by various methods.
  • the multi-task screen may be output by a push input of tapping the first user input unit 123a a plurality of times.
  • a hardware key associated with an output function of the multitasking screen may be separately provided, or a software key or menu associated with an output function of the multitasking screen may be displayed on the touch screen.
  • the multitasking screen may be output by a touch input applied to a corresponding key.
  • the multitasking screen includes an execution screen of each of the applications. Specifically, it includes an execution screen of each of the applications being executed by multitasking in the foreground and / or background.
  • Applications that are being executed by multitasking refer to applications that are executed by user input but have not received a termination command for terminating the execution.
  • An application is a concept including a widget, a home launcher, etc., and means any type of program that can be run in a terminal. Accordingly, the application may be a program that performs a function of a web browser, video playback, message transmission, reception, schedule management, and application updating.
  • the execution screen of the application included in the multitasking screen refers to the screen of the last state executed by the user. As the application is executed, the execution screen is changed, and the last changed execution screen is output on the multitasking screen.
  • the last changed execution screen may be a screen which was most recently output on the display area.
  • Displaying the multitasking screen on the display area (S450) includes reducing the main screen to a predetermined size and displaying the multitasking screen on the multitasking screen, and guiding one or more windows included in the main screen. And displaying guide information on the reduced main screen.
  • the multitasking screen may include a main screen in which the main screen displayed in the display area immediately before the multitasking screen is displayed, as well as an execution screen of each of the applications executed by the multitasking, being reduced to a predetermined size;
  • the display device may further include guide information for guiding a window included in the main screen.
  • the guide information may include a guide line for distinguishing one or more windows included in the main screen.
  • the guide line may mean a boundary line of the one or more windows.
  • the controller 180 changes at least a part of the main screen based on the touch input applied to the multitasking screen (S470).
  • the controller 180 may change a screen included in at least one of the windows included in the main screen based on a touch input applied to the multitasking screen. For example, when the main screen includes first and second windows, the screen included in at least one of the first and second windows may be changed.
  • the controller 180 may apply a touch input to move one execution screen included in the multitasking screen to any one of the windows included in the main screen.
  • the screen output to the window can be switched to any one execution screen.
  • the controller 180 displays a screen output on the one window. You can switch to the page. Since different home screen pages are set in the plurality of windows included in the main screen, different home screen pages may be output according to the windows.
  • the controller 180 when the touch input is applied to move any one window among the windows included in the reduced main screen to another window, the controller 180 is different from the one window.
  • the touch screen 151 may be controlled to change the display position of the windows.
  • the controller 180 may change the layout of the windows included in the main screen based on a touch input with respect to the guide line. As the layout is changed, the size of the content displayed in the window may be enlarged or reduced to fit the layout.
  • the controller 180 changes a part of the reduced main screen as the screen included in the one window is changed.
  • the screen displayed on the one window is switched to the one execution screen, but the screen displayed on the other window is maintained.
  • the controller 180 displays a main screen on which the at least part is changed in the display area (S490).
  • the controller 180 controls the touch screen to output the main screen, the part of which has been changed, to the display area.
  • the output of the multitasking screen may be terminated when the home button is pressed while the multitasking screen is output.
  • the multitasking screen serves to quickly switch the output screen to the execution screen of the recently executed application.
  • one or more windows are displayed in the display area.
  • the question is how to control the plurality of windows displayed in the display area.
  • the multitasking screen is used to control at least one of the plurality of windows included in the main screen, in addition to simply switching the entire main screen to another screen. Accordingly, a multitasking screen specialized for a rollable mobile terminal is provided, and a user can control one or more windows being output to the display area using the multitasking screen.
  • 5A to 5L are examples of operations implemented by the control method of FIG. 4, and are conceptual views illustrating control of a window using a touch input applied to a multitasking screen.
  • the multitasking screen 500 may be output by a preset user input while a main screen including a plurality of windows is displayed.
  • first to third windows A-C are sequentially output to the display area.
  • the figure shows an example in which the first to third windows A, B, and C are arranged along the width direction of the terminal.
  • each window may display an execution screen or a preset home screen page of an application installed in the terminal.
  • the controller 180 may output guide information for guiding the plurality of windows.
  • the guide information may include a guide line for guiding a boundary line of the windows.
  • identification information indicating how many pages the output page corresponds to may be included in each window.
  • the execution screen of the application is replaced with a circle.
  • the circle diagram means the execution screen of the application, and the number in the circle diagram is a number for distinguishing different applications.
  • the rectangle surrounding the circular shape means the area where the execution screen is displayed.
  • an execution screen 1 of the first application is displayed in the first window A
  • an execution screen 2 of the second application is displayed in the second window B
  • a third window C is displayed.
  • the execution screen 3 of the third application may be displayed on the screen.
  • the controller 180 sets the screen output from the display area as the main screen before the push input is applied. do.
  • the multitasking screen 500 is output in response to the push input, and the main screen reduced to a predetermined size is included in the multitasking screen.
  • the main screen reduced to the predetermined size is referred to as a reduced main screen 510.
  • the multitasking screen 500 includes not only the reduced main screen 510 but also application information executed by multitasking.
  • Application information executed by multitasking includes at least one of a name, an icon, and an execution screen of an application executed by multitasking in the foreground and / or background.
  • the first to third applications (1, 2, 3) that are executed in the background and the execution screen is included in the main screen, and some of the applications (4, 5, 6) that are executed in the background are executed.
  • Information may be included in the multitasking screen 500.
  • a user may not only check an application running in the foreground and an application running in the background using a multitasking screen, but also layout of one or more windows included in the main screen. can confirm.
  • a list of applications that are scrolled and executed by multitasking may be checked.
  • the first to sixth application information may be changed to the fourth to ninth application information.
  • the application information may be displayed sequentially from left to right in the order of execution. For example, since the fifth application information 5 is displayed on the right side of the fourth application information 4, the user can intuitively recognize that the fourth application has been executed more recently than the fifth application.
  • the user can check all the applications that are being executed by multitasking, and can recognize the execution order of the applications.
  • the controller 180 may control at least one of one or more windows included in the main screen based on a touch input applied to the multitasking screen.
  • the screen displayed on the one window is switched to the one execution screen.
  • one of the windows included in the main screen may be selected by a user input, and information displayed only on the selected window may vary.
  • the screen displayed on the third window is different, but the screen displayed on the first and second windows is maintained.
  • the screen displayed in the third window is displayed as the execution screen of the third application.
  • the execution screen (9) of the ninth application As the screen included in the third window is changed, a part of the reduced main screen 510 is also changed.
  • the ninth application moves from the background to the foreground
  • the third application moves from the foreground to the background.
  • the display order of application information included in the multitasking screen is also changed.
  • the controller 180 displays a screen displayed on one of the windows on the home screen page. You can switch to
  • the touch input of the preset method may be a touch input applied to a region where the one window is displayed and released after the applied touch is continuously moved in a preset direction.
  • the execution screen 1 of the first application included in the first window is displayed. You can move along the touch. As the execution screen 1 of the first application moves, a home screen page that is covered by the execution screen 1 of the first application may appear. When the touch is released while satisfying the preset method, the execution screen 1 of the first application disappears and the home screen page set in the first window is completely displayed.
  • the controller 180 may terminate execution of a specific application in response to the preset touch input. For example, referring to FIG. 5D, the first application included in the first window may be terminated, and the first application information may disappear from the multitasking screen 500. This is because the first application is not included in the list of applications executed by multitasking as the execution of the first application ends.
  • the execution of the application running in the background may be terminated by the touch input applied to the multitasking screen 500.
  • the controller 180 terminates execution of the sixth application, and the multitasking screen is displayed.
  • the touch screen 151 is controlled such that the sixth application information 6 disappears.
  • the space generated as the sixth application information 6 disappears is filled with the seventh application information 7.
  • the multitasking screen 500 may include at least one of a screen integration icon 520 and a screen division icon 522.
  • the screen integration icon 520 is linked to a function of integrating a plurality of windows included in the main screen into a single window
  • the screen split icon 522 is configured to combine one window included in the main screen into a plurality of windows. To the function of dividing into groups.
  • the screen division icon or the screen integration icon may be selectively output according to the number of one or more windows included in the main screen.
  • the screen integration icon 520 and the screen division icon 522 may be displayed on the multitasking screen like a toggle button.
  • the screen integration icon may be activated and the screen division icon may be deactivated.
  • the screen integration icon may be deactivated and the screen division icon may be activated.
  • the deactivation of the icon may mean that a function associated with the icon is not executed even if a touch input is applied, or may disappear from the display area.
  • any one of the first to third windows may be selected by touch while the first to third windows are included in the main screen.
  • An indicator may be displayed so that the user may recognize that the one window is selected.
  • the indicator may output an edge of a preset color to an edge region of the selected window or display a graphic object on the selected window.
  • the controller 180 may integrate a plurality of windows included in the main screen into one window.
  • a selected third window among the first to third windows is enlarged and displayed on the reduced main screen 510, and the first and second windows are displayed on the reduced main window. It may disappear from the screen 510.
  • the second application is executed in the background.
  • the screen integration icon 520 may disappear and the screen division icon 522 may appear.
  • the controller 180 updates the guide information when a function corresponding to the screen division icon or a function corresponding to the screen integration icon is executed.
  • the user may check how many windows are included in the reduced main screen 510 through the updated guide information.
  • the one window The screen displayed on the screen may be switched to any one execution screen. For example, as shown in FIG. 5G, when the execution screen 7 of the seventh application is dragged to the one window while the execution screen 9 of the ninth application is displayed on the one window, the zoom out is performed. The main screen 510 is switched from the execution screen 9 of the ninth application to the execution screen 7 of the seventh application.
  • the controller 180 opens the one window in response to a touch applied to the screen division icon 522. It can be divided into a plurality of windows. Accordingly, the execution screen 7 of the seventh application output from one window is divided into a plurality of areas, and each of the divided areas is displayed in different windows.
  • a plurality of windows appear on the reduced main screen 510, and guide information for guiding the plurality of windows is updated.
  • the controller 180 determines the number of windows to be displayed on the main screen based on the size of the display area, and divides the main screen by the determined number. For example, although the main screen is divided into three windows in FIG. 5G, the main screen may be divided into two or more windows according to the size of the display area.
  • the reduced main screen 510 includes a guide line for guiding one or more windows, and the controller 180 controls the one or more windows based on a touch input to the guide line. You can change the layout.
  • the first and second guide lines 540 and 542 distinguishing the three windows are reduced. May be output on 510.
  • the second guide line 542 When the second guide line 542 is combined with the first guide line 540 by a drag input, the second guide line 542 disappears from the reduced main screen 510.
  • the windows included in the reduced main screen 510 are reduced from three to two.
  • the execution screen 5 of the fifth application may be dragged to the second window while the execution screen 7 of the seventh application is displayed on the first and second windows. 7
  • the execution screen 7 of the application is reduced to the first window
  • the execution screen 5 of the fifth application is displayed on the second window
  • the guide line 540 for guiding the first and second windows is displayed. Is displayed.
  • the layout of the first and second windows may be changed by a drag input to the guide line 540.
  • the guide line 540 is moved by a drag input, and a new boundary of the first and second windows may be defined according to the movement.
  • at least one of positions and sizes of the first and second windows may be changed, and sizes of contents included in the first and second windows may be changed.
  • the controller 180 may display the multitasking screen so that the multitasking screen disappears from the display area and the reduced main screen 510 is displayed on the display area.
  • the touch screen 151 is controlled.
  • the reduced main screen 510 is enlarged at a predetermined ratio and displayed on the display area. That is, execution screens 5 and 7 of the fifth and seventh applications included in the reduced main screen 510 may be enlarged at a predetermined ratio and displayed on the display area.
  • the home screen page is displayed.
  • the controller 180 displays a home screen page on the display area as the home button is pressed, and displays a home screen page preset according to the size of the display area.
  • the controller 180 displays a predetermined number of windows based on the size of the display area, and displays different home screen pages preset in each of the windows.
  • first to third pages may be displayed on the display area.
  • the multitasking screen is used to control at least one of the plurality of windows included in the main screen, in addition to simply switching the entire main screen to another screen. Accordingly, a multitasking screen specialized for a rollable mobile terminal is provided, and a user can control one or more windows being output to the display area using the multitasking screen.
  • 6A and 6B are conceptual diagrams for describing a method of merging or dividing windows displayed in a display area.
  • a multitasking screen 600 may be output while one window is displayed on the display area.
  • the controller 180 sets the one window as the main screen, and reduces the size to a predetermined size.
  • the main screen 610 is included in the multitasking screen 600.
  • the screen split icon 522 is activated and the screen merge icon 520 is deactivated.
  • the reduced main screen 610 is divided into at least two windows according to the size of the display area.
  • An example of dividing the window into three windows is illustrated in FIG. 6A, but is not limited thereto.
  • first and second guide lines 610 and 612 for guiding the plurality of windows are displayed on the reduced main screen 610.
  • the screen split icon 522 is deactivated, and the screen merge icon 520 is activated.
  • the controller 180 controls the one or more windows. Can be integrated into Windows. For example, as illustrated in FIG. 6B, when a touch input for moving the first guide line 610 to the left end of the reduced main screen 610 is applied, the plurality of windows are integrated into one window. . Accordingly, the display of the first and second guide lines 610 and 612 is terminated, the screen division icon 522 is activated, and the screen integration icon 520 is deactivated.
  • the user may control one or a plurality of windows displayed in the display area by using the multitasking screen as a whole or individually.
  • a plurality of windows are displayed in the display area, not only the layout can be changed but also the display position of the windows can be changed.
  • the rollable mobile terminal described above is not limited to the configuration and method of the above-described embodiments, but the embodiments are configured by selectively combining all or some of the embodiments so that various modifications can be made. May be
  • the present invention described above can be embodied as computer readable codes on a medium in which a program is recorded.
  • the computer-readable medium includes all kinds of recording devices in which data that can be read by a computer system is stored. Examples of computer-readable media include hard disk drives (HDDs), solid state disks (SSDs), silicon disk drives (SDDs), ROMs, RAMs, CD-ROMs, magnetic tapes, floppy disks, optical data storage devices, and the like. This also includes implementations in the form of carrier waves (eg, transmission over the Internet).
  • the computer may include the controller 180 of the terminal. Accordingly, the above detailed description should not be construed as limiting in all aspects and should be considered as illustrative. The scope of the invention should be determined by reasonable interpretation of the appended claims, and all changes within the equivalent scope of the invention are included in the scope of the invention.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Hardware Design (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Mathematical Physics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Telephone Function (AREA)

Abstract

La présente invention concerne un terminal mobile apte à rouler muni d'un dispositif d'affichage apte à rouler, et son procédé de commande, le terminal mobile apte à rouler comprenant : un écran tactile formé de façon à pouvoir rouler ; un premier corps relié à une extrémité de l'écran tactile ; un second corps relié à l'autre extrémité opposée à la première extrémité de l'écran tactile ; une unité de guidage disposée dans le premier corps et/ou le second corps, et formée pour enrouler et recevoir l'écran tactile ; et une unité de commande pour régler, en tant que zone d'affichage, au moins une partie de l'écran tactile exposée sur la partie extérieure de l'unité de guidage, et commander l'écran tactile de telle sorte qu'un écran multitâche comprenant un écran d'exécution de chacune des applications est délivré à la zone d'affichage, lorsqu'une entrée d'utilisateur préétablie est reçue pendant l'affichage d'un écran principal dans la zone d'affichage, l'unité de commande réduisant l'écran principal à une taille prédéterminée et affichant l'écran principal réduit sur l'écran multitâche, et affichant des informations de guidage pour guider une ou plusieurs fenêtres incluses dans l'écran principal sur l'écran principal réduit.
PCT/KR2015/014291 2015-12-24 2015-12-24 Terminal mobile apte à rouler et son procédé de commande WO2017111192A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/KR2015/014291 WO2017111192A1 (fr) 2015-12-24 2015-12-24 Terminal mobile apte à rouler et son procédé de commande

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/KR2015/014291 WO2017111192A1 (fr) 2015-12-24 2015-12-24 Terminal mobile apte à rouler et son procédé de commande

Publications (1)

Publication Number Publication Date
WO2017111192A1 true WO2017111192A1 (fr) 2017-06-29

Family

ID=59090605

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2015/014291 WO2017111192A1 (fr) 2015-12-24 2015-12-24 Terminal mobile apte à rouler et son procédé de commande

Country Status (1)

Country Link
WO (1) WO2017111192A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114415894A (zh) * 2022-01-25 2022-04-29 京东方科技集团股份有限公司 终端分屏处理方法、装置、设备及介质
US20220404870A1 (en) * 2021-06-18 2022-12-22 International Business Machines Corporation Articulated display of flexible display device dividable into separate units
WO2022261892A1 (fr) * 2021-06-17 2022-12-22 深圳传音控股股份有限公司 Procédé de commande, terminal mobile et support de stockage lisible
US11934501B2 (en) 2021-09-22 2024-03-19 International Business Machines Corporation Rollable extended mobile device display

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130127799A1 (en) * 2011-11-18 2013-05-23 Samsung Display Co., Ltd. Display Device Having a Rollable Display Unit
KR20140025231A (ko) * 2012-08-22 2014-03-04 삼성전자주식회사 플렉서블 디스플레이 장치 및 플렉서블 디스플레이 장치의 제어 방법
US20140098075A1 (en) * 2012-10-04 2014-04-10 Samsung Electronics Co., Ltd. Flexible display apparatus and control method thereof
KR20140112988A (ko) * 2013-03-15 2014-09-24 김지하 휘어지는 디스플레이를 이용하여 멀티윈도우 시스템을 제공하는 방법
KR20150068823A (ko) * 2013-12-12 2015-06-22 엘지전자 주식회사 이동 단말기

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130127799A1 (en) * 2011-11-18 2013-05-23 Samsung Display Co., Ltd. Display Device Having a Rollable Display Unit
KR20140025231A (ko) * 2012-08-22 2014-03-04 삼성전자주식회사 플렉서블 디스플레이 장치 및 플렉서블 디스플레이 장치의 제어 방법
US20140098075A1 (en) * 2012-10-04 2014-04-10 Samsung Electronics Co., Ltd. Flexible display apparatus and control method thereof
KR20140112988A (ko) * 2013-03-15 2014-09-24 김지하 휘어지는 디스플레이를 이용하여 멀티윈도우 시스템을 제공하는 방법
KR20150068823A (ko) * 2013-12-12 2015-06-22 엘지전자 주식회사 이동 단말기

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022261892A1 (fr) * 2021-06-17 2022-12-22 深圳传音控股股份有限公司 Procédé de commande, terminal mobile et support de stockage lisible
US20220404870A1 (en) * 2021-06-18 2022-12-22 International Business Machines Corporation Articulated display of flexible display device dividable into separate units
US11940841B2 (en) * 2021-06-18 2024-03-26 International Business Machines Corporation Articulated display of flexible display device dividable into separate units
US11934501B2 (en) 2021-09-22 2024-03-19 International Business Machines Corporation Rollable extended mobile device display
CN114415894A (zh) * 2022-01-25 2022-04-29 京东方科技集团股份有限公司 终端分屏处理方法、装置、设备及介质

Similar Documents

Publication Publication Date Title
WO2017099276A1 (fr) Terminal mobile enroulable et son procédé de commande
WO2017104860A1 (fr) Terminal mobile enroulable
WO2017119529A1 (fr) Terminal mobile
WO2017057803A1 (fr) Terminal mobile et son procédé de commande
WO2017090823A1 (fr) Terminal mobile enroulable et son procédé de commande
WO2016182132A1 (fr) Terminal mobile et son procédé de commande
WO2015199270A1 (fr) Terminal mobile, et procédé de commande correspondant
WO2020171287A1 (fr) Terminal mobile et dispositif électronique comportant un terminal mobile
WO2017082508A1 (fr) Terminal de type montre, et procédé de commande associé
WO2017030223A1 (fr) Terminal mobile à unité de carte et son procédé de commande
WO2017119531A1 (fr) Terminal mobile et son procédé de commande
WO2017126737A1 (fr) Terminal mobile
WO2017047854A1 (fr) Terminal mobile et son procédé de commande
WO2017003055A1 (fr) Appareil d'affichage et procédé de commande
WO2016032045A1 (fr) Terminal mobile et son procédé de commande
WO2017034126A1 (fr) Terminal mobile
WO2017039051A1 (fr) Terminal mobile de type montre et son procédé de commande
WO2018043844A1 (fr) Terminal mobile
WO2017051959A1 (fr) Appareil de terminal et procédé de commande pour appareil de terminal
WO2016129778A1 (fr) Terminal mobile et procédé de commande associé
WO2016039498A1 (fr) Terminal mobile et son procédé de commande
WO2021182692A1 (fr) Terminal mobile, dispositif électronique ayant un terminal mobile, et procédé de commande du dispositif électronique
WO2018030619A1 (fr) Terminal mobile
WO2016190484A1 (fr) Terminal mobile et procédé de commande associé
WO2015167128A1 (fr) Terminal mobile et son procédé de commande

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15911451

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15911451

Country of ref document: EP

Kind code of ref document: A1