EP2728437B1 - Mobiles Endgerät und Steuerungsverfahren dafür - Google Patents

Mobiles Endgerät und Steuerungsverfahren dafür Download PDF

Info

Publication number
EP2728437B1
EP2728437B1 EP13182840.2A EP13182840A EP2728437B1 EP 2728437 B1 EP2728437 B1 EP 2728437B1 EP 13182840 A EP13182840 A EP 13182840A EP 2728437 B1 EP2728437 B1 EP 2728437B1
Authority
EP
European Patent Office
Prior art keywords
transparent substrate
transparent
displayed
user
display unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
EP13182840.2A
Other languages
English (en)
French (fr)
Other versions
EP2728437A3 (de
EP2728437A2 (de
Inventor
Jiyoung Park
Sujin Kim
Jumin Chi
Jaeho Choi
Sunghye Yoon
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
LG Electronics Inc
Original Assignee
LG Electronics Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by LG Electronics Inc filed Critical LG Electronics Inc
Publication of EP2728437A2 publication Critical patent/EP2728437A2/de
Publication of EP2728437A3 publication Critical patent/EP2728437A3/de
Application granted granted Critical
Publication of EP2728437B1 publication Critical patent/EP2728437B1/de
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1694Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a single or a set of motion sensors for pointer control or gesture input obtained by sensing movements of the portable computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1637Details related to the display arrangement, including those related to the mounting of the display in the housing
    • G06F1/1643Details related to the display arrangement, including those related to the mounting of the display in the housing the display being associated to a digitizer, e.g. laptops that can be used as penpads
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1626Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1637Details related to the display arrangement, including those related to the mounting of the display in the housing
    • G06F1/1647Details related to the display arrangement, including those related to the mounting of the display in the housing including at least an additional display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0483Interaction with page-structured environments, e.g. book metaphor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04804Transparency, e.g. transparent or translucent windows

Definitions

  • This specification relates to a mobile terminal, and particularly, to a mobile terminal capable of controlling contents displayed on a display unit, and a control method thereof.
  • Terminals may be divided into mobile/portable terminals and stationary terminals according to their mobility.
  • Mobile terminals mobile device, portable device, portable terminal
  • the portable terminal can be allowed to capture still images or moving images, play music or video files, play games, receive broadcast and the like, so as to be implemented as a multimedia player.
  • Various new attempts have been made for the multimedia devices by hardware or software in order to implement such complicated functions.
  • the terminal may output contents on a display unit.
  • contents when an event is generated while the contents are output, it causes inconvenience in that at least part of a screen for displaying the contents is obscured to display information relating to the generated event.
  • US2012/0600089 discloses a method of controlling the transparency of first and second transparent display units to control an image viewed by a user.
  • the contents of the first transparent display unit and the second transparent display unit may be changed in correspondence to touch inputs on the first transparent display unit and the second transparent display unit, respectively.
  • KR20110125356 A1 discloses a mobile terminal with a terminal body and a transparent display unit having first and second transparent substrates configured to sense a touch input, and a control method thereof. If an event is generated during output of a content, an icon is displayed. Thereafter, if the mobile terminal is turned and the side of the transparent display unit oriented towards the user is changed, information related to the event is displayed.
  • EP2101243 A1 and EP2157769 A1 disclose mobile terminals with displays on two sides.
  • an aspect of the detailed description is to provide a mobile terminal capable of improving user's convenience in the aspect of displaying information relating to an event generated while a content display screen is displayed on a display unit, and a control method thereof.
  • FIG. 1 is a block diagram of a mobile terminal 100 in accordance with one exemplary embodiment.
  • the mobile terminal 100 may comprise components, such as a wireless communication unit 110, an Audio/Video (A/V) input unit 120, a user input unit 130, a sensing unit 140, an output unit 150, a memory 160, an interface unit 170, a controller 180, a power supply 190 and the like.
  • FIG. 1 shows the mobile terminal 100 having various components, but it is understood that implementing all of the illustrated components is not a requirement. Greater or fewer components may alternatively be implemented.
  • the wireless communication unit 110 may typically include one or more modules which permit wireless communications between the mobile terminal 100 and a wireless communication system or between the mobile terminal 100 and a network within which the mobile terminal 100 is located.
  • the wireless communication unit 110 may include a broadcast receiving module 111, a mobile communication module 112, a wireless Internet module 113, a short-range communication module 114, a location information module 115 and the like.
  • the broadcast receiving module 111 receives a broadcast signal and/or broadcast associated information from an external broadcast managing entity via a broadcast channel.
  • broadcast associated information may include information associated with a broadcast channel, a broadcast program, a broadcast service provider, and the like.
  • the broadcast associated information may be provided via a mobile communication network, and received by the mobile communication module 112.
  • Broadcast signals and/or broadcast associated information received via the broadcast receiving module 111 may be stored in a suitable device, such as a memory 160.
  • the mobile communication module 112 transmits/receives wireless signals to/from at least one of network entities (e.g., base station, an external mobile terminal, a server, etc.) on a mobile communication network.
  • the wireless signals may include audio call signal, video (telephony) call signal, or various formats of data according to transmission/reception of text/multimedia messages.
  • the wireless Internet module 113 supports wireless Internet access for the mobile terminal. This module may be internally or externally coupled to the mobile terminal 100. Examples of such wireless Internet access may include Wireless LAN (WLAN) (Wi-Fi), Wireless Broadband (Wibro), Worldwide Interoperability for Microwave Access (Wimax), High Speed Downlink Packet Access (HSDPA) and the like.
  • WLAN Wireless LAN
  • Wibro Wireless Broadband
  • Wimax Worldwide Interoperability for Microwave Access
  • HSDPA High Speed Downlink Packet Access
  • the short-range communication module 114 denotes a module for short-range communications. Suitable technologies for implementing this module may include BLUETOOTHTM, Radio Frequency IDentification (RFID), Infrared Data Association (IrDA), Ultra-WideBand (UWB), ZigBeeTM, and the like.
  • RFID Radio Frequency IDentification
  • IrDA Infrared Data Association
  • UWB Ultra-WideBand
  • ZigBeeTM ZigBeeTM
  • the location information module 115 denotes a module for detecting or calculating a position of a mobile terminal.
  • An example of the location information module 115 may include a Global Position System (GPS) module.
  • GPS Global Position System
  • the A/V input unit 120 is configured to provide audio or video signal input to the mobile terminal.
  • the AN input unit 120 may include a camera 121 and a microphone 122.
  • the camera 121 receives and processes image frames of still pictures or video obtained by image sensors in a video call mode or a capturing mode.
  • the processed image frames may be displayed on a display unit 151.
  • the image frames processed by the camera 121 may be stored in the memory 160 or transmitted to the exterior via the wireless communication unit 110. Two or more cameras 121 may be provided according to the configuration of the mobile terminal.
  • the microphone 122 may receive an external audio signal while the mobile terminal is in a particular mode, such as a phone call mode, a recording mode, a voice recognition mode, or the like. This audio signal is processed into digital data. The processed digital data is converted for output into a format transmittable to a mobile communication base station via the mobile communication module 112 in case of the phone call mode.
  • the microphone 122 may include assorted noise removing algorithms to remove noise generated in the course of receiving the external audio signal.
  • the user input unit 130 may generate input data input by a user to control the operation of the mobile terminal.
  • the user input unit 130 may include a keypad, a dome switch, a touchpad (e.g., static pressure/capacitance), a jog wheel, a jog switch and the like.
  • the sensing unit 140 provides status measurements of various aspects of the mobile terminal. For instance, the sensing unit 140 may detect an open/close status of the mobile terminal, a change in a location of the mobile terminal 100, a presence or absence of user contact with the mobile terminal 100, the location of the mobile terminal 100, acceleration/deceleration of the mobile terminal 100, and the like, so as to generate a sensing signal for controlling the operation of the mobile terminal 100. For example, regarding a slide-type mobile terminal, the sensing unit 140 may sense whether a sliding portion of the mobile terminal is open or closed. Other examples include sensing functions, such as the sensing unit 140 sensing the presence or absence of power provided by the power supply 190, the presence or absence of a coupling or other connection between the interface unit 170 and an external device.
  • the sensing unit 140 may include a proximity sensor 141. Also, the sensing unit 140 includes a touch sensor (not shown) which senses a touch operation with respect to the display unit 151.
  • the touch sensor may be implemented as a touch film, a touch sheet, a touchpad, and the like.
  • the touch sensor may be configured to convert changes of a pressure applied to a specific part of the display unit 151, or a capacitance occurring from a specific part of the display unit 151, into electric input signals. Also, the touch sensor may be configured to sense not only a touched position and a touched area, but also touch pressure.
  • the display unit 151 and a touch sensitive sensor have a layered structure therebetween, the display unit 151 may be used as an input device rather than an output device.
  • the display unit 151 with the structure may be referred to as a touch screen.
  • a touch controller When touch inputs are applied via the touch screen, corresponding signals are transmitted to a touch controller.
  • the touch controller processes the received signals, and then transmits corresponding data to the controller 180. Accordingly, the controller 180 may sense which region of the display unit 151 has been touched.
  • the touch screen When the touch screen is implemented as a capacitance type, proximity of a pointer to the touch screen is sensed by changes of an electromagnetic field.
  • the touch screen touch sensor
  • the proximity sensor 141 may be categorized into the proximity sensor 141.
  • the proximity sensor 141 indicates a sensor to sense presence or absence of an object approaching to a surface to be sensed by using an electromagnetic field or infrared rays without a mechanical contact.
  • the proximity sensor 141 has a longer lifespan and a more enhanced utility than a contact sensor.
  • Examples of the proximity sensor 141 may include a transmissive type photoelectric sensor, a direct reflective type photoelectric sensor, a mirror reflective type photoelectric sensor, a high-frequency oscillation proximity sensor, a capacitance type proximity sensor, a magnetic type proximity sensor, an infrared rays proximity sensor, and so on.
  • a status that the object to be sensed is positioned to be proximate onto the touch screen without contact will be referred to as 'proximity touch'
  • a status that the object to be sensed substantially comes in contact with the touch screen will be referred to as 'contact touch'.
  • the proximity sensor 141 senses proximity touch, and proximity touch patterns (e.g., distance, direction, speed, time, position, moving status, etc.). Information relating to the sensed proximity touch and the sensed proximity touch patterns may be output onto the touch screen.
  • proximity touch patterns e.g., distance, direction, speed, time, position, moving status, etc.
  • the output unit 150 is configured to output an audio signal, a video signal or a tactile signal.
  • the output unit 150 may include a display unit 151, an audio output module 152, an alarm unit 153 and a haptic module 154.
  • the display unit 151 may output information processed in the mobile terminal 100. For example, when the mobile terminal is operating in a phone call mode, the display unit 151 will provide a User Interface (Ul) or a Graphic User Interface (GUI), which includes information associated with the call. As another example, if the mobile terminal is in a video call mode or a capturing mode, the display unit 151 may additionally or alternatively display images captured and/or received, UI, or GUI.
  • Ul User Interface
  • GUI Graphic User Interface
  • the display unit 151 may be implemented using, for example, at least one of a Liquid Crystal Display (LCD), a Thin Film Transistor-Liquid Crystal Display (TFT-LCD), an Organic Light-Emitting Diode (OLED), a flexible display, a three-dimensional (3D) display, an e-ink display or the like.
  • LCD Liquid Crystal Display
  • TFT-LCD Thin Film Transistor-Liquid Crystal Display
  • OLED Organic Light-Emitting Diode
  • flexible display a three-dimensional (3D) display
  • 3D three-dimensional
  • e-ink display or the like.
  • Some of such display units 151 may be implemented as a transparent type or an optical transparent type through which the exterior is visible, which is referred to as 'transparent display'.
  • a representative example of the transparent display may include a Transparent OLED (TOLED), and the like.
  • the rear surface of the display unit 151 is also implemented to be optically transparent. Under this configuration, a user can view an object positioned at a rear side of a terminal body through a region occupied by the display unit 151 of the terminal body.
  • the display unit 151 is implemented in two or more in number according to a configured aspect of the mobile terminal 100. For instance, a plurality of the displays 151 may be arranged on one surface to be spaced apart from or integrated with each other, or may be arranged on different surfaces.
  • the audio output module 152 may output audio data received from the wireless communication unit 110 or stored in the memory 160, in a call-receiving mode, a call-placing mode, a recording mode, a voice recognition mode, a broadcast reception mode, and so on.
  • the audio output module 152 may output audio signals relating to functions performed in the mobile terminal 100, e.g., sound alarming a call received or a message received, and so on.
  • the audio output module 152 may include a receiver, a speaker, a buzzer, and so on.
  • the alarm unit 153 outputs signals notifying occurrence of events from the mobile terminal 100.
  • the events occurring from the mobile terminal 100 may include call received, message received, key signal input, touch input, and so on.
  • the alarm unit 153 may output not only video or audio signals, but also other types of signals such as signals notifying occurrence of events in a vibration manner. Since the video or audio signals can be output through the display unit 151 or the audio output module 152, the display unit 151 and the audio output module 152 may be categorized into a part of the alarm unit 153.
  • the haptic module 154 generates various tactile effects which a user can feel.
  • a representative example of the tactile effects generated by the haptic module 154 includes vibration.
  • Vibration generated by the haptic module 154 may have a controllable intensity, a controllable pattern, and so on. For instance, different vibration may be output in a synthesized manner or in a sequential manner.
  • the haptic module 154 may generate various tactile effects, including not only vibration, but also arrangement of pins vertically moving with respect to a skin being touched (contacted), air injection force or air suction force through an injection hole or a suction hole, touch by a skin surface, presence or absence of contact with an electrode, effects by stimulus such as an electrostatic force, reproduction of cold or hot feeling using a heat absorbing device or a heat emitting device, and the like.
  • the haptic module 154 may be configured to transmit tactile effects (signals) through a user's direct contact, or a user's muscular sense using a finger or a hand.
  • the haptic module 154 may be implemented in two or more in number according to the configuration of the mobile terminal 100.
  • the memory 160 may store a program for the processing and control of the controller 180. Alternatively, the memory 160 may temporarily store input/output data (e.g., phonebook data, messages, still images, video and the like). Also, the memory 160 may store data related to various patterns of vibrations and audio output upon the touch input on the touch screen.
  • input/output data e.g., phonebook data, messages, still images, video and the like.
  • the memory 160 may store data related to various patterns of vibrations and audio output upon the touch input on the touch screen.
  • the memory 160 may be implemented using any type of suitable storage medium including a flash memory type, a hard disk type, a multimedia card micro type, a memory card type (e.g., SD or DX memory), Random Access Memory (RAM), Static Random Access Memory (SRAM), Read-Only Memory (ROM), Electrically Erasable Programmable Read-Only Memory (EEPROM), Programmable Read-Only Memory (PROM), magnetic memory, magnetic disk, optical disk, and the like.
  • the mobile terminal 100 may operate a web storage which performs the storage function of the memory 160 on the Internet.
  • the interface unit 170 may generally be implemented to interface the mobile terminal with external devices.
  • the interface unit 170 may allow a data reception from an external device, a power delivery to each component in the mobile terminal 100, or a data transmission from the mobile terminal 100 to an external device.
  • the interface unit 170 may include, for example, wired/wireless headset ports, external charger ports, wired/wireless data ports, memory card ports, ports for coupling devices having an identification module, audio Input/Output (I/O) ports, video I/O ports, earphone ports, and the like.
  • I/O audio Input/Output
  • the identification module may be configured as a chip for storing various information required to authenticate an authority to use the mobile terminal 100, which may include a User Identity Module (UIM), a Subscriber Identity Module (SIM), and the like. Also, the device having the identification module (hereinafter, referred to as 'identification device') may be implemented in a type of smart card. Hence, the identification device can be coupled to the mobile terminal 100 via a port.
  • UIM User Identity Module
  • SIM Subscriber Identity Module
  • the interface unit 170 may serve as a path for power to be supplied from an external cradle to the mobile terminal 100 when the mobile terminal 100 is connected to the external cradle or as a path for transferring various command signals input from the cradle by a user to the mobile terminal 100.
  • Such various command signals or power input from the cradle may operate as signals for recognizing that the mobile terminal 100 has accurately been mounted to the cradle.
  • the controller 180 typically controls the overall operations of the mobile terminal 100. For example, the controller 180 performs the control and processing associated with telephony calls, data communications, video calls, and the like.
  • the controller 180 may include a multimedia module 181 which provides multimedia playback.
  • the multimedia module 181 may be configured as part of the controller 180 or as a separate component.
  • the controller 180 can perform a pattern recognition processing so as to recognize writing or drawing input on the touch screen as text or image.
  • the power supply 190 provides power required by various components under the control of the controller 180.
  • the provided power may be internal power, external power, or combination thereof.
  • the embodiments described herein may be implemented within one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), processors, microprocessors, other electronic units designed to perform the functions described herein, or a selective combination thereof. In some cases, such embodiments are implemented by the controller 180.
  • ASICs Application Specific Integrated Circuits
  • DSPs Digital Signal Processors
  • DSPDs Digital Signal Processing Devices
  • PLDs Programmable Logic Devices
  • FPGAs Field Programmable Gate Arrays
  • processors microprocessors, other electronic units designed to perform the functions described herein, or a selective combination thereof.
  • microprocessors other electronic units designed to perform the functions described herein, or a selective combination thereof.
  • the software codes can be implemented with a software application written in any suitable programming language. Also, the software codes may be stored in the memory 160 and executed by the controller 180.
  • the user input unit 130 may be manipulated to allow inputting of commands for controlling operations of the mobile terminal 100, and include a plurality of first manipulation units 131, 132.
  • the plurality of manipulation units 131, 132 may be referred to as a manipulating portion.
  • Such manipulating portion can employ any tactile manner that a user can touch or tap for manipulation.
  • the information may be displayed in the form of character, number, symbol, graphic, icon, etc, or may be configured as a 3D stereoscopic image.
  • At least one of the characters, numbers, symbols, graphics and icons are displayed in a certain array so as to be implemented in the form of a keypad.
  • Such keypad may be called 'soft key'.
  • the display unit 151 may be operated as a whole region or may be divided into a plurality of regions and accordingly operated. In the latter case, the plurality of regions may be operation in association with each other.
  • an output window and an input window may be displayed at upper and lower portions of the display unit 151, respectively.
  • the output window and the input window are regions allocated for outputting or inputting information, respectively.
  • Soft keys including numbers for inputting a phone number, or the like, are outputted to the input window.
  • a number corresponding to the touched soft key is displayed on the output window.
  • the first manipulation unit is manipulated, a call connection with respect to a phone number displayed on the output window is attempted, or text displayed on the output window may be input to an application.
  • the display unit 151 or the touch pad may be configured to receive a touch through scrolling.
  • the user may move a cursor or a pointer positioned on an entity, e.g., an icon, or the like, displayed on the display unit 151 by scrolling the display unit 151 or the touch pad.
  • an entity e.g., an icon, or the like
  • the path along which the user's fingers move may be visually displayed on the display unit 151. This would be useful in editing an image displayed on the display unit 151.
  • One function of the terminal may be executed in case where the display unit 151 (touch screen) and the touch pad are touched together within a certain time range.
  • the both touches may be clamping the terminal body with the user's thumb and index finger.
  • the executed one function of the mobile terminal 100 may be, for example, activation or deactivation of the display unit 151 or the touch pad.
  • FIGS. 2A and 2B are perspective views showing an appearance of the mobile terminal in accordance with the one exemplary embodiment.
  • FIG. 2A shows a front surface and one side surface of the mobile terminal 100.
  • FIG. 2B shows a rear surface and another side surface of the mobile terminal 100.
  • the mobile terminal 100 disclosed herein is provided with a bar-type terminal body.
  • the present application is not limited to this type of terminal, but is also applicable to various structures of terminals such as slide type, folder type, swivel type, swing type, and the like, in which two or more bodies are combined with each other in a relatively movable manner.
  • a terminal body may include a case (or referred to as casing, housing, cover, etc.) defining an appearance of the mobile terminal 100.
  • the case may be divided into a front case 101 and a rear case 102.
  • Various electronic components are installed in the space between the front case 101 and the rear case 102.
  • At least one intermediate case may be additionally disposed between the front case 101 and the rear case 102.
  • Such cases may be injected using a synthetic resin or be formed of a metal, such as stainless steel (STS), titanium (Ti), aluminum (Al) or the like.
  • STS stainless steel
  • Ti titanium
  • Al aluminum
  • the terminal body usually the front case 101 may be shown having a display unit 151, an audio output module 152, a user input unit 130 (see FIG. 1 ), a microphone 122, an interface unit 170 and the like.
  • the display unit 151 may occupy most of a principal surface of the front case 101.
  • the audio output module 152 and the camera 121 may be disposed near one of both end portions of the display unit 151, and a first user input unit 131 and the microphone 122 on the other end portion of the display unit 151.
  • a second user input unit 132, the interface unit 170 and the like may be disposed on side surfaces of the front and rear cases 101 and 102.
  • the user input unit 130 may be manipulated to allow inputting of commands for controlling operations of the mobile terminal 100, and include a plurality of manipulation units 131 and 132.
  • the first and second manipulation units 131 and 132 may be set to allow inputting of various contents.
  • the first manipulation unit 131 may be configured to input commands such as START, END, SCROLL or the like
  • the second manipulation unit 132 may be configured to input commands, such as a volume adjustment of sounds output from the audio output module 152, conversion of the display unit 151 into a touch recognition mode, or the like.
  • the rear surface, namely, the rear case 102 of the terminal body may further be provided with a rear camera 121'.
  • the rear camera 121' faces a direction which is opposite to a direction faced by the front camera 121 (see FIG. 2A ), and may have different pixels from those of the front camera 121.
  • the front camera 121 may operate with relatively lower pixels (lower resolution) and the rear camera 121' may operate with relatively higher pixels.
  • the front camera 121 may be useful when a user can capture his face and send it to another party during a video call or the like. This may result in reduction of a size of the transmitted data.
  • the rear camera 121' may be useful for a user to obtain higher quality pictures for later use.
  • the cameras 121 and 121' may be installed in the terminal body to be rotatable or popped up.
  • a flash 123 and a mirror 124 may additionally be disposed adjacent to the camera 121'.
  • the flash 123 operates in conjunction with the camera 121' when taking a picture using the camera 121'.
  • the mirror 124 can cooperate with the camera 121' to allow a user to photograph himself in a self-portrait mode.
  • a rear audio output module 152' may further be disposed at a rear face of the terminal body, namely, the rear case 102.
  • the rear audio output module 152' can cooperate with the front audio output module 152 (see FIG. 2A ) to provide stereo output.
  • the rear audio output module 152' may be configured to operate as a speakerphone.
  • a broadcast signal receiving antenna 116 may further be disposed at the side surface of the terminal body, in addition to an antenna for call connection.
  • the antenna forming a part of the broadcast receiving module 111 may be retractably into the terminal body.
  • a power supply unit 190 for supplying power to the mobile terminal 100 may be mounted in the terminal body.
  • the power supply unit 190 may be mounted in the terminal body or detachably coupled directly onto the outside of the terminal body.
  • the rear case 102 may be further provided with a touchpad 135 for detecting a touch input. Similar to the display unit 151 (see FIG. 2A ), the touchpad 135 is implemented as a light-transmissive type. A rear display unit is mounted on the touchpad 135 to output visual information even on the touchpad 135. Here, the information output on the front display unit 151 and the rear display unit may be controlled by the touchpad 135.
  • the touchpad 135 may operate mutually in association with the display unit 151.
  • the touchpad 135 is provided on the rear of the display unit 151 in parallel to each other.
  • the touchpad 135 may have a size the same as or smaller than the size of the display unit 151.
  • the transparent display unit 155 has a characteristic of transmitting light with displaying an image on a screen. This may allow a user to visually recognize an object located at the opposite side of the transparent display unit 155.
  • an inorganic thin film electroluminescent display and an organic light emitting diode may be used.
  • Those modules are a passive matrix type, so any TFT is not required. This may result in highly increasing optical transmittance.
  • the transparent display unit 155 may be implemented.
  • the transparent display unit 155 may be implemented.
  • FIG. 3 is a block diagram showing a transparent display unit 155 in accordance with one exemplary embodiment.
  • the transparent display unit 155 may include a first transparent substrate 410, a second transparent substrate 420, and an image layer 430 interposed between the first transparent substrate 410 and the second transparent substrate 420.
  • the image layer 430 located between the first transparent substrate 410 and the second transparent substrate 420 may be referred to as an organic cell.
  • light can be transmitted through the first transparent substrate 410 and the second transparent substrate 420.
  • the image layer 430 may include an anode 431, a hole transport layer 432, an emitting layer 433, an electron transport layer 434 and a cathode 435.
  • a grayscale current may be supplied to them. Accordingly, electrons generated from the cathode 435 may be carried to the emitting layer 433 via the electron transport layer 434.
  • holes generated from the anode 431 may be carried to the emitting layer 433 via the hole transport layer 432.
  • the electrons supplied from the electron transport layer 434 and the holes supplied from the hole transport layer 432 may collide with each other on the emitting layer 433 so as to be re-coupled to each other.
  • the collision between the electrons and the holes may generate light from the emitting layer 433.
  • Brightness of the light generated from the emitting layer 433 may be proportional to an amount of the grayscale current supplied to the anode 431.
  • the light When the light is generated from the emitting layer 433, the light may be emitted either toward the first transparent substrate 410 or toward the second transparent substrate 420. This may allow the user to view an image through the second transparent substrate 420 as well as the first transparent substrate 410.
  • FIG. 4 is a conceptual view showing the transparent display unit 155 in accordance with the one exemplary embodiment.
  • the transparent display unit 155 may include the first and second transparent substrates 410 and 420 and first and second touch panels 411 and 421.
  • the first and second touch panels 411 and 421 may be disposed on the first transparent substrate 410 and the second transparent substrate 420 of the transparent display unit 155, respectively.
  • the first and second touch panels 411 and 421 may have various structures, such as a capacitance touch panel, a press-type touch panel, an optical touch panel and the like.
  • first and second touch panels 411 and 421 may not be illustrated for convenience of explanation, but generating a touch input means that one of the first and second touch panels 411 and 421 has been disposed on an area where the touch is generated.
  • FIGS. 5 and 6 are exemplary views showing a content output on the transparent display unit in accordance with the one exemplary embodiment.
  • the user When a user faces the first transparent substrate 410, the user may view an image output on the transparent display unit 155 in the form as shown in FIG. 5(a) .
  • the user when the user turns over the body of the mobile terminal 100 to the other side, the user may view an image output on the second transparent substrate 420 in the form as shown in FIG. 5(b) . That is, the images output on the first and second transparent substrates 410 and 420 may be images which are left-right inversed images from each other.
  • the first and second transparent substrates 410 and 420 may output different images from each other.
  • the first transparent substrate 410 may output a first image 510
  • the second transparent substrate 420 may output a second image 520 which is different from the first image 510.
  • the first image 510 may be an image obtained by capturing an object in a first direction
  • the second image 520 may be an image obtained by capturing the object in a second direction which is reverse to the first direction.
  • the first image 510 may be the front view of the object and the second image may be the rear view of the object.
  • the first image 510 when an object is a house, the first image 510 may be an image that the house has been captured in a first direction, and the second image 520 may be an image that the house has been captured in a second direction which is reverse to the first direction.
  • a provider of image contents for example, a broadcasting station may send to the mobile terminal 100 both the image of the object captured in the first direction and the image of the object captured in the second direction which is reverse to the first direction.
  • the mobile terminal 100 capable of improving user's convenience in displaying information relating to an event generated while a content display screen is output on the first transparent substrate 410 of the transparent display unit 155, and a control method thereof, with reference to the accompanying drawings.
  • FIG. 7A is a flowchart showing one exemplary embodiment of a control method for a mobile terminal 100 (see FIG. 1 ) according to the present disclosure.
  • the mobile terminal 100 may include the transparent display unit 155 (see FIG. 1 ), the sensing unit 140 (see FIG. 1 ) and the controller 180 (see FIG. 1 ).
  • a content may be output on the first transparent substrate 410 of the transparent display unit 155 (S110).
  • the transparent display unit 155 may include the first and second transparent substrates 410 and 420.
  • the first and second transparent substrates 410 and 420 may be configured to display contents, respectively. Also, contents may be displayed on one of the first and second transparent substrates 410 and 420, and the contents displayed on the one substrate may be transmitted onto the other.
  • the first transparent substrate 410 of the transparent display unit 155 may display the contents.
  • the contents refer to various information or the like output by the mobile terminal 100, a computer and the like.
  • the contents may indicate information relating to producing, processing and circulating at least one of text data, image data, video data and audio data in a digital manner.
  • the contents may be provided to the mobile terminal 100 or the computer via an Internet or computer communication.
  • At least one of motion and rotation of a terminal body may be sensed (S120).
  • the sensing unit 140 may include a motion recognition sensor (not shown).
  • the motion recognition sensor may include at least one of a terrestrial magnetism sensor, a gyro sensor and an acceleration sensor.
  • the terrestrial magnetism sensor is a sensor which detects direction and magnitude of the terrestrial magnetism and generates an electric signal using the detected results.
  • the gyro sensor is a sensor which detects a rotation speed of a terminal body and generates an electric signal using the detected result.
  • the acceleration sensor is a sensor which measures a direction of a gravity acceleration, detects the change of acceleration in any one direction and generates an electric signal using the obtained results.
  • the sensing unit 140 may sense whether or not the terminal body has rotated. That is, the sensing unit 140 may detect a displacement according to the rotation of the terminal body, namely, a rotated direction and a rotated angle, and generate an electric signal using the detected results. The sensing unit 140 may thus sense an orientation that the transparent display unit 155 faces by detecting the rotated direction and the rotated angle of the terminal body.
  • the sensing unit 140 for sensing the at least one of the motion and the rotation of the terminal body, includes an eye detector (not shown).
  • the eye detector may detect user's eyes using at least one of the camera 121 (see FIG. 1 ) and an infrared sensor (not shown).
  • infrared rays emitted from the infrared sensor may be reflected on the retina in the user's eye within a predetermined viewing range based on the transparent display unit 155.
  • the eye detector may detect the user's view using the inputted infrared rays or a user image obtained by the camera 121. Accordingly, the sensing unit 140 may sense whether the user is viewing either the first transparent substrate 410 or the second transparent substrate 420 of the transparent display unit 155.
  • the controller 180 may set the touch input-sensed surface (for example, the first transparent substrate 410) to a main screen.
  • the controller 180 may set a surface where a touch input is not sensed (for example, the second transparent substrate 420) to a sub screen. The controller 180 may thus recognize that the orientation that the transparent display unit 155 is facing the user is the first transparent substrate 410 as the main screen.
  • the main screen and the sub screen may be switched according to on which surface a touch input has been sensed each time when the sleep mode is released. Also, the main screen and the sub screen may be switched according to on which surface a touch input has been sensed each time when a lock screen is unlocked.
  • the sensing unit 140 may detect an orientation of the main screen. For example, the sensing unit 140 may detect that the main screen is toward a preset direction. To this end, the sensing unit 140 may include a light sensor which is disposed on at least one of the first an second transparent substrates 410 and 420. For example, when the mobile terminal 100 is laid on a floor, the sensing unit 140 may detect using the light sensor that the main screen faces the top (or ceiling) and the sub screen faces the bottom.
  • the sensing unit 140 may include a motion recognition sensor (a terrestrial magnetism sensor, a gyro sensor, an acceleration sensor, etc.).
  • a motion recognition sensor a terrestrial magnetism sensor, a gyro sensor, an acceleration sensor, etc.
  • the controller 180 may recognize that the orientation in which the transparent display unit 155 is facing the user has been switched to the second transparent substrate 420.
  • Those methods of sensing the at least one of the motion and the rotation of the terminal body may be employed by combination or individually.
  • an event may be generated when the user is viewing the first transparent substrate 410 on which the content is being output.
  • the event may indicate a variety of events received from a network or a server to the mobile terminal 100.
  • the event may include every event generated within the mobile terminal 100 or generated by the user.
  • the controller 180 may display an object (for example, a pop-up window) on the first transparent substrate 410 to indicate the generation of the event.
  • the pop-up window may be displayed on the first transparent substrate 410 for a preset time.
  • the pop-up window may automatically disappear from (not be displayed any more on) the first transparent substrate 410 after the lapse of the preset time. Also, the pop-up window may not be displayed any more on the first transparent substrate 410 when the user selects a stop icon displayed on the pop-up window. In addition, the pop-up window may not be displayed any more on the first transparent substrate 410 when the user touches the second transparent substrate 420.
  • the controller 180 may display information related to the event on the second transparent substrate 420.
  • the pop-up window displayed on the first transparent substrate 410 may display a message indicating the reception of the text message.
  • an execution screen of a text message application which includes the text (contents) of the received text message may be displayed on the second transparent substrate 420.
  • the content may be continuously reproduced on the first transparent substrate 410.
  • the reproduced content may be continuously output on the first transparent substrate 410.
  • the content which is output on the first transparent substrate 410 may be paused.
  • a paused screen of the content may be output on the first transparent substrate 410.
  • the content output on the first transparent substrate 410 may be terminated.
  • a content output screen displayed on the first transparent substrate 410 a default screen (for example, a home screen) or a contents list may be displayed on the first transparent substrate 410.
  • the pop-up window which has been output on the first transparent substrate 410 may not be output any more simultaneously when the execution screen of the text message application is displayed on the second transparent substrate 420.
  • brightness of the second transparent substrate 420 may be automatically adjusted.
  • the brightness of the second transparent substrate 420 may be adjusted to a brighter level than the current brightness level.
  • the brightness of the first transparent substrate 410 which the user does not view may also be automatically adjusted.
  • brightness of the first transparent substrate 410 may be adjusted to a darker level than the current brightness level.
  • the brightness of each of the first and second transparent substrates 410 and 420 may be adjusted in reverse to the aforementioned ways.
  • the exemplary embodiments of the present disclosure may be applied to the mobile terminal 100 having the transparent display unit 155 with the first and second transparent substrates 410 and 420, and further applied even to a mobile terminal having a plurality of display units (for example, "first and second display units") 151a and 151b.
  • first and second display units 151a and 151b may be attached to each other.
  • a content output screen may be displayed on the first transparent substrate 410 of the transparent display unit 155 and information related to a generated event may be displayed on the second transparent substrate 420. This may allow a user to fully view the content output screen and the event-related information without an obscured portion.
  • the information related to the generated event may be displayed on the second transparent substrate 420 based on the orientation of the transparent display unit 155. That is, an entrance path for displaying the event-related information may be provided by a simplified control operation. This may facilitate the user to check the event-related information even without a complicated manipulation, resulting in improvement of the user's convenience.
  • FIG. 7B is a flowchart showing a detailed exemplary embodiment of the control method for the mobile terminal 100 according to FIG. 7A .
  • the mobile terminal 100 may include the transparent display unit 155 (see FIG. 1 ), the sensing unit 140 (see FIG. 1 ) and the controller 180 (see FIG. 1 ).
  • the transparent display unit 155 see FIG. 1
  • the sensing unit 140 see FIG. 1
  • the controller 180 see FIG. 1 .
  • a content which is being reproduced is output on the first transparent substrate 410 of the transparent display unit 155 (S140). While the content is reproduced on the first transparent substrate 410, when an event is generated, a pop-up window for indicating the generation of the event may be displayed (S150).
  • a pop-up window as an object for indicating the event generation may be displayed on at least one of the first and second transparent substrates 410 and 420.
  • the pop-up window may be displayed on the first transparent substrate 410.
  • the pop-up window may also be displayed on the second transparent substrate 420.
  • the pop-up window displayed on the second transparent substrate 420 may be transmitted onto the first transparent substrate 410.
  • the pop-up window displayed may be terminated (S160).
  • the pop-up window displayed may disappear when a position on the first transparent substrate 410 or the second transparent substrate 420 is touched.
  • the pop-up window displayed may not be displayed any more only when a stop icon displayed on the pop-up window is selected by a user.
  • the controller 180 may display information relating to the event on the second transparent substrate 420.
  • an execution screen of a text message application including the text of the received text message may be displayed on the second transparent substrate 420.
  • the content which is being reproduced on the first transparent substrate 410 may be paused. Accordingly, the paused screen of the content may be displayed on the first transparent substrate 410.
  • the content may be reproduced again on the first transparent substrate 410 (S180).
  • the controller 180 may reproduce the content again.
  • the controller 180 may reproduce the content again, starting from the paused time point. Consequently, the user may keep viewing the content from the paused time point via the first transparent substrate 410.
  • the controller 180 may reproduce the content from the beginning according to the user's setting.
  • a pop-up window may be output on the first transparent substrate 410 such that a user can select whether to reproduce the content again from the beginning or from the paused time point.
  • FIGS. 8 to 14 are conceptual views showing exemplary operations of the mobile terminal according to FIG. 7A .
  • the mobile terminal 100 may include the transparent display unit 155, the sensing unit 140 (see FIG. 1 ) and the controller 180 (see FIG. 1 ).
  • the transparent display unit 155 may include the first and second transparent substrates 410 and 420.
  • FIG. 8 shows a user interface in case where an orientation of the transparent display unit is switched when a pop-up window 251 associated with an event generated is displayed on the first transparent substrate 410 of the transparent display unit 155.
  • a content may be output on the first transparent substrate 410 of the transparent display unit 155.
  • a video content may be output on the first transparent substrate 410.
  • FIG. 8 illustrates that a separate content is not output on the second transparent substrate 420 for convenience of explanation, but a separate content may also be output on the second transparent substrate 420.
  • the separate content output on the second transparent substrate 420 may be transmitted onto the first transparent substrate 410. That is, a left-right inversed image of the content output on the second transparent substrate 420 may be displayed on the first transparent substrate 410 together with the video content which is being output on the first transparent substrate 410.
  • an event may be generated while the video content is output on the first transparent substrate 410.
  • an object 251 indicating the event generation may be displayed on the first transparent substrate 410 together with the video content.
  • the object 251 may include an icon, a widget, an application execution menu, a thumbnail image, a pop-up window and the like.
  • the type of the object 251 will be illustrated as the pop-up window.
  • the pop-up window 251 displayed on the first transparent substrate 410 may output a message indicating the reception of the text message.
  • the pop-up window 251 may be displayed in an overlapping manner with the video content.
  • the pop-up window 251 may be displayed in a non-transparent form, or displayed in a semi-transparent or transparent form such that it cannot obscure the video content.
  • the controller 180 may display information relating to the received text message on the second transparent substrate 420.
  • the video content may be continuously output on the first transparent substrate 410 or a paused image content that the video content has been paused may be output on the first transparent substrate 410.
  • the second transparent substrate 420 may display a left-right inversed image of the video content or the paused image content output on the first transparent substrate 410.
  • a level that the left-right inversed image of the video content or the paused image content, which is being output on the first transparent substrate 410, is transmitted onto the second transparent substrate 420 may depend on transparency information set for each of the first and second transparent substrates 410 and 420.
  • a pop-up window 252 including a text of the received text message as information relating to the received text message may be displayed on the second transparent substrate 420.
  • an execution screen of a text message application including the text of the received text message may be displayed on the second transparent substrate 420.
  • the pop-up window 252 including the text of the received text message may be displayed with overlapping the video content.
  • the pop-up window 252 may be displayed in a non-transparent form, or displayed in a semi-transparent or transparent form such that it cannot obscure the left-right inversed image of the video content which is being output on the first transparent substrate 410.
  • FIG. 9 shows a user interface in case where the pop-up window 251 is selected when the pop-up window 251 associated with an event generated is displayed on the first transparent substrate 410 of the transparent display unit 155.
  • a text message may be received while outputting a video content on the first transparent substrate 410 of the transparent display unit 155.
  • the controller 180 may display a pop-up window 251, which includes a message indicating the reception of the text message, on the first transparent substrate 410.
  • the controller 180 may display information relating to the received text message on the first transparent substrate 410.
  • a pop-up window 252 including a text of the received text message as information relating to the received text message may be displayed on the first transparent substrate 410.
  • an execution screen of a text message application including the text of the received text message may be displayed on the first transparent substrate 410.
  • the pop-up window 252 including the text of the received text message may be displayed with overlapping the video content.
  • the pop-up window 252 may be displayed in a non-transparent form, or displayed in a semi-transparent or transparent form such that it cannot obscure the video content.
  • the user may send a text message to another party by applying a touch input onto the pop-up window 252. That is, the user may send a text message directly to another party on the pop-up window 252. Or, when a text message execution screen is displayed after selecting the pop-up window 252, the user may send a text message to another party on the text message execution screen.
  • FIGS. 10 and 11 show user interfaces in case where the pop-up window 251 is not displayed any more when the pop-up window 251 associated with an event generated is displayed on the first transparent substrate 410 of the transparent display unit 155.
  • a text message may be received while outputting a video content on the first transparent substrate 410 of the transparent display unit 155.
  • the controller 180 may display a pop-up window 251 including a message, which indicates the reception of the text message, on the first transparent substrate 410.
  • the controller 180 may control the pop-up window 251, which is displayed on the first transparent substrate 410 with including the message indicating the reception of the text message, not to be displayed any more on the first transparent substrate 410.
  • the one position on the second transparent substrate 420 may be an arbitrary position on the second transparent substrate 420 or a position onto which the pop-up window 251 displayed on the first transparent substrate 410 is transmitted.
  • the controller 180 may control the pop-up window 251 not to be displayed on the first transparent substrate 410 any more.
  • FIG. 12 shows a user interface in case where an orientation of the transparent display unit 155 is switched when the pop-up window 251 associated with an event generated is displayed on the first transparent substrate 410 of the transparent display unit 155.
  • a text message may be received while outputting a video content on the first transparent substrate 410 of the transparent display unit 155.
  • the controller 180 may display the pop-up window 251, which includes a message indicating the reception of the text message, on the first transparent substrate 410.
  • the controller 180 may display an event list including generated event items on the second transparent substrate 420.
  • the second transparent substrate 420 may display not only a text message reception indicating item but also the event list including a missed call indicating item, an email reception indicating item and the like.
  • the first transparent substrate 410 may continuously output the video content or a paused image content that the video content has been paused.
  • the second transparent substrate 420 may display a left-right inversed image of the video content or the paused image content output on the first transparent substrate 410.
  • the controller 180 may display information relating to an event corresponding to the selected event item on the second transparent substrate 420.
  • the controller 180 may display an execution screen 254 of a text message application, which includes the text of the received text message, on the second transparent substrate 420.
  • the execution screen 254 of the text message application may be displayed as a full screen on the second transparent substrate 420 or displayed on a partial area of the second transparent substrate 420.
  • the user may send a text message to another party on the execution screen 254 of the text message application.
  • the video content or the paused image of the content that the video content has been paused may be continuously output on the first transparent substrate 410.
  • FIGS. 13 and 14 show user interfaces in case where an object 255 associated with a generated event is displayed on the second transparent substrate 420 of the transparent display unit 155.
  • a text message may be received while outputting a video content on the first transparent substrate 410 of the transparent display unit 155.
  • the controller 180 may display an object 255 (hereinafter, referred to as "text message icon”), which indicates the reception of the text message, on the second transparent substrate 420 other than the first transparent substrate 410.
  • a left-right inversed image of the text message icon 255 displayed on the second transparent substrate 420 is transmitted onto the first transparent substrate 410.
  • a level that the left-right inversed image of the text message icon 255 output on the second transparent substrate 420 is transmitted onto the first transparent substrate 410 depends on transparency information set for each of the first and second transparent substrates 410 and 420.
  • the controller 180 may display information relating to the generated event on the first transparent substrate 410.
  • the controller 180 may display a pop-up window 252, which includes a text of the received text message, on the first transparent substrate 410.
  • the controller 180 may display an execution screen of a text message application including the text of the received text message on the first transparent substrate 410.
  • the controller 180 may also output a video content, which is being output on the first transparent substrate 410, on the second transparent substrate 420. Accordingly, the video content output on the second transparent substrate 420 may be transmitted onto the first transparent substrate 410.
  • the controller 180 may control the first icon 255 not to be displayed any more on the second transparent substrate 420, and, as shown in FIG. 14(b) , display on the second transparent substrate 420 an object 256 (hereinafter, referred to as "second icon”) which indicates presence of an unchecked text message among previously received text messages.
  • first icon an object 255
  • second icon an object 256
  • the first icon 255 and the second icon 256 may be displayed on different positions of the second transparent substrate 420 with different sizes.
  • the first icon 255 may be displayed on a central region of the second transparent substrate 420 to indicate that a text message has received just before.
  • the second icon 256 may be displayed on an edge region of the second transparent substrate 420 in a smaller size than the first icon 255 to indicate that there is an unchecked text message among previously received text messages (received earlier than "just before”).
  • first icon 255 and the second icon 256 may be different from each other in at least one of a shape, color, edge thickness, transparency and three-dimensional (3D) depth value.
  • the controller 180 may display a pop-up window 252, which includes information relating to an event generated, for example, the text of the received text message, on the first transparent substrate 410.
  • the controller 180 may simultaneously output a video content, which is being output on the first transparent substrate 410, on the second transparent substrate 420. Accordingly, the video content output on the second transparent substrate 420 may be transmitted onto the first transparent substrate 410.
  • FIGS. 15 and 16 are conceptual views showing user interfaces for adjusting transparency of the transparent display unit 155 in the mobile terminal 100 according to FIGS. 7A and 7B .
  • the mobile terminal 100 may include the transparent display unit 155, the sensing unit 140 (see FIG. 1 ) and the controller 180 (see FIG. 1 ).
  • the transparent display unit 155 may include the first and second transparent substrates 410 and 420.
  • a video content may be output on the first transparent substrate 410 of the transparent display unit 155.
  • the controller 180 may sense a touch input applied onto the transparent display unit 155 to adjust transparency of the transparent display unit 155.
  • the controller 180 may generate a control command for adjusting the transparency of at least one of the first and second transparent substrates 410 and 420 only when touch inputs are sensed simultaneously on the first and second transparent substrates 410 and 420 of the transparent display unit 155.
  • first and second touch inputs touch inputs
  • the controller 180 may adjust the transparency of the respective first and second transparent substrates 410 and 420 based on attribute information relating to each of the sensed first and second touch inputs.
  • the controller 180 may adjust the transparency of the first transparent substrate 410 based on at least one of direction information and length information both relating to the first touch input. Similarly, the controller 180 may adjust the transparency of the second transparent substrate 420 based on at least one of direction information and length information both relating to the second touch input.
  • the controller 180 may lower the transparency of the first transparent substrate 410 based on the length of the drag input. That is, the controller 180 may increase non-transparency of the first transparent substrate 410 such that a content displayed on the second transparent substrate 420 cannot be transmitted onto the first transparent substrate 410.
  • the controller 180 may increase the transparency of the second transparent substrate 420 based on the length of the drag input. That is, the controller 180 may increase the transparency of the second transparent substrate 420 such that an image content displayed on the first transparent substrate 410 can fully be transmitted onto the second transparency substrate 420.
  • the controller 180 may display on the transparent display unit 155 indicator bars 257 (hereinafter, referred to as "first and second indicator bars”), which indicate transparency information relating to each of the first and second transparent substrates 410 and 420.
  • the first and second indicator bars 257a and 257b may be displayed in an overlapping manner with the video content.
  • the first and second indicator bars 257a and 257b may be displayed in a non-transparent form, or displayed in a semi-transparent or transparent form such that they cannot obscure the video content.
  • the transparency of each of the first and second indicator bars 257a and 257b may be adjusted based on a user's touch input.
  • the controller 180 may decide, based on an orientation of the transparent display unit 155, which one of the first transparent substrate 410 and the second transparent substrate 420 is used to display the indicator bars 257a and 257b.
  • the controller 180 may display the first and second indicator bars 257a and 257b on the first transparent substrate 410.
  • the controller 180 may display the first and second indicator bars 257a and 257b on the second transparent substrate 420.
  • the controller 180 may always display the first indicator bar 257a on the first transparent substrate 410 and the second indicator bar 257b on the second transparent substrate 420, irrespective of the orientation of the transparent display unit 155.
  • the user may adjust the transparency of each of the first and second transparent substrates 410 and 420 by use of first and second touch inputs applied onto the first and second transparent substrates 410 and 420, respectively.
  • the user may adjust the transparency of each of the first and second transparent substrates 410 and 420 by use of first and second touch inputs applied onto the first and second indicator bars 257a and 257b, respectively.
  • the controller 180 may first simultaneously sense touch inputs applied on the first and second transparent substrates 410 and 420 of the transparent display unit 155, and adjust the transparency of one of the first and second transparent substrates 410 and 420 based on the touch input sensed on the corresponding one substrate.
  • the touch inputs applied on the first and second transparent substrates 410 and 420 of the transparent display unit 155 may not always have to be sensed simultaneously in order to generate a control command for transparency adjustment. Even though a touch input is sensed on only one of the first and second transparent substrates 410 and 420, the controller 180 may generate a control command for adjusting the transparency of the one substrate with the touch input sensed thereon.
  • the controller 180 may adjust the transparency of the first transparent substrate 410 based on at least one of direction information and length information relating to a touch input sensed on the first transparent substrate 410.
  • the controller 180 may increase the transparency of the first transparent substrate 410 based on the length of the drag input. That is, the controller 180 may increase the transparency of the first transparent substrate 410 such that a content displayed on the second transparent substrate 420 can be fully transmitted onto the first transparent substrate 410.
  • the controller 180 may switch screens, which are displayed respectively on the first and second transparent substrates 410 and 420, with each other.
  • the controller 180 may display a content output screen, which has been displayed on the first transparent substrate 410, on the second transparent substrate 420, and an object 255, which has been displayed on the second transparent substrate 420, on the first transparent substrate 410.
  • the object 255 may be displayed on the first transparent substrate 410, and a left-right inversed image of the output image content may be displayed on the second transparent substrate 420 by being transmitted thereonto.
  • FIG. 17 is a flowchart showing another example of a control method for a mobile terminal 100 (see FIG. 1 ) not being part of the present invention.
  • the mobile terminal may include the transparent display unit 155 (see FIG. 1 ), the sensing unit 140 (see FIG. 1 ) and the controller 180 (see FIG. 1 ).
  • a content may be output on the first transparent substrate 410 of the transparent display unit 155 (S210).
  • the transparent display unit 155 may include the first and second transparent substrates 410 and 420.
  • the first and second transparent substrates 410 and 420 may display contents thereon.
  • At least one of motion and rotation of a terminal body may be sensed (S220).
  • the sensing unit 140 may include at least one of an eye detector (not shown) and a motion recognition sensor (not shown).
  • the eye detector and the motion recognition sensor may cooperatively work to sense at least one of motion and rotation of a terminal body, or only one of the eye detector and the motion recognition sensor may operate to sense the at least one of the motion and the rotation of the terminal body.
  • a content output screen displayed on the first transparent substrate 410 it may be determined whether or not the at least one of the motion and the rotation of the terminal body meets a preset condition information, and decide based on the determination result whether or not to display another screen associated with the content output screen on the second transparent substrate 420 (S230).
  • the content output screen may be displayed on the first transparent substrate 410 when the user is facing the first transparent substrate 410 on which the content is being output.
  • the controller 180 may display another screen associated with the content output screen on the second transparent substrate 420.
  • the another screen displayed on the second transparent substrate 420 may include at least one of a detailed information display screen for a content, a content setting screen and a content edit screen.
  • the user may check detailed information relating to the content on the detailed information display screen displayed on the second transparent substrate 420. Also, the user may change settings associated with the content on the content setting screen displayed on the second transparent substrate 420. The user may also edit the content on the content edit screen displayed on the second transparent substrate 420.
  • the controller 180 may apply the changed information to the content output screen displayed on the first transparent substrate 410. Also, when the content is edited on the content edit screen displayed on the second transparent substrate 420, the controller 180 may apply the changed information to the content output screen displayed on the first transparent substrate 410.
  • FIGS. 18 to 23 are conceptual views showing exemplary control operations of the mobile terminal 100 according to FIG. 17 .
  • the mobile terminal 100 may include the transparent display unit 155 (see FIG. 1 ), the sensing unit 140 (see FIG. 1 ) and the controller 180 (see FIG. 1 ).
  • the transparent display unit 155 may include the first and second transparent substrates 410 and 420.
  • FIG. 18 shows a user interface in case where a content indicating an alarm time (hereinafter, referred to as "alarm content”) 258 is displayed on the first transparent substrate 410.
  • alarm content a content indicating an alarm time
  • an alarm sound may be output via the audio output module 152 (see FIG. 1 ). Or, as shown, the alarm content 258 may be displayed on the first transparent substrate 410 of the transparent display unit 155.
  • the sensing unit 140 may include a light sensor disposed on at least one of the first and second transparent substrates 410 and 420. For example, when the mobile terminal 100 is laid on a floor, the sensing unit 140 may detect using the light sensor that the first transparent substrate 410 faces the top (or ceiling) and the second transparent substrate 420 faces the bottom.
  • the controller 180 may display the alarm content 258 on the first transparent substrate 410.
  • the sensing unit 140 may detect using the light sensor that the first transparent substrate 410 faces the bottom and the second transparent substrate 420 faces the top. As such, when the user is facing the second transparent substrate 420 of the transparent display unit 155, the controller 180 may display another screen associated with a content output screen on the second transparent substrate 420.
  • detailed information 259 relating to the alarm content 258 may be displayed on the second transparent substrate 420.
  • the detailed information 259 relating to the alarm content 258, user's schedule information of the day may be displayed.
  • a set alarm time, an alarm name, an alarm repetitive count, an alarm sound and the like may also be displayed as the detailed information 259 relating to the alarm content 258.
  • FIG. 19 shows a user interface in case where a contact item content (hereinafter, referred to as "contact content”) is displayed on the first transparent substrate 410.
  • contact content a contact item content
  • a contact content may be displayed on the first transparent substrate 410 of the transparent display unit 155. That is, while the user is facing the first transparent substrate 410 of the transparent display unit 155, the controller 180 may display the contact content on the first transparent substrate 410. As shown, the contact content may include names and images of a plurality of third parties, respectively.
  • the controller 180 may display, as another screen associated with a content output screen, detailed information relating to the contact content on the second transparent substrate 420.
  • contact information relating to each third party may be displayed.
  • the controller 180 may execute an edit mode for the contact content. That is, an edit screen for the contact content may be displayed on the second transparent substrate 420. Or the contact content may be editable directly on the detailed information display screen displayed on the second transparent substrate 420.
  • the user may edit the phone number or email address of a third party on the second transparent substrate 420.
  • the user may edit the third party's name or image on the second transparent substrate 420 as well.
  • the controller 180 may maintain a screen displayed on the second transparent substrate 420 as it is, with scrolling a screen displayed on the first transparent substrate 410. Similarly, even when the scroll input is sensed on the second transparent substrate 420, the controller 180 may scroll only the screen displayed on the second transparent substrate 420.
  • the controller 180 may scroll both the first and second transparent substrates 410 and 420 at the same time. Similarly, even when the scroll input is sensed on the second transparent substrate 420, the controller 180 may also scroll both the first and second transparent substrates 410 and 420 at the same time.
  • FIG. 20 shows a user interface in case where a calendar content is displayed on the first transparent substrate 410.
  • the calendar content may be displayed on the first transparent substrate 410 of the transparent display unit 155. That is, while the user is facing the first transparent substrate 410 of the transparent display unit 155, the controller 180 may display the calendar content on the first transparent substrate 410. As shown, the calendar content may include day items. A day item corresponding to a scheduled day may be provided with a separate identification item.
  • the controller 180 may display, as another screen associated with a content output screen, detailed information relating to the calendar content on the second transparent substrate 420.
  • schedule information corresponding to each day item may be displayed.
  • the controller 180 may execute an edit mode for the calendar content.
  • the second transparent substrate 420 may display left-right inversed images of the day items displayed on the first transparent substrate 410.
  • the controller 180 may execute a schedule edit mode corresponding to "21 st day.”
  • a pop-up window 261 for receiving a schedule corresponding to "21 st day” may be displayed on the second transparent substrate 420.
  • the controller 180 may apply the input schedule to the calendar content output screen displayed on the first transparent substrate 410.
  • a separate identification item 262 may be displayed on the day item corresponding to "21 st day" on the first transparent substrate 410.
  • FIG. 21 shows a user interface in case where an image content is displayed on the first transparent substrate 410.
  • the image content may be displayed on the first transparent substrate 410 of the transparent display unit 155. That is, while the user is facing the first transparent substrate 410 of the transparent display unit 155, the controller 180 may display the image content on the first transparent substrate 410.
  • the controller 180 may display, as another screen associated with the content output screen, an image content list on the second transparent substrate 420.
  • the controller 180 may display an image content 265 corresponding to the selected image content item 264 on the first transparent substrate 410.
  • FIG. 22 shows a user interface in case where an image content is displayed on the first transparent substrate 410.
  • an image content may be displayed on the first transparent substrate 410 of the transparent display unit 155. That is, while the user is facing the first transparent substrate 410 of the transparent display unit 155, the controller 180 may display a webpage including the image content or an execution screen of an application including the image content on the first transparent substrate 410.
  • the image content displayed on the first transparent substrate 410 may be an image showing one surface of an object to be captured.
  • the controller 180 may display, as another screen associated with a content output screen, an image showing the other surface, which is opposite to the one surface of the object to be captured, on the second transparent substrate 420.
  • the user may view the images showing each of the one surface and the other surface of the object via the first and second transparent substrates 410 and 420.
  • FIG. 23 shows a user interface in case where a page of an electronic document is displayed on the first transparent substrate 410.
  • the controller 180 may display a predetermined page of an electronic document on the first transparent substrate 410.
  • the representative of the electronic document may include an electronic book.
  • the electronic book may include multimedia information such as sound or image as well as text.
  • the controller 180 may display, as another screen associated with a content output screen, one of a previous page and the next page of the predetermined page on the second transparent substrate 420.
  • the sensing unit 140 may sense a rotated direction of the terminal body.
  • the sensing unit 140 may include at least one of a terrestrial magnetism sensor, a gyro sensor and an acceleration sensor. Accordingly, the sensing unit 140 may sense whether or not the terminal body rotates.
  • the sensing unit 140 may detect displacement according to the rotation of the terminal body, namely, a rotated direction and a rotated angle, using at least one of those sensors.
  • the sensing unit 140 may generate a different control command depending on the rotated direction of the terminal body.
  • the controller 180 may display, as another screen associated with a content output screen, the previous page of the predetermined page, which has been displayed on the first transparent substrate 410, on the second transparent substrate 420.
  • the controller 180 may display the next page of the predetermined page, which has been displayed on the first transparent substrate 410, on the second transparent substrate 420.
  • the user may view the different page on the second transparent substrate 420 according to the rotation of the terminal body in the first direction or the second direction.
  • the method can be implemented as computer-readable codes in a program-recorded medium.
  • the computer-readable medium may include all types of recording devices each storing data readable by a computer system. Examples of such computer-readable media may include ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical data storage element and the like. Also, the computer-readable medium may also be implemented as a format of carrier wave (e.g., transmission via an Internet).
  • the computer may include the controller 180 of the mobile terminal.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • User Interface Of Digital Computer (AREA)
  • Telephone Function (AREA)
  • Controls And Circuits For Display Device (AREA)

Claims (4)

  1. Mobiles Endgerät, das aufweist:
    einen Endgerätkörper;
    eine transparente Anzeigeeinheit (155) mit einem ersten transparenten Substrat (410), das konfiguriert ist, um eine Berührungseingabe abzutasten, und einem zweiten transparenten Substrat (420), das konfiguriert ist, um eine Berührungseingabe abzutasten, wobei die ersten und zweiten transparenten Substrate mit einer Bildschicht (430) zwischen ihnen aufeinander gestapelt sind;
    eine Abtasteinheit (140), die konfiguriert ist, um eine Bewegung des Endgerätkörpers und/oder eine Drehung des Endgerätkörpers abzutasten, wobei die Abtasteinheit einen Augendetektor aufweist, der konfiguriert ist, um abzutasten, ob ein Benutzer entweder das erste transparente Substrat (410) oder das zweite transparente Substrat (420) betrachtet; und
    eine Steuerung (180), die konfiguriert ist, um:
    einen Inhalt auf dem ersten transparenten Substrat (410), das dem Benutzer zugewandt ist, auszugeben;
    ein Auftreten eines Ereignisses zu erfassen;
    ein Piktogramm (255) anzuzeigen, das die Ereigniserzeugung auf dem zweiten transparenten Substrat (420), das dem Benutzer nicht zugewandt ist, angibt, während der Inhalt auf dem ersten transparenten Substrat (410), das dem Benutzer zugewandt ist, angezeigt wird, wobei das auf dem zweiten transparenten Substrat angezeigte Piktogramm mit einem Pegel auf das erste Substrat transmittiert wird, der von Transparenzinformationen, die für jedes der ersten und zweiten transparenten Substrate festgelegt werden, abhängt;
    wenn die Abtasteinheit abtastet, dass der Benutzer auf das Betrachten des zweiten transparenten Substrats (420) umgeschaltet hat:
    Informationen (252), die das Ereignis betreffen, auf dem zweiten transparenten Substrat (420) anzuzeigen; und
    den Inhalt, der auf dem ersten transparenten Substrat (410) gerade angezeigt wird, zu stoppen; und
    wenn eine Berührungseingabe auf eine Position angewendet wird, wo das Piktogramm auf dem zweiten transparenten Substrat (420) angezeigt wird:
    Informationen (252), die das Ereignis betreffen, auf dem ersten transparenten Substrat (410), das dem Benutzer zugewandt ist, anzuzeigen und gleichzeitig den Inhalt, der gerade auf dem ersten transparenten Substrat (410) ausgegeben wird, auf dem zweiten transparenten Substrat (420) auszugeben.
  2. Mobiles Endgerät nach Anspruch 1, wobei die Steuerung (180) konfiguriert ist, um das Piktogramm eine vorgeschriebene Zeit lang anzuzeigen.
  3. Verfahren zur Anzeige auf einem mobiles Endgerät, das einen Endgerätkörper, ein erstes transparentes Substrat (410), ein zweites transparentes Substrat (420), und eine Abtasteinheit (140), die einen Augendetektor aufweist, der konfiguriert ist, um abzutasten, ob ein Benutzer entweder das erste transparente Substrat (410) oder das zweite transparente Substrat (420) betrachtet, aufweist, wobei die ersten und zweiten transparenten Substrate mit einer Bildschicht (430) zwischen ihnen aufeinander gestapelt sind, wobei das Verfahren aufweist:
    Anzeigen eines Inhalts auf dem ersten transparenten Substrat (410);
    Erfassen eines Auftretens eines Ereignisses;
    Anzeigen eines Piktogramms (255), das die Ereigniserzeugung angibt, auf dem zweiten transparenten Substrat (420), wobei das auf dem zweiten transparenten Substrat angezeigte Piktogramm mit einem Pegel auf das erste Substrat transmittiert wird, der von Transparenzinformationen, die für jedes der ersten und zweiten transparenten Substrate festgelegt werden, abhängt;
    nach dem Abtasten durch die Abtasteinheit, dass der Benutzer auf das Betrachten des zweiten transparenten Substrats (420) umgeschaltet hat:
    Anzeigen von Informationen (252), die das Ereignis betreffen, auf dem zweiten transparenten Substrat (420); und
    Stoppen des Inhalts, der auf dem ersten transparenten Substrat (410) gerade angezeigt wird; und
    nach Empfang einer Berührungseingabe, die auf eine Position angewendet wird, wo das Piktogramm auf dem zweiten transparenten Substrat (420) angezeigt wird:
    ansprechend auf die Berührungseingabe Anzeigen der Informationen (252), die das Ereignis betreffen, auf dem ersten transparenten Substrat (410) und gleichzeitig Ausgeben des Inhalts, der gerade auf dem ersten transparenten Substrat (410) ausgegeben wird, auf dem zweiten transparenten Substrat (420).
  4. Verfahren nach Anspruch 3, wobei das Piktogramm eine vorgeschriebene Zeit lang angezeigt wird.
EP13182840.2A 2012-10-31 2013-09-03 Mobiles Endgerät und Steuerungsverfahren dafür Active EP2728437B1 (de)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020120122509A KR101907949B1 (ko) 2012-10-31 2012-10-31 이동 단말기 및 그것의 제어 방법

Publications (3)

Publication Number Publication Date
EP2728437A2 EP2728437A2 (de) 2014-05-07
EP2728437A3 EP2728437A3 (de) 2017-05-31
EP2728437B1 true EP2728437B1 (de) 2021-05-26

Family

ID=49123696

Family Applications (1)

Application Number Title Priority Date Filing Date
EP13182840.2A Active EP2728437B1 (de) 2012-10-31 2013-09-03 Mobiles Endgerät und Steuerungsverfahren dafür

Country Status (4)

Country Link
US (1) US9189101B2 (de)
EP (1) EP2728437B1 (de)
KR (1) KR101907949B1 (de)
CN (1) CN103793167B (de)

Families Citing this family (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
USD765110S1 (en) * 2014-04-25 2016-08-30 Tencent Technology (Shenzhen) Company Limited Portion of a display screen with animated graphical user interface
USD763882S1 (en) * 2014-04-25 2016-08-16 Tencent Technology (Shenzhen) Company Limited Portion of a display screen with animated graphical user interface
KR20150134949A (ko) * 2014-05-23 2015-12-02 엘지전자 주식회사 이동 단말기 및 그것의 제어방법
KR101659032B1 (ko) * 2014-07-25 2016-09-23 엘지전자 주식회사 이동 단말기 및 그것의 제어방법
USD763890S1 (en) * 2014-12-04 2016-08-16 Dalian Situne Technology Co., Ltd. Display screen or portion thereof with graphical user interface
US10254863B2 (en) * 2014-12-19 2019-04-09 Lg Electronics Inc. Mobile terminal
WO2016108439A1 (en) 2014-12-29 2016-07-07 Samsung Electronics Co., Ltd. Foldable device and method of controlling the same
KR102308645B1 (ko) 2014-12-29 2021-10-05 삼성전자주식회사 사용자 단말 장치 및 그의 제어 방법
KR102121533B1 (ko) * 2015-01-09 2020-06-10 삼성전자주식회사 투명 디스플레이를 구비한 디스플레이 장치 및 그 디스플레이 장치의 제어 방법
USD788733S1 (en) * 2015-02-09 2017-06-06 Lg Electronics Inc. Mobile phone
US20180129461A1 (en) * 2016-11-10 2018-05-10 SK Commercial Construction, Inc. Method and system for advertising and screen identification using a mobile device transparent screen
US10339569B2 (en) * 2016-11-10 2019-07-02 SK Commercial Construction, Inc. Method and system for advertising and screen identification using a mobile device transparent screen, bendable and multiple non-transparent screen
US10789617B2 (en) * 2017-03-20 2020-09-29 SK Commercial Construction, Inc. Method and system for advertising and screen identification using a mobile device transparent screen, bendable and multiple non-transparent screen
USD865796S1 (en) * 2017-07-19 2019-11-05 Lenovo (Beijing) Co., Ltd. Smart glasses with graphical user interface
CN109710031A (zh) * 2018-12-24 2019-05-03 武汉西山艺创文化有限公司 一种基于透明液晶显示屏的平板电脑及交互方法
CN110413115B (zh) * 2019-07-23 2023-02-28 Oppo广东移动通信有限公司 显示控制方法及相关设备
CN115278326A (zh) 2021-04-29 2022-11-01 腾讯科技(深圳)有限公司 视频展示方法、装置、计算机可读介质及电子设备

Family Cites Families (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0605945B1 (de) * 1992-12-15 1997-12-29 Sun Microsystems, Inc. Darstellung von Informationen in einem Anzeigesystem mit transparenten Fenstern
US8514182B2 (en) * 2001-12-27 2013-08-20 Lg Display Co., Ltd. Touch panel display device and method of fabricating the same
TW594184B (en) * 2003-10-01 2004-06-21 Display Optronics Corp M Multi-display monitor
EP1607847A1 (de) * 2004-06-15 2005-12-21 Research In Motion Limited Verfahren und Vorrichtung zur Änderung des Grades der Transparenz einer virtuellen Tastatur auf einem berührungsempfindlichen Bildschirm.
KR101474429B1 (ko) * 2008-03-11 2014-12-19 엘지전자 주식회사 이동단말기 및 그의 표시 방법
KR101482115B1 (ko) * 2008-07-07 2015-01-13 엘지전자 주식회사 자이로센서를 구비한 휴대 단말기 및 그 제어방법
KR101533099B1 (ko) * 2008-08-22 2015-07-01 엘지전자 주식회사 휴대 단말기 및 그 동작 제어방법
KR101555055B1 (ko) * 2008-10-10 2015-09-22 엘지전자 주식회사 이동단말기 및 그 디스플레이방법
TWI498810B (zh) * 2008-10-27 2015-09-01 Htc Corp 顯示方法與顯示控制模組
KR101557353B1 (ko) * 2008-12-16 2015-10-06 엘지전자 주식회사 투명디스플레이를 구비한 이동단말기 및 그의 화면 캡쳐 방법
KR101021857B1 (ko) * 2008-12-30 2011-03-17 삼성전자주식회사 듀얼 터치 센서를 이용하여 제어 신호를 입력하는 장치 및 방법
KR101544364B1 (ko) * 2009-01-23 2015-08-17 삼성전자주식회사 듀얼 터치 스크린을 구비한 휴대 단말기 및 그 컨텐츠 제어방법
KR101563523B1 (ko) * 2009-01-30 2015-10-28 삼성전자주식회사 듀얼 터치 스크린을 구비한 휴대 단말기 및 그 사용자 인터페이스 표시 방법
US20100277420A1 (en) * 2009-04-30 2010-11-04 Motorola, Inc. Hand Held Electronic Device and Method of Performing a Dual Sided Gesture
US9696809B2 (en) * 2009-11-05 2017-07-04 Will John Temple Scrolling and zooming of a portable device display with device motion
KR20110125356A (ko) * 2010-05-13 2011-11-21 엘지전자 주식회사 이동 단말기
KR101708304B1 (ko) * 2010-08-18 2017-02-20 엘지전자 주식회사 이동 단말기 및 그의 모션 인식방법
KR101688942B1 (ko) 2010-09-03 2016-12-22 엘지전자 주식회사 다중 디스플레이에 기반한 사용자 인터페이스 제공 방법 및 이를 이용하는 이동 단말기
WO2012036327A1 (ko) * 2010-09-15 2012-03-22 엘지전자 주식회사 이동 통신 단말기에서의 스케쥴 표시 방법 및 장치
US8698771B2 (en) * 2011-03-13 2014-04-15 Lg Electronics Inc. Transparent display apparatus and method for operating the same
US9122457B2 (en) * 2012-05-11 2015-09-01 Htc Corporation Handheld device and unlocking method thereof

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
None *

Also Published As

Publication number Publication date
EP2728437A3 (de) 2017-05-31
EP2728437A2 (de) 2014-05-07
CN103793167A (zh) 2014-05-14
US9189101B2 (en) 2015-11-17
KR101907949B1 (ko) 2018-10-16
KR20140055510A (ko) 2014-05-09
CN103793167B (zh) 2017-10-03
US20140118258A1 (en) 2014-05-01

Similar Documents

Publication Publication Date Title
EP2728437B1 (de) Mobiles Endgerät und Steuerungsverfahren dafür
US9310996B2 (en) Mobile terminal and method for providing user interface thereof
KR101868352B1 (ko) 이동 단말기 및 그것의 제어 방법
US9207854B2 (en) Mobile terminal and user interface of mobile terminal
KR102080741B1 (ko) 이동 단말기 및 그것의 제어방법
US8847996B2 (en) Mobile terminal and control method thereof
US8934949B2 (en) Mobile terminal
US9001151B2 (en) Mobile terminal for displaying a plurality of images during a video call and control method thereof
US9081541B2 (en) Mobile terminal and method for controlling operation thereof
KR102058368B1 (ko) 이동 단말기 및 그것의 제어방법
KR20100030968A (ko) 단말기 및 그의 메뉴 디스플레이 방법
KR20100028239A (ko) 이동 단말기 및 이를 이용한 사용자인터페이스 제공 방법
KR20100039024A (ko) 이동 단말기 및 이것의 디스플레이 제어 방법
KR20110045659A (ko) 이동 통신 단말기에서의 아이콘 표시 제어 방법 및 이를 적용한 이동 통신 단말기
KR20110068666A (ko) 사이드 터치 입력 수단을 구비한 이동단말기 및 그의 기능 수행 방법
EP2680134A1 (de) Mobiles Endgerät und Steuerungsverfahren dafür
KR20140000742A (ko) 이동단말기 및 그 제어방법
KR20130090965A (ko) 이동 단말기 및 그 제어방법
US10025483B2 (en) Mobile terminal and method of controlling the mobile terminal
KR20100104562A (ko) 이동 단말기 및 이것의 배경 화면 디스플레이 제어 방법
US10628012B2 (en) Mobile terminal having front surface and rear surface with preset input applied to rear input unit causing portion of screen information and portion of home screen displayed together on front surface
KR20130059123A (ko) 이동 단말기 및 그것의 제어 방법
KR20100027548A (ko) 이동 단말기 및 이동 단말기의 스티커 기능 수행방법
KR20100026362A (ko) 이동통신 단말기 및 이를 이용한 컨텐츠 재생 방법
KR20100039977A (ko) 이동 단말기 및 그 무선 통신 채널 변경 방법

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20130903

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

PUAL Search report despatched

Free format text: ORIGINAL CODE: 0009013

RIC1 Information provided on ipc code assigned before grant

Ipc: G06F 3/0484 20130101ALI20170419BHEP

Ipc: G06F 3/0488 20130101ALI20170419BHEP

Ipc: G06F 3/0487 20130101ALI20170419BHEP

Ipc: G06F 1/16 20060101AFI20170419BHEP

Ipc: G06F 3/0483 20130101ALI20170419BHEP

AK Designated contracting states

Kind code of ref document: A3

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

RBV Designated contracting states (corrected)

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20181108

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

INTG Intention to grant announced

Effective date: 20201223

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE PATENT HAS BEEN GRANTED

RAP3 Party data changed (applicant data changed or rights of an application transferred)

Owner name: LG ELECTRONICS INC.

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: AT

Ref legal event code: REF

Ref document number: 1396863

Country of ref document: AT

Kind code of ref document: T

Effective date: 20210615

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 602013077636

Country of ref document: DE

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG9D

REG Reference to a national code

Ref country code: AT

Ref legal event code: MK05

Ref document number: 1396863

Country of ref document: AT

Kind code of ref document: T

Effective date: 20210526

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210826

Ref country code: AT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210526

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210526

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210526

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210526

REG Reference to a national code

Ref country code: NL

Ref legal event code: MP

Effective date: 20210526

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210926

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210526

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210827

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210526

Ref country code: RS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210526

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210927

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210526

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210826

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: NL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210526

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210526

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210526

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210526

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210526

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210526

Ref country code: SM

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210526

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210526

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 602013077636

Country of ref document: DE

REG Reference to a national code

Ref country code: DE

Ref legal event code: R119

Ref document number: 602013077636

Country of ref document: DE

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

26N No opposition filed

Effective date: 20220301

REG Reference to a national code

Ref country code: BE

Ref legal event code: MM

Effective date: 20210930

GBPC Gb: european patent ceased through non-payment of renewal fee

Effective date: 20210903

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210926

Ref country code: MC

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210526

Ref country code: AL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210526

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20210903

Ref country code: IT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210526

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20210903

Ref country code: GB

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20210903

Ref country code: FR

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20210930

Ref country code: DE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20220401

Ref country code: BE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20210930

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20210930

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20210930

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: HU

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO

Effective date: 20130903

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CY

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210526

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210526