WO2011083975A2 - Mobile device and method for operating content displayed on transparent display panel - Google Patents

Mobile device and method for operating content displayed on transparent display panel Download PDF

Info

Publication number
WO2011083975A2
WO2011083975A2 PCT/KR2011/000066 KR2011000066W WO2011083975A2 WO 2011083975 A2 WO2011083975 A2 WO 2011083975A2 KR 2011000066 W KR2011000066 W KR 2011000066W WO 2011083975 A2 WO2011083975 A2 WO 2011083975A2
Authority
WO
WIPO (PCT)
Prior art keywords
touch event
touch
content
brightness
mobile device
Prior art date
Application number
PCT/KR2011/000066
Other languages
French (fr)
Other versions
WO2011083975A3 (en
Inventor
Eun Hye Lee
Joon Ho Won
Bo Eun Park
Byeong Cheol Hwang
Original Assignee
Samsung Electronics Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co., Ltd. filed Critical Samsung Electronics Co., Ltd.
Priority to EP11731913A priority Critical patent/EP2521960A2/en
Priority to CN2011800054519A priority patent/CN102696004A/en
Publication of WO2011083975A2 publication Critical patent/WO2011083975A2/en
Publication of WO2011083975A3 publication Critical patent/WO2011083975A3/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/169Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated pointing device, e.g. trackball in the palm rest area, mini-joystick integrated between keyboard keys, touch pads or touch stripes
    • G06F1/1692Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated pointing device, e.g. trackball in the palm rest area, mini-joystick integrated between keyboard keys, touch pads or touch stripes the I/O peripheral being a secondary touch screen used as control interface, e.g. virtual buttons or sliders
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0412Digitisers structurally integrated in a display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B1/00Details of transmission systems, not covered by a single one of groups H04B3/00 - H04B13/00; Details of transmission systems not characterised by the medium used for transmission
    • H04B1/38Transceivers, i.e. devices in which transmitter and receiver form a structural unit and in which at least one part is used for functions of transmitting and receiving
    • H04B1/40Circuits
    • H04B1/401Circuits for selecting or indicating operating mode
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/72427User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality for supporting games or graphical animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/22Details of telephonic subscriber devices including a touch pad, a touch sensor or a touch detector

Definitions

  • the present invention relates to a mobile device having a touch screen. More particularly, the present invention relates to a mobile device and method for operating displayed content in response to a touch event using at least one of front and back faces of the mobile device having a transparent display panel.
  • a mobile device refers to a type of electronic device based on mobility and portability. With remarkable growths of related technologies, a great variety of mobile devices capable of supporting various end-user functions are increasingly popularized in these days.
  • some of conventional mobile devices offer a touch-based input interface such as a touch screen that normally includes a touch panel and a display unit.
  • the mobile device generates a touch event in response to a user's touch input made on the touch panel and then performs a predefined function based on the touch event.
  • One of advanced touch screens is a dual touch screen in which two touch sensors are disposed on both sides of a display unit formed of a transparent display panel.
  • This dual touch screen can detect a touch event from both sides through the front and back touch sensors.
  • the display unit of the dual touch screen is transparent and hence its transparency varies according to the brightness of displayed colors. Namely, the transparency of the display unit approaches 0% as the colors displayed on the display unit gets near to white. In addition, the transparency of the display unit approaches 100% as the colors displayed on the display unit gets near to black.
  • a conventional mobile device having the transparent display panel is still in a simple stage which allows the background behind the mobile device to be seen through the transparent display panel. More improved and effective user interfaces and their operation methods may be required for the mobile device having the transparent display panel.
  • An aspect of the present invention is to address at least the above-mentioned problems and/or disadvantages and to provide at least the advantages described below. Accordingly, an aspect of the present invention is to provide a mobile device having a touch screen and also to provide a method for operating displayed content using the touch screen.
  • Another aspect of the present invention is to provide a mobile device having a dual touch screen based on a transparent display panel and also to provide a method for operating displayed content in response to a touch event on the dual touch screen.
  • Still another aspect of the present invention is to provide an apparatus and method for operating displayed content in response to a touch on the front side, on the back side, or on both sides of a dual touch screen based on a transparent display panel in a mobile device.
  • Yet another aspect of the present invention is to provide an apparatus and method for modifying the brightness of parts or all of displayed content in response to a touch event on a dual touch screen based on a transparent display panel in a mobile device.
  • a method for operating displayed content in a mobile device having a transparent display panel includes displaying the content on the transparent display panel with a predefined brightness, receiving a touch event through the transparent display panel, and modifying the brightness of parts or whole of the displayed content in response to the touch event.
  • a mobile device includes, a touch screen for receiving a touch event and then for modifying the brightness of content in response to the touch event, and a control unit for determining the type of the touch event received from the touch screen and then, in response to the touch event, and for performing a transparentizing process by modifying the brightness of parts or whole of the content displayed on the touch screen.
  • the mobile device having the transparent display panel may realize optimal environments for varying the transparency of displayed content according to various touch events. Therefore, this mobile device can offer new additive and useful functions based on the transparent display panel.
  • a method for operating the displayed content is provided by allowing a user to intuitively modify the brightness of parts or whole of the displayed content through various touch events such as front, back or double-sided touch events.
  • This technique may be favorably and widely applied to any type of device that employs the transparent display panel.
  • the double-sided touch-based changes in transparency of displayed content may promote usability, accessibility and competitiveness of the mobile device.
  • FIG. 1 is a block diagram illustrating a configuration of a mobile device having a dual touch screen based on a transparent display panel in accordance with an exemplary embodiment of the present invention.
  • FIG. 2 is a schematic view illustrating a transparent display panel of a mobile device in accordance with an exemplary embodiment of the present invention.
  • FIGs. 3 to 5 are schematic views illustrating three types of touch events on a transparent display panel of a mobile device in accordance with exemplary embodiments of the present invention.
  • FIGs. 6 to 11 are views illustrating various ways to operate content displayed on a display unit in response to a touch event in a mobile device in accordance with exemplary embodiments of the present invention.
  • FIG. 12 is a flowchart illustrating a method for operating displayed content in response to a touch event on a dual touch screen based on a transparent display panel in a mobile device in accordance with an exemplary embodiment of the present invention.
  • Exemplary embodiments of the present invention relate to a mobile device that has a touch screen and executes a method for operating displayed content using the touch screen. More particularly, the mobile device includes a dual touch screen based on a transparent display panel. Additionally, this invention allows the mobile device to perform a particular function in response to a touch event through a touch on the front side, on the back side, or on both sides of the dual touch screen.
  • a front touch refers to a touch gesture on the front side of a touch screen, including a single touch and a multi touch.
  • a back touch refers to a touch gesture on the back side of a touch screen, including a single touch and a multi touch.
  • a double-sided touch refers to a multi touch gesture on both sides of a touch screen.
  • FIG. 1 is a block diagram illustrating a configuration of a mobile device having a dual touch screen based on a transparent display panel in accordance with an exemplary embodiment of the present invention.
  • the mobile device includes a touch screen 110, a memory unit 150 and a control unit 170. Additionally, the mobile device may include an audio processing unit having a microphone and a speaker, a digital broadcast module for receiving and playing digital broadcasting such as Digital Multimedia Broadcasting (DMB) or Digital Video Broadcasting (DVB), a camera module for taking a photo or recording a video, a Bluetooth communication module for performing a Bluetooth communication function, a radio frequency communication module for performing a communication function based on a mobile communication service, an Internet communication module for performing an Internet communication function, a touch pad for a touch-based input, a key input unit for a mechanical key input, a battery for a power supply, and the like. Since these elements are well known in the art, related illustration and description will be omitted herein.
  • DMB Digital Multimedia Broadcasting
  • DVD Digital Video Broadcasting
  • the touch screen 110 includes a first touch sensor 120, a display unit 130, and a second touch sensor 140.
  • a user-facing side of the display unit 130 is referred to as the front side, and the opposite side is referred to as the back side.
  • the first touch sensor 120 may be disposed on the front side, and the second touch sensor 140 may be disposed on the back side.
  • the first and second touch sensors 120 and 140 may include well known touch-sensitive sensors of capacitive overlay type, resistive overlay type, piezoelectric type, and the like.
  • the display unit 130 represents an operation state of the mobile device and related data. Under the control of the control unit 170, the display unit 130 may display a variety of contents (e.g., photos, images, lists, text, icons, menus, etc.). Also, under the control of the control unit 170, the display unit 130 may display content modified in response to a touch event that is detected through at least one of the first and second touch sensors 120 and 140. Here, the display unit 130 may display content being partially transparent according to a touch event. Namely, the brightness of content displayed may become different.
  • contents e.g., photos, images, lists, text, icons, menus, etc.
  • the display unit 130 may display content modified in response to a touch event that is detected through at least one of the first and second touch sensors 120 and 140.
  • the display unit 130 may display content being partially transparent according to a touch event. Namely, the brightness of content displayed may become different.
  • a touch event (hereinafter, referred to as a front touch event) is inputted on the front side of the display unit 130
  • content displayed on the display unit 130 may be modified through the execution of a particular function (e.g., a content move, a content enlargement, a content reduction, a content scroll, a content change, an application correlated with content, etc.) depending on the front touch event.
  • a particular function e.g., a content move, a content enlargement, a content reduction, a content scroll, a content change, an application correlated with content, etc.
  • a touch event (hereinafter, referred to as a back touch event) is inputted on the back side of the display unit 130
  • content displayed on the display unit 130 may be modified through the execution of a particular function (e.g., an object erasing of a specific region from which the back touch event is detected, etc.) depending on the back touch event.
  • a particular function e.g., an object erasing of a specific region from which the back touch event is detected, etc.
  • the region erased from content by the back touch event namely, the region where the back touch event occurs
  • a touch event (hereinafter, referred to as a double-sided touch event) is inputted through correlated regions of on both sides of the display unit 130
  • content displayed on the display unit 130 may be modified through the execution of a particular function (e.g., a partial focusing of content at a specific region from which the double-sided touch event is detected, a partial cutting of content at a specific region from which the double-sided touch event is detected, etc.) depending on the double-sided touch event.
  • all regions except the focused region may become transparent or have an increased brightness. Otherwise, the focused region only may become transparent or have an increased brightness.
  • the cut region may become transparent or have an increased brightness.
  • Each of the first and second touch sensors 120 and 140 may detect a touch event inputted by a certain input tool such as a user's finger on the surface of the touch screen 110. Then the first and second touch sensors 120 and 140 may find the coordinates of a detection region of the touch event and send them to the control unit 170. As discussed above, the first and second touch sensors 120 and 140 may be disposed on the front and back sides of the display unit 130, respectively. Therefore, the first and second touch sensors 120 and 140 may be referred to as a front touch sensor and a back touch sensor, respectively.
  • the front touch sensor 120 may detect the front touch event corresponding to the input gesture.
  • the back touch sensor 140 may detect the back touch event corresponding to the input gesture.
  • the front and back touch sensors 120 and 140 may simultaneously detect the front and back touch events, i.e., the double-sided touch event, corresponding to the input gesture.
  • a touch event may refer to a particular input gesture such as a touch, a touch move (e.g., a drag), or a touch and release (e.g., a tap), which may be detected through at least one of the front and back touch sensors 120 and 140.
  • a touch move e.g., a drag
  • a touch and release e.g., a tap
  • the memory unit 150 stores a variety of programs and data executed and processed in the mobile device and may be formed of at least one of a non-volatile memory, such as a Read Only Memory (ROM) or a flash memory, and a volatile memory such as a Random Access Memory (RAM).
  • the memory unit 150 may permanently or temporarily store an operating system of the mobile device, programs and data in connection with display control operations of the display unit 130, programs and data in connection with input control operations of the first and second touch sensors 120 and 140, programs and data in connection with a transparentizing process of the display unit 130, and the like.
  • the memory unit 150 may store execution information 163 including a relation between particular functions and touch event types (i.e., a front touch event, a back touch event, and a double-sided touch event), and setting values for the brightness of content to be modified in response to a touch event. Also, the memory unit 150 may store various contents 165 to be displayed on the display unit 130. Here, setting values of content brightness may vary from 0% to 100%.
  • the control unit 170 controls general states and operations of the mobile device. Particularly, the control unit 170 may control the operation of content in response to a touch event detected through the touch screen 110. For instance, the control unit 170 may modify displayed content in response to at least one touch event delivered from the first and second touch sensors 120 and 140. Here, the control unit 170 may check a region from which a touch event is detected, and then determine whether the detected region is on the front side, on the back side, or on both sides of the display unit 130. Also, for each touch event, the control unit 170 may change the transparency of a partial or whole region of content according to a predefined brightness.
  • the control unit 170 may include a touch region check unit 180 that identifies a specific region of a touch event delivered from the front and back touch sensors 120 and 140, and a brightness modification unit 190 that modifies the brightness of content according to a touch event.
  • the touch region check unit 180 may identify a specific region on which a touch event is inputted, by finding coordinates of a touched region from a touch signal received from the touch screen 110. Namely, the touch region check unit 180 may receive a touch signal from at least one of the front and back touch sensors 120 and 140 and then identify a touch region by using its coordinates. More specifically, through coordinates received from the touch screen 110, the touch region check unit 180 may determine whether a touch event occurs on the front side, on the back side, or on both sides of the display unit 130. Additionally, by using the coordinates received from the touch screen 110, the touch region check unit 180 may determine whether a touch input is moved across the surface of the touch screen 110. Also, by using the coordinates received from the touch screen 110, the touch region check unit 180 may determine the size of a touch region.
  • the brightness modification unit 190 may increase or decrease the brightness of displayed content according to predefined levels of a touch event. Namely, the brightness modification unit 190 may receive information about a touch event from the touch region check unit 180 and then perform a transparentizing process to change the brightness of displayed content depending on the level of the received touch event. For instance, if the brightness of certain content is set to 0%, the brightness modification unit 190 may change the brightness of a selected partial region or the whole region of content from 1% to 100%.
  • control unit 170 may identify a touch event through the touch region check unit 180 and then, based thereon, change the brightness of content displayed on the display unit 130 to a predefined brightness corresponding to the identified touch event through the brightness modification unit 190. Control operations of the control unit 170 will be described later.
  • control unit 170 can control various operations in connection with typical functions of the mobile device. For instance, the control unit 170 may control the operation and data display of a running application. Also, the control unit 170 may receive an input signal through various touch-based input interfaces and then control the execution of a corresponding function. And also, the control unit 170 may control transmission and reception of data based on a wired or wireless communication.
  • the mobile device shown in FIG. 1 may include communication devices, multimedia players and their application equipment, each of which has a touch screen based on a transparent display panel.
  • the mobile device may include many types of mobile communication terminals based on various communication protocols, a tablet PC, a smart phone, a Portable Multimedia Player (PMP), a digital broadcasting player, a Personal Digital Assistant (PDA), a portable game console, and the like.
  • PMP Portable Multimedia Player
  • PDA Personal Digital Assistant
  • a content operation method using the transparent display panel according to this invention may be applied to a monitor, a notebook, a TV, a Large Format Display (LFD), a Digital Signage (DS), a media pole, etc.
  • LFD Large Format Display
  • DS Digital Signage
  • FIG. 2 is a schematic view illustrating a transparent display panel of a mobile device in accordance with an exemplary embodiment of the present invention.
  • the mobile device has the touch screen 110 which is formed of a transparent display panel. Therefore, the touch screen 110 allows a background 200 behind the mobile device to be shown through the transparent display panel. Namely, the brightness more than 0% is assigned to the display unit 130, so the background 200 can be seen according to a degree of transparency depending on the brightness. For instance, as the brightness approaches 0%, the color displayed on the display unit 130 gets near to white. Namely, transparency becomes lower. On the contrary, as the brightness approaches 100%, the color displayed on the display unit 130 gets near to black. Namely, transparency becomes higher. Contents displayed on the display unit 130 may be basically set to the brightness of 0% and the lowest transparency.
  • the background 200 is expressed as diagonal lines. However, actually, the background 200 may be appearances of real things that are seen through the transparent display panel.
  • the touch screen 110 includes two touch sensors 120 and 140 and the display unit 130.
  • the touch sensors 120 and 140 may be attached to both sides 210 and 230 of the display unit 130, respectively.
  • a user-facing side 210 of the display unit 130 is referred to as the front side
  • the opposite side 230 is referred to as the back side.
  • the touch sensor disposed on the front side 210 is referred to as a front touch sensor 120
  • the touch sensor disposed on the back side 230 is referred to as a back touch sensor 140.
  • the mobile device may control the operation of the content 165 displayed on the display unit 130, depending on a touch event detected through at least one of the front and back touch sensors 120 and 140. Related examples will be described later.
  • FIGs. 3 to 5 are schematic views illustrating three types of touch events on a transparent display panel of a mobile device in accordance with exemplary embodiments of the present invention.
  • a user may input a front touch event by touching (or dragging, tapping, etc.) a selected region with the finger on the front side 210 of the display unit 130. Then the front touch sensor 120 finds the coordinates of the region from which the front touch event is detected, and sends them to the control unit 170. Thus the control unit 170 controls the execution of a particular function according to the front touch event. Functions executed by the front touch event may be to move, enlarge, reduce, or scroll the content selected by the front touch event or to draw or input a certain object (e.g., a line, a figure, text, etc.) depending on the front touch event.
  • a certain object e.g., a line, a figure, text, etc.
  • a user may input a back touch event by touching (or dragging, tapping, etc.) a selected region with the finger on the back side 230 of the display unit 130. Then the back touch sensor 140 finds the coordinates of the region from which the back touch event is detected, and sends them to the control unit 170. Thus the control unit 170 controls the execution of a particular function according to the back touch event.
  • Functions executed by the back touch event may be to erase or remove the content or object selected by the back touch event. Erasing the content may include a process of transparentizing a specific region, on which the back touch event is inputted, in the content according to a predefined brightness.
  • removing the object may include partially or completely removing the object (e.g., a line, a figure, text, etc.) that is drawn or inputted by the front touch event.
  • a user may input a double-sided touch event by simultaneously touching (or dragging, tapping, etc.) selected regions with the fingers on the front and back sides 210 and 230 of the display unit 130. Then each of the front and back touch sensors 120 and 140 finds the coordinates of each region from which the double-sided touch event is detected, and sends them to the control unit 170. Thus the control unit 170 controls the execution of a particular function according to the double-sided touch event. Functions executed by the double-sided touch event may be to focus or cut parts of the content selected by the double-sided touch event.
  • Focusing the content may include a process of transparentizing, depending on a predefined brightness, all regions in the content except a specific region on which the double-sided touch event is inputted. Also, cutting the content may include a process of transparentizing a cut region in the content, depending on a predefined brightness.
  • the transparency of the display unit 130 may be varied according to the brightness of displayed colors. In other words, the transparency becomes lower as colors with a higher brightness are displayed on the display unit 130. On the contrary, the transparency becomes higher as colors with a lower brightness are displayed on the display unit 130. Therefore, the display unit 130 with a higher transparency allows the background behind the mobile device to be seen, and vice versa.
  • a color with a higher brightness approaches a white
  • a color with a lower brightness approaches a black.
  • all contents displayed on the display unit may be transparent to allow the background to be seen or may be opaque to disallow the background to be seen. Alternatively, some contents may be transparent and the others may be opaque.
  • FIG. 6 is a view illustrating a way to operate content displayed on a display unit in response to a touch event in a mobile device in accordance with an exemplary embodiment of the present invention. More particularly, FIG. 6 shows an example of operating displayed content in response to the front touch event.
  • the display unit 130 represents opaque contents (0% brightness) such as image or photo contents.
  • a user may input a front touch event on the front side 210 of the display unit 130. For instance, as shown, a user may touch a certain region with the finger on the front side 210 of the display unit 130 and then moves leftward the finger.
  • the control unit 170 modifies the displayed content in response to the front touch event. For instance, as shown in a next stage 620, the display unit 130 may represent another content changed by the front touch event.
  • FIG. 7 is a view illustrating a way to operate content displayed on a display unit in response to a touch event in a mobile device in accordance with an exemplary embodiment of the present invention. More particularly, FIG. 7 shows an example of transparentizing a part of displayed content in response to the back touch event.
  • the display unit 130 represents opaque contents (0% brightness) such as calendar content.
  • a background 700 is simply expressed as diagonal lines. However, actually, the background 700 may be appearances of real things that are seen through the transparent display panel.
  • a user may input the first back touch event on the back side 230 of the display unit 130. For instance, as shown in the first stage 710 and a second stage 720, a user may touch a certain region with the finger on the back side 230 of the display unit 130 and wait for a while.
  • the control unit 170 modifies the displayed content in response to the first back touch event.
  • the display unit 130 may represent the content modified through a transparentizing process with a given brightness (e.g., 50% brightness, i.e, semitransparent) by the first back touch event.
  • a given brightness e.g. 50% brightness, i.e, semitransparent
  • the semitransparent content in this stage 720 allows a user to conveniently and exactly input the second back touch event. So, this state 720 may be omitted.
  • the occurrence of the first back touch event for the semitransparent content requires an input that continues for a given time.
  • the release of the back touch event may require a given waiting time for an input of the next back touch event as shown in a third stage 730. If there is no back touch event for a given time, the semitransparent content may return to the original state.
  • a user may input the second back touch event on the back side 230 of the display unit 130. For instance, as shown in the third stage 730, a user may scratch a certain region in the displayed content.
  • the control unit 170 modifies the displayed content in response to the second back touch event.
  • the display unit 130 may represent a scratched part 705 of the content modified through a transparentizing process with a given brightness (e.g., 100% brightness) by the second back touch event.
  • a user may release the second touch event inputted on the back side 230 of the display unit 130. Then the release of the second back touch event is detected through the back touch sensor 140, and the control unit 170 modifies the displayed content in response to the release of the second back touch event. For instance, as shown in a fourth stage 740, the display unit 130 may represent the scratched part 705 only as being transparent by the second back touch event, while keeping the original brightness (e.g., 0% brightness) of the other parts in the displayed content.
  • the original brightness e.g., 0% brightness
  • this stage 740 may also be omitted. Namely, in the first stage 710, the scratched part 705 may be directly transparentized in response to the second back touch event.
  • FIG. 8 is a view illustrating a way to operate content displayed on a display unit in response to a touch event in a mobile device in accordance with an exemplary embodiment of the present invention. More particularly, FIG. 8 shows an example of cutting and transparentizing a part of displayed content in response to the double-sided touch event.
  • the display unit 130 represents opaque contents (0% brightness) such as photo content.
  • a background 800 is simply expressed as diagonal lines. However, actually, the background 800 may be appearances of real things that are seen through the transparent display panel.
  • a user may input the first double-sided touch event on both sides 210 and 230 of the display unit 130. For instance, as shown in a second stage 820, a user may touch coinciding regions with the fingers on both sides 210 and 230 of the display unit 130 and wait for a while.
  • the control unit 170 modifies the displayed content in response to the first double-sided touch event. For instance, as shown in a third stage 830, the display unit 130 may transparentize all regions with a given brightness (e.g., 50% brightness, i.e., semitransparent) except a specific region selected by the first double-sided touch event.
  • a given brightness e.g. 50% brightness, i.e., semitransparent
  • This stage 830 is required only to promote a user's visibility and therefore may be omitted.
  • a user may input the second double-sided touch event on both sides of the display unit 130. For instance, as shown in the fourth stage 840, a user may move a specific region selected by the first double-sided touch event to another position.
  • the control unit 170 modifies the displayed content in response to the second double-sided touch event. For instance, as shown in a fourth stage 840, the region selected by the first double-sided touch event is cut and moved in response to the second double-sided touch event. At this time, a semitransparent state of regions other than the selected region may be maintained. Also, a cut part 805, i.e., the original position of the selected region, may be transparentized with a given brightness (e.g., 100% brightness).
  • a user may release the second touch event inputted on both sides 210 and 230 of the display unit 130. Then the release of the second double-sided touch event is detected through the front and back touch sensors 120 and 140, and the control unit 170 modifies the displayed content in response to the release of the second double-sided touch event. For instance, as shown in a fifth stage 850, the display unit 130 may represent the cut and moved part 805 at a new position. At this time, the display unit 130 may keep the transparency of the cut and moved part 805 and also returns the other parts of semitransparent content to the original brightness (e.g., 0% brightness).
  • the original brightness e.g., 0% brightness
  • this stage 850 may also be omitted. Namely, in the second stage 820, the cut part 805 may be directly transparentized in response to the first and second double-sided touch events.
  • FIG. 9 is a view illustrating a way to operate content displayed on a display unit in response to a touch event in a mobile device in accordance with an exemplary embodiment of the present invention. More particularly, FIG. 9 shows an example of focusing and transparentizing a part of displayed content in response to the double-sided touch event.
  • the display unit 130 represents opaque contents (0% brightness) such as message list content.
  • a background 900 is simply expressed as diagonal lines. However, actually, the background 900 may be appearances of real things that are seen through the transparent display panel.
  • a user may input a double-sided touch event on both sides 210 and 230 of the display unit 130. For instance, as shown in a second stage 920, a user may touch coinciding regions with the fingers on both sides 210 and 230 of the display unit 130 and wait for a while.
  • the display unit 130 may transparentize all regions with a given brightness (e.g., a certain brightness between 1% and 100%) except a specific region (e.g., a specific item in the message list) selected by the double-sided touch event.
  • a given brightness e.g., a certain brightness between 1% and 100%
  • a specific region e.g., a specific item in the message list
  • the display unit 130 may transparentize a specific region (e.g., a specific item in the message list) selected by the double-sided touch event with a given brightness (e.g., a certain brightness between 1% and 100%).
  • a part (referred to as an object) of the displayed content selected by the double-sided touch event may be transparentized with a predefined brightness, or the other region except the selected object may be transparentized with a predefined brightness.
  • This option of a transparentizing process may rely on a user's setting.
  • a further double-sided touch event (e.g., an event of moving or copying the selected object to other position in the list) may be inputted after the third or fourth stage 930 or 940 as discussed in FIG. 8.
  • FIG. 10 is a view illustrating a way to operate content displayed on a display unit in response to a touch event in a mobile device in accordance with an exemplary embodiment of the present invention. More particularly, FIG. 10 shows an example in which content is created in response to the front touch event and then operated in response to the back touch event.
  • the display unit 130 has a predefined brightness (e.g., certain brightness between 1% and 100%) and hence allows a background 1000 to be seen through the transparent display panel.
  • a predefined brightness e.g., certain brightness between 1% and 100%
  • any opaque content may be displayed on the whole or parts of the display unit 130 as discussed above.
  • the background 1000 is simply expressed as diagonal lines. However, actually, the background 1000 may be appearances of real things that are seen through the transparent display panel.
  • a user may input a front touch event on the front side 210 of the display unit 130.
  • a user may touch a specific region with the finger on the front side 210 of the display unit 130 and then, as shown in a third stage 1030, move the touch across the front side 210. Namely, a user may perform a drawing through the front touch event on the front side 210 of the display unit 130.
  • specific content may be created depending on the front touch event.
  • zigzag-shaped content 1005 may be drawn on the display unit 130 by using the front touch event. This content 1005 may be different from the former content that has been already offered to the display unit 130 with a predefined brightness.
  • a user may input a back touch event on the back side 230 of the display unit 130.
  • a user may touch a specific region with the finger on the back side 230 of the display unit 130 and then, as shown in a sixth stage 1060, move the touch across the back side 230. Namely, a user may perform an erasing through the back touch event on the back side 230 of the display unit 130.
  • specific content 1005 may be created in response to the front touch event and then partially or completely erased in response to the back touch event. For instance, as shown in a seventh stage 1070, a part of the displayed content 1005 may be erased from the display unit 130 through the back touch event. Namely, a selected part of the content 1005 may be transparentized by the back touch event.
  • a text input process may be supported according to the content operation method as shown in FIG. 10.
  • FIG. 11 is a view illustrating a way to operate content displayed on a display unit in response to a touch event in a mobile device in accordance with an exemplary embodiment of the present invention. More particularly, FIG. 11 shows an example of selecting and transparentizing several regions in displayed content in response to the front touch event, the back touch event, or the double-sided touch event.
  • the display unit 130 represents opaque contents (0% brightness) such as photo content.
  • a background 1100 is simply expressed as diagonal lines. However, actually, the background 1100 may be appearances of real things that are seen through the transparent display panel.
  • a user may input one type of touch event (e.g., the front touch event, the back touch event, or the double-sided touch event) on at least one of the front and back sides 210 and 230 of the display unit 130.
  • one type of touch event e.g., the front touch event, the back touch event, or the double-sided touch event
  • a user may simultaneously touch two regions with the fingers on both sides 210 and 230 of the display unit 130.
  • a user may repeatedly input the double-sided touch event. Namely, a user may sequentially input several touch events on several regions.
  • the double-sided touch event is detected through the front and back touch sensors 120 and 140, and the control unit 170 modifies the displayed content in response to the double-sided touch event.
  • the display unit 130 may transparentize all regions with a given brightness (e.g., certain brightness between 1% and 100%) except several regions 1105 selected by several touch events.
  • the display unit 130 may transparentize several regions 1105 selected by several touch events with a given brightness (e.g., certain brightness between 1% and 100%).
  • two or more regions may be selected by the front touch event, the back touch event or the double-sided touch event.
  • the selected regions may be transparentized with a predefined brightness, or the other region except the selected regions may be transparentized with a predefined brightness.
  • This option of a transparentizing process may rely on a user's setting.
  • a transparentizing process for several regions may be performed sequentially whenever each region is selected or performed at once after all regions are selected.
  • a further touch event (e.g., an event of moving or copying the selected several objects to other positions) may be inputted after the third or fourth stage 1130 or 1140 as discussed earlier.
  • FIG. 12 is a flowchart illustrating a method for operating displayed content in response to a touch event on a dual touch screen based on a transparent display panel in a mobile device in accordance with an exemplary embodiment of the present invention.
  • the control unit 170 displays a user interface by a user's selection in step 1201.
  • the user interface may have a predefined brightness (e.g., 100%) and thus allow the background to be seen through the display unit 130 based on the transparent display panel.
  • the user interface may represent various contents such as photos, icons, menus, lists, etc. or screen data a running application such as a web browser.
  • the user interface in exemplary embodiments of this invention may include all types of screens regardless of being transparent or not.
  • the user interface is the second case of the above three cases. Namely, at least one content is displayed on the display unit 130.
  • control unit 170 detects a touch event in step 1203. For instance, the control unit 170 may determine whether a given touch event is detected through the touch screen 110. In this step, the control unit 170 may detect one of the front touch event, the back touch event and the double-sided touch event through at least one of the front and back touch sensors 120 and 140.
  • control unit 170 identifies the detected touch event in step 1205. For instance, the control unit 170 may find a specific region from which the touch event is detected, by using the touch region check unit 180. Namely, the control unit 170 may determine whether the detected region of the touch event is on the front side, on the back side or on both sides of the display unit 130.
  • the control unit 170 recognizes the touch event as a front touch event in step 1211. Then the control unit 170 identifies the type of the front touch event in step 1213 and modifies the displayed content according to the type of the front touch event in step 1241. For instance, if the type of the front touch event is to move the displayed content, the control unit 170 may transfer the content across the display unit 130 depending on the front touch event. Also, if the type of the front touch event is to draw or input text, the control unit 170 may create the content on the display unit 130 depending on the front touch event.
  • the control unit 170 may change the brightness of a specific region selected by the front touch event, of a non-selected region, or of the displayed content depending on the front touch event.
  • changes in brightness may be performed by the brightness modification unit 190 according to predefined levels of a touch event.
  • the control unit 170 recognizes the touch event as a back touch event in step 1221. Then the control unit 170 identifies the type of the back touch event in step 1223 and modifies the displayed content according to the type of the back touch event in step 1241. For instance, if the type of the back touch event is to input at least one region of the displayed content for a while, the control unit 170 may modify the brightness of the content. Also, if the type of the back touch event is to select at least one region of the displayed content, the control unit 170 may modify the brightness of the selected or non-selected region in the content.
  • control unit 170 may transparentize a specific region selected by the back touch event.
  • changes in brightness may be performed by the brightness modification unit 190 according to predefined levels of a touch event.
  • the control unit 170 recognizes the touch event as a double-sided touch event in step 1231. Then the control unit 170 identifies the type of the double-sided touch event in step 1233 and modifies the displayed content according to the type of the double-sided touch event in step 1241. For instance, if the type of the double-sided touch event is to input at least one region of the displayed content for a while, the control unit 170 may modify the brightness of the content. Also, if the type of the double-sided touch event is to select at least one region of the displayed content, the control unit 170 may modify the brightness of the selected or non-selected region in the content.
  • control unit 170 may transparentize a cut region selected by the double-sided touch event.
  • changes in brightness may be performed by the brightness modification unit 190 according to predefined levels of a touch event.
  • the above-described methods according to the present invention can be implemented in hardware or as software or computer code that can be stored in a recording medium such as a CD ROM, an RAM, a floppy disk, a hard disk, or a magneto-optical disk or downloaded over a network, so that the methods described herein can be rendered in such software using a general purpose computer, or a special processor or in programmable or dedicated hardware, such as an ASIC or FPGA.
  • the computer, the processor or the programmable hardware include memory components, e.g., RAM, ROM, Flash, etc. that may store or receive software or computer code that when accessed and executed by the computer, processor or hardware implement the processing methods described herein.
  • RAM random accesses code for implementing the processing shown herein
  • the execution of the code transforms the general purpose computer into a special purpose computer for executing the processing shown herein.

Abstract

A mobile device for operating displayed content having a dual touch screen based on a transparent display panel is provided. The mobile device allows a method for operating displayed content in response to a touch event using at least one of front and back faces of the dual touch screen. The mobile device displays the content on the transparent display panel with a predefined brightness. When receiving a touch event through the transparent display panel, the mobile device modifies the brightness of parts or whole of the displayed content in response to the touch event.

Description

MOBILE DEVICE AND METHOD FOR OPERATING CONTENT DISPLAYED ON TRANSPARENT DISPLAY PANEL
The present invention relates to a mobile device having a touch screen. More particularly, the present invention relates to a mobile device and method for operating displayed content in response to a touch event using at least one of front and back faces of the mobile device having a transparent display panel.
As well known in the art, a mobile device refers to a type of electronic device based on mobility and portability. With remarkable growths of related technologies, a great variety of mobile devices capable of supporting various end-user functions are increasingly popularized in these days.
Nowadays such mobile devices may employ many input techniques. In particular, some of conventional mobile devices offer a touch-based input interface such as a touch screen that normally includes a touch panel and a display unit. In this case, the mobile device generates a touch event in response to a user's touch input made on the touch panel and then performs a predefined function based on the touch event.
One of advanced touch screens is a dual touch screen in which two touch sensors are disposed on both sides of a display unit formed of a transparent display panel. This dual touch screen can detect a touch event from both sides through the front and back touch sensors. The display unit of the dual touch screen is transparent and hence its transparency varies according to the brightness of displayed colors. Namely, the transparency of the display unit approaches 0% as the colors displayed on the display unit gets near to white. In addition, the transparency of the display unit approaches 100% as the colors displayed on the display unit gets near to black.
However, a conventional mobile device having the transparent display panel is still in a simple stage which allows the background behind the mobile device to be seen through the transparent display panel. More improved and effective user interfaces and their operation methods may be required for the mobile device having the transparent display panel.
An aspect of the present invention is to address at least the above-mentioned problems and/or disadvantages and to provide at least the advantages described below. Accordingly, an aspect of the present invention is to provide a mobile device having a touch screen and also to provide a method for operating displayed content using the touch screen.
Another aspect of the present invention is to provide a mobile device having a dual touch screen based on a transparent display panel and also to provide a method for operating displayed content in response to a touch event on the dual touch screen.
Still another aspect of the present invention is to provide an apparatus and method for operating displayed content in response to a touch on the front side, on the back side, or on both sides of a dual touch screen based on a transparent display panel in a mobile device.
Yet another aspect of the present invention is to provide an apparatus and method for modifying the brightness of parts or all of displayed content in response to a touch event on a dual touch screen based on a transparent display panel in a mobile device.
According to an aspect of the present invention, a method for operating displayed content in a mobile device having a transparent display panel is provided. The method includes displaying the content on the transparent display panel with a predefined brightness, receiving a touch event through the transparent display panel, and modifying the brightness of parts or whole of the displayed content in response to the touch event.
According to another aspect of the present invention, a mobile device is provided. The device includes, a touch screen for receiving a touch event and then for modifying the brightness of content in response to the touch event, and a control unit for determining the type of the touch event received from the touch screen and then, in response to the touch event, and for performing a transparentizing process by modifying the brightness of parts or whole of the content displayed on the touch screen.
Other aspects, advantages, and salient features of the invention will become apparent to those skilled in the art from the following detailed description, which, taken in conjunction with the annexed drawings, discloses exemplary embodiments of the invention.
In various aspects of this invention, the mobile device having the transparent display panel may realize optimal environments for varying the transparency of displayed content according to various touch events. Therefore, this mobile device can offer new additive and useful functions based on the transparent display panel.
According to another aspect of the present invention, a method for operating the displayed content is provided by allowing a user to intuitively modify the brightness of parts or whole of the displayed content through various touch events such as front, back or double-sided touch events. This technique may be favorably and widely applied to any type of device that employs the transparent display panel. The double-sided touch-based changes in transparency of displayed content may promote usability, accessibility and competitiveness of the mobile device.
The above and other aspects, features, and advantages of certain exemplary embodiments of the present invention will be more apparent from the following description taken in conjunction with the accompanying drawings, in which:
FIG. 1 is a block diagram illustrating a configuration of a mobile device having a dual touch screen based on a transparent display panel in accordance with an exemplary embodiment of the present invention.
FIG. 2 is a schematic view illustrating a transparent display panel of a mobile device in accordance with an exemplary embodiment of the present invention.
FIGs. 3 to 5 are schematic views illustrating three types of touch events on a transparent display panel of a mobile device in accordance with exemplary embodiments of the present invention.
FIGs. 6 to 11 are views illustrating various ways to operate content displayed on a display unit in response to a touch event in a mobile device in accordance with exemplary embodiments of the present invention.
FIG. 12 is a flowchart illustrating a method for operating displayed content in response to a touch event on a dual touch screen based on a transparent display panel in a mobile device in accordance with an exemplary embodiment of the present invention.
Throughout the drawings, it should be noted that like reference numbers are used to depict the same or similar elements, features, and structures.
The following description with reference to the accompanying drawings is provided to assist in a comprehensive understanding of exemplary embodiments of the invention as defined by the claims and their equivalents. It includes various specific details to assist in that understanding but these are to be regarded as merely exemplary. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the invention. In addition, descriptions of well-known functions and constructions may be omitted for clarity and conciseness.
The terms and words used in the following description and claims are not limited to the bibliographical meanings, but, are merely used by the inventor to enable a clear and consistent understanding of the invention. Accordingly, it should be apparent to those skilled in the art that the following description of exemplary embodiments of the present invention of the present invention is provided for illustration purpose only and not for the purpose of limiting the invention as defined by the appended claims and their equivalents.
It is to be understood that the singular forms "a," "an," and "the" include plural referents unless the context clearly dictates otherwise. Thus, for example, reference to "a component surface" includes reference to one or more of such surfaces.
Exemplary embodiments of the present invention relate to a mobile device that has a touch screen and executes a method for operating displayed content using the touch screen. More particularly, the mobile device includes a dual touch screen based on a transparent display panel. Additionally, this invention allows the mobile device to perform a particular function in response to a touch event through a touch on the front side, on the back side, or on both sides of the dual touch screen.
Among terms set forth herein, a front touch refers to a touch gesture on the front side of a touch screen, including a single touch and a multi touch. A back touch refers to a touch gesture on the back side of a touch screen, including a single touch and a multi touch. A double-sided touch refers to a multi touch gesture on both sides of a touch screen.
Now, a mobile device according to exemplary embodiments of this invention and its operation control method will be described hereinafter. The following exemplary embodiment is, however, exemplary only and not to be considered as a limitation of this invention. As will be understood by those skilled in the art, any other alternative exemplary embodiments may be favorably used.
FIG. 1 is a block diagram illustrating a configuration of a mobile device having a dual touch screen based on a transparent display panel in accordance with an exemplary embodiment of the present invention.
Referring to FIG. 1, the mobile device includes a touch screen 110, a memory unit 150 and a control unit 170. Additionally, the mobile device may include an audio processing unit having a microphone and a speaker, a digital broadcast module for receiving and playing digital broadcasting such as Digital Multimedia Broadcasting (DMB) or Digital Video Broadcasting (DVB), a camera module for taking a photo or recording a video, a Bluetooth communication module for performing a Bluetooth communication function, a radio frequency communication module for performing a communication function based on a mobile communication service, an Internet communication module for performing an Internet communication function, a touch pad for a touch-based input, a key input unit for a mechanical key input, a battery for a power supply, and the like. Since these elements are well known in the art, related illustration and description will be omitted herein.
The touch screen 110 includes a first touch sensor 120, a display unit 130, and a second touch sensor 140. Herein, a user-facing side of the display unit 130 is referred to as the front side, and the opposite side is referred to as the back side. The first touch sensor 120 may be disposed on the front side, and the second touch sensor 140 may be disposed on the back side. The first and second touch sensors 120 and 140 may include well known touch-sensitive sensors of capacitive overlay type, resistive overlay type, piezoelectric type, and the like.
The display unit 130 represents an operation state of the mobile device and related data. Under the control of the control unit 170, the display unit 130 may display a variety of contents (e.g., photos, images, lists, text, icons, menus, etc.). Also, under the control of the control unit 170, the display unit 130 may display content modified in response to a touch event that is detected through at least one of the first and second touch sensors 120 and 140. Here, the display unit 130 may display content being partially transparent according to a touch event. Namely, the brightness of content displayed may become different.
For instance, it is supposed that one touch event of a front touch, a back touch and a double-sided touch is inputted when content with a specific image is displayed on the display unit 130.
When a touch event (hereinafter, referred to as a front touch event) is inputted on the front side of the display unit 130, content displayed on the display unit 130 may be modified through the execution of a particular function (e.g., a content move, a content enlargement, a content reduction, a content scroll, a content change, an application correlated with content, etc.) depending on the front touch event.
When a touch event (hereinafter, referred to as a back touch event) is inputted on the back side of the display unit 130, content displayed on the display unit 130 may be modified through the execution of a particular function (e.g., an object erasing of a specific region from which the back touch event is detected, etc.) depending on the back touch event. Here, the region erased from content by the back touch event (namely, the region where the back touch event occurs) may become transparent or have an increased brightness.
When a touch event (hereinafter, referred to as a double-sided touch event) is inputted through correlated regions of on both sides of the display unit 130, content displayed on the display unit 130 may be modified through the execution of a particular function (e.g., a partial focusing of content at a specific region from which the double-sided touch event is detected, a partial cutting of content at a specific region from which the double-sided touch event is detected, etc.) depending on the double-sided touch event. Here, all regions except the focused region may become transparent or have an increased brightness. Otherwise, the focused region only may become transparent or have an increased brightness. In addition, the cut region may become transparent or have an increased brightness.
Each of the first and second touch sensors 120 and 140 may detect a touch event inputted by a certain input tool such as a user's finger on the surface of the touch screen 110. Then the first and second touch sensors 120 and 140 may find the coordinates of a detection region of the touch event and send them to the control unit 170. As discussed above, the first and second touch sensors 120 and 140 may be disposed on the front and back sides of the display unit 130, respectively. Therefore, the first and second touch sensors 120 and 140 may be referred to as a front touch sensor and a back touch sensor, respectively. When a user takes an input gesture (e.g., a touch, a drag, a tap, etc.) with the finger on the front side of the display unit 130, the front touch sensor 120 may detect the front touch event corresponding to the input gesture. When a user takes an input gesture (e.g., a touch, a drag, a tap, etc.) with the finger on the back side of the display unit 130, the back touch sensor 140 may detect the back touch event corresponding to the input gesture. When a user takes a simultaneous input gesture (e.g., a multi touch, a multi drag, a multi tap, etc.) with two fingers on both sides of the display unit 130, the front and back touch sensors 120 and 140 may simultaneously detect the front and back touch events, i.e., the double-sided touch event, corresponding to the input gesture.
Meanwhile, a touch event may refer to a particular input gesture such as a touch, a touch move (e.g., a drag), or a touch and release (e.g., a tap), which may be detected through at least one of the front and back touch sensors 120 and 140.
The memory unit 150 stores a variety of programs and data executed and processed in the mobile device and may be formed of at least one of a non-volatile memory, such as a Read Only Memory (ROM) or a flash memory, and a volatile memory such as a Random Access Memory (RAM). The memory unit 150 may permanently or temporarily store an operating system of the mobile device, programs and data in connection with display control operations of the display unit 130, programs and data in connection with input control operations of the first and second touch sensors 120 and 140, programs and data in connection with a transparentizing process of the display unit 130, and the like.
Additionally, the memory unit 150 may store execution information 163 including a relation between particular functions and touch event types (i.e., a front touch event, a back touch event, and a double-sided touch event), and setting values for the brightness of content to be modified in response to a touch event. Also, the memory unit 150 may store various contents 165 to be displayed on the display unit 130. Here, setting values of content brightness may vary from 0% to 100%.
The control unit 170 controls general states and operations of the mobile device. Particularly, the control unit 170 may control the operation of content in response to a touch event detected through the touch screen 110. For instance, the control unit 170 may modify displayed content in response to at least one touch event delivered from the first and second touch sensors 120 and 140. Here, the control unit 170 may check a region from which a touch event is detected, and then determine whether the detected region is on the front side, on the back side, or on both sides of the display unit 130. Also, for each touch event, the control unit 170 may change the transparency of a partial or whole region of content according to a predefined brightness.
The control unit 170 may include a touch region check unit 180 that identifies a specific region of a touch event delivered from the front and back touch sensors 120 and 140, and a brightness modification unit 190 that modifies the brightness of content according to a touch event.
The touch region check unit 180 may identify a specific region on which a touch event is inputted, by finding coordinates of a touched region from a touch signal received from the touch screen 110. Namely, the touch region check unit 180 may receive a touch signal from at least one of the front and back touch sensors 120 and 140 and then identify a touch region by using its coordinates. More specifically, through coordinates received from the touch screen 110, the touch region check unit 180 may determine whether a touch event occurs on the front side, on the back side, or on both sides of the display unit 130. Additionally, by using the coordinates received from the touch screen 110, the touch region check unit 180 may determine whether a touch input is moved across the surface of the touch screen 110. Also, by using the coordinates received from the touch screen 110, the touch region check unit 180 may determine the size of a touch region.
The brightness modification unit 190 may increase or decrease the brightness of displayed content according to predefined levels of a touch event. Namely, the brightness modification unit 190 may receive information about a touch event from the touch region check unit 180 and then perform a transparentizing process to change the brightness of displayed content depending on the level of the received touch event. For instance, if the brightness of certain content is set to 0%, the brightness modification unit 190 may change the brightness of a selected partial region or the whole region of content from 1% to 100%.
As discussed heretofore, the control unit 170 may identify a touch event through the touch region check unit 180 and then, based thereon, change the brightness of content displayed on the display unit 130 to a predefined brightness corresponding to the identified touch event through the brightness modification unit 190. Control operations of the control unit 170 will be described later.
Besides, the control unit 170 can control various operations in connection with typical functions of the mobile device. For instance, the control unit 170 may control the operation and data display of a running application. Also, the control unit 170 may receive an input signal through various touch-based input interfaces and then control the execution of a corresponding function. And also, the control unit 170 may control transmission and reception of data based on a wired or wireless communication.
Meanwhile, the mobile device shown in FIG. 1 may include communication devices, multimedia players and their application equipment, each of which has a touch screen based on a transparent display panel. For instance, the mobile device may include many types of mobile communication terminals based on various communication protocols, a tablet PC, a smart phone, a Portable Multimedia Player (PMP), a digital broadcasting player, a Personal Digital Assistant (PDA), a portable game console, and the like. Also, a content operation method using the transparent display panel according to this invention may be applied to a monitor, a notebook, a TV, a Large Format Display (LFD), a Digital Signage (DS), a media pole, etc.
FIG. 2 is a schematic view illustrating a transparent display panel of a mobile device in accordance with an exemplary embodiment of the present invention.
Referring to FIG. 2, the mobile device has the touch screen 110 which is formed of a transparent display panel. Therefore, the touch screen 110 allows a background 200 behind the mobile device to be shown through the transparent display panel. Namely, the brightness more than 0% is assigned to the display unit 130, so the background 200 can be seen according to a degree of transparency depending on the brightness. For instance, as the brightness approaches 0%, the color displayed on the display unit 130 gets near to white. Namely, transparency becomes lower. On the contrary, as the brightness approaches 100%, the color displayed on the display unit 130 gets near to black. Namely, transparency becomes higher. Contents displayed on the display unit 130 may be basically set to the brightness of 0% and the lowest transparency.
In the drawings, the background 200 is expressed as diagonal lines. However, actually, the background 200 may be appearances of real things that are seen through the transparent display panel.
Meanwhile, as discussed above, the touch screen 110 includes two touch sensors 120 and 140 and the display unit 130. The touch sensors 120 and 140 may be attached to both sides 210 and 230 of the display unit 130, respectively. Herein, a user-facing side 210 of the display unit 130 is referred to as the front side, and the opposite side 230 is referred to as the back side. Also, the touch sensor disposed on the front side 210 is referred to as a front touch sensor 120, and the touch sensor disposed on the back side 230 is referred to as a back touch sensor 140. According to an exemplary embodiment of this invention, the mobile device may control the operation of the content 165 displayed on the display unit 130, depending on a touch event detected through at least one of the front and back touch sensors 120 and 140. Related examples will be described later.
FIGs. 3 to 5 are schematic views illustrating three types of touch events on a transparent display panel of a mobile device in accordance with exemplary embodiments of the present invention.
Referring to FIG. 3, while certain contents are displayed on the display unit 130, a user may input a front touch event by touching (or dragging, tapping, etc.) a selected region with the finger on the front side 210 of the display unit 130. Then the front touch sensor 120 finds the coordinates of the region from which the front touch event is detected, and sends them to the control unit 170. Thus the control unit 170 controls the execution of a particular function according to the front touch event. Functions executed by the front touch event may be to move, enlarge, reduce, or scroll the content selected by the front touch event or to draw or input a certain object (e.g., a line, a figure, text, etc.) depending on the front touch event.
Referring to FIG. 4, while certain contents are displayed on the display unit 130, a user may input a back touch event by touching (or dragging, tapping, etc.) a selected region with the finger on the back side 230 of the display unit 130. Then the back touch sensor 140 finds the coordinates of the region from which the back touch event is detected, and sends them to the control unit 170. Thus the control unit 170 controls the execution of a particular function according to the back touch event. Functions executed by the back touch event may be to erase or remove the content or object selected by the back touch event. Erasing the content may include a process of transparentizing a specific region, on which the back touch event is inputted, in the content according to a predefined brightness. Also, removing the object may include partially or completely removing the object (e.g., a line, a figure, text, etc.) that is drawn or inputted by the front touch event.
Referring to FIG. 5, while certain contents are displayed on the display unit 130, a user may input a double-sided touch event by simultaneously touching (or dragging, tapping, etc.) selected regions with the fingers on the front and back sides 210 and 230 of the display unit 130. Then each of the front and back touch sensors 120 and 140 finds the coordinates of each region from which the double-sided touch event is detected, and sends them to the control unit 170. Thus the control unit 170 controls the execution of a particular function according to the double-sided touch event. Functions executed by the double-sided touch event may be to focus or cut parts of the content selected by the double-sided touch event. Focusing the content may include a process of transparentizing, depending on a predefined brightness, all regions in the content except a specific region on which the double-sided touch event is inputted. Also, cutting the content may include a process of transparentizing a cut region in the content, depending on a predefined brightness.
As discussed heretofore, the transparency of the display unit 130 may be varied according to the brightness of displayed colors. In other words, the transparency becomes lower as colors with a higher brightness are displayed on the display unit 130. On the contrary, the transparency becomes higher as colors with a lower brightness are displayed on the display unit 130. Therefore, the display unit 130 with a higher transparency allows the background behind the mobile device to be seen, and vice versa. Herein, a color with a higher brightness approaches a white, and a color with a lower brightness approaches a black.
Now, examples of modifying the brightness of displayed content in response to a touch event inputted in the mobile device will be described. In the initial state, all contents displayed on the display unit may be transparent to allow the background to be seen or may be opaque to disallow the background to be seen. Alternatively, some contents may be transparent and the others may be opaque.
FIG. 6 is a view illustrating a way to operate content displayed on a display unit in response to a touch event in a mobile device in accordance with an exemplary embodiment of the present invention. More particularly, FIG. 6 shows an example of operating displayed content in response to the front touch event.
Referring to FIG. 6, in an initial stage 610, the display unit 130 represents opaque contents (0% brightness) such as image or photo contents.
In the initial stage 610, a user may input a front touch event on the front side 210 of the display unit 130. For instance, as shown, a user may touch a certain region with the finger on the front side 210 of the display unit 130 and then moves leftward the finger.
Then the front touch event is detected through the front touch sensor 120, and the control unit 170 modifies the displayed content in response to the front touch event. For instance, as shown in a next stage 620, the display unit 130 may represent another content changed by the front touch event.
FIG. 7 is a view illustrating a way to operate content displayed on a display unit in response to a touch event in a mobile device in accordance with an exemplary embodiment of the present invention. More particularly, FIG. 7 shows an example of transparentizing a part of displayed content in response to the back touch event.
Referring to FIG. 7, in a first stage 710, the display unit 130 represents opaque contents (0% brightness) such as calendar content. In FIG. 7, a background 700 is simply expressed as diagonal lines. However, actually, the background 700 may be appearances of real things that are seen through the transparent display panel.
In the first stage 710, a user may input the first back touch event on the back side 230 of the display unit 130. For instance, as shown in the first stage 710 and a second stage 720, a user may touch a certain region with the finger on the back side 230 of the display unit 130 and wait for a while.
Then the first back touch event is detected through the back touch sensor 140, and the control unit 170 modifies the displayed content in response to the first back touch event. For instance, as shown in the second stage 720, the display unit 130 may represent the content modified through a transparentizing process with a given brightness (e.g., 50% brightness, i.e, semitransparent) by the first back touch event. The semitransparent content in this stage 720 allows a user to conveniently and exactly input the second back touch event. So, this state 720 may be omitted.
The occurrence of the first back touch event for the semitransparent content requires an input that continues for a given time. Similarly, the release of the back touch event may require a given waiting time for an input of the next back touch event as shown in a third stage 730. If there is no back touch event for a given time, the semitransparent content may return to the original state.
In the second stage 720, a user may input the second back touch event on the back side 230 of the display unit 130. For instance, as shown in the third stage 730, a user may scratch a certain region in the displayed content.
Then the second back touch event is detected through the back touch sensor 140, and the control unit 170 modifies the displayed content in response to the second back touch event. For instance, as shown in the third stage 730, the display unit 130 may represent a scratched part 705 of the content modified through a transparentizing process with a given brightness (e.g., 100% brightness) by the second back touch event.
Next, in the third stage 730, a user may release the second touch event inputted on the back side 230 of the display unit 130. Then the release of the second back touch event is detected through the back touch sensor 140, and the control unit 170 modifies the displayed content in response to the release of the second back touch event. For instance, as shown in a fourth stage 740, the display unit 130 may represent the scratched part 705 only as being transparent by the second back touch event, while keeping the original brightness (e.g., 0% brightness) of the other parts in the displayed content.
If the stage 720 is omitted as discussed above, this stage 740 may also be omitted. Namely, in the first stage 710, the scratched part 705 may be directly transparentized in response to the second back touch event.
FIG. 8 is a view illustrating a way to operate content displayed on a display unit in response to a touch event in a mobile device in accordance with an exemplary embodiment of the present invention. More particularly, FIG. 8 shows an example of cutting and transparentizing a part of displayed content in response to the double-sided touch event.
Referring to FIG. 8, in a first stage 810, the display unit 130 represents opaque contents (0% brightness) such as photo content. In FIG. 8, a background 800 is simply expressed as diagonal lines. However, actually, the background 800 may be appearances of real things that are seen through the transparent display panel.
In the first stage 810, a user may input the first double-sided touch event on both sides 210 and 230 of the display unit 130. For instance, as shown in a second stage 820, a user may touch coinciding regions with the fingers on both sides 210 and 230 of the display unit 130 and wait for a while.
Then the first double-sided touch event is detected through the front and back touch sensors 120 and 140, and the control unit 170 modifies the displayed content in response to the first double-sided touch event. For instance, as shown in a third stage 830, the display unit 130 may transparentize all regions with a given brightness (e.g., 50% brightness, i.e., semitransparent) except a specific region selected by the first double-sided touch event. This stage 830 is required only to promote a user's visibility and therefore may be omitted.
In the third stage 830, a user may input the second double-sided touch event on both sides of the display unit 130. For instance, as shown in the fourth stage 840, a user may move a specific region selected by the first double-sided touch event to another position.
Then the second double-sided touch event is detected through the front and back touch sensors 120 and 140, and the control unit 170 modifies the displayed content in response to the second double-sided touch event. For instance, as shown in a fourth stage 840, the region selected by the first double-sided touch event is cut and moved in response to the second double-sided touch event. At this time, a semitransparent state of regions other than the selected region may be maintained. Also, a cut part 805, i.e., the original position of the selected region, may be transparentized with a given brightness (e.g., 100% brightness).
Next, in the fourth stage 840, a user may release the second touch event inputted on both sides 210 and 230 of the display unit 130. Then the release of the second double-sided touch event is detected through the front and back touch sensors 120 and 140, and the control unit 170 modifies the displayed content in response to the release of the second double-sided touch event. For instance, as shown in a fifth stage 850, the display unit 130 may represent the cut and moved part 805 at a new position. At this time, the display unit 130 may keep the transparency of the cut and moved part 805 and also returns the other parts of semitransparent content to the original brightness (e.g., 0% brightness).
If the third stage 830 is omitted as discussed above, this stage 850 may also be omitted. Namely, in the second stage 820, the cut part 805 may be directly transparentized in response to the first and second double-sided touch events.
FIG. 9 is a view illustrating a way to operate content displayed on a display unit in response to a touch event in a mobile device in accordance with an exemplary embodiment of the present invention. More particularly, FIG. 9 shows an example of focusing and transparentizing a part of displayed content in response to the double-sided touch event.
Referring to FIG. 9, in a first stage 910, the display unit 130 represents opaque contents (0% brightness) such as message list content. In FIG. 9, a background 900 is simply expressed as diagonal lines. However, actually, the background 900 may be appearances of real things that are seen through the transparent display panel.
In the first stage 910, a user may input a double-sided touch event on both sides 210 and 230 of the display unit 130. For instance, as shown in a second stage 920, a user may touch coinciding regions with the fingers on both sides 210 and 230 of the display unit 130 and wait for a while.
Then the double-sided touch event is detected through the front and back touch sensors 120 and 140, and the control unit 170 modifies the displayed content in response to the double-sided touch event. For instance, as shown in a third stage 930, the display unit 130 may transparentize all regions with a given brightness (e.g., a certain brightness between 1% and 100%) except a specific region (e.g., a specific item in the message list) selected by the double-sided touch event. Alternatively, as shown in a fourth stage 940, the display unit 130 may transparentize a specific region (e.g., a specific item in the message list) selected by the double-sided touch event with a given brightness (e.g., a certain brightness between 1% and 100%).
Like this, a part (referred to as an object) of the displayed content selected by the double-sided touch event may be transparentized with a predefined brightness, or the other region except the selected object may be transparentized with a predefined brightness. This option of a transparentizing process may rely on a user's setting.
Although not illustrated in FIG. 9, a further double-sided touch event (e.g., an event of moving or copying the selected object to other position in the list) may be inputted after the third or fourth stage 930 or 940 as discussed in FIG. 8.
FIG. 10 is a view illustrating a way to operate content displayed on a display unit in response to a touch event in a mobile device in accordance with an exemplary embodiment of the present invention. More particularly, FIG. 10 shows an example in which content is created in response to the front touch event and then operated in response to the back touch event.
Referring to FIG. 10, in a first stage 1010, the display unit 130 has a predefined brightness (e.g., certain brightness between 1% and 100%) and hence allows a background 1000 to be seen through the transparent display panel. Although no content is displayed on the display unit 130 in FIG. 10, any opaque content may be displayed on the whole or parts of the display unit 130 as discussed above. In FIG. 10, the background 1000 is simply expressed as diagonal lines. However, actually, the background 1000 may be appearances of real things that are seen through the transparent display panel.
In the first stage 1010, a user may input a front touch event on the front side 210 of the display unit 130. For instance, as shown in a second stage 1020, a user may touch a specific region with the finger on the front side 210 of the display unit 130 and then, as shown in a third stage 1030, move the touch across the front side 210. Namely, a user may perform a drawing through the front touch event on the front side 210 of the display unit 130.
In other words, specific content may be created depending on the front touch event. For instance, as shown in a fourth stage 1040, zigzag-shaped content 1005 may be drawn on the display unit 130 by using the front touch event. This content 1005 may be different from the former content that has been already offered to the display unit 130 with a predefined brightness.
Then, in the fourth stage 1040, a user may input a back touch event on the back side 230 of the display unit 130. For instance, as shown in a fifth stage 1050, a user may touch a specific region with the finger on the back side 230 of the display unit 130 and then, as shown in a sixth stage 1060, move the touch across the back side 230. Namely, a user may perform an erasing through the back touch event on the back side 230 of the display unit 130.
In other words, specific content 1005 may be created in response to the front touch event and then partially or completely erased in response to the back touch event. For instance, as shown in a seventh stage 1070, a part of the displayed content 1005 may be erased from the display unit 130 through the back touch event. Namely, a selected part of the content 1005 may be transparentized by the back touch event.
Although not illustrated, a text input process may be supported according to the content operation method as shown in FIG. 10.
FIG. 11 is a view illustrating a way to operate content displayed on a display unit in response to a touch event in a mobile device in accordance with an exemplary embodiment of the present invention. More particularly, FIG. 11 shows an example of selecting and transparentizing several regions in displayed content in response to the front touch event, the back touch event, or the double-sided touch event.
Referring to FIG. 11, in a first stage 1110, the display unit 130 represents opaque contents (0% brightness) such as photo content. In FIG. 11, a background 1100 is simply expressed as diagonal lines. However, actually, the background 1100 may be appearances of real things that are seen through the transparent display panel.
In the first stage 1110, a user may input one type of touch event (e.g., the front touch event, the back touch event, or the double-sided touch event) on at least one of the front and back sides 210 and 230 of the display unit 130. For instance, as shown in a second stage 1120, a user may simultaneously touch two regions with the fingers on both sides 210 and 230 of the display unit 130. In this case, a user may repeatedly input the double-sided touch event. Namely, a user may sequentially input several touch events on several regions.
Then the double-sided touch event is detected through the front and back touch sensors 120 and 140, and the control unit 170 modifies the displayed content in response to the double-sided touch event. For instance, as shown in a third stage 1130, the display unit 130 may transparentize all regions with a given brightness (e.g., certain brightness between 1% and 100%) except several regions 1105 selected by several touch events. Alternatively, as shown in a fourth stage 1140, the display unit 130 may transparentize several regions 1105 selected by several touch events with a given brightness (e.g., certain brightness between 1% and 100%).
Like this, two or more regions may be selected by the front touch event, the back touch event or the double-sided touch event. Also, the selected regions may be transparentized with a predefined brightness, or the other region except the selected regions may be transparentized with a predefined brightness. This option of a transparentizing process may rely on a user's setting. Besides, a transparentizing process for several regions may be performed sequentially whenever each region is selected or performed at once after all regions are selected.
Although not illustrated in FIG. 11, a further touch event (e.g., an event of moving or copying the selected several objects to other positions) may be inputted after the third or fourth stage 1130 or 1140 as discussed earlier.
FIG. 12 is a flowchart illustrating a method for operating displayed content in response to a touch event on a dual touch screen based on a transparent display panel in a mobile device in accordance with an exemplary embodiment of the present invention.
Referring to FIG. 12, the control unit 170 displays a user interface by a user's selection in step 1201. The user interface may have a predefined brightness (e.g., 100%) and thus allow the background to be seen through the display unit 130 based on the transparent display panel. Alternatively, the user interface may represent various contents such as photos, icons, menus, lists, etc. or screen data a running application such as a web browser. Namely, the user interface in exemplary embodiments of this invention may include all types of screens regardless of being transparent or not. In FIG. 12, it is supposed that the user interface is the second case of the above three cases. Namely, at least one content is displayed on the display unit 130.
Next, the control unit 170 detects a touch event in step 1203. For instance, the control unit 170 may determine whether a given touch event is detected through the touch screen 110. In this step, the control unit 170 may detect one of the front touch event, the back touch event and the double-sided touch event through at least one of the front and back touch sensors 120 and 140.
Next, the control unit 170 identifies the detected touch event in step 1205. For instance, the control unit 170 may find a specific region from which the touch event is detected, by using the touch region check unit 180. Namely, the control unit 170 may determine whether the detected region of the touch event is on the front side, on the back side or on both sides of the display unit 130.
If the touch event is inputted on the front side of the display unit 130, the control unit 170 recognizes the touch event as a front touch event in step 1211. Then the control unit 170 identifies the type of the front touch event in step 1213 and modifies the displayed content according to the type of the front touch event in step 1241. For instance, if the type of the front touch event is to move the displayed content, the control unit 170 may transfer the content across the display unit 130 depending on the front touch event. Also, if the type of the front touch event is to draw or input text, the control unit 170 may create the content on the display unit 130 depending on the front touch event. And also, if the type of the front touch event is to modify the brightness, the control unit 170 may change the brightness of a specific region selected by the front touch event, of a non-selected region, or of the displayed content depending on the front touch event. Here, changes in brightness may be performed by the brightness modification unit 190 according to predefined levels of a touch event. Related examples in connection with the front touch event have been already discussed with reference to FIGs. 3, 6 and 11.
On the other hand, if the touch event is inputted on the back side of the display unit 130, the control unit 170 recognizes the touch event as a back touch event in step 1221. Then the control unit 170 identifies the type of the back touch event in step 1223 and modifies the displayed content according to the type of the back touch event in step 1241. For instance, if the type of the back touch event is to input at least one region of the displayed content for a while, the control unit 170 may modify the brightness of the content. Also, if the type of the back touch event is to select at least one region of the displayed content, the control unit 170 may modify the brightness of the selected or non-selected region in the content. And also, if the type of the back touch event is to erase the displayed content, the control unit 170 may transparentize a specific region selected by the back touch event. Here, changes in brightness may be performed by the brightness modification unit 190 according to predefined levels of a touch event. Related examples in connection with the back touch event have been already discussed with reference to FIGs. 4, 7, 10 and 11.
On the other hand, if the touch event is inputted on both sides of the display unit 130, the control unit 170 recognizes the touch event as a double-sided touch event in step 1231. Then the control unit 170 identifies the type of the double-sided touch event in step 1233 and modifies the displayed content according to the type of the double-sided touch event in step 1241. For instance, if the type of the double-sided touch event is to input at least one region of the displayed content for a while, the control unit 170 may modify the brightness of the content. Also, if the type of the double-sided touch event is to select at least one region of the displayed content, the control unit 170 may modify the brightness of the selected or non-selected region in the content. And also, if the type of the double-sided touch event is to cut a part of the displayed content, the control unit 170 may transparentize a cut region selected by the double-sided touch event. Here, changes in brightness may be performed by the brightness modification unit 190 according to predefined levels of a touch event. Related examples in connection with the double-sided touch event have been already discussed with reference to FIGs. 5, 8, 9 and 11.
The above-described methods according to the present invention can be implemented in hardware or as software or computer code that can be stored in a recording medium such as a CD ROM, an RAM, a floppy disk, a hard disk, or a magneto-optical disk or downloaded over a network, so that the methods described herein can be rendered in such software using a general purpose computer, or a special processor or in programmable or dedicated hardware, such as an ASIC or FPGA. As would be understood in the art, the computer, the processor or the programmable hardware include memory components, e.g., RAM, ROM, Flash, etc. that may store or receive software or computer code that when accessed and executed by the computer, processor or hardware implement the processing methods described herein. In addition, it would be recognized that when a general purpose computer accesses code for implementing the processing shown herein, the execution of the code transforms the general purpose computer into a special purpose computer for executing the processing shown herein.
While the invention has been shown and described with reference to certain exemplary embodiment thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims and their equivalents.

Claims (15)

  1. A method for operating displayed content in a mobile device having a transparent display panel, the method comprising:
    displaying the content on the transparent display panel with a predefined brightness;
    receiving a touch event through the transparent display panel; and
    modifying the brightness of parts or whole of the displayed content in response to the touch event.
  2. The method of claim 1, wherein the displaying of the content includes:
    representing one of transparent content with the predefined brightness in order to allow a background to be seen, non-transparent content on the entire screen, and both transparent content and non-transparent content in order to allow the background to be partially seen.
  3. The method of claim 2, wherein the receiving of the touch event includes:
    receiving one of a front touch event, a back touch event and a double-sided touch event through at least one of the front and back sides of a display unit.
  4. The method of claim 3, wherein the representing includes:
    performing one of a transparentizing process by modifying the brightness of a specific region corresponding to the touch event in the displayed content; and a transparentizing process by modifying the brightness of all regions except the specific region corresponding to the touch event in the displayed content.
  5. The method of claim 3, wherein the representing includes:
    cutting a specific region selected by the touch event; and
    performing a transparentizing process by modifying the brightness of the cut region in the content.
  6. The method of claim 3, wherein the representing includes:
    performing one of a transparentizing process by modifying the brightness of two or more specific regions selected by the touch event; and a transparentizing process by modifying the brightness of all regions except the two or more specific regions selected by the touch event in the displayed content.
  7. The method of claim 3, wherein the representing includes:
    creating and representing content in response to the first touch event; and
    performing a transparentizing process by modifying the brightness of parts or whole of the created content in response to the second touch event.
  8. The method of claim 7, wherein the created content is drawn or inputted and is different from the displayed content, depending on the predefined brightness.
  9. A mobile device comprising:
    a touch screen for receiving a touch event and then for modifying the brightness of content in response to the touch event; and
    a control unit for determining the type of the touch event received from the touch screen and then, in response to the touch event, for performing a transparentizing process by modifying the brightness of parts or whole of the content displayed on the touch screen.
  10. The mobile device of claim 9, wherein the touch screen includes:
    a display unit for displaying the content modified in response to the touch event;
    a front touch sensor disposed on the front side of the display unit and for receiving the touch event; and
    a back touch sensor disposed on the back side of the display unit and for receiving the touch event.
  11. The mobile device of claim 10, wherein the touch event includes:
    a front touch event detected by the front touch sensor;
    a back touch event detected by the back touch sensor; and
    a double-sided touch event detected by both the front and back touch sensors.
  12. The mobile device of claim 10, wherein the control unit modifies the brightness of parts or whole of the displayed content in response to the front touch event, the back touch event or the double-sided touch event.
  13. The mobile device of claim 12, wherein the control unit performs a transparentizing process by modifying the brightness of at least one specific region corresponding to the touch event in the displayed content, by modifying the brightness of all regions except the at least one specific region corresponding to the touch event in the displayed content, or by modifying the brightness of a specific region selected by the touch event and cut in the displayed content.
  14. The mobile device of claim 12, wherein the control unit creates and represents content in response to the first touch event, and performs a transparentizing process by modifying the brightness of parts or whole of the created content in response to the second touch event.
  15. The mobile device of claim 11, wherein the control unit includes:
    a touch region check unit for identifying a specific region on which the touch event is inputted, by finding coordinates of a touched region from a touch signal received from at least one of the front and second touch sensors; and
    a brightness modification unit for increasing or decreasing the brightness of the displayed content according to predefined levels of the touch event identified by the touch region check unit.
PCT/KR2011/000066 2010-01-06 2011-01-06 Mobile device and method for operating content displayed on transparent display panel WO2011083975A2 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP11731913A EP2521960A2 (en) 2010-01-06 2011-01-06 Mobile device and method for operating content displayed on transparent display panel
CN2011800054519A CN102696004A (en) 2010-01-06 2011-01-06 Mobile device and method for operating content displayed on transparent display panel

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US33542810P 2010-01-06 2010-01-06
US61/335,428 2010-01-06
KR1020100135745A KR20110081040A (en) 2010-01-06 2010-12-27 Method and apparatus for operating content in a portable terminal having transparent display panel
KR10-2010-0135745 2010-12-27

Publications (2)

Publication Number Publication Date
WO2011083975A2 true WO2011083975A2 (en) 2011-07-14
WO2011083975A3 WO2011083975A3 (en) 2011-12-01

Family

ID=44919830

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2011/000066 WO2011083975A2 (en) 2010-01-06 2011-01-06 Mobile device and method for operating content displayed on transparent display panel

Country Status (5)

Country Link
US (1) US20110163986A1 (en)
EP (1) EP2521960A2 (en)
KR (1) KR20110081040A (en)
CN (1) CN102696004A (en)
WO (1) WO2011083975A2 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103399702A (en) * 2013-07-05 2013-11-20 广东欧珀移动通信有限公司 Operation method and mobile terminal for sending or playing voice message
US20150009303A1 (en) * 2012-01-11 2015-01-08 Ultra-D Coöperatief U.A. Mobile display device
US9239642B2 (en) 2013-04-15 2016-01-19 Samsung Electronics Co., Ltd. Imaging apparatus and method of controlling the same
WO2016119264A1 (en) * 2015-01-30 2016-08-04 华为技术有限公司 Terminal wallpaper control method and terminal
US10296127B2 (en) 2012-04-07 2019-05-21 Samsung Electronics Co., Ltd. Object control method performed in device including transparent display, the device, and computer readable recording medium thereof

Families Citing this family (111)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9041661B2 (en) * 2010-08-13 2015-05-26 Nokia Corporation Cover for an electronic device
KR101843337B1 (en) 2010-10-28 2018-03-30 삼성전자주식회사 Display module and display system
TW201220152A (en) * 2010-11-11 2012-05-16 Wistron Corp Touch control device and touch control method with multi-touch function
US8698771B2 (en) * 2011-03-13 2014-04-15 Lg Electronics Inc. Transparent display apparatus and method for operating the same
EP2500814B1 (en) * 2011-03-13 2019-05-08 LG Electronics Inc. Transparent display apparatus and method for operating the same
JP5784960B2 (en) * 2011-04-26 2015-09-24 京セラ株式会社 Mobile terminal, touch panel operation program, and touch panel operation method
JP5822577B2 (en) * 2011-07-19 2015-11-24 キヤノン株式会社 Display device and control method thereof
EP2763378B1 (en) * 2011-09-27 2019-07-24 NEC Corporation Portable electronic device, touch operation processing method and program
CN102360254A (en) * 2011-09-28 2012-02-22 广东美的电器股份有限公司 Touch display screen and terminal equipment using same
WO2013048443A1 (en) * 2011-09-30 2013-04-04 Intel Corporation Convertible computing device
US9298304B2 (en) * 2011-10-12 2016-03-29 Htc Corporation Electronic device and touch-sensing method
JP2013089202A (en) * 2011-10-21 2013-05-13 Sony Computer Entertainment Inc Input control unit, input control method and input control program
KR101447283B1 (en) * 2011-11-03 2014-10-10 주식회사 네오위즈인터넷 Method, terminal, and recording medium for controlling screen output
CN102495690B (en) * 2011-11-24 2015-03-11 汉王科技股份有限公司 Touch detection method, device and mobile terminal
US9223138B2 (en) 2011-12-23 2015-12-29 Microsoft Technology Licensing, Llc Pixel opacity for augmented reality
TW201329837A (en) * 2012-01-13 2013-07-16 Fih Hong Kong Ltd System and method for unlocking an electronic device
US9606586B2 (en) 2012-01-23 2017-03-28 Microsoft Technology Licensing, Llc Heat transfer device
KR101899810B1 (en) * 2012-02-07 2018-11-02 엘지전자 주식회사 Mobile terminal and method for controlling thereof
US9779643B2 (en) 2012-02-15 2017-10-03 Microsoft Technology Licensing, Llc Imaging structure emitter configurations
US9726887B2 (en) 2012-02-15 2017-08-08 Microsoft Technology Licensing, Llc Imaging structure color conversion
US9368546B2 (en) 2012-02-15 2016-06-14 Microsoft Technology Licensing, Llc Imaging structure with embedded light sources
US9297996B2 (en) 2012-02-15 2016-03-29 Microsoft Technology Licensing, Llc Laser illumination scanning
CN103257704B (en) * 2012-02-20 2016-12-14 联想(北京)有限公司 Messaging device and method
US9578318B2 (en) 2012-03-14 2017-02-21 Microsoft Technology Licensing, Llc Imaging structure emitter calibration
US11068049B2 (en) 2012-03-23 2021-07-20 Microsoft Technology Licensing, Llc Light guide display and field of view
US10191515B2 (en) * 2012-03-28 2019-01-29 Microsoft Technology Licensing, Llc Mobile device light guide display
US9558590B2 (en) 2012-03-28 2017-01-31 Microsoft Technology Licensing, Llc Augmented reality light guide display
US9717981B2 (en) 2012-04-05 2017-08-01 Microsoft Technology Licensing, Llc Augmented reality and physical games
KR102164453B1 (en) * 2012-04-07 2020-10-13 삼성전자주식회사 Object control method performed in device including transparent display, the device, and computer readable recording medium thereof
US8954890B2 (en) * 2012-04-12 2015-02-10 Supercell Oy System, method and graphical user interface for controlling a game
GB2511668A (en) 2012-04-12 2014-09-10 Supercell Oy System and method for controlling technical processes
US8814674B2 (en) 2012-05-24 2014-08-26 Supercell Oy Graphical user interface for a gaming system
US10502876B2 (en) 2012-05-22 2019-12-10 Microsoft Technology Licensing, Llc Waveguide optics focus elements
US8989535B2 (en) 2012-06-04 2015-03-24 Microsoft Technology Licensing, Llc Multiple waveguide imaging structure
US9429997B2 (en) * 2012-06-12 2016-08-30 Apple Inc. Electronic device with wrapped display
JP6080401B2 (en) * 2012-06-27 2017-02-15 京セラ株式会社 apparatus
WO2014000203A1 (en) * 2012-06-28 2014-01-03 Intel Corporation Thin screen frame tablet device
US20140002374A1 (en) * 2012-06-29 2014-01-02 Lenovo (Singapore) Pte. Ltd. Text selection utilizing pressure-sensitive touch
WO2014021658A1 (en) * 2012-08-01 2014-02-06 Samsung Electronics Co., Ltd. Transparent display apparatus and display method thereof
KR20140017420A (en) * 2012-08-01 2014-02-11 삼성전자주식회사 Transparent display apparatus and method thereof
KR102080741B1 (en) * 2012-08-29 2020-02-24 엘지전자 주식회사 Mobile terminal and control method thereof
KR102121527B1 (en) * 2012-08-30 2020-06-10 삼성전자주식회사 Method and device for adjusting transparency of display being used for packaging product
TWI637312B (en) * 2012-09-19 2018-10-01 三星電子股份有限公司 Method for displaying information on transparent display device, display device therewith, and computer-readable recording medium therefor
CN103793163A (en) * 2012-10-30 2014-05-14 联想(北京)有限公司 Information processing method and electronic device
US9635305B1 (en) 2012-11-03 2017-04-25 Iontank, Ltd. Display apparatus including a transparent electronic monitor
KR102121021B1 (en) * 2012-11-12 2020-06-09 삼성전자주식회사 Apparatas and method for changing a setting value in an electronic device
CN102968211A (en) * 2012-11-13 2013-03-13 鸿富锦精密工业(深圳)有限公司 Double-sided transparent remote controller
WO2014082303A1 (en) * 2012-11-30 2014-06-05 东莞宇龙通信科技有限公司 Terminal and control method for screen backlight
US10192358B2 (en) 2012-12-20 2019-01-29 Microsoft Technology Licensing, Llc Auto-stereoscopic augmented reality display
KR20140080220A (en) * 2012-12-20 2014-06-30 삼성전자주식회사 Method for zoomming for contents an electronic device thereof
US9830068B2 (en) * 2012-12-28 2017-11-28 Intel Corporation Dual configuration computer
US20150370523A1 (en) * 2013-02-27 2015-12-24 Nec Corporation Portable electronic apparatus, method for controlling the same, and program
US20160034132A1 (en) * 2013-03-13 2016-02-04 Google Technology Holdings LLC Systems and methods for managing displayed content on electronic devices
US20140285504A1 (en) * 2013-03-21 2014-09-25 Au Optronics Corporation Controllable display apparatus and applications thereof
WO2014157357A1 (en) * 2013-03-27 2014-10-02 Necカシオモバイルコミュニケーションズ株式会社 Information terminal, display control method, and program therefor
KR102107728B1 (en) * 2013-04-03 2020-05-07 삼성메디슨 주식회사 Portable ultrasound apparatus, portable ultrasound system and method for diagnosis using ultrasound
US20170115693A1 (en) * 2013-04-25 2017-04-27 Yonggui Li Frameless Tablet
KR101433751B1 (en) * 2013-06-21 2014-08-27 한국과학기술원 Double-sided interactive apparatus using transparent display
US10775869B2 (en) 2013-07-18 2020-09-15 Samsung Electronics Co., Ltd. Mobile terminal including display and method of operating the same
KR102156642B1 (en) * 2013-07-30 2020-09-16 삼성전자 주식회사 Method and apparatus for controlling lock or unlock in a portable terminal
US9367280B2 (en) * 2013-08-06 2016-06-14 Intel Corporation Dual screen visibility with virtual transparency
CN104424318A (en) * 2013-09-09 2015-03-18 阿里巴巴集团控股有限公司 Method and device for controlling page elements
CN104461326B (en) * 2013-09-16 2017-12-26 联想(北京)有限公司 A kind of information processing method and electronic equipment
US9189614B2 (en) * 2013-09-23 2015-11-17 GlobalFoundries, Inc. Password entry for double sided multi-touch display
WO2015113209A1 (en) * 2014-01-28 2015-08-06 华为终端有限公司 Terminal equipment processing method and terminal equipment
CN104331182B (en) * 2014-03-06 2017-08-25 广州三星通信技术研究有限公司 Portable terminal with auxiliary touch-screen
US9891743B2 (en) * 2014-05-02 2018-02-13 Semiconductor Energy Laboratory Co., Ltd. Driving method of an input device
CN105260167A (en) * 2014-06-10 2016-01-20 中兴通讯股份有限公司 Method and device for controlling terminal
CN105278721A (en) * 2014-07-25 2016-01-27 南京瀚宇彩欣科技有限责任公司 Non-shielded touch hand-held electronic apparatus and touch outer cover
CN105278769A (en) * 2014-07-25 2016-01-27 南京瀚宇彩欣科技有限责任公司 Method for improving sunlight readability of display panel of hand-held electronic apparatus in strong light environment
CN105302385A (en) * 2014-07-25 2016-02-03 南京瀚宇彩欣科技有限责任公司 Unblocked touch type handheld electronic apparatus and touch outer cover thereof
US20160026305A1 (en) * 2014-07-25 2016-01-28 Hannstar Display (Nanjing) Corporation Shadeless touch hand-held electronic device and touch-sensing cover thereof
CN105278770A (en) * 2014-07-25 2016-01-27 南京瀚宇彩欣科技有限责任公司 Non-blocking touch handheld electronic device and outer touch cover
CN105278723A (en) * 2014-07-25 2016-01-27 南京瀚宇彩欣科技有限责任公司 Non-blocking touch handheld electronic device, outer touch cover and computer execution method
CN105278724A (en) * 2014-07-25 2016-01-27 南京瀚宇彩欣科技有限责任公司 Non-blocking touch handheld electronic device, outer touch cover and computer execution method
CN105320325A (en) * 2014-07-25 2016-02-10 南京瀚宇彩欣科技有限责任公司 No-blocking touch control type handheld electronic device, touch cover and computer executing method
US9304235B2 (en) 2014-07-30 2016-04-05 Microsoft Technology Licensing, Llc Microfabrication
US10678412B2 (en) 2014-07-31 2020-06-09 Microsoft Technology Licensing, Llc Dynamic joint dividers for application windows
US10592080B2 (en) 2014-07-31 2020-03-17 Microsoft Technology Licensing, Llc Assisted presentation of application windows
KR20160015843A (en) * 2014-07-31 2016-02-15 삼성전자주식회사 Display apparatus and method for controlling the apparatus thereof
US10254942B2 (en) 2014-07-31 2019-04-09 Microsoft Technology Licensing, Llc Adaptive sizing and positioning of application windows
KR20160114413A (en) * 2015-03-24 2016-10-05 엘지전자 주식회사 Mobile terminal and control method for the mobile terminal
US9671828B2 (en) * 2014-09-19 2017-06-06 Lg Electronics Inc. Mobile terminal with dual touch sensors located on different sides of terminal body and method of controlling the same
US9910518B2 (en) * 2014-10-01 2018-03-06 Rockwell Automation Technologies, Inc. Transparency augmented industrial automation display
KR102161694B1 (en) 2014-10-20 2020-10-05 삼성전자주식회사 Display apparatus and display method thereof
TWI536239B (en) * 2014-10-27 2016-06-01 緯創資通股份有限公司 Touch apparatus and touch method
KR101667727B1 (en) 2014-10-31 2016-10-28 엘지전자 주식회사 Mobile terminal and method of controlling the same
US10235037B2 (en) * 2014-12-30 2019-03-19 Lg Electronics Inc. Digital device and control method therefor
KR102121533B1 (en) * 2015-01-09 2020-06-10 삼성전자주식회사 Display Apparatus Having a Transparent Display and Controlling Method for The Display Apparatus Thereof
US9513480B2 (en) 2015-02-09 2016-12-06 Microsoft Technology Licensing, Llc Waveguide
US9372347B1 (en) 2015-02-09 2016-06-21 Microsoft Technology Licensing, Llc Display system
US11086216B2 (en) 2015-02-09 2021-08-10 Microsoft Technology Licensing, Llc Generating electronic components
US10317677B2 (en) 2015-02-09 2019-06-11 Microsoft Technology Licensing, Llc Display system
US9827209B2 (en) 2015-02-09 2017-11-28 Microsoft Technology Licensing, Llc Display system
US9429692B1 (en) 2015-02-09 2016-08-30 Microsoft Technology Licensing, Llc Optical components
US9423360B1 (en) 2015-02-09 2016-08-23 Microsoft Technology Licensing, Llc Optical components
US9535253B2 (en) 2015-02-09 2017-01-03 Microsoft Technology Licensing, Llc Display system
US10018844B2 (en) 2015-02-09 2018-07-10 Microsoft Technology Licensing, Llc Wearable image display system
CN104965657B (en) * 2015-06-30 2019-03-08 Oppo广东移动通信有限公司 Method of toch control and device
KR102399724B1 (en) 2015-09-24 2022-05-20 삼성전자주식회사 Display apparatus, Door and Refrigerator having the same
CN105183095B (en) * 2015-10-19 2019-03-15 京东方科技集团股份有限公司 Handheld terminal with transparent display
CN105224234B (en) * 2015-10-28 2019-06-07 努比亚技术有限公司 Content of text selection method and mobile terminal
CN105302444A (en) * 2015-10-30 2016-02-03 努比亚技术有限公司 Picture processing method and apparatus
GB2547055B (en) * 2016-03-24 2021-06-16 Buckland Tracy Physicians eye
KR102583929B1 (en) * 2017-02-24 2023-10-04 삼성전자주식회사 Display apparatus and control method thereof
US11126258B2 (en) 2017-10-14 2021-09-21 Qualcomm Incorporated Managing and mapping multi-sided touch
CN112286434B (en) 2017-10-16 2021-12-10 华为技术有限公司 Suspension button display method and terminal equipment
KR20190054397A (en) * 2017-11-13 2019-05-22 삼성전자주식회사 Display apparatus and the control method thereof
CN109218527A (en) * 2018-08-31 2019-01-15 努比亚技术有限公司 screen brightness control method, mobile terminal and computer readable storage medium
JP7197007B2 (en) * 2019-06-12 2022-12-27 日本電信電話株式会社 Touch panel type information terminal device and its information input processing method
WO2023037476A1 (en) * 2021-09-09 2023-03-16 日本電信電話株式会社 Display device, method for controlling display device, and program

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007126801A2 (en) * 2006-03-30 2007-11-08 Cirque Corporation Circular scrolling touchpad functionality determined by starting position of pointing object on touchpad surface
US20070291014A1 (en) * 2006-06-16 2007-12-20 Layton Michael D Method of scrolling that is activated by touchdown in a predefined location on a touchpad that recognizes gestures for controlling scrolling functions
US20090077497A1 (en) * 2007-09-18 2009-03-19 Lg Electronics Inc. Mobile terminal including touch screen and method of controlling operation thereof
JP2009265768A (en) * 2008-04-22 2009-11-12 Autonetworks Technologies Ltd Operation device

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7800592B2 (en) * 2005-03-04 2010-09-21 Apple Inc. Hand held electronic device with multiple touch sensing devices
US20050166158A1 (en) * 2004-01-12 2005-07-28 International Business Machines Corporation Semi-transparency in size-constrained user interface
US20070291008A1 (en) * 2006-06-16 2007-12-20 Daniel Wigdor Inverted direct touch sensitive input devices
EP2129090B1 (en) * 2008-05-29 2016-06-15 LG Electronics Inc. Mobile terminal and display control method thereof

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007126801A2 (en) * 2006-03-30 2007-11-08 Cirque Corporation Circular scrolling touchpad functionality determined by starting position of pointing object on touchpad surface
US20070291014A1 (en) * 2006-06-16 2007-12-20 Layton Michael D Method of scrolling that is activated by touchdown in a predefined location on a touchpad that recognizes gestures for controlling scrolling functions
US20090077497A1 (en) * 2007-09-18 2009-03-19 Lg Electronics Inc. Mobile terminal including touch screen and method of controlling operation thereof
JP2009265768A (en) * 2008-04-22 2009-11-12 Autonetworks Technologies Ltd Operation device

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150009303A1 (en) * 2012-01-11 2015-01-08 Ultra-D Coöperatief U.A. Mobile display device
US9681121B2 (en) 2012-01-11 2017-06-13 Ultra-D Coöperatief U.A. Mobile display device
US10225547B2 (en) 2012-01-11 2019-03-05 Ultra-D Coöperatief U.A. Mobile display device
US10296127B2 (en) 2012-04-07 2019-05-21 Samsung Electronics Co., Ltd. Object control method performed in device including transparent display, the device, and computer readable recording medium thereof
US9239642B2 (en) 2013-04-15 2016-01-19 Samsung Electronics Co., Ltd. Imaging apparatus and method of controlling the same
CN103399702A (en) * 2013-07-05 2013-11-20 广东欧珀移动通信有限公司 Operation method and mobile terminal for sending or playing voice message
WO2016119264A1 (en) * 2015-01-30 2016-08-04 华为技术有限公司 Terminal wallpaper control method and terminal

Also Published As

Publication number Publication date
US20110163986A1 (en) 2011-07-07
KR20110081040A (en) 2011-07-13
WO2011083975A3 (en) 2011-12-01
CN102696004A (en) 2012-09-26
EP2521960A2 (en) 2012-11-14

Similar Documents

Publication Publication Date Title
WO2011083975A2 (en) Mobile device and method for operating content displayed on transparent display panel
WO2012018212A2 (en) Touch-sensitive device and touch-based folder control method thereof
WO2011099720A2 (en) Mobile device with dual display units and method for providing a clipboard function using the dual display units
WO2011099806A2 (en) Method and apparatus for providing information of multiple applications
WO2011099712A2 (en) Mobile terminal having multiple display units and data handling method for the same
WO2010038985A2 (en) Function execution method and mobile terminal operating with the same
AU2011339167B2 (en) Method and system for displaying screens on the touch screen of a mobile device
WO2012039587A1 (en) Method and apparatus for editing home screen in touch device
WO2014119886A1 (en) Method and apparatus for multitasking
WO2015119378A1 (en) Apparatus and method of displaying windows
WO2013073890A1 (en) Apparatus including a touch screen under a multi-application environment and controlling method thereof
WO2012026753A2 (en) Mobile device and method for offering a graphic user interface
WO2011108797A1 (en) Mobile terminal and control method thereof
WO2010134710A2 (en) List search method and mobile terminal supporting the same
WO2013077537A1 (en) Flexible display apparatus and method of providing user interface by using the same
WO2011099713A2 (en) Screen control method and apparatus for mobile terminal having multiple touch screens
WO2013032234A1 (en) Method of providing of user interface in portable terminal and apparatus thereof
WO2011043601A2 (en) Method for providing gui using motion and display apparatus applying the same
WO2010134748A2 (en) Mobile device and method for executing particular function through touch event on communication related list
WO2012077906A1 (en) Method and apparatus for displaying lists
WO2013058539A1 (en) Method and apparatus for providing search function in touch-sensitive device
WO2012108620A2 (en) Operating method of terminal based on multiple inputs and portable terminal supporting the same
WO2015105271A1 (en) Apparatus and method of copying and pasting content in a computing device
WO2013105771A1 (en) Display apparatus and item selecting method using the same
WO2013125914A1 (en) Method and apparatus for object size adjustment on a screen

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11731913

Country of ref document: EP

Kind code of ref document: A1

REEP Request for entry into the european phase

Ref document number: 2011731913

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2011731913

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: DE