US20180196581A9 - System and method for displaying information on transparent display device - Google Patents
System and method for displaying information on transparent display device Download PDFInfo
- Publication number
- US20180196581A9 US20180196581A9 US14/031,483 US201314031483A US2018196581A9 US 20180196581 A9 US20180196581 A9 US 20180196581A9 US 201314031483 A US201314031483 A US 201314031483A US 2018196581 A9 US2018196581 A9 US 2018196581A9
- Authority
- US
- United States
- Prior art keywords
- transparent display
- display device
- information
- screen
- external device
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/038—Indexing scheme relating to G06F3/038
- G06F2203/0383—Remote input, i.e. interface arrangements in which the signals generated by a pointing device are transmitted to a PC at a remote location, e.g. to a PC in a LAN
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04808—Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen
Definitions
- Methods and apparatuses consistent with exemplary embodiments relate to displaying information, and more particularly, to a system and method for displaying information related to an external object or an external device on a transparent display device.
- Transparent display devices are considered next generation display devices.
- a transparent display device has a degree of transparency that enables a user to see an external object or an external device through the transparent display device.
- a transparent display device does not display information related to the external object or the external device.
- Exemplary embodiments provide a system, a method, and an apparatus for displaying information related to an external device seen through a screen of a transparent display device on the screen of the transparent display device, and a recording medium thereof.
- Exemplary embodiments also provide a system, a method, and an apparatus for displaying information related to an object displayed on a screen of an external device seen through a screen of a transparent display device on the screen of the transparent display device, and a recording medium thereof.
- Exemplary embodiments also provide a system, a method, and an apparatus for displaying information related to an external object seen through a screen of a transparent display device on the screen of the transparent display device, and a recoding medium thereof.
- a method of displaying information on a transparent display device including: receiving a touch input on the transparent display device that selects an object displayed on an external device that is viewable through a screen of the transparent display device; requesting the external device for information related to the object; receiving the information related to the object from the external device; and displaying the received information on the screen of the transparent display device.
- the touch input may indicate a contour line of the object that is viewable through the screen, a tap-based touch indicating a location on the screen at which the object is viewable through the screen, or indicate a closed region on the screen at which the object is viewable through the screen.
- the information related to the object indicates at least one other object having a type that is the same as a type of the object, and a display location on a screen of the external device of the at least one other object differs from that of the object.
- the information related to the object indicates information that is not displayed on a screen of the external device.
- the displaying comprises displaying the received information at a display location on the screen of the transparent display device that corresponds to a display location of the object on a screen of the external device.
- the method may further include editing the information that is displayed on the screen of the transparent display device based on an interaction between the transparent display device and the external device.
- the method may further include displaying information related to the external device based on an augmented reality service on the screen of the transparent display device.
- the requesting and the receiving the information are performed based on one of a direct communication between devices, a communication via a server, and a communication via a repeater.
- a transparent display device including: a transparent display configured to receive a touch input that selects an object displayed on an external device that is viewable through the transparent display; a communication unit configured to communicate with an external device that is viewable through the transparent display; and a processor configured to request the external device for information related to the object based on the touch input, via the communication unit, receive information related to the object from the external device in response to the request, via the communication unit, and control the transparent display to display the received information.
- a method of displaying information on a transparent display device including: receiving a first touch input on a screen of the transparent display device indicating first position information of an external device that is viewable through the screen of the transparent display device and receiving a second touch input on the screen of the transparent display device indicating second position information of an object displayed on a screen of the external device viewable through the screen of the transparent display device; requesting the external device for information related to the object based on the first position information and the second position information; receiving information related to the object from the external device in response to the requesting; and displaying the received information on the screen of the transparent display device.
- the first position information indicates a contour line of the external device viewable through the screen of the transparent display device.
- the first touch input may be independent touch operations on a first point and a second point on the screen of the transparent display device that indicate a contour line of the external device that is viewable through the screen of the transparent display device.
- the first touch input may be a touch-and-drag operation for connecting a first point and a second point on the screen of the transparent display device that indicates a contour line of the external device that is viewable through the screen of the transparent display device.
- the first touch input may indicate a touch-based region adjusting operation for guide information displayed on the screen of the transparent display device, and a range related to the touch-based region adjusting operation for the guide information may be based on a contour line of the external device that is viewable through the screen of the transparent display device.
- the first touch input may be a touch operation for selecting screen information of the external device, wherein the screen information may be included in a selectable screen information menu item about the external device, which is displayed on the screen of the transparent display device, and the screen information may include at least one of screen size information and screen type information.
- the second position information may indicate a contour line of the object that is viewable through the screen on the transparent display device.
- the second touch input may be a tap-based touch indicating a location on the screen of the transparent display device at which the object is viewable through the screen of the transparent display device.
- the second touch input may indicate a closed region on the screen of the transparent display device through which the object is viewable on the screen of the transparent display device.
- the information related to the object may indicate at least one other object having a type that is the same as a type of the object, and a display location on the screen of the external device of the at least one other object may differ from that of the object.
- the method may further include editing the information that is displayed on the screen of the transparent display device based on an interaction between the transparent display device and the external device.
- a transparent display device including: a transparent display configured to receive a touch input indicating first position information of an external device that is viewable through the transparent display, and to receive a second touch input indicating second position information of an object displayed on a screen of the external device viewable through the transparent display; a communication unit configured to communicate with the external device; and a processor configured to request the external device for information related to the object based on the first position information and the second position information, via the communication unit, receive information related to the object from the external device in response to the request, via the communication unit, and display the received information on the transparent display.
- a method of displaying information on a screen of a transparent display device including: receiving from the transparent display device a request for information related to at least one object displayed on the screen of an external device that is viewable through a screen of the transparent display device; selecting the at least one object in response to the request; and transmitting the information related to the selected object to the transparent display device, wherein the request for information related to the object comprises first position information of the external device indicated by a first touch input on the transparent display device and second position information of the object displayed on the screen of the external device indicated by a second touch input on the transparent display device.
- a non-transitory computer-readable recording medium having embodied thereon a program for implementing the methods discussed of displaying information on the transparent display device.
- FIG. 1A through FIG. 1C are block diagrams of an information display system according to an exemplary embodiment
- FIG. 2 is a flowchart illustrating a method of displaying information in a transparent display device, according to an exemplary embodiment
- FIGS. 3A through 3H are diagrams showing examples of a first touch input according to exemplary embodiments
- FIGS. 4A through 4E are diagrams showing other examples of a first touch input according to exemplary embodiments.
- FIGS. 5A through 5E are diagrams showing other examples of a first touch input according to exemplary embodiments.
- FIGS. 6A through 6C are diagrams showing a first touch input, a second touch input, and a screen displayed on a transparent display device according to the first and second touch inputs, according to exemplary embodiments;
- FIGS. 7A through 7D are diagrams showing screens for illustrating the first touch input, the second touch input, and editing processes according to the exemplary embodiments;
- FIGS. 8A through 8G are diagrams showing screens for illustrating the first touch input and the second touch input according to the exemplary embodiments
- FIGS. 9A through 9C are diagrams showing screens for illustrating the first touch input and the second touch input according to the exemplary embodiments in a case where a transparent display device and an external device have equal size;
- FIGS. 10A through 10D are diagrams showing examples of the first touch input and the second touch input according to the exemplary embodiments
- FIG. 11 is a flowchart illustrating a method of displaying information to be performed by a transparent display device, according to another exemplary embodiment
- FIG. 12 is a flowchart illustrating a method of displaying information to be performed by a transparent display device, according to another exemplary embodiment
- FIGS. 13A and 13B are side views of the transparent display device and the external device shown in FIG. 12 ;
- FIG. 14 is a flowchart illustrating a method of displaying information to be performed by a transparent display device, according to another exemplary embodiment
- FIG. 15 is a flowchart illustrating a method of displaying information to be performed by a transparent display device, according to another exemplary embodiment
- FIG. 16 is a functional block diagram of a transparent display device according to an exemplary embodiment
- FIG. 17 is a diagram showing an example of a transparent display unit shown in FIG. 16 ;
- FIG. 18 is a diagram illustrating a software layer stored in a storage unit of a transparent display device, according to an exemplary embodiment
- FIG. 19 is a functional block diagram of a transparent display device according to another exemplary embodiment.
- FIG. 20 is a flowchart illustrating a method of displaying information to be performed by an external device, according to an exemplary embodiment.
- FIG. 21 is a flowchart illustrating a method of displaying information to be performed by a transparent display device according to another exemplary embodiment.
- the term โand/orโ includes any and all combinations of one or more of the associated listed items. Expressions such as โat least one of,โ when preceding a list of elements, modify the entire list of elements and do not modify the individual elements of the list.
- An object denotes a component or information displayed on an external device or a screen of the external device of a transparent display device.
- an object may include an image, an image included in another image, an icon, a folder icon, an icon included in a folder icon, text, a pop-up window, an application execution window, a content included in an application execution window, a list, an item, a content, and a file included in a list; however, the present invention is not limited thereto. Examples of an object will be described in detail in various examples of screens that will be described later.
- the object may be referred to as an external object of the transparent display device.
- a touch input denotes input information of a user input through a touch-based gesture using a finger of the user or a touch tool.
- the touch tool may be referred to as an external input device, a stylus, or a stylus pen.
- the touch-based gesture may be variously defined.
- examples of the touch-based gesture may include touch-based motions on a touch screen, such as tap, touch-and-hold, double tap, drag, touch-and-drag, panning, flick, drag-and-drop, sweep, and swipe, but the touch-based gesture is not limited thereto.
- the touch input may be replaced by a gesture based on an image captured by a camera, according to an input desired to represent based on the touch. For example, if the touch input is an input for selecting an object displayed on an external device, the touch input may be replaced by a gesture or operation according to a moving direction or sign of the hand captured by the camera.
- the camera may be configured based on an image sensor or an optical sensor.
- the touch input may be replaced by a user voice signal based on natural language, according to an input desired to represent based on the touch. For example, if a touch input is an input for selecting an object including a certain letter or a name displayed on an external device, the touch input may be replaced by a user voice signal based on natural language representing the certain letter or the name of the object.
- FIG. 1A is a block diagram of an information display system according to an exemplary embodiment.
- the information display system includes a transparent display device 100 and an external device 110 .
- the information display system is not limited to the example shown in FIG. 1A . That is, the information display system may further include other components, in addition to the components shown in FIG. 1A .
- the information display system may further include a server 120 .
- the transparent display device 100 and the external device 110 may transmit and/or receive information via the server 120 , and the transparent display device 100 may receive information based on an augmented reality service about the external device 110 from the server 120 .
- the communication through the server 120 may be wired or wireless internet, but is not limited thereto.
- the server 120 may include at least one of a cloud server, an information supply server, and a service server.
- the server 120 may manage and provide information based on the augmented reality service.
- the information display system may further include an access point 130 , as shown in FIG. 1C .
- the transparent display device 100 and the external device 110 may transmit and/or receive information via the access point 130 .
- the communication method via the access point 130 may be, for example, a wireless LAN communication method of infrastructure mode (or WiFi), but is not limited thereto.
- the transparent display device 100 and the external device 110 may transmit and/or receive information through a device-to-device direct communication.
- the device-to-device direct communication method may use, for example, a local area wireless communication method such as wireless LAN communication of Ad-hoc mode such as WiFi-direct, Bluetooth communication, ultra wideband (UWB) communication, and Zigbee communication, but is not limited thereto.
- the transparent display device 100 and the external device 110 may be connected to each other via a wire.
- the transparent display device 100 and the external device 110 may be connected to each other via a universal serial bus (USB) or a universal asynchronous receiver/transmitter (UART) to transmit/receive data.
- the device-to-device direct communication method may be referred to as a machine-to-machine (M2M) communication method, a device-to-device (D2D) communication method, or a peer-to-peer (P2P) communication method.
- M2M machine-to-machine
- D2D device-to-device
- P2P peer-to-peer
- the communication between the transparent display device 100 and the external device 110 may be performed based on one of the direct communication between devices, the communication method via the access point 130 , and the communication method via the server 120 , according to elements of the information display system, but is not limited thereto.
- the transparent display device 100 and the external device 110 may transmit and/or receive at least one of size information thereof, owner information thereof, and information sharable with other devices, through a short distance communication method such as a near field communication (NFC).
- NFC near field communication
- the size information of the device may be represented as, for example, (width โ length โ thickness) mm, but is not limited thereto.
- Screen information may include screen size information and screen type information, and is not limited thereto.
- the screen size information may be represented as, for example, A4, B5, 7 inches, or 5.5 inches, and is not limited thereto.
- the screen type information may represent whether the screen is a touch screen or a non-touch screen, and is not limited thereto.
- the screen type information may represent whether the screen is a liquid crystal display (LCD) panel or an active matrix organic light emitting diodes (AMO LED) panel.
- LCD liquid crystal display
- AMO LED active matrix organic light emitting diodes
- the transparent display device 100 may display the information about the external device 110 , which is transmitted from the external device 110 via a short distance communication method, such as the NFC, as information about the external device 110 based on the augmented reality service.
- the transparent display device 100 may display the information about the external device 110 on a display area adjacent to the external device 110 that is seen through the transparent display device 100 .
- the display area is a part of a screen of the transparent display device 100 .
- the external device 100 that is seen through the transparent display device 100 may be referred to as the external device 100 that is seen via the screen of the transparent display device 100 .
- the transparent display device 100 is a device having a transparent display.
- the transparent display device 100 may be a mobile phone having a transparent display, a smartphone having a transparent display, a notebook computer having a transparent display, a tablet PC having a transparent display, a handheld PC having a transparent display, an electronic book terminal having a transparent display, a digital broadcasting terminal having a transparent display, a personal digital assistant (PDA) having a transparent display, a portable multimedia player (PMP) having a transparent display, a navigation device having a transparent display, a smart TV having a transparent display, a consumer electronic (CE) device having a transparent display (for example, a refrigerator having a transparent display, an air conditioner having a transparent display, a dish washing machine having a transparent display, etc.), and an iOS-convertible device having a transparent display, but is not limited thereto.
- PDA personal digital assistant
- PMP portable multimedia player
- CE consumer electronic
- the transparent display may be applied to various fields such as high added-value glass, glass as a functional car element, car dashboard, navigators, security electronic devices, solar batteries, electronic devices for military, game consoles, toys, and show windows, as well as smart windows.
- the screen of the transparent display device 100 may be referred to as a screen on the transparent display.
- the transparent display device 100 may provide application execution function, communication function, media player function, web-browsing function, word-processing function, e-mail transmission function, messenger function, and/or data storage function, but is not limited thereto.
- the transparent display device 100 requests information related to at least one object that is displayed on the external device 110 and seen through the transparent display device 100 , based on a touch input.
- the transparent display device 100 displays the received information.
- the external device 110 is a device that is seen through the transparent display device 100 , through the screen of the transparent display device 100 , or through the transparent display of the transparent display device 100 .
- the external device 110 may be referred to as another device.
- the external device 110 may not include a transparent display.
- the external device 110 may be a mobile phone, a smartphone, a notebook computer, a tablet PC, a handheld PC, an electronic book terminal, a digital broadcasting terminal, a PDA, a PMP, a navigation, a smart TV, a CE device (for example, a refrigerator, an air conditioner, a dishwashing machine having a display panel, etc.), and an iOS convertible device, but is not limited thereto. That is, the external device 110 may include a transparent display.
- the external device 110 may provide application execution function, communication function, media player function, web-browsing function, word-processing function, e-mail transmission function, messenger function, and/or data storage function, but is not limited thereto.
- the external device 110 selects the requested object and transmits information related to the requested object to the transparent display device 100 .
- FIG. 2 is a flowchart illustrating a method of displaying information to be performed by the transparent display device 100 , according to an exemplary embodiment.
- the transparent display device 100 receives a first touch input and a second touch input.
- the first touch input represents reference information of the external device 110 that is seen through the transparent display device 100 .
- the reference information is used to detect a display location of the object on the external device 110 , wherein the object is selected by the second touch input in the transparent display device 100 .
- the reference information may be referred to as first position information of the external device 110 .
- FIGS. 3A through 3H are diagrams showing examples of the first touch input.
- the transparent display device 100 has a size that is greater than that of the external device 110 , and the external device 110 is seen through the transparent display device 100 as shown in FIG. 3A .
- a result of sensing the first touch input may or may not be displayed on the transparent display device 100 .
- FIG. 3B shows an example in which the first touch input is drawn along a contour line of the external device 110 that is seen through the transparent display device 100 .
- the contour line of the external device 110 may be referred to as a boundary of the screen of the external device 110 .
- the first touch input may be referred to as a first input that identifies the boundary of the screen of the external device 110 .
- the first touch input shown in FIG. 3B is based on drawing operation from a point S on the external device 110 that is seen through the transparent display device 100 along the contour line of the external device 110 to a point E.
- the point S denotes a start point of the touch operation, that is, a drawing operation along the contour line of the external device 110 .
- the point E denotes an end point of the touch operation along the contour line of the external device 110 .
- the point S and the point E may have the same display location (or xy coordinates). However, the point S and the point E may be adjacent to each other so that a closed area may be set according to the touch operation for drawing along the contour line of the external device 110 .
- the point S is a left uppermost corner in the contour line of the external device 110 , but is not limited thereto. That is, the point S may be an arbitrary point on the contour line of the external device 110 .
- the point E is determined depending on the point S.
- the first touch input is based on independent touch operations at a first point and a second point on the transparent display device 100 .
- the first point and the second point are in a diagonal relationship on the contour line of the external device 110 that is seen through the transparent display device 100 .
- the first point is a left uppermost point P1 on the contour line of the external device 110
- the second point is a right lowermost point P2 on the contour line of the external device 110 .
- the transparent display device 100 may trace the contour line of the external device 110 that is seen through the transparent display device 100 based on information about xy coordinates of the point P1 and the point P2 on the transparent display device 100 .
- (x, y) coordinate information of a right uppermost point and a left lowermost point of the contour line, which are not touched, is detected based on the (x, y) coordinate information of the point P1 and the point P2, and the detected points are connected to each other to trace the contour line of the external device 110 .
- the first point is a left lowermost point P3 on the contour line of the external device 110 and the second point is a right uppermost point P4 on the contour line of the external device 110 .
- the transparent display device 100 may trace the contour line of the external device 110 that is seen through the transparent display device 100 based on the xy coordinate information of the point P3 and the point P4 on the transparent display device 100 .
- the tracing of the contour line may be performed in the same way as described with reference to FIG. 3C .
- FIGS. 3E through 3H show examples where the first touch input is based on a touch-and-drag operation connecting the first point and the second point to each other on the transparent display device 100 .
- the first point and the second point are in the diagonal relationship with each other based on the contour line of the external device 110 that is seen through the transparent display device 100 .
- the first point may denote a start point S and the second point may denote an end point E.
- the point S is a left uppermost point on the contour line of the external device 110 and the point E is a right lowermost point on the contour line of the external device 110 . Since the touch-and-drag operation is performed toward the point E after touching the point S, the transparent display device 100 may trace the contour line of the external device 110 . That is, as shown in FIG.
- the transparent display device 100 may display an arrow or a block setting shown in FIG. 3E based on a current touching location to show variation of the touched location according to the dragging.
- the transparent display device 100 may end the arrow or the block setting display, and may display the contour line of the external device 110 . Otherwise, the arrow or the block display status may be maintained.
- FIG. 3F shows a case where the start point S of the touch-and-drag operation is the right uppermost point on the contour line of the external device 110 and the end point E is the left lowermost point on the contour line of the external device 110 .
- FIG. 3G shows a case where the start point S of the touch-and-drag operation is the left lowermost point on the contour line of the external device 110 and the end point E is the right uppermost point on the contour line of the external device 110 .
- FIG. 3H shows a case where the start point S of the touch-and-drag operation is the right lowermost point on the contour line of the external device 110 and the end point E is the left uppermost point on the contour line of the external device 110 .
- FIGS. 4A through 4E are diagrams showing other examples of the first touch input.
- the transparent display device 100 is larger than the external device 110 , and as shown in FIG. 4A , the external device 110 is seen through the transparent display device 100 .
- a result of sensing the first touch input may or may not be displayed on the transparent display device 100 .
- the first touch input is based on a touch-based operation for adjusting a region with respect to guide information displayed on the transparent display device 100
- the adjustable range of the guide information based on the touch operation is based on the contour line of the external device 110 that is seen through the transparent display device 100 .
- the guide information may be, for example, camera focusing range information.
- the guide information may be displayed according to a request of a user of the transparent display device 100 .
- the request of the user may include a request for displaying guide information for executing the information display method according to the exemplary embodiment, or request for executing the information display method.
- the transparent display device 100 displays guide information G1 as shown in FIG. 4B .
- the guide information G1 may be displayed on the transparent display device 100 according to a command of a user of the transparent display device 100 .
- the guide information is displayed on the transparent display device 100 .
- the transparent display device 100 may trace the contour line of the external device 110 according to adjusted (x, y) coordinate values of the four points P5, P6, P7, and P8 of the guide information G1.
- the tracing of the contour line may be performed by connecting the changed (x, y) coordinate values of the points P5, P6, P7, and P8, but is not limited thereto.
- the changed (x, y) coordinate value of each point may be obtained by adding a variation amount according to the dragging operation to the original (x, y) coordinate value, but is not limited thereto.
- the original (x, y) coordinate values of the points may be updated to the (x, y) coordinate values of the second touched points.
- FIGS. 4D and 4E are diagrams showing examples of the region adjusting operation of the guide information G1.
- the user moves touched point from the left uppermost point P5 of the guide information G1 to the left uppermost point of the contour line of the external device 110 after touching the point P5.
- the left uppermost point in the contour line of the external device 110 is a corner of the external device 110 , which corresponds to the point P5 of the guide information G1.
- the touched point is dragged to the right lowermost point in the contour line of the external device 110 , the region of the guide information G1 is moved from the points P6, P7, and P8 except for the left uppermost point P5, and accordingly, the display state of the guide information G1 is changed as shown in FIG. 4C .
- the touched point is moved to a corresponding corner of the external device 110 , and then, the diagonal point of the guide information G1 is touched and dragged to the corresponding corner of the external device 110 so as to change a display location of the guide information G1 or adjust displayed size of the guide information G1.
- the one point and the diagonal point in the guide information G1 are not limited to the examples shown in FIGS. 4D and 4E .
- the point P6 of the guide information G1 is touched and dragged to the corresponding corner in the contour line of the external device 110 , and then, the point P7 that is in a diagonal relation with the point P6 is touched and dragged so that the other points P5, P7, and P8 of the guide information G1 may be moved to the corresponding corners in the contour line of the external device 110 .
- FIGS. 5A through 5E are diagrams showing examples of the first touch input in a case where the transparent display device 100 is smaller than the external device 110 . That is, as shown in FIG. 5A , when the transparent display device 100 is smaller than the external device 110 , the first touch input may be based on a touch operation of drawing along the contour line of the external device 110 that overlaps the transparent display device 100 .
- a direction of the touch operation that is, drawing direction along the contour line, may not be limited to one direction.
- the transparent display device 100 may be smaller than the external device 110 . Accordingly, when an object to be selected is displayed at a location adjacent to a center on a screen of the external device 110 so that the first touch input shown in FIGS. 5B through 5E is not applied, the transparent display device 100 may reduce a size of the external device 110 by using a zoom-out function of a camera to receive a first touch input and a second touch input. Here, the transparent display device 100 may detect a screen size of the external device 110 according to a zoom-out magnification.
- the first touch input may be received when a touch operation of touching a start point S and dragging to the end point E along the contour line of the external device 110 that is seen through the transparent display device 100 .
- the first touch input may be based on a touch operation for selecting screen information of the external device 110 , which is included in a menu 910 shown in FIG. 9B and will be described later.
- the screen information may include at least one of screen size information of the external device 110 and screen type information of the external device 110 as described above.
- the screen size information may represent, for example, whether the screen size of the transparent display device 100 is equal to a screen size of the external device, or certain size information such as A4, B5, 7 inches, 4 inches, etc. as shown in FIG. 9B , but is not limited thereto.
- the first touch input may be based on a touch operation for selecting corresponding screen size from among the pieces of the screen size information.
- the transparent display device 100 may change the (x, y) coordinate information on the transparent display device 100 according to the first touch input and the (x, y) coordinate information on the transparent display device 100 according to the second touch input into information according to the screen size of the external device 110 .
- the transparent display device 100 may change the coordinate information of the first touch input on the transparent display device 100 and the coordinate information of the second touch input on the transparent display device into coordinate information on the screen size of 7 inches, by using a function of converting the coordinate information of the screen size of 4 inches into coordinate information of the screen size of 7 inches.
- the transparent display device 100 may use relational information between the (x, y) coordinate information on the transparent display device 100 according to the first touch input and the (x, y) coordinate information on the transparent display device 100 according to the second touch input (for example, difference information between the coordinate information).
- the transparent display device 100 may change the coordinate information of the first touch input on the transparent display device 100 and the coordinate information of the second touch input on the transparent display device 100 into coordinate information on the screen size of 4 inches, by using a function of converting the coordinate information of the 10-inch screen size into the coordinate information of 4-inch screen size.
- the above described function of converting the coordinate information according to the screen size may be included in the external device 110 .
- the transparent display device 100 may transmit the (x, y) coordinate information on the transparent display device 100 according to the first touch input, the (x, y) coordinate information on the transparent display device 100 according to the second touch input, and the screen size information of the transparent display device 100 to the external device 110 .
- the screen type information may include information representing whether the screen type of the external device 110 is a touch type or a non-touch type. If the screen of the external device 110 is the touch type screen, the external device 110 may recognize a region overlapping the transparent display device 100 and the external device 110 . Accordingly, the first touch input may not include information relating to the contour line of the external device 110 , but may only include the information representing that the screen of the external device 110 is the touch type screen.
- the second touch input is an input for selecting at least one object displayed on the external device 110 .
- the object that is displayed on the external device 110 is seen through the transparent display device 100 .
- the input for selecting the at least one object displayed on the external device 110 may be referred to as an input for selecting at least one position of the screen of the external device 110 .
- the second touch input may be based on at least one of a touch operation, that is, touching an arbitrary point on a contour line of an object that is seen through the transparent display device 100 and dragging the touched location along the contour line of the object, and a touch operation of writing along the object (for example, text) that is seen through the transparent display device 100 .
- the second touch input may be referred to as a touch input on the screen of the transparent display device 100 indicating position information (or second position information) of the object displayed on the screen of the external device 110 .
- the second touch input may be referred to as an input that selects a position of the screen of the external device 110 viewable through the transparent display device 100 .
- the object is displayed on the screen of the external device 110 at the position.
- the position comprises one of a coordinate position of the screen of the external device 110 viewable through the transparent display device 100 and an area of the screen of the external device 110 viewable through the transparent display device 100 .
- FIGS. 6A through 6C are diagrams showing examples of the second touch input according to the exemplary embodiment.
- FIG. 6A shows the second touch input based on a touch operation of drawing along a contour line of an object and the first touch input based on a touch operation of drawing along the contour line of the external device 110 .
- the first touch input is received according to the touch operation of touching the point S and drawing a line to the point E along the contour line of the external device 110
- the second touch input is received according to the touch operation of drawing a line along with a contour line of an icon.
- the touch operation of drawing a line along the contour line of the icon is performed by touching a point S1 and drawing a line to a point E1 along the contour line of the icon that is an object, and accordingly, the second touch input is received.
- the start point and the end point of the touch operation for drawing along the contour line of the icon are not limited to the examples shown in FIG. 6A .
- the start point is an arbitrary point in the contour line of the icon, and the end point is determined according to the start point as described above.
- the object displayed on the external device 110 is an icon, but the object displayed on the external device 110 may be another type of object, as discussed below.
- the touch operation between the start point and the end point of the touch operation for drawing along the contour line of the object may be performed continuously or discontinuously. If the touch operation is performed discontinuously, the end point of the touch operation for drawing along the contour line of the object may be changed.
- the touch operation for drawing along the contour line of the object in FIG. 6A starts from the start point S1 and stops at a left lowermost point of the object, and then, the touch operation starts again from the start point S1 or the end point E1 to the left lowermost point of the object.
- the end point is the left lowermost point of the object
- the end point E1 may be a connection point for connecting the contour line according to the touch operation.
- the point where the touch operation stops is not limited to the above example. That is, the touch operation may be stopped at an arbitrary point on the contour line of the object, or at a plurality of points on the contour line of the object.
- FIG. 6B shows an example where the second touch input is received based on a writing touch operation along the object (text). That is, FIG. 6B shows a second touch input based on the touch operation for writing an alphabet character P.
- the icon displayed on the external device 110 in FIG. 6A constituted the object
- the alphabet character P is the object displayed on the external device 110 in FIG. 6B .
- the second touch input based on the object writing touch operation may be performed by touching an arbitrary point in the text, and then, writing along the text. For example, after touching a point 601 , a writing touch operation along the object (text) may be performed in a direction denoted by an arrow of FIG. 6B .
- the start point of the object writing touch operation is not limited to the example shown in FIG. 6B , that is, an arbitrary point of the object may be the start point.
- the object writing touch operation may be performed continuously or discontinuously.
- at least one connection point as described above may be included between the start point and the end point.
- FIGS. 7A and 7B are diagrams showing examples of a screen for describing the first touch input and the second touch input, in a case where the transparent display device 100 has a screen that is larger than that of the external device 110 . That is, as shown in FIG. 7A , when the external device 110 is seen through the transparent display device 100 , the first touch input is based on the touch operation for drawing along the contour line of the external device 110 and the second touch input is based on the touch operation for drawing along the contour line of the object displayed on the external device 110 .
- FIGS. 8A through 8G are diagrams showing other examples for illustrating the first touch input and the second touch input in a case where the transparent display device 100 is smaller than the external device 110 .
- FIGS. 8A through 8G shows examples in which pieces of an object that is displayed on the external device 110 are arranged by adjusting the overlapping locations of the transparent display device 100 and the external device 110 . Therefore, in the examples shown in FIGS. 8A through 8G , the transparent display device 100 displays information of sensing the second touch input on the transparent display device 100 .
- the transparent display device 100 overlaps the external device 110 as shown in FIG. 8B .
- the first touch input is based on a touch operation for drawing along the contour line of the external device 110 ( 801 )
- the second touch input is based on a touch operation for drawing along a contour line of the object ( 802 ).
- information of sensing the second touch input ( 802 ) is displayed on the transparent display device 100 .
- the transparent display device 100 detects relational information between (x, y) coordinate information on the transparent display device 100 according to the first touch input and (x, y) coordinate information on the transparent display device 100 according to the second touch input in FIG. 8B , and stores the detected information.
- the relational information detected by the transparent display device 100 may include a difference between the (x, y) coordinate information on the transparent display device 100 according to the first touch input and the (x, y) coordinate information on the transparent display device 100 according to the second touch input.
- the external device 110 may recognize the object selected by the transparent display device 100 in FIG. 8B according to the (x, y) coordinate information according to the first touch input, the (x, y) coordinate information according to the second touch input, and the above relational information.
- the transparent display device 100 may obtain relational information from coordinates (x(1)-x(i), y(1)-y(i)) to (x(1+m)-x(i+j), y(1+m)-y(i+j)).
- m, i, and j are natural numbers that are equal to or greater than 2.
- the transparent display device 100 may detect the above relational information by sampling the coordinate information obtained by the first touch input and the coordinate information obtained by the second touch input.
- a target to be sampled may be determined according to the display location thereof.
- the first touch input is based on a touch operation for drawing along the contour line of the external device 110 ( 803 ) and the second touch input is based on a touch operation for drawing along the contour line of the object ( 804 ).
- information of sensing the second touch input ( 804 ) is displayed on the transparent display device 100 .
- the image of the object displayed on the transparent display device 100 may include the information of sensing the second touch input ( 802 ) in FIG. 8B , as shown in FIG. 8C .
- the transparent display device 100 detects the coordinate information according to the first touch input and the second touch input and the relational information between the coordinate information in FIG. 8C and stores the detected information as described with reference to FIG. 8B .
- the transparent display device 100 stores the coordinate information and the relational information between the coordinate information detected in the process of FIG. 8B and the coordinate information and the relational information between the coordinate information detected in the process of FIG. 8C to be distinguished that the coordinated information and the relational information are detected from each other process.
- the first touch input is based on a touch operation for drawing along the contour line of the external device 110 ( 805 ) and the second touch input is based on a touch operation for drawing along the contour line of the object ( 806 ).
- the second touch input may further include a touch operation for filling inside the contour line of the object.
- Information of sensing the second touch input ( 806 ) is displayed on the transparent display device 100 .
- the image of the object displayed on the transparent display device 100 may include an image including the information of sensing the second touch input in the processes shown in FIGS. 8B and 8C .
- the transparent display device 100 detects and stores the coordinate information on the transparent display device 100 according to the first touch input and the second touch input in the process shown in FIG. 8D and the relational information between the coordinate information, as described with reference to FIG. 8B .
- the transparent display device 100 stores the detected coordinate information and the relational information to be distinguished from the coordinate information and the relational information obtained in the processes shown in FIGS. 8B and 8C .
- the first touch input is based on a touch operation for drawing along the contour line of the external device 110 ( 807 )
- the second touch input is based on a touch operation for drawing along the contour line of the object ( 808 ).
- the second touch input may further include a touch operation for filling inside the contour line of the object.
- Information of sensing the second touch input ( 808 ) is displayed on the transparent display device 100 . Accordingly, the image of the object displayed on the transparent display device 100 may be the image including all the information of sensing the second touch input in processes shown in FIGS. 8B, 8C, and 8D .
- the transparent display device 100 obtains coordinate information on the transparent display device 100 according to the first touch input ( 807 ) and the second touch input ( 808 ) in FIG. 8E and the relational information between the coordinate information, and stores the detected information.
- the transparent display device 100 stores the detected coordinate information and the relational information obtained in the process of FIG. 8E to be distinguished from the coordinate information and the relational information obtained in the processes shown in FIGS. 8B through 8D .
- the first touch input is based on a touch operation for drawing along the contour line of the external device 110 ( 809 ) and the second touch input is based on a touch operation for writing along a text โRABBITโ ( 810 ).
- information of sensing the second touch input ( 810 ) is displayed on the transparent display device 100 .
- the image of the object displayed on the transparent display device 100 is an image including all the information of sensing the second touch inputs in the processes shown in FIGS. 8B through 8E .
- the transparent display device 100 obtains coordinate information on the transparent display device 100 according to the first touch input and the second touch input in FIG. 8F and the relational information between the coordinate information, and stores the detected information as shown in FIGS. 8B through 8E .
- the information of sensing the second touch inputs is displayed on the transparent display device 100 as shown in FIG. 8G .
- a displaying location about the object, which would be transmitted from the external device 110 may be determined in advance.
- the processes shown in FIGS. 8B through 8F may be performed after changing a location of the transparent display device 100 or moving the transparent display device 100 to arrange pieces of the object displayed on the external device 110 . Therefore, the transparent display device 100 may clearly distinguish the first touch input and the second touch input from each other in each screen. For example, after receiving the first touch input and the second touch input in FIG. 8B , the transparent display device 100 changes its location or moves, and then, receives the first touch input and the second touch input according to the process of FIG. 8C to select the object displayed on the external device 110 as shown in FIG. 8C . Therefore, the first and second touch inputs in the process of FIG. 8B and the first and second touch inputs in the process of FIG. 8C may be distinguished from each other via sensing of the location variation or the moving of the transparent display device 100 .
- FIGS. 9A through 9C are diagrams showing examples of screen for describing the first and second touch inputs.
- the transparent display device 100 and the external device 110 have the same size.
- a first touch input operation is performed based on a menu 910 displayed on the transparent display device 100
- a second touch input is based on a touch operation for setting a closed region with respect to a sun, a tap-based touch operation with respect to a cloud, and a touch operation for drawing a contour line of a flower.
- the closed region shown in FIG. 9B is not limited thereto.
- the closed region may be set as various types of closed loops in the transparent display device 100 .
- FIGS. 10A through 10D are diagrams showing examples of the screens for describing a first touch input and a second touch input based on an augmented reality service.
- FIG. 10A shows a case where information about the external device 110 based on the augmented reality service is displayed adjacent to the external device 110 that is seen through the transparent display device 100 .
- the information about the external device 110 based on the augmented reality service may be provided from the external device 110 , another external device, or a server based on a physical locations between the transparent display device 100 and the external device 110 .
- the information about the external device 110 based on the augmented reality service may be provided using an access point.
- physical locations of the transparent display device 100 and the external device 110 may be estimated by using an indoor sensor capable of estimating a physical location of a device such as a geomagnetic sensor, an acceleration sensor, a gyro sensor, and an altitude sensor mounted in the device.
- an indoor sensor capable of estimating a physical location of a device such as a geomagnetic sensor, an acceleration sensor, a gyro sensor, and an altitude sensor mounted in the device.
- the information about the external device 110 based on the augmented reality service may be provided from the above described other external device or the server according to the estimated physical locations.
- the transparent display device 100 receives or reads, from the external device 110 , information that is necessary for receiving information based on the augmented reality service about the external device 110 (for example, mark information for recognizing the external device 110 ) using short distance communication such as NFC, and then, collects and displays the information based on the augmented reality service about the external device 110 from the server or the above described other external device.
- information that is necessary for receiving information based on the augmented reality service about the external device 110 for example, mark information for recognizing the external device 110
- short distance communication such as NFC
- the information about the external device 110 seen through the transparent display device 100 based on the augmented reality service may include a name of the device, a name of the owner, and contents of the external device, which may be shared with other devices, as shown in FIG. 10A , but is not limited thereto.
- the first touch input may be based on an operation of setting a touch-based closed region about the external device 110 as shown in FIG. 10B .
- the touch-based closed region is not limited to the example shown in FIG. 10B .
- information about a shared folder may be displayed on the transparent display device 100 as shown in FIG. 10C .
- the information about the shared folder may be information based on the augmented reality service, or information that is received from the external device 110 when the first touch input is transmitted to the external device 110 .
- the screen displayed on the external device 110 may not display the information about the shared folder.
- the transparent display device 100 may perform the second touch input operation by an operation of setting a touch-based closed region on a desired picture from among the available pictures shown in FIG. 10D .
- the transparent display device 100 requests the external device 110 for information about at least one selected object, based on the first and second touch inputs.
- a signal requesting the information about the object may include the coordinate information on the transparent display device 100 according to the first and second touch inputs and/or relational information between the coordinate information.
- the signal requesting the information related to the object may include coordinate information on the external device 110 according to the first and second touch inputs, wherein the coordinate information is converted by using the coordinate information converting function of the transparent display device 100 , and/or relational information between the coordinate information.
- the coordinate information on the external device 110 according to the second touch input may be coordinate information of the object that is displayed on the external device 110 .
- the signal requesting the information related to the object may further include a signal requesting relation information with the object.
- the signal requesting relation information with the object may include, for example, information for requesting a folder and objects included in the folder, when the object selected according to the second touch input is the folder.
- the objects included in the folder may be referred to as objects that are not displayed on the external device 110 .
- the signal requesting the information related to the object may include coordinate information on the transparent display device 100 according to the first and second touch inputs, and screen size information of the transparent display device 100 .
- the external device 110 may detect coordinate information on the external device 110 according to the first and second inputs based on the information transmitted from the transparent display device 100 and the screen information of the external device 110 .
- the coordinate information on the external device 110 may be detected by the processes described with reference to FIGS. 8B through 8F , but is not limited thereto.
- the signal requesting the information related to the object may include various pieces of information that may be estimated by the examples of the first and second touch inputs described with reference to FIGS. 3 through 10D .
- the transparent display device 100 receives information related to the selected object from the external device 110 , and in operation S 204 , the transparent display device 100 displays the received information related to the object on the transparent display device 100 .
- the information related to the object may include at least one other object having the same display type as that of the object selected by the second touch input.
- the other object has a different display location on the external device 110 from that of the selected object. That is, as shown in FIG. 6B , when the second touch input is received based on the touch operation for writing the text P, the transparent display device 100 may receive all of the text Ps that are displayed at different locations on the external device 110 from the external device 110 , and displays the received text.
- the display locations of the received information on the transparent display device 100 may similarly correspond to the display locations on the external device 110 . If there are a plurality pieces of received information, the transparent display device 100 receives information about display coordinates on the external device 110 , detects information about display coordinates on the transparent display device 100 by using the screen size information of the transparent display device 100 and the display coordinate information transmitted from the external device 110 , and displays the plurality of objects by using the detected coordinate information.
- the coordinate information may be detected by the coordinate information converting operation that is described above.
- the external device 110 may detect information about coordinates on the transparent display device 100 by using the screen size information of the transparent display device 100 and the information about the display coordinates of the plurality pieces of the object information on the external device 110 , and may transmit the detected coordinate information and the object information to the transparent display device 100 . Then, the transparent display device 100 may display the objects based on the received coordinate information.
- the information about the selected object transmitted from the external device 110 may include information relating to the selected object.
- the information relating to the object may include information that is not displayed on the external device 110 (for example, information about objects included in a folder) as described above.
- displaying the received information on the transparent display device 100 may include displaying the received information at similar locations as those of the external device 110 as shown in FIGS. 6C, 7C, and 9C .
- pictures Pic1, Pic5, and Pic6, which are the selected objects are received.
- the transparent display device 100 may display the received pictures Pic1, Pic5, and Pic6 sequentially or on locations based on the screen shown in FIG. 10D .
- the received information may be stored in a clip board in the transparent display device 100 , or may be displayed on a clip board after generating the clip board.
- FIG. 11 is a flowchart illustrating a method of displaying information in a transparent display device according to another exemplary embodiment.
- the method illustrated in FIG. 11 includes an editing function.
- the transparent display device 100 receives a first touch input and a second touch input.
- the first and second touch inputs are the same as those described with reference to FIGS. 2 through 10 .
- the transparent display device 100 requests the external device 110 for information related to an object based on the first and second touch inputs.
- the request for the information related to the object is the same as that described in operation S 201 of FIG. 2 .
- the transparent display device 100 receives information corresponding to the request from the external device 110 .
- the information related to the object that is received is the same as that described in operation S 203 .
- the transparent display device 100 displays the received information the transparent display device 100 .
- the transparent display device 100 edits the received information that is displayed on the transparent display device 100 according to a user input.
- the transparent display device 100 displays a screen on which the objects are combined as shown in FIG. 7D .
- the user inputs 701 and 702 may be received via various touch-based operations.
- the user inputs 701 and 702 may be performed as various touch-based operations, for example, a touch-based operation for long-touching the object to be moved and dragging the object to a desired location of the object to be combined, a touch-based operation for long touching the object to be moved and long-touching the desired location of the object to be combined, a touch-based operation for setting a touch-based closed region on the object to be moved and long-touching the desired location of the object to be combined, or a touch-based operation for setting a touch-based closed region on the object to be moved, setting a touch-based closed region on the desired location of the object to be combined, and connecting the closed regions.
- the editing operation in operation S 1105 is not limited to the combination of the objects as shown in FIGS. 7C and 7D .
- the editing may include various edits on the object, such as change in the shape of the object or change in the content of the object, and an edit on the screen including the object.
- the transparent display device 100 may perform the above editing operation based on an interaction with the external device 110 . Accordingly, the information displayed on the external device 110 may reflect the editing result in the transparent display device 100 in real-time.
- the editing result may be stored in the external device 110 only, in the transparent display device 100 only, or in both the devices 100 and 110 , according to the user input in the transparent display device 100 .
- FIG. 12 is a flowchart illustrating a method of displaying information in a transparent display device according to another exemplary embodiment.
- the transparent display device 100 is flexible, and a front portion and a rear portion of the transparent display device 100 may be transformed or deformed according to a touch-based input.
- the external device 110 has a touch screen.
- a touch input for selecting an object displayed on the external device 110 that is seen through the transparent display device 100 is received.
- the received touch input may correspond to the second touch input described with reference to FIGS. 2 through 10 .
- FIGS. 13A and 13B are side views showing a relation between the transparent display device 100 that is flexible and has the front and rear surface portions 1301 and 1302 that are deformed together according to the touch-based input, and the external device 110 .
- FIG. 13A is a side view showing the transparent display device 100 and the external device 110 overlapping each other before the touch input is received.
- FIG. 13B shows a case where the front and rear surface portions 1301 and 1302 of the transparent display device 100 are transformed together to touch a touch screen 1303 of the external device 110 according to the touch-based user input to the front surface portion 1301 of the transparent display device 100 .
- the rear surface portion 1302 of the transparent display device 100 may be configured as a constant voltage type so that the touch screen of the external device 110 may recognize a contact portion of the rear surface portion 1302 of the transparent display device 100 as a touch-based input; however, the present invention is not limited thereto. That is, the rear surface portion 1302 may be configured according to a touch sensing type of the touch screen 1303 in the external device 110 .
- the transparent display device 100 receives information related to the selected object from the external device 110 based on the touch input due to the contact between the rear surface portion 1302 of the transparent display device 100 and the external device 110 .
- the transparent display device 100 displays the received information.
- FIG. 14 is a flowchart illustrating a method of displaying information in a transparent display device 100 according to another exemplary embodiment.
- FIG. 14 shows a case where the information related to the object displayed on the external device 110 and the screen size information are transmitted based on a local area wireless communication between the transparent display device 100 and the external device 110 .
- the transparent display device 100 receives the information related to the object displayed on the external device 110 and the screen size information of the external device 110 via the local area wireless network.
- the local area wireless communication may include NFC, Bluetooth communication, Wi-Fi direct communication and IR association communication, but is not limited thereto.
- the transparent display device 100 checks whether the transparent display device 100 overlaps the external device 110 .
- the checking in the operation S 1402 may including checking the intention of the user to display the object displayed on the external device 110 that is seen through the transparent display device 100 on the transparent display device 100 according to the touch input to the transparent display device 100 .
- the intention of the user may be interpreted as the intention to select an object to be displayed on the transparent display device 100 .
- the checking operation may be performed by disposing a contact sensor on the rear surface portion of the transparent display device 100 or transmitting a sensing result sensed by a contact sensor disposed on a front surface portion of the external device 110 to the transparent display device 100 via the local area wireless communication, but is not limited thereto.
- the transparent display device 100 and the external device 110 may overlap so that the external device 110 may be included within the screen of the transparent display device 100 when the external device 110 is smaller as shown in FIG. 3A , but is not limited thereto. If the transparent display device 100 is smaller than the external device 110 as shown in the example of FIG. 4A , a part of the external device 110 may overlap the transparent display device 100 , but is not limited thereto. When the transparent display device 100 and the external device 110 have equal sizes as shown in FIG. 9A , the overlapping surfaces of the transparent display device 100 and the external device 110 may be the same as each other.
- the transparent display device 100 displays the information related to the object displayed on the external device 110 by using the information transmitted via the local area wireless communication according to the user input in operation S 1403 .
- the user input in the operation S 1403 may include a request for displaying the object displayed on the external device 110 that is seen through the transparent display device 100 , but is not limited thereto.
- FIG. 15 is a flowchart illustrating a method of displaying information in a transparent display device according to another exemplary embodiment.
- FIG. 15 shows a case where information obtained by photographing the external device 110 using a camera function of the transparent display device 100 is displayed on the transparent display device 100 according to a user input.
- the transparent display device 100 photographs an object displayed on the external device 110 by using the camera function.
- the transparent display device 100 determines whether the transparent display device 100 and the external device 110 overlap each other. Determining whether the transparent display device 100 and the external device 110 overlap each other may be performed in the same manner as that of operation S 1402 described above.
- the transparent display device 100 displays the object displayed on the external device 110 that is photographed according to the user input.
- the user input may include a request for outputting the object displayed on the photographed external device 110 , but is not limited thereto.
- FIG. 16 is a functional block diagram of the transparent display device 100 according to an exemplary embodiment.
- the transparent display device 100 may include a transparent display 1610 , a storage 1620 , a communication interface 1630 , a processor 1640 , and a sensor 1650 .
- the transparent display device 100 may further include additional components other than those shown in FIG. 16 .
- the transparent display device 100 may include an interface, such as a universal serial bus (USB) or a camera module.
- USB universal serial bus
- the transparent display 1610 is configured so that the object displayed on a screen of the external device 110 may be seen through the transparent display 1610 and may be configured to receive a touch-based input.
- the transparent display unit 1610 may be formed in various types, for example, a transparent liquid crystal display (LCD) type, a transparent thin-film electroluminescent panel (TFEL) type, a transparent OLED type, or a projection type.
- LCD liquid crystal display
- TFEL transparent thin-film electroluminescent panel
- the transparent LCD type is a transparent display device formed by removing a backlight unit from a currently used LCD device and using a pair of polarization plates, an optical film, a transparent thin film transistor (TFT), and a transparent electrode.
- the transparent display device may be referred to as a transparent display.
- a transmittance is degraded due to the polarization plates or the optical film and optical efficiency is reduced since peripheral light is used instead of the backlight unit; however, a large size transparent display may be realized.
- the transparent TFEL type is a transparent display device using an alternating current (AC) type inorganic thin film EL display (AC-TFEL) including a transparent electrode, an inorganic phosphor, and an insulating film.
- AC-TFEL alternating current
- the AC-TFEL emits light when accelerated electrons pass through the inorganic phosphor to excite the phosphor.
- the processor 1640 may adjust the electrons to be projected to an appropriate location to determine a location displaying the information. Since the inorganic phosphor and the insulating film are transparent, the transparent display may be easily obtained.
- the transparent OLED type is a transparent display device using an OLED that emits light by itself. Since an organic emission layer is transparent, the OLED may serve as the transparent display device provided that both electrodes are realized as transparent electrodes. In the OLED, electrons and holes are injected from both sides of the organic emission layer to be combined in the organic emission layer and emit light. The transparent OLED device may display the information by injecting the electrons and holes to desired locations.
- FIG. 17 is a diagram showing a detailed structure of the transparent display 1610 that is formed as the transparent OLED type.
- the transparent display 1610 is not limited to the example shown in FIG. 17 .
- the transparent display 1610 includes a transparent substrate 1702 , a transparent transistor layer 1703 , a first transparent electrode 1704 , a transparent organic emission layer 1705 , a second transparent electrode 1706 , and a connection electrode 1707 .
- the transparent substrate 1702 may be formed of a polymer material that is transparent such as plastic, or a glass material.
- the material forming the transparent substrate 1702 may be determined according to environment in which the transparent display device 100 is used.
- the polymer material is light and flexible, and thus may be applied to a portable display device.
- the glass material may be applied to show windows or general windows.
- the transparent transistor layer 1703 is a layer including a transistor that is fabricated by replacing opaque silicon used in a conventional TFT with an organic material such as transparent zinc oxide or titanium oxide.
- a source, a gate, a drain, and various dielectric layers 1708 and 1709 are formed, and the connection electrode 1707 for electrically connecting the drain to the first transparent electrode 1704 may be formed.
- the transparent transistor layer 1703 includes a plurality of transparent transistors that are distributed throughout the entire display surface of the transparent display device 100 .
- the processor 1640 applies a control signal to the gate in each of the transistors in the transparent transistor layer 1703 to drive the corresponding transparent transistor and display information.
- the first transparent electrode 1704 and the second transparent electrode 1706 are disposed at opposite sides to each other while the transparent organic emission layer 1705 is interposed.
- the first transparent electrode 1704 , the transparent organic emission layer 1705 , and the second transparent electrode 1706 form an organic light-emitting diode (OLED).
- OLED organic light-emitting diode
- the transparent OLED may be classified as a passive matrix OLED (PMOLED) and an active matrix OLED (AMOLED) according to a driving method thereof.
- PMOLED passive matrix OLED
- AMOLED active matrix OLED
- the PMOLED has a structure in which cross points between the first and second transparent electrodes 1704 and 1706 form pixels.
- a TFT is disposed to drive each of the pixels.
- Each of the first and second transparent electrodes 1704 and 1706 includes a plurality of line electrodes, arranged perpendicularly to each other. For example, if the line electrodes of the first transparent electrode 1704 are arranged in a transverse direction, the line electrodes of the second transparent electrode 1706 are arranged in a longitudinal direction. Accordingly, there are a plurality of crossing areas formed between the first and second transparent electrodes 1704 and 1706 . The transparent transistor is connected to each of the crossing areas.
- the processor 1640 generates a potential difference in each of the crossing areas by using the transparent transistor.
- the electrons and holes are induced to the transparent organic emission layer 1705 from the first and second electrodes 1704 and 1706 within the crossing area where the potential difference is generated, and then, combined with each other to emit light.
- the crossing area where the potential difference is not generated does not emit light, and accordingly, background image of the rear surface is seen as it is.
- ITO Indium tin oxide
- Graphene is a material of a honeycomb-shaped plane structure in which carbon atoms are connected to each other and having a transparent property.
- the transparent organic emission layer 1705 may be formed of various materials.
- the transparent display 1610 may be formed as the projection type, as well as the transparent LCD type, the transparent TFEL type, and the transparent OLED type.
- the projection type is a method of displaying an image by projecting the image to a transparent screen such as a head-up display.
- the transparent display 1610 may be a dual-touchable touch screen, or may be a touch screen, a front surface of which is only touchable.
- the transparent display 1610 displays information including the object processed in the transparent display device 100 .
- the information may include information except for the object.
- the information except for the object may denote information that is displayed, but may not be selected by the user input.
- the transparent display 1610 is formed as a transparent device, and a transparency of the transparent display 1610 may be adjusted by adjusting light transmittance of the transparent device or by adjusting RGB value of each pixel.
- the transparent display 1610 may have a structure in which an OLED and an LCD are combined.
- the OLED may be located adjacent to a front surface input portion
- the LCD may be located adjacent to a rear surface input portion.
- the transparent display 1610 maintains a transparent state such as the glass during power-off status, and when power is applied, the LCD blocks the light so that the transparent display 1610 becomes opaque.
- the transparent display 1610 receives a touch input of the user through the front surface input unit.
- the screen displayed on the transparent display 1610 may include a user interface (UI) or a graphic user interface (GUI).
- UI user interface
- GUI graphic user interface
- the transparent display 1610 may receive and display the information related to the object from the external device 110 according to the touch input (the first and second touch inputs) of the user on the object displayed on the external device 110 that is seen through the transparent display unit 1610 .
- the storage 1620 stores at least one program that is configured to execute the information display method in the transparent display 1610 .
- the storage unit 1620 may include a non-volatile memory such as a high speed random access memory, a magnetic disk storage device, or a flash memory, or other non-volatile semiconductor memories.
- FIG. 18 is a diagram illustrating software layers stored in the storage 1620 of the transparent display device 100 according to an exemplary embodiment.
- the software layer may include a storage module 1810 , a sensor and recognition module 1820 , a communication module 1830 , an input/output module 1860 , and a legend module 1870 , but is not limited thereto.
- the storage module 1810 includes a system database 1811 that is a storage for storing general data such as address book and environmental information, and a touch mode data region 1812 for storing setting values for touch modes of the object that will be displayed on the transparent display 1610 .
- the sensor recognition module 1820 includes a module 1821 for sensing a touch on the transparent display 1610 , and a module 1822 for classifying the input touch.
- the module 1822 for classifying the input touch may classify the touch input as a front input mode 1823 for transferring an input on the front surface input interface to an event processor X11, a rear input mode 1824 for transferring an input on a rear surface input interface to the event processor X11, and a dual mode 1825 for transferring a dual-touch input (both-touch input of the front surface input interface and the real surface input interface) to the event processor X11.
- the sensor recognition module 1820 may be configured by an input mode for only transferring the input on the front surface of the transparent display 1610 to the event processor X11.
- the communication module 1830 may include a telephony module 1840 and a messaging module 1850 , but is not limited thereto.
- the telephony module 1840 includes an information collection module 1842 for connecting a phone call, and a voice service module 1841 for transmitting voice over the Internet based on voice over Internet protocol (VoIP).
- VoIP voice over Internet protocol
- the messaging module 1850 includes an instant module 1851 regarding conversation between users through an Internet connection, a module 1852 regarding short message service (SMS) text messages and multimedia messages, and a module 1853 for emailing.
- SMS short message service
- the input/output module 1860 includes a UI & graphic module 1861 , and a multimedia module 1865 .
- the UI & graphic module 1861 includes an X11 module 1862 for receiving a touch input by a window manager, a module 1863 that outputs all objects seen by a user on a screen, and an evaluation module 1864 regarding a mode setting value stored for each object and a current touch input.
- the multimedia module 1865 includes a moving picture reproducing module 1866 , a moving picture and still image capturing module 1867 , and a voice reproducing module 1868 .
- the programs for executing the information display method according to the exemplary embodiments may be stored in the storage module 1871 .
- the storage module 1871 may store various applications.
- the storage 1620 may store programs of various configurations, and is not limited to the example shown in FIG. 18 .
- the communication interface 1630 may communicate with at least one of the external device 110 , the server 120 , and the AP 130 . To perform communication, the communication interface 1630 may be configured to transmit/receive data via a wireless communication network such as wireless Internet, wireless Intranet, a wireless phone network, a wireless local area network (LAN), a Wi-Fi network, a Wi-Fi direct (WFD) network, a 3G network, a 4G Long Term Evolution (LTE) network, a Bluetooth network, an infrared data association (IrDA) network, a radio frequency identification (RFID) network, a ultra wideband (UWB) network, a Zigbee network, or a near field communication (NFC) network; however, the present invention is not limited thereto.
- the communication interface 1630 may include a global positioning system (GPS) module.
- GPS global positioning system
- the processor 1640 may perform operations according to the above described exemplary embodiments by executing the programs stored in the storage 1620 .
- the processor 1640 receives a first touch input representing reference information with respect to the external device 110 that is seen through the transparent display 1610 , and a second touch input representing a selection on an object displayed on the external device 110 .
- the processor 1640 requests information related to the object to the external device 110 based on the first and second touch inputs received via the communication interface 1630 .
- the processor 1640 displays the received information on the transparent display 1610 .
- Operations of the processor 1640 regarding the information display method according to the exemplary embodiments may be performed as described with reference to the flowcharts in FIGS. 2, 11, 12, 14, 15 , and FIG. 21 that will be described later.
- the sensor 1650 senses a current status of the transparent display device 100 such as location of the transparent display device 100 , contact of the user on the transparent display device 100 , orientation of the transparent display device 100 , and acceleration or deceleration of the transparent display device 100 and generates a sensing signal for controlling operations of the transparent display device 100 .
- the sensor 1650 may generate a sensing signal regarding the location of the transparent display device 100 in order to receive information based on the augmented reality service described with reference to FIGS. 10A through 10D .
- FIG. 19 is a functional block diagram of the transparent display device 100 according to an exemplary embodiment.
- the transparent display device 100 may include a transparent display 1901 , a user input interface 1902 , a sensor 1903 , a camera 1904 , a storage 1905 , a communication interface 1906 , a port 1907 , an audio input interface 1908 , an audio signal processor 1909 , an audio output interface 1910 , a power supply 1911 , and a processor 1912 , but is not limited thereto. That is, the transparent display device 100 may include fewer components than those of FIG. 19 , or may include additional components other than those of FIG. 19 .
- the transparent display 1901 may be referred to as a touch screen.
- the transparent display 1901 may display objects, and may receive a touch-based user input.
- the transparent display 1901 may receive the touch-based user input via at least one of a front surface and a rear surface of the transparent display 1901 .
- the transparent display 1901 includes at least one touch sensor.
- the touch sensor may recognize the user input based on (x, y) coordinates.
- the touch sensor may include a sensor for recognizing a direct-touch, or a sensor for recognizing a proximity-touch.
- the user input may be generated according to a request of a user based on gestures of the user, or user's selection.
- the gesture of the user may be variously defined by combinations of the number of touches, touch patterns, touch area, and touch intensity.
- the transparent display 1901 is formed as a transparent device, and a transparency of the transparent display 1901 may be adjusted by adjusting light transmittance of the transparent device or by adjusting RGB value of each pixel.
- the transparent display 1901 may have a structure in which an OLED and an LCD are combined.
- the OLED may be located adjacent to a front surface of the transparent display 1901
- the LCD may be located adjacent to a rear surface of the transparent display 1901 .
- the transparent display 1901 may display a screen respectively responding to a touch-based user input through at least one of the front and rear surfaces thereof, a user input based on the sensor 1903 , a user input via the camera 1904 , and a user input via the audio input interface 1908 .
- the screen displayed on the transparent display 1901 may include a UI or a GUI screen.
- the transparent display 1901 may have a physical structure like the transparent display 1610 described with reference to FIG. 16 . Two or more transparent display 1901 may be formed according to the type of the transparent display device 100 .
- the user input interface 1902 generates input data (or control data) for controlling operations of the transparent display device 100 and a user input.
- the user input interface 1902 may include a keypad, a dome switch, a touch pad that is used instead of a mouse, a jog wheel, a jog switch, and a hardware (H/W) button.
- the sensor 1903 senses a current status of the transparent display device 100 such as location of the transparent display device 100 , contact of the user on the transparent display device 100 , orientation of the transparent display device 100 , and acceleration or deceleration of the transparent display device 100 and generates a sensing signal for controlling operations of the transparent display device 100 .
- the sensor 1903 may include a sensor except for the sensors for sensing the direct touch or the proximate touch described regarding the transparent display 1901 .
- the sensor 1903 may include a proximity sensor.
- the proximity sensor is a sensor for detecting whether an object approaches a previously set defection surface or whether the external object is present nearby by using a force of an electromagnetic field or an infrared ray without an actual physical touch. Examples of the proximity sensor include a transparent photoelectric sensor, a direct reflective photoelectric sensor, a mirror reflective photoelectric sensor, a high frequency oscillation photoelectric sensor, a capacitive photoelectric sensor, a magnetic photoelectric sensor, an infrared photoelectric sensor, etc.
- the camera 1904 processes an image frame such as a still image or a moving image obtained from an image sensor in a conference call mode or a photographing mode.
- the processed image frame may be displayed on the transparent display 1901 .
- the image frame processed by the camera 1904 may be stored in the storage 1905 or may be transmitted to another device through the communication interface 1906 or the port 1907 .
- the device receiving the transmitted image frame may include at least one of the external device 110 , the server 120 , and the AP 130 , but is not limited thereto.
- the camera 1904 may also be configured to receive the user input to the front and rear surfaces of the transparent display 1901 or to photograph the object.
- the number of cameras 1904 may be two or more according to a structure of the transparent display device 100 .
- the camera 1904 may be used as an input apparatus that recognizes a user's spatial gesture.
- the storage 1905 stores at least one program configured to be executed by the processor 1912 , which will be described later, and a resource.
- the at least one program includes a program that executes an information display method, an operating system (OS) program of the transparent display device 100 , applications set in the transparent display device 100 , and a program necessary for performing various functions (for example, communication function and display function) of the transparent display device 100 .
- OS operating system
- the resource includes information necessary for executing the above-described programs, user interface screen information for performing the information display method mentioned in embodiments of the present invention, and the user input information recognized by the first and second touch inputs.
- the user input information recognized as the first and second touch inputs may be set based on the examples described with reference to FIGS. 3 through 10 , but is not limited thereto.
- the storage 1905 may be configured to independently include a storage that stores at least one program necessary for performing various functions of the transparent display device 100 and an operating system program, and a storage that stores one or more programs, resources, and various applications that execute the information display method.
- the storage 1905 may include at least one type of storage medium among a flash memory type, a hard disk type, a multimedia card micro type, and a card type memory (for example, an SD or XD memory), a read only memory (ROM), an electronically erasable programmable read-only memory (EEPROM), a programmable read only memory (PROM) magnetic memory, and an optical disk.
- a flash memory type for example, an SD or XD memory
- ROM read only memory
- EEPROM electronically erasable programmable read-only memory
- PROM programmable read only memory
- the communication interface 1906 may be configured to transmit data to and receive data from at least one of the external device 110 , a server ( 120 ), and AP( 130 ) via a wireless communication network such as wireless Internet, wireless Intranet, a wireless phone network, a wireless local area network (LAN), a Wi-Fi network, a Wi-Fi direct (WFD) network, a 3G network, a 4G Long Term Evolution (LTE) network, a Bluetooth network, an infrared data association (IrDA) network, a radio frequency identification (RFID) network, a ultra wideband (UWB) network, a Zigbee network, or a near field communication (NFC) network, but is not limited thereto.
- a wireless communication network such as wireless Internet, wireless Intranet, a wireless phone network, a wireless local area network (LAN), a Wi-Fi network, a Wi-Fi direct (WFD) network, a 3G network, a 4G Long Term Evolution (LTE) network, a Bluetooth network,
- the communication interface 1906 may include at least one of a broadcasting reception module, a mobile communication module, a wireless Internet module, a wired Internet module, a short distance communication module, and a location information module but is not limited thereto.
- the broadcasting reception module receives a broadcasting signal and/or broadcasting related information from an external broadcasting management server through a broadcasting channel.
- the broadcasting channel may include a satellite channel and a terrestrial channel.
- the mobile communication module transmits and receives a wireless signal to and from at least one of a base station, the external device 110 , and the server 120 over a mobile communication network.
- the wireless signal may include various types of data according to a voice call signal, a conference phone call, or transmission/reception of a text/multimedia message.
- the wireless Internet module is a module for a wireless Internet connection.
- the wired Internet module is a module for a wired Internet connection.
- the short distance communication module is a module for short distance communication.
- Short distance communication technologies may use Bluetooth, RFID, IrDA, UWB, Zigbee, WFD, NFC, etc.
- the information about the object displayed on the external device 110 and the screen size information may be received via the short distance communication module.
- the transparent display device 100 may read or receive the above information from the external device 110 .
- the location information module is a module for identifying or obtaining the location of the transparent display device 100 .
- a GPS module may be used.
- the GPS module receives location information from a plurality of satellites.
- the location information may include coordinate information represented by latitude and longitude.
- the port 1907 may transmit and receive data to and from the external device 110 by using a plug and play interface such as a USB port.
- the plug and play interface is a module that automatically detects and enables use of (i.e., play) if the external device 110 is plugged into the transparent display device 100 .
- the device is not limited to the external device 110 .
- the audio input interface 1908 receives an input of an external sound signal in a call mode, a recording mode, or a voice recognition mode, etc.
- the audio input interface 1908 may be configured as, for example, a microphone.
- the audio input interface 1908 may be configured to include various noise removal algorithms for removing noise that occurs during the process of receiving the input of the external sound signal.
- the sound signal input by using the audio input interface 1908 may be user's input representing a selection on the object displayed on the external device 110 that is seen through the transparent display unit 1901 according to an exemplary embodiment.
- the sound signal input by using the audio input interface 1908 may be stored in the storage 1905 or may be transmitted to the outside through the communication interface 1906 or the port 1907 .
- the outside may include the external device 110 , other external devices (not shown), the server 120 , and the AP 130 .
- the audio signal processing unit 1909 provides an interface between the audio input interface 1908 and the processor 1912 and between the audio output interface 1910 and the processor 1912 . That is, the audio signal processing unit 1909 converts the sound signal received from the audio input interface 1908 into audio data that may be processed by the processor 1912 and transmits the audio data to the processor 1912 . The audio signal processing unit 1909 converts the audio data transmitted from the processor 1912 into an electrical sound signal and transmits the electrical sound signal to the audio output interface 1910 .
- the audio output interface 1910 outputs the sound signal or the audio signal received from the audio signal processing unit 1909 in the call mode or an audio production mode.
- the audio signal output interface 1910 may be configured as a speaker.
- the audio input interface 1908 and the audio output interface 1910 may be integrally configured like a head set.
- the transparent display 1901 , the user input interface 1902 , the sensor 1903 , the camera 1904 , and the audio input interface 1908 may be referred to as input apparatuses or input/output apparatuses according to a function of a user interface between the transparent display device 100 and the user.
- the function of the user interface between the transparent display device 100 and the user includes a touch screen function, a sound recognition function, and a spatial gesture recognition function
- the user input interface 1902 , the sensor 1903 , the camera 1904 , and the audio input interface 1908 may be referred to as the input apparatuses
- the transparent display 1901 may be referred to as the input/output apparatus.
- the power supply 1911 supplies power to various elements of the transparent display device 100 .
- the power supply 1911 includes one or more power sources such as a battery and an alternating current (AC) power source.
- the transparent display device 100 may not include the power supply 1911 but may include a connection unit (not shown) that may be connected to an external power supply (not shown).
- the processor 1912 may be referred to as one or more processors that control a general operation of the transparent display device 100 . Although the processor 1912 is implemented as a single chip in FIG. 19 , the processor 1912 may be divided into a plurality of processors according to a function of the transparent display device 100 .
- the processor 1912 may generally control the transparent display 1901 , the user input interface 1902 , the sensor 1903 , the camera 1904 , the storage 1905 , the communication interface 1906 , the port 1907 , the audio input interface 1908 , the audio signal processing unit 1909 , and the audio output interface 1910 .
- the processor 1912 may be referred to as a controller, a microprocessor, a digital signal processor, etc.
- the processor 1912 may also provide user's input that is input through the transparent display 1901 , the user input interface 1902 , the sensor 1903 , the camera 1904 , and the audio input interface 1908 that correspond to input apparatuses and a user interface based on the transparent display 1901 .
- the processor 1912 may execute at least one program related to the information display method according to the exemplary embodiments.
- the processor 1912 may execute the program by reading the program from the storage 1905 or downloading the program from an external apparatus such as an application providing server (not shown) or a market server (not shown) through the communication interface 1906 .
- the processor 1912 may be understood to include an interface function unit interfacing between various functional modules and the processor 1912 of the transparent display device 100 .
- the operation of the processor 1912 related to the information display method according to the exemplary embodiments may be performed as shown in flowcharts of FIGS. 2, 11, 12, 14, 15, and 21 that will be described later.
- FIG. 20 is a flowchart illustrating operations of the external device 110 according to an exemplary embodiment.
- the external device 110 receives a request for information related to at least on object displayed on the external device 110 from the transparent display device 100 .
- the request for information may be transmitted via at least one of the direct communication between the devices, the communication via a server, and the communication via a repeater.
- the request for information related to the object may be input based on the first touch input and the second touch input to the transparent display device 100 .
- the first touch input is a user input to the transparent display device 100 for representing the reference information about the external device 110 that is seen through the transparent display device 100 .
- the second touch input is a user input to the transparent display device 100 for selecting at least one object displayed on the external device 110 that is seen through the transparent display device 100 .
- the request for information related to the object may include the displayed location information (coordinate information) of the selected object on the external device 110 as described in the above exemplary embodiments, the screen size of the transparent display device 100 , and the coordinate information of the first and second touch inputs on the transparent display device 100 , but is not limited thereto.
- the request for information related to the object may be based on the touch input corresponding to the second touch input.
- the external device 110 selects an object in response to the received request for the information related to the object. For example, if the requested object is an icon, the external device 110 selects the icon of the requested object and the application program connected to the icon. If the requested object is a folder, the external device 110 selects the requested folder and files or data located at a lower layer of the folder. If the requested object is an object included in one screen, the external device 110 selects the object by using the coordinate information included in the received request. If the requested object is a plurality of objects included in one screen, the external device 110 respectively selects the plurality of objects by using the coordinate information of the object included in the received request.
- the external device 110 transmits information related to the selected object to the transparent display device 100 .
- the information related to the object is transmitted to the transparent display device 100 in the same manner as the request for the information is received, but is not limited thereto.
- the request for the information may be received via the direct communication between the devices, and the information related to the object selected in response to the request may be transmitted to the transparent display device 100 via the repeater or the server.
- FIG. 21 is a flowchart illustrating a method of displaying information on a transparent display device according to another exemplary embodiment.
- FIG. 21 shows a case where the second touch input mentioned with reference to FIG. 2 is used.
- the transparent display device 100 receives a touch input for selecting an object displayed on the external device 110 that is seen through the transparent display device 100 .
- the touch input corresponds to the second touch input mentioned in FIGS. 2 through 10 .
- the information about the external device 110 that is seen through the transparent display device 100 may be the same as that mentioned in FIGS. 1 through 10 .
- the transparent display device 100 requests the external device 110 for information related to the object selected based on the touch input.
- a signal for requesting information related to the object selected based on the touch input transmitted to the external device 110 includes a signal for requesting information related to the object selected based on the second touch input mentioned in FIGS. 2 through 10 .
- the transparent display device 100 receives information about the selected object from the external device 110 .
- the received information corresponds to the request signal in operation S 2102 , and may be the same as the information received in operation S 203 shown in FIG. 2 .
- the transparent display device 100 displays the received information.
- the received information may be displayed in the same manner as that of operation S 204 shown in FIG. 2 .
- the flowchart shown in FIG. 21 may be modified to include the operation S 1105 shown in FIG. 11 so that the object displayed on the transparent display device 100 may be edited based on the interaction between the transparent display device 100 and the external device 110 .
- the information display method may also be embodied as computer readable codes on a computer readable recording medium.
- the computer readable medium may be any recording apparatus capable of storing data that is read by a computer system, e.g., a read-only memory (ROM), a random access memory (RAM), a compact disc (CD)-ROM, a magnetic tape, a floppy disk, an optical data storage device, and so on.
- the computer readable medium may be distributed among computer systems that are interconnected through a network, and the present invention may be stored and implemented as computer readable code in the distributed system.
Abstract
Description
- This application claims priority from Korean Patent Application No. 10-2012-00104156, filed on Sep. 19, 2012, and Korean Patent Application No. 10-2013-00106227, filed on Sep. 4, 2013, in the Korean Intellectual Property Office, the disclosure of which is incorporated herein by reference in its entirety.
- 1. Field
- Methods and apparatuses consistent with exemplary embodiments relate to displaying information, and more particularly, to a system and method for displaying information related to an external object or an external device on a transparent display device.
- 2. Description of the Related Art
- Transparent display devices are considered next generation display devices. A transparent display device has a degree of transparency that enables a user to see an external object or an external device through the transparent display device.
- However, a transparent display device does not display information related to the external object or the external device.
- Exemplary embodiments provide a system, a method, and an apparatus for displaying information related to an external device seen through a screen of a transparent display device on the screen of the transparent display device, and a recording medium thereof.
- Exemplary embodiments also provide a system, a method, and an apparatus for displaying information related to an object displayed on a screen of an external device seen through a screen of a transparent display device on the screen of the transparent display device, and a recording medium thereof.
- Exemplary embodiments also provide a system, a method, and an apparatus for displaying information related to an external object seen through a screen of a transparent display device on the screen of the transparent display device, and a recoding medium thereof.
- According to an aspect of an exemplary embodiment, there is provided a method of displaying information on a transparent display device, the method including: receiving a touch input on the transparent display device that selects an object displayed on an external device that is viewable through a screen of the transparent display device; requesting the external device for information related to the object; receiving the information related to the object from the external device; and displaying the received information on the screen of the transparent display device.
- The touch input may indicate a contour line of the object that is viewable through the screen, a tap-based touch indicating a location on the screen at which the object is viewable through the screen, or indicate a closed region on the screen at which the object is viewable through the screen.
- The information related to the object indicates at least one other object having a type that is the same as a type of the object, and a display location on a screen of the external device of the at least one other object differs from that of the object.
- The information related to the object indicates information that is not displayed on a screen of the external device.
- The displaying comprises displaying the received information at a display location on the screen of the transparent display device that corresponds to a display location of the object on a screen of the external device.
- The method may further include editing the information that is displayed on the screen of the transparent display device based on an interaction between the transparent display device and the external device.
- The method may further include displaying information related to the external device based on an augmented reality service on the screen of the transparent display device.
- The requesting and the receiving the information are performed based on one of a direct communication between devices, a communication via a server, and a communication via a repeater.
- According to another aspect of an exemplary embodiment, there is provided a transparent display device including: a transparent display configured to receive a touch input that selects an object displayed on an external device that is viewable through the transparent display; a communication unit configured to communicate with an external device that is viewable through the transparent display; and a processor configured to request the external device for information related to the object based on the touch input, via the communication unit, receive information related to the object from the external device in response to the request, via the communication unit, and control the transparent display to display the received information.
- According to another aspect of an exemplary embodiment, there is provided a method of displaying information on a transparent display device, the method including: receiving a first touch input on a screen of the transparent display device indicating first position information of an external device that is viewable through the screen of the transparent display device and receiving a second touch input on the screen of the transparent display device indicating second position information of an object displayed on a screen of the external device viewable through the screen of the transparent display device; requesting the external device for information related to the object based on the first position information and the second position information; receiving information related to the object from the external device in response to the requesting; and displaying the received information on the screen of the transparent display device.
- The first position information indicates a contour line of the external device viewable through the screen of the transparent display device.
- The first touch input may be independent touch operations on a first point and a second point on the screen of the transparent display device that indicate a contour line of the external device that is viewable through the screen of the transparent display device.
- The first touch input may be a touch-and-drag operation for connecting a first point and a second point on the screen of the transparent display device that indicates a contour line of the external device that is viewable through the screen of the transparent display device.
- The first touch input may indicate a touch-based region adjusting operation for guide information displayed on the screen of the transparent display device, and a range related to the touch-based region adjusting operation for the guide information may be based on a contour line of the external device that is viewable through the screen of the transparent display device.
- The first touch input may be a touch operation for selecting screen information of the external device, wherein the screen information may be included in a selectable screen information menu item about the external device, which is displayed on the screen of the transparent display device, and the screen information may include at least one of screen size information and screen type information.
- The second position information may indicate a contour line of the object that is viewable through the screen on the transparent display device.
- The second touch input may be a tap-based touch indicating a location on the screen of the transparent display device at which the object is viewable through the screen of the transparent display device.
- The second touch input may indicate a closed region on the screen of the transparent display device through which the object is viewable on the screen of the transparent display device.
- The information related to the object may indicate at least one other object having a type that is the same as a type of the object, and a display location on the screen of the external device of the at least one other object may differ from that of the object.
- The method may further include editing the information that is displayed on the screen of the transparent display device based on an interaction between the transparent display device and the external device.
- According to another aspect of an exemplary embodiment, there is provided a transparent display device including: a transparent display configured to receive a touch input indicating first position information of an external device that is viewable through the transparent display, and to receive a second touch input indicating second position information of an object displayed on a screen of the external device viewable through the transparent display; a communication unit configured to communicate with the external device; and a processor configured to request the external device for information related to the object based on the first position information and the second position information, via the communication unit, receive information related to the object from the external device in response to the request, via the communication unit, and display the received information on the transparent display.
- According to another aspect of an exemplary embodiment, there is provided a method of displaying information on a screen of a transparent display device, the method including: receiving from the transparent display device a request for information related to at least one object displayed on the screen of an external device that is viewable through a screen of the transparent display device; selecting the at least one object in response to the request; and transmitting the information related to the selected object to the transparent display device, wherein the request for information related to the object comprises first position information of the external device indicated by a first touch input on the transparent display device and second position information of the object displayed on the screen of the external device indicated by a second touch input on the transparent display device.
- According to another aspect of an exemplary embodiment, there is provided a non-transitory computer-readable recording medium having embodied thereon a program for implementing the methods discussed of displaying information on the transparent display device.
- The above and other aspects will become more apparent by describing in detail exemplary embodiments thereof with reference to the attached drawings in which:
-
FIG. 1A throughFIG. 1C are block diagrams of an information display system according to an exemplary embodiment; -
FIG. 2 is a flowchart illustrating a method of displaying information in a transparent display device, according to an exemplary embodiment; -
FIGS. 3A through 3H are diagrams showing examples of a first touch input according to exemplary embodiments; -
FIGS. 4A through 4E are diagrams showing other examples of a first touch input according to exemplary embodiments; -
FIGS. 5A through 5E are diagrams showing other examples of a first touch input according to exemplary embodiments; -
FIGS. 6A through 6C are diagrams showing a first touch input, a second touch input, and a screen displayed on a transparent display device according to the first and second touch inputs, according to exemplary embodiments; -
FIGS. 7A through 7D are diagrams showing screens for illustrating the first touch input, the second touch input, and editing processes according to the exemplary embodiments; -
FIGS. 8A through 8G are diagrams showing screens for illustrating the first touch input and the second touch input according to the exemplary embodiments; -
FIGS. 9A through 9C are diagrams showing screens for illustrating the first touch input and the second touch input according to the exemplary embodiments in a case where a transparent display device and an external device have equal size; -
FIGS. 10A through 10D are diagrams showing examples of the first touch input and the second touch input according to the exemplary embodiments; -
FIG. 11 is a flowchart illustrating a method of displaying information to be performed by a transparent display device, according to another exemplary embodiment; -
FIG. 12 is a flowchart illustrating a method of displaying information to be performed by a transparent display device, according to another exemplary embodiment; -
FIGS. 13A and 13B are side views of the transparent display device and the external device shown inFIG. 12 ; -
FIG. 14 is a flowchart illustrating a method of displaying information to be performed by a transparent display device, according to another exemplary embodiment; -
FIG. 15 is a flowchart illustrating a method of displaying information to be performed by a transparent display device, according to another exemplary embodiment; -
FIG. 16 is a functional block diagram of a transparent display device according to an exemplary embodiment; -
FIG. 17 is a diagram showing an example of a transparent display unit shown inFIG. 16 ; -
FIG. 18 is a diagram illustrating a software layer stored in a storage unit of a transparent display device, according to an exemplary embodiment; -
FIG. 19 is a functional block diagram of a transparent display device according to another exemplary embodiment; -
FIG. 20 is a flowchart illustrating a method of displaying information to be performed by an external device, according to an exemplary embodiment; and -
FIG. 21 is a flowchart illustrating a method of displaying information to be performed by a transparent display device according to another exemplary embodiment. - As the exemplary embodiments allow for various changes and numerous embodiments, particular exemplary embodiments will be illustrated in the drawings and described in detail in the written description. However, this is not intended to limit the exemplary embodiments to particular modes of practice, and it is to be appreciated that all changes, equivalents, and substitutes that do not depart from the spirit and technical scope of the disclosure are encompassed. In the description, certain explanations of well known related art are omitted.
- While such terms as โfirst,โ โsecond,โ etc., may be used to describe various components, such components must not be limited to the above terms. The above terms are used only to distinguish one component from another.
- The terms used in the present specification are merely used to describe particular exemplary embodiments, and are not intended as limiting. All terms including descriptive or technical terms used herein should be construed as having meanings that would be understood to one of ordinary skill in the art. However, the terms may have different meanings according to an intention of one of ordinary skill in the art, precedent cases, or the appearance of new technologies. Also, some terms may be arbitrarily selected by the applicant, and in this case, the meaning of the selected terms will be described in detail in the detailed description. Thus, the terms used herein have to be defined based on the meaning of the terms together with the description throughout the specification. Screens suggested in the present application are used only for descriptive purposes, and are not intended as limiting.
- An expression used in the singular encompasses the expression of the plural, unless it has a clearly different meaning in the context. In the present specification, it is to be understood that the terms such as โincludingโ or โhaving,โ etc., are intended to indicate the existence of the features, numbers, steps, actions, components, parts, or combinations thereof disclosed in the specification, and are not intended to preclude the possibility that one or more other features, numbers, steps, actions, components, parts, or combinations thereof may exist or may be added.
- As used herein, the term โand/orโ includes any and all combinations of one or more of the associated listed items. Expressions such as โat least one of,โ when preceding a list of elements, modify the entire list of elements and do not modify the individual elements of the list.
- An object denotes a component or information displayed on an external device or a screen of the external device of a transparent display device. For example, an object may include an image, an image included in another image, an icon, a folder icon, an icon included in a folder icon, text, a pop-up window, an application execution window, a content included in an application execution window, a list, an item, a content, and a file included in a list; however, the present invention is not limited thereto. Examples of an object will be described in detail in various examples of screens that will be described later. The object may be referred to as an external object of the transparent display device.
- Throughout the entire specification, a touch input denotes input information of a user input through a touch-based gesture using a finger of the user or a touch tool. The touch tool may be referred to as an external input device, a stylus, or a stylus pen.
- The touch-based gesture may be variously defined. In other words, examples of the touch-based gesture may include touch-based motions on a touch screen, such as tap, touch-and-hold, double tap, drag, touch-and-drag, panning, flick, drag-and-drop, sweep, and swipe, but the touch-based gesture is not limited thereto.
- The touch input may be replaced by a gesture based on an image captured by a camera, according to an input desired to represent based on the touch. For example, if the touch input is an input for selecting an object displayed on an external device, the touch input may be replaced by a gesture or operation according to a moving direction or sign of the hand captured by the camera. The camera may be configured based on an image sensor or an optical sensor.
- The touch input may be replaced by a user voice signal based on natural language, according to an input desired to represent based on the touch. For example, if a touch input is an input for selecting an object including a certain letter or a name displayed on an external device, the touch input may be replaced by a user voice signal based on natural language representing the certain letter or the name of the object.
- Hereinafter, exemplary embodiments will be described in detail with reference to accompanying drawings, wherein like reference numerals denote like elements to not provide repeated descriptions.
-
FIG. 1A is a block diagram of an information display system according to an exemplary embodiment. - Referring to
FIG. 1A , the information display system includes atransparent display device 100 and anexternal device 110. However, the information display system is not limited to the example shown inFIG. 1A . That is, the information display system may further include other components, in addition to the components shown inFIG. 1A . - For example, as shown in
FIG. 1B , the information display system may further include aserver 120. In this case, thetransparent display device 100 and theexternal device 110 may transmit and/or receive information via theserver 120, and thetransparent display device 100 may receive information based on an augmented reality service about theexternal device 110 from theserver 120. The communication through theserver 120 may be wired or wireless internet, but is not limited thereto. Theserver 120 may include at least one of a cloud server, an information supply server, and a service server. Theserver 120 may manage and provide information based on the augmented reality service. - The information display system may further include an
access point 130, as shown inFIG. 1C . In this case, thetransparent display device 100 and theexternal device 110 may transmit and/or receive information via theaccess point 130. The communication method via theaccess point 130 may be, for example, a wireless LAN communication method of infrastructure mode (or WiFi), but is not limited thereto. - When the information display system is configured as shown in
FIG. 1A , thetransparent display device 100 and theexternal device 110 may transmit and/or receive information through a device-to-device direct communication. The device-to-device direct communication method may use, for example, a local area wireless communication method such as wireless LAN communication of Ad-hoc mode such as WiFi-direct, Bluetooth communication, ultra wideband (UWB) communication, and Zigbee communication, but is not limited thereto. - That is, the
transparent display device 100 and theexternal device 110 may be connected to each other via a wire. For example, thetransparent display device 100 and theexternal device 110 may be connected to each other via a universal serial bus (USB) or a universal asynchronous receiver/transmitter (UART) to transmit/receive data. The device-to-device direct communication method may be referred to as a machine-to-machine (M2M) communication method, a device-to-device (D2D) communication method, or a peer-to-peer (P2P) communication method. - Therefore, the communication between the
transparent display device 100 and theexternal device 110 may be performed based on one of the direct communication between devices, the communication method via theaccess point 130, and the communication method via theserver 120, according to elements of the information display system, but is not limited thereto. - The
transparent display device 100 and theexternal device 110 may transmit and/or receive at least one of size information thereof, owner information thereof, and information sharable with other devices, through a short distance communication method such as a near field communication (NFC). - The size information of the device may be represented as, for example, (widthรlengthรthickness) mm, but is not limited thereto. Screen information may include screen size information and screen type information, and is not limited thereto. The screen size information may be represented as, for example, A4, B5, 7 inches, or 5.5 inches, and is not limited thereto. The screen type information may represent whether the screen is a touch screen or a non-touch screen, and is not limited thereto. For example, the screen type information may represent whether the screen is a liquid crystal display (LCD) panel or an active matrix organic light emitting diodes (AMO LED) panel.
- The
transparent display device 100 may display the information about theexternal device 110, which is transmitted from theexternal device 110 via a short distance communication method, such as the NFC, as information about theexternal device 110 based on the augmented reality service. For example, thetransparent display device 100 may display the information about theexternal device 110 on a display area adjacent to theexternal device 110 that is seen through thetransparent display device 100. The display area is a part of a screen of thetransparent display device 100. Theexternal device 100 that is seen through thetransparent display device 100 may be referred to as theexternal device 100 that is seen via the screen of thetransparent display device 100. - The
transparent display device 100 is a device having a transparent display. For example, thetransparent display device 100 may be a mobile phone having a transparent display, a smartphone having a transparent display, a notebook computer having a transparent display, a tablet PC having a transparent display, a handheld PC having a transparent display, an electronic book terminal having a transparent display, a digital broadcasting terminal having a transparent display, a personal digital assistant (PDA) having a transparent display, a portable multimedia player (PMP) having a transparent display, a navigation device having a transparent display, a smart TV having a transparent display, a consumer electronic (CE) device having a transparent display (for example, a refrigerator having a transparent display, an air conditioner having a transparent display, a dish washing machine having a transparent display, etc.), and an iOS-convertible device having a transparent display, but is not limited thereto. The transparent display may be applied to various fields such as high added-value glass, glass as a functional car element, car dashboard, navigators, security electronic devices, solar batteries, electronic devices for military, game consoles, toys, and show windows, as well as smart windows. The screen of thetransparent display device 100 may be referred to as a screen on the transparent display. - The
transparent display device 100 may provide application execution function, communication function, media player function, web-browsing function, word-processing function, e-mail transmission function, messenger function, and/or data storage function, but is not limited thereto. - The
transparent display device 100 requests information related to at least one object that is displayed on theexternal device 110 and seen through thetransparent display device 100, based on a touch input. When receiving the information related to the object from theexternal device 110, thetransparent display device 100 displays the received information. - The
external device 110 is a device that is seen through thetransparent display device 100, through the screen of thetransparent display device 100, or through the transparent display of thetransparent display device 100. Theexternal device 110 may be referred to as another device. Theexternal device 110 may not include a transparent display. For example, theexternal device 110 may be a mobile phone, a smartphone, a notebook computer, a tablet PC, a handheld PC, an electronic book terminal, a digital broadcasting terminal, a PDA, a PMP, a navigation, a smart TV, a CE device (for example, a refrigerator, an air conditioner, a dishwashing machine having a display panel, etc.), and an iOS convertible device, but is not limited thereto. That is, theexternal device 110 may include a transparent display. - The
external device 110 may provide application execution function, communication function, media player function, web-browsing function, word-processing function, e-mail transmission function, messenger function, and/or data storage function, but is not limited thereto. - When the
transparent display device 100 requests the information related to at least one object that is displayed, theexternal device 110 selects the requested object and transmits information related to the requested object to thetransparent display device 100. -
FIG. 2 is a flowchart illustrating a method of displaying information to be performed by thetransparent display device 100, according to an exemplary embodiment. - In operation S201, the
transparent display device 100 receives a first touch input and a second touch input. The first touch input represents reference information of theexternal device 110 that is seen through thetransparent display device 100. The reference information is used to detect a display location of the object on theexternal device 110, wherein the object is selected by the second touch input in thetransparent display device 100. The reference information may be referred to as first position information of theexternal device 110. -
FIGS. 3A through 3H are diagrams showing examples of the first touch input. InFIGS. 3A through 3H , thetransparent display device 100 has a size that is greater than that of theexternal device 110, and theexternal device 110 is seen through thetransparent display device 100 as shown inFIG. 3A . In the examples shown inFIGS. 3A through 3H , a result of sensing the first touch input may or may not be displayed on thetransparent display device 100. -
FIG. 3B shows an example in which the first touch input is drawn along a contour line of theexternal device 110 that is seen through thetransparent display device 100. The contour line of theexternal device 110 may be referred to as a boundary of the screen of theexternal device 110. The first touch input may be referred to as a first input that identifies the boundary of the screen of theexternal device 110. The first touch input shown inFIG. 3B is based on drawing operation from a point S on theexternal device 110 that is seen through thetransparent display device 100 along the contour line of theexternal device 110 to a point E. The point S denotes a start point of the touch operation, that is, a drawing operation along the contour line of theexternal device 110. The point E denotes an end point of the touch operation along the contour line of theexternal device 110. The point S and the point E may have the same display location (or xy coordinates). However, the point S and the point E may be adjacent to each other so that a closed area may be set according to the touch operation for drawing along the contour line of theexternal device 110. - In
FIG. 3B , the point S is a left uppermost corner in the contour line of theexternal device 110, but is not limited thereto. That is, the point S may be an arbitrary point on the contour line of theexternal device 110. The point E is determined depending on the point S. - In
FIGS. 3C and 3D , the first touch input is based on independent touch operations at a first point and a second point on thetransparent display device 100. InFIGS. 3C and 3D , the first point and the second point are in a diagonal relationship on the contour line of theexternal device 110 that is seen through thetransparent display device 100. - Referring to
FIG. 3C , the first point is a left uppermost point P1 on the contour line of theexternal device 110, and the second point is a right lowermost point P2 on the contour line of theexternal device 110. Because the point P1 and the point P2 are independently touched, thetransparent display device 100 may trace the contour line of theexternal device 110 that is seen through thetransparent display device 100 based on information about xy coordinates of the point P1 and the point P2 on thetransparent display device 100. That is, (x, y) coordinate information of a right uppermost point and a left lowermost point of the contour line, which are not touched, is detected based on the (x, y) coordinate information of the point P1 and the point P2, and the detected points are connected to each other to trace the contour line of theexternal device 110. - Referring to
FIG. 3D , the first point is a left lowermost point P3 on the contour line of theexternal device 110 and the second point is a right uppermost point P4 on the contour line of theexternal device 110. When the point P3 and the point P4 are touched, thetransparent display device 100 may trace the contour line of theexternal device 110 that is seen through thetransparent display device 100 based on the xy coordinate information of the point P3 and the point P4 on thetransparent display device 100. The tracing of the contour line may be performed in the same way as described with reference toFIG. 3C . -
FIGS. 3E through 3H show examples where the first touch input is based on a touch-and-drag operation connecting the first point and the second point to each other on thetransparent display device 100. InFIGS. 3E through 3H , the first point and the second point are in the diagonal relationship with each other based on the contour line of theexternal device 110 that is seen through thetransparent display device 100. In the examples shown inFIGS. 3E through 3H , the first point may denote a start point S and the second point may denote an end point E. - Referring to
FIG. 3E , the point S is a left uppermost point on the contour line of theexternal device 110 and the point E is a right lowermost point on the contour line of theexternal device 110. Since the touch-and-drag operation is performed toward the point E after touching the point S, thetransparent display device 100 may trace the contour line of theexternal device 110. That is, as shown inFIG. 3E , when a touch point according to the touch-and-drag operation is t1, (x, y) coordinate information of a point t2 and a point t3 on the contour line, wherein the points t2 and t3 are not touched, is detected based on the (x, y) coordinate information of the point S and the (x, y) coordinate information of the point t1, and then, the contour line of theexternal device 110 may be traced by connecting the point S, the points t1, t2, and t3 based on the (x, y) information thereof. - The
transparent display device 100 may display an arrow or a block setting shown inFIG. 3E based on a current touching location to show variation of the touched location according to the dragging. When the touch-and-drag operation finishes at the point E, thetransparent display device 100 may end the arrow or the block setting display, and may display the contour line of theexternal device 110. Otherwise, the arrow or the block display status may be maintained. -
FIG. 3F shows a case where the start point S of the touch-and-drag operation is the right uppermost point on the contour line of theexternal device 110 and the end point E is the left lowermost point on the contour line of theexternal device 110.FIG. 3G shows a case where the start point S of the touch-and-drag operation is the left lowermost point on the contour line of theexternal device 110 and the end point E is the right uppermost point on the contour line of theexternal device 110.FIG. 3H shows a case where the start point S of the touch-and-drag operation is the right lowermost point on the contour line of theexternal device 110 and the end point E is the left uppermost point on the contour line of theexternal device 110. -
FIGS. 4A through 4E are diagrams showing other examples of the first touch input. InFIGS. 4A through 4E , thetransparent display device 100 is larger than theexternal device 110, and as shown inFIG. 4A , theexternal device 110 is seen through thetransparent display device 100. In the examples shown inFIGS. 4A through 4E , a result of sensing the first touch input may or may not be displayed on thetransparent display device 100. - In
FIGS. 4A through 4E , the first touch input is based on a touch-based operation for adjusting a region with respect to guide information displayed on thetransparent display device 100, and the adjustable range of the guide information based on the touch operation is based on the contour line of theexternal device 110 that is seen through thetransparent display device 100. - The guide information may be, for example, camera focusing range information. The guide information may be displayed according to a request of a user of the
transparent display device 100. For example, the request of the user may include a request for displaying guide information for executing the information display method according to the exemplary embodiment, or request for executing the information display method. - As shown in
FIG. 4A , when theexternal device 110 is seen through thetransparent display device 100, thetransparent display device 100 displays guide information G1 as shown inFIG. 4B . The guide information G1 may be displayed on thetransparent display device 100 according to a command of a user of thetransparent display device 100. When performing the region adjusting operation of the guide information G1 toward four corners of the contour line of the external device in a state of contacting four points P5, P6, P7, and P8, as shown inFIG. 4C , the guide information, the region of which is adjusted, is displayed on thetransparent display device 100. - Accordingly, the
transparent display device 100 may trace the contour line of theexternal device 110 according to adjusted (x, y) coordinate values of the four points P5, P6, P7, and P8 of the guide information G1. The tracing of the contour line may be performed by connecting the changed (x, y) coordinate values of the points P5, P6, P7, and P8, but is not limited thereto. The changed (x, y) coordinate value of each point may be obtained by adding a variation amount according to the dragging operation to the original (x, y) coordinate value, but is not limited thereto. That is, according to two-touch operations, that is, touching the four points P5, P6, P7, and P8 of the guide information G1 and touching the points representing the desired region, the original (x, y) coordinate values of the points may be updated to the (x, y) coordinate values of the second touched points. -
FIGS. 4D and 4E are diagrams showing examples of the region adjusting operation of the guide information G1. Referring toFIG. 4D , the user moves touched point from the left uppermost point P5 of the guide information G1 to the left uppermost point of the contour line of theexternal device 110 after touching the point P5. The left uppermost point in the contour line of theexternal device 110 is a corner of theexternal device 110, which corresponds to the point P5 of the guide information G1. - Next, after touching the right lowermost point P8 of the guide information G1, the touched point is dragged to the right lowermost point in the contour line of the
external device 110, the region of the guide information G1 is moved from the points P6, P7, and P8 except for the left uppermost point P5, and accordingly, the display state of the guide information G1 is changed as shown inFIG. 4C . - According to the examples shown in
FIGS. 4D, 4E, and 4C , after touching one point of the guide information G1, the touched point is moved to a corresponding corner of theexternal device 110, and then, the diagonal point of the guide information G1 is touched and dragged to the corresponding corner of theexternal device 110 so as to change a display location of the guide information G1 or adjust displayed size of the guide information G1. - The one point and the diagonal point in the guide information G1 are not limited to the examples shown in
FIGS. 4D and 4E . For example, the point P6 of the guide information G1 is touched and dragged to the corresponding corner in the contour line of theexternal device 110, and then, the point P7 that is in a diagonal relation with the point P6 is touched and dragged so that the other points P5, P7, and P8 of the guide information G1 may be moved to the corresponding corners in the contour line of theexternal device 110. -
FIGS. 5A through 5E are diagrams showing examples of the first touch input in a case where thetransparent display device 100 is smaller than theexternal device 110. That is, as shown inFIG. 5A , when thetransparent display device 100 is smaller than theexternal device 110, the first touch input may be based on a touch operation of drawing along the contour line of theexternal device 110 that overlaps thetransparent display device 100. Here, a direction of the touch operation, that is, drawing direction along the contour line, may not be limited to one direction. - As shown in
FIG. 5A , thetransparent display device 100 may be smaller than theexternal device 110. Accordingly, when an object to be selected is displayed at a location adjacent to a center on a screen of theexternal device 110 so that the first touch input shown inFIGS. 5B through 5E is not applied, thetransparent display device 100 may reduce a size of theexternal device 110 by using a zoom-out function of a camera to receive a first touch input and a second touch input. Here, thetransparent display device 100 may detect a screen size of theexternal device 110 according to a zoom-out magnification. - Meanwhile, when the
external device 110 is seen through thetransparent display device 100 as shown inFIG. 6A , the first touch input may be received when a touch operation of touching a start point S and dragging to the end point E along the contour line of theexternal device 110 that is seen through thetransparent display device 100. - Also, the first touch input may be based on a touch operation for selecting screen information of the
external device 110, which is included in amenu 910 shown inFIG. 9B and will be described later. The screen information may include at least one of screen size information of theexternal device 110 and screen type information of theexternal device 110 as described above. - The screen size information may represent, for example, whether the screen size of the
transparent display device 100 is equal to a screen size of the external device, or certain size information such as A4, B5, 7 inches, 4 inches, etc. as shown inFIG. 9B , but is not limited thereto. The first touch input may be based on a touch operation for selecting corresponding screen size from among the pieces of the screen size information. - If the screen size of the
transparent display device 100 is different from the screen size of theexternal device 110, thetransparent display device 100 may change the (x, y) coordinate information on thetransparent display device 100 according to the first touch input and the (x, y) coordinate information on thetransparent display device 100 according to the second touch input into information according to the screen size of theexternal device 110. - For example, when the
transparent display device 100 has a screen size (length, width, area, etc.) of 4 inches and theexternal device 110 has a screen of 7 inches, thetransparent display device 100 may change the coordinate information of the first touch input on thetransparent display device 100 and the coordinate information of the second touch input on the transparent display device into coordinate information on the screen size of 7 inches, by using a function of converting the coordinate information of the screen size of 4 inches into coordinate information of the screen size of 7 inches. Here, thetransparent display device 100 may use relational information between the (x, y) coordinate information on thetransparent display device 100 according to the first touch input and the (x, y) coordinate information on thetransparent display device 100 according to the second touch input (for example, difference information between the coordinate information). - Also, if the
transparent display device 100 has a screen size of 10 inches and theexternal device 110 has a screen size of 4 inches, thetransparent display device 100 may change the coordinate information of the first touch input on thetransparent display device 100 and the coordinate information of the second touch input on thetransparent display device 100 into coordinate information on the screen size of 4 inches, by using a function of converting the coordinate information of the 10-inch screen size into the coordinate information of 4-inch screen size. - The above described function of converting the coordinate information according to the screen size may be included in the
external device 110. When theexternal device 110 has the function of converting the coordinate information, thetransparent display device 100 may transmit the (x, y) coordinate information on thetransparent display device 100 according to the first touch input, the (x, y) coordinate information on thetransparent display device 100 according to the second touch input, and the screen size information of thetransparent display device 100 to theexternal device 110. - The screen type information may include information representing whether the screen type of the
external device 110 is a touch type or a non-touch type. If the screen of theexternal device 110 is the touch type screen, theexternal device 110 may recognize a region overlapping thetransparent display device 100 and theexternal device 110. Accordingly, the first touch input may not include information relating to the contour line of theexternal device 110, but may only include the information representing that the screen of theexternal device 110 is the touch type screen. - In addition, in operation S201 of
FIG. 2 , the second touch input is an input for selecting at least one object displayed on theexternal device 110. The object that is displayed on theexternal device 110 is seen through thetransparent display device 100. The input for selecting the at least one object displayed on theexternal device 110 may be referred to as an input for selecting at least one position of the screen of theexternal device 110. - The second touch input may be based on at least one of a touch operation, that is, touching an arbitrary point on a contour line of an object that is seen through the
transparent display device 100 and dragging the touched location along the contour line of the object, and a touch operation of writing along the object (for example, text) that is seen through thetransparent display device 100. The second touch input may be referred to as a touch input on the screen of thetransparent display device 100 indicating position information (or second position information) of the object displayed on the screen of theexternal device 110. The second touch input may be referred to as an input that selects a position of the screen of theexternal device 110 viewable through thetransparent display device 100. The object is displayed on the screen of theexternal device 110 at the position. The position comprises one of a coordinate position of the screen of theexternal device 110 viewable through thetransparent display device 100 and an area of the screen of theexternal device 110 viewable through thetransparent display device 100. -
FIGS. 6A through 6C are diagrams showing examples of the second touch input according to the exemplary embodiment. -
FIG. 6A shows the second touch input based on a touch operation of drawing along a contour line of an object and the first touch input based on a touch operation of drawing along the contour line of theexternal device 110. - That is, referring to
FIG. 6A , the first touch input is received according to the touch operation of touching the point S and drawing a line to the point E along the contour line of theexternal device 110, and the second touch input is received according to the touch operation of drawing a line along with a contour line of an icon. The touch operation of drawing a line along the contour line of the icon is performed by touching a point S1 and drawing a line to a point E1 along the contour line of the icon that is an object, and accordingly, the second touch input is received. The start point and the end point of the touch operation for drawing along the contour line of the icon are not limited to the examples shown inFIG. 6A . That is, the start point is an arbitrary point in the contour line of the icon, and the end point is determined according to the start point as described above. InFIG. 6A , the object displayed on theexternal device 110 is an icon, but the object displayed on theexternal device 110 may be another type of object, as discussed below. - Also, the touch operation between the start point and the end point of the touch operation for drawing along the contour line of the object may be performed continuously or discontinuously. If the touch operation is performed discontinuously, the end point of the touch operation for drawing along the contour line of the object may be changed.
- For example, the touch operation for drawing along the contour line of the object in
FIG. 6A starts from the start point S1 and stops at a left lowermost point of the object, and then, the touch operation starts again from the start point S1 or the end point E1 to the left lowermost point of the object. In this case, the end point is the left lowermost point of the object, and the end point E1 may be a connection point for connecting the contour line according to the touch operation. The point where the touch operation stops is not limited to the above example. That is, the touch operation may be stopped at an arbitrary point on the contour line of the object, or at a plurality of points on the contour line of the object. -
FIG. 6B shows an example where the second touch input is received based on a writing touch operation along the object (text). That is,FIG. 6B shows a second touch input based on the touch operation for writing an alphabet character P. As such, whereas the icon displayed on theexternal device 110 inFIG. 6A constituted the object, the alphabet character P is the object displayed on theexternal device 110 inFIG. 6B . The second touch input based on the object writing touch operation may be performed by touching an arbitrary point in the text, and then, writing along the text. For example, after touching apoint 601, a writing touch operation along the object (text) may be performed in a direction denoted by an arrow ofFIG. 6B . However, the start point of the object writing touch operation is not limited to the example shown inFIG. 6B , that is, an arbitrary point of the object may be the start point. In addition, the object writing touch operation may be performed continuously or discontinuously. When the object writing touch operation is performed discontinuously, at least one connection point as described above may be included between the start point and the end point. -
FIGS. 7A and 7B are diagrams showing examples of a screen for describing the first touch input and the second touch input, in a case where thetransparent display device 100 has a screen that is larger than that of theexternal device 110. That is, as shown inFIG. 7A , when theexternal device 110 is seen through thetransparent display device 100, the first touch input is based on the touch operation for drawing along the contour line of theexternal device 110 and the second touch input is based on the touch operation for drawing along the contour line of the object displayed on theexternal device 110. -
FIGS. 8A through 8G are diagrams showing other examples for illustrating the first touch input and the second touch input in a case where thetransparent display device 100 is smaller than theexternal device 110.FIGS. 8A through 8G shows examples in which pieces of an object that is displayed on theexternal device 110 are arranged by adjusting the overlapping locations of thetransparent display device 100 and theexternal device 110. Therefore, in the examples shown inFIGS. 8A through 8G , thetransparent display device 100 displays information of sensing the second touch input on thetransparent display device 100. - That is, when the screen displayed on the
external device 110 is shown asFIG. 8A , thetransparent display device 100 overlaps theexternal device 110 as shown inFIG. 8B . Here, the first touch input is based on a touch operation for drawing along the contour line of the external device 110 (801), and the second touch input is based on a touch operation for drawing along a contour line of the object (802). Here, information of sensing the second touch input (802) is displayed on thetransparent display device 100. - The
transparent display device 100 detects relational information between (x, y) coordinate information on thetransparent display device 100 according to the first touch input and (x, y) coordinate information on thetransparent display device 100 according to the second touch input inFIG. 8B , and stores the detected information. The relational information detected by thetransparent display device 100 may include a difference between the (x, y) coordinate information on thetransparent display device 100 according to the first touch input and the (x, y) coordinate information on thetransparent display device 100 according to the second touch input. Theexternal device 110 may recognize the object selected by thetransparent display device 100 inFIG. 8B according to the (x, y) coordinate information according to the first touch input, the (x, y) coordinate information according to the second touch input, and the above relational information. - For example, when the coordinate information according to the first touch input (801) includes coordinates from (x(1), y(1)) to (x(1+m), y(1+m)), the coordinate information according to the second touch input (802) includes coordinates from (x(i), y(i)) to x((i+j), y(i+j)), and the number of pieces of the coordinate information obtained by the first touch input and the number of pieces of the coordinate information obtained by the second touch input are equal to each other, the
transparent display device 100 may obtain relational information from coordinates (x(1)-x(i), y(1)-y(i)) to (x(1+m)-x(i+j), y(1+m)-y(i+j)). Here, m, i, and j are natural numbers that are equal to or greater than 2. - However, the number of pieces of coordinate information on the
transparent display device 100 according to the first touch input and the number of pieces of coordinate information on thetransparent display device 100 according to the second touch input may be different from each other. In this case, thetransparent display device 100 may detect the above relational information by sampling the coordinate information obtained by the first touch input and the coordinate information obtained by the second touch input. A target to be sampled may be determined according to the display location thereof. - Next, as shown in
FIG. 8C , when thetransparent display device 100 overlaps theexternal device 110, the first touch input is based on a touch operation for drawing along the contour line of the external device 110 (803) and the second touch input is based on a touch operation for drawing along the contour line of the object (804). Here, information of sensing the second touch input (804) is displayed on thetransparent display device 100. - Accordingly, the image of the object displayed on the
transparent display device 100 may include the information of sensing the second touch input (802) inFIG. 8B , as shown inFIG. 8C . Thetransparent display device 100 detects the coordinate information according to the first touch input and the second touch input and the relational information between the coordinate information inFIG. 8C and stores the detected information as described with reference toFIG. 8B . Here, thetransparent display device 100 stores the coordinate information and the relational information between the coordinate information detected in the process ofFIG. 8B and the coordinate information and the relational information between the coordinate information detected in the process ofFIG. 8C to be distinguished that the coordinated information and the relational information are detected from each other process. - As shown in
FIG. 8D , when thetransparent display device 100 overlaps theexternal device 110, the first touch input is based on a touch operation for drawing along the contour line of the external device 110 (805) and the second touch input is based on a touch operation for drawing along the contour line of the object (806). Here, the second touch input may further include a touch operation for filling inside the contour line of the object. Information of sensing the second touch input (806) is displayed on thetransparent display device 100. Accordingly, the image of the object displayed on thetransparent display device 100 may include an image including the information of sensing the second touch input in the processes shown inFIGS. 8B and 8C . - The
transparent display device 100 detects and stores the coordinate information on thetransparent display device 100 according to the first touch input and the second touch input in the process shown inFIG. 8D and the relational information between the coordinate information, as described with reference toFIG. 8B . When storing the information, thetransparent display device 100 stores the detected coordinate information and the relational information to be distinguished from the coordinate information and the relational information obtained in the processes shown inFIGS. 8B and 8C . - As shown in
FIG. 8E , when thetransparent display device 100 overlaps theexternal device 110, the first touch input is based on a touch operation for drawing along the contour line of the external device 110 (807), and the second touch input is based on a touch operation for drawing along the contour line of the object (808). Here, the second touch input may further include a touch operation for filling inside the contour line of the object. Information of sensing the second touch input (808) is displayed on thetransparent display device 100. Accordingly, the image of the object displayed on thetransparent display device 100 may be the image including all the information of sensing the second touch input in processes shown inFIGS. 8B, 8C, and 8D . - The
transparent display device 100 obtains coordinate information on thetransparent display device 100 according to the first touch input (807) and the second touch input (808) inFIG. 8E and the relational information between the coordinate information, and stores the detected information. Here, thetransparent display device 100 stores the detected coordinate information and the relational information obtained in the process ofFIG. 8E to be distinguished from the coordinate information and the relational information obtained in the processes shown inFIGS. 8B through 8D . - When the
transparent display device 100 and theexternal device 110 overlap each other as shown inFIG. 8F , the first touch input is based on a touch operation for drawing along the contour line of the external device 110 (809) and the second touch input is based on a touch operation for writing along a text โRABBITโ (810). Here, information of sensing the second touch input (810) is displayed on thetransparent display device 100. Accordingly, the image of the object displayed on thetransparent display device 100 is an image including all the information of sensing the second touch inputs in the processes shown inFIGS. 8B through 8E . Thetransparent display device 100 obtains coordinate information on thetransparent display device 100 according to the first touch input and the second touch input inFIG. 8F and the relational information between the coordinate information, and stores the detected information as shown inFIGS. 8B through 8E . - According to displaying information of sensing the second touch inputs in the processes shown in
FIGS. 8B through 8F on thetransparent display device 100, the information of sensing the second touch inputs is displayed on thetransparent display device 100 as shown inFIG. 8G . As described above, by displaying the information of sensing the second touch inputs on thetransparent display device 100 when the second touch input is received, a displaying location about the object, which would be transmitted from theexternal device 110, may be determined in advance. - Also, the processes shown in
FIGS. 8B through 8F may be performed after changing a location of thetransparent display device 100 or moving thetransparent display device 100 to arrange pieces of the object displayed on theexternal device 110. Therefore, thetransparent display device 100 may clearly distinguish the first touch input and the second touch input from each other in each screen. For example, after receiving the first touch input and the second touch input inFIG. 8B , thetransparent display device 100 changes its location or moves, and then, receives the first touch input and the second touch input according to the process ofFIG. 8C to select the object displayed on theexternal device 110 as shown inFIG. 8C . Therefore, the first and second touch inputs in the process ofFIG. 8B and the first and second touch inputs in the process ofFIG. 8C may be distinguished from each other via sensing of the location variation or the moving of thetransparent display device 100. -
FIGS. 9A through 9C are diagrams showing examples of screen for describing the first and second touch inputs. - Referring to
FIG. 9A , thetransparent display device 100 and theexternal device 110 have the same size. Referring toFIG. 9B , a first touch input operation is performed based on amenu 910 displayed on thetransparent display device 100, and a second touch input is based on a touch operation for setting a closed region with respect to a sun, a tap-based touch operation with respect to a cloud, and a touch operation for drawing a contour line of a flower. The closed region shown inFIG. 9B is not limited thereto. For example, the closed region may be set as various types of closed loops in thetransparent display device 100. -
FIGS. 10A through 10D are diagrams showing examples of the screens for describing a first touch input and a second touch input based on an augmented reality service. -
FIG. 10A shows a case where information about theexternal device 110 based on the augmented reality service is displayed adjacent to theexternal device 110 that is seen through thetransparent display device 100. The information about theexternal device 110 based on the augmented reality service may be provided from theexternal device 110, another external device, or a server based on a physical locations between thetransparent display device 100 and theexternal device 110. - The information about the
external device 110 based on the augmented reality service may be provided using an access point. When thetransparent display device 100 and theexternal device 110 are located within a communication area of the same access point, physical locations of thetransparent display device 100 and theexternal device 110 may be estimated by using an indoor sensor capable of estimating a physical location of a device such as a geomagnetic sensor, an acceleration sensor, a gyro sensor, and an altitude sensor mounted in the device. Thus, the information about theexternal device 110 based on the augmented reality service may be provided from the above described other external device or the server according to the estimated physical locations. - Otherwise, the
transparent display device 100 receives or reads, from theexternal device 110, information that is necessary for receiving information based on the augmented reality service about the external device 110 (for example, mark information for recognizing the external device 110) using short distance communication such as NFC, and then, collects and displays the information based on the augmented reality service about theexternal device 110 from the server or the above described other external device. - The information about the
external device 110 seen through thetransparent display device 100 based on the augmented reality service may include a name of the device, a name of the owner, and contents of the external device, which may be shared with other devices, as shown inFIG. 10A , but is not limited thereto. - When displaying the information about the
external device 110 based on the augmented reality service as shown inFIG. 10A , the first touch input may be based on an operation of setting a touch-based closed region about theexternal device 110 as shown inFIG. 10B . The touch-based closed region is not limited to the example shown inFIG. 10B . - According to receipt of the first touch input, information about a shared folder may be displayed on the
transparent display device 100 as shown inFIG. 10C . The information about the shared folder may be information based on the augmented reality service, or information that is received from theexternal device 110 when the first touch input is transmitted to theexternal device 110. Here, the screen displayed on theexternal device 110 may not display the information about the shared folder. - When the second touch input according to the operation of setting the touch-based closed region or tap-based touch operation on a desired folder is received based on the information about the shared folder displayed on the
transparent display device 100 as shown inFIG. 10C , available pictures may be displayed as shown inFIG. 10D . The screen of theexternal device 110 may not display the available pictures shown inFIG. 10D . - The
transparent display device 100 may perform the second touch input operation by an operation of setting a touch-based closed region on a desired picture from among the available pictures shown inFIG. 10D . - Meanwhile, in operation S202 of
FIG. 2 , thetransparent display device 100 requests theexternal device 110 for information about at least one selected object, based on the first and second touch inputs. A signal requesting the information about the object may include the coordinate information on thetransparent display device 100 according to the first and second touch inputs and/or relational information between the coordinate information. - Otherwise, the signal requesting the information related to the object may include coordinate information on the
external device 110 according to the first and second touch inputs, wherein the coordinate information is converted by using the coordinate information converting function of thetransparent display device 100, and/or relational information between the coordinate information. The coordinate information on theexternal device 110 according to the second touch input may be coordinate information of the object that is displayed on theexternal device 110. - The signal requesting the information related to the object may further include a signal requesting relation information with the object. The signal requesting relation information with the object may include, for example, information for requesting a folder and objects included in the folder, when the object selected according to the second touch input is the folder. The objects included in the folder may be referred to as objects that are not displayed on the
external device 110. - The signal requesting the information related to the object may include coordinate information on the
transparent display device 100 according to the first and second touch inputs, and screen size information of thetransparent display device 100. In this case, theexternal device 110 may detect coordinate information on theexternal device 110 according to the first and second inputs based on the information transmitted from thetransparent display device 100 and the screen information of theexternal device 110. The coordinate information on theexternal device 110 may be detected by the processes described with reference toFIGS. 8B through 8F , but is not limited thereto. - The signal requesting the information related to the object may include various pieces of information that may be estimated by the examples of the first and second touch inputs described with reference to
FIGS. 3 through 10D . - In operation S203, the
transparent display device 100 receives information related to the selected object from theexternal device 110, and in operation S204, thetransparent display device 100 displays the received information related to the object on thetransparent display device 100. - The information related to the object may include at least one other object having the same display type as that of the object selected by the second touch input. The other object has a different display location on the
external device 110 from that of the selected object. That is, as shown inFIG. 6B , when the second touch input is received based on the touch operation for writing the text P, thetransparent display device 100 may receive all of the text Ps that are displayed at different locations on theexternal device 110 from theexternal device 110, and displays the received text. - Here, the display locations of the received information on the
transparent display device 100 may similarly correspond to the display locations on theexternal device 110. If there are a plurality pieces of received information, thetransparent display device 100 receives information about display coordinates on theexternal device 110, detects information about display coordinates on thetransparent display device 100 by using the screen size information of thetransparent display device 100 and the display coordinate information transmitted from theexternal device 110, and displays the plurality of objects by using the detected coordinate information. The coordinate information may be detected by the coordinate information converting operation that is described above. - However, the
external device 110 may detect information about coordinates on thetransparent display device 100 by using the screen size information of thetransparent display device 100 and the information about the display coordinates of the plurality pieces of the object information on theexternal device 110, and may transmit the detected coordinate information and the object information to thetransparent display device 100. Then, thetransparent display device 100 may display the objects based on the received coordinate information. - In operation S203, the information about the selected object transmitted from the
external device 110 may include information relating to the selected object. The information relating to the object may include information that is not displayed on the external device 110 (for example, information about objects included in a folder) as described above. - In operation S204, displaying the received information on the
transparent display device 100 may include displaying the received information at similar locations as those of theexternal device 110 as shown inFIGS. 6C, 7C, and 9C . However, when the second touch input is received as shown inFIG. 10D , pictures Pic1, Pic5, and Pic6, which are the selected objects, are received. Thus, thetransparent display device 100 may display the received pictures Pic1, Pic5, and Pic6 sequentially or on locations based on the screen shown inFIG. 10D . The received information may be stored in a clip board in thetransparent display device 100, or may be displayed on a clip board after generating the clip board. -
FIG. 11 is a flowchart illustrating a method of displaying information in a transparent display device according to another exemplary embodiment. The method illustrated inFIG. 11 includes an editing function. - In operation S1101, the
transparent display device 100 receives a first touch input and a second touch input. The first and second touch inputs are the same as those described with reference toFIGS. 2 through 10 . - In operation S1102, the
transparent display device 100 requests theexternal device 110 for information related to an object based on the first and second touch inputs. The request for the information related to the object is the same as that described in operation S201 ofFIG. 2 . - In operation S1103, the
transparent display device 100 receives information corresponding to the request from theexternal device 110. The information related to the object that is received is the same as that described in operation S203. - In operation S1104, the
transparent display device 100 displays the received information thetransparent display device 100. - In operation S1105, the
transparent display device 100 edits the received information that is displayed on thetransparent display device 100 according to a user input. - That is, when the received object is displayed on the
transparent display device 100 as shown inFIG. 7C and a touch-based user input for combining the objects (701 and 702) is received, thetransparent display device 100 displays a screen on which the objects are combined as shown inFIG. 7D . - The
user inputs user inputs - The editing operation in operation S1105 is not limited to the combination of the objects as shown in
FIGS. 7C and 7D . The editing may include various edits on the object, such as change in the shape of the object or change in the content of the object, and an edit on the screen including the object. - The
transparent display device 100 may perform the above editing operation based on an interaction with theexternal device 110. Accordingly, the information displayed on theexternal device 110 may reflect the editing result in thetransparent display device 100 in real-time. The editing result may be stored in theexternal device 110 only, in thetransparent display device 100 only, or in both thedevices transparent display device 100. -
FIG. 12 is a flowchart illustrating a method of displaying information in a transparent display device according to another exemplary embodiment. InFIG. 12 , thetransparent display device 100 is flexible, and a front portion and a rear portion of thetransparent display device 100 may be transformed or deformed according to a touch-based input. Theexternal device 110 has a touch screen. - In operation S1201, a touch input for selecting an object displayed on the
external device 110 that is seen through thetransparent display device 100 is received. Here, the received touch input may correspond to the second touch input described with reference toFIGS. 2 through 10 . - In operation S1202, front and rear surface portions of the
transparent display device 100 to which the touch input is received are deformed to protrude toward theexternal device 110. -
FIGS. 13A and 13B are side views showing a relation between thetransparent display device 100 that is flexible and has the front andrear surface portions external device 110. -
FIG. 13A is a side view showing thetransparent display device 100 and theexternal device 110 overlapping each other before the touch input is received. -
FIG. 13B shows a case where the front andrear surface portions transparent display device 100 are transformed together to touch atouch screen 1303 of theexternal device 110 according to the touch-based user input to thefront surface portion 1301 of thetransparent display device 100. Therear surface portion 1302 of thetransparent display device 100 may be configured as a constant voltage type so that the touch screen of theexternal device 110 may recognize a contact portion of therear surface portion 1302 of thetransparent display device 100 as a touch-based input; however, the present invention is not limited thereto. That is, therear surface portion 1302 may be configured according to a touch sensing type of thetouch screen 1303 in theexternal device 110. - Meanwhile, in operation S1203, the
transparent display device 100 receives information related to the selected object from theexternal device 110 based on the touch input due to the contact between therear surface portion 1302 of thetransparent display device 100 and theexternal device 110. - In operation S1204, the
transparent display device 100 displays the received information. -
FIG. 14 is a flowchart illustrating a method of displaying information in atransparent display device 100 according to another exemplary embodiment.FIG. 14 shows a case where the information related to the object displayed on theexternal device 110 and the screen size information are transmitted based on a local area wireless communication between thetransparent display device 100 and theexternal device 110. - In operation S1401, the
transparent display device 100 receives the information related to the object displayed on theexternal device 110 and the screen size information of theexternal device 110 via the local area wireless network. The local area wireless communication may include NFC, Bluetooth communication, Wi-Fi direct communication and IR association communication, but is not limited thereto. - In operation S1402, the
transparent display device 100 checks whether thetransparent display device 100 overlaps theexternal device 110. The checking in the operation S1402 may including checking the intention of the user to display the object displayed on theexternal device 110 that is seen through thetransparent display device 100 on thetransparent display device 100 according to the touch input to thetransparent display device 100. The intention of the user may be interpreted as the intention to select an object to be displayed on thetransparent display device 100. - The checking operation may be performed by disposing a contact sensor on the rear surface portion of the
transparent display device 100 or transmitting a sensing result sensed by a contact sensor disposed on a front surface portion of theexternal device 110 to thetransparent display device 100 via the local area wireless communication, but is not limited thereto. - Also, in the operation S1402, the
transparent display device 100 and theexternal device 110 may overlap so that theexternal device 110 may be included within the screen of thetransparent display device 100 when theexternal device 110 is smaller as shown inFIG. 3A , but is not limited thereto. If thetransparent display device 100 is smaller than theexternal device 110 as shown in the example ofFIG. 4A , a part of theexternal device 110 may overlap thetransparent display device 100, but is not limited thereto. When thetransparent display device 100 and theexternal device 110 have equal sizes as shown inFIG. 9A , the overlapping surfaces of thetransparent display device 100 and theexternal device 110 may be the same as each other. - In operation S1402, if it is determined that the
transparent display device 100 and theexternal device 110 overlap each other, thetransparent display device 100 displays the information related to the object displayed on theexternal device 110 by using the information transmitted via the local area wireless communication according to the user input in operation S1403. The user input in the operation S1403 may include a request for displaying the object displayed on theexternal device 110 that is seen through thetransparent display device 100, but is not limited thereto. -
FIG. 15 is a flowchart illustrating a method of displaying information in a transparent display device according to another exemplary embodiment.FIG. 15 shows a case where information obtained by photographing theexternal device 110 using a camera function of thetransparent display device 100 is displayed on thetransparent display device 100 according to a user input. - In operation S1501, the
transparent display device 100 photographs an object displayed on theexternal device 110 by using the camera function. - In operation S1502, the
transparent display device 100 determines whether thetransparent display device 100 and theexternal device 110 overlap each other. Determining whether thetransparent display device 100 and theexternal device 110 overlap each other may be performed in the same manner as that of operation S1402 described above. - In operation S1503, if it is determined that the
transparent display device 100 and theexternal device 110 overlap each other, thetransparent display device 100 displays the object displayed on theexternal device 110 that is photographed according to the user input. The user input may include a request for outputting the object displayed on the photographedexternal device 110, but is not limited thereto. -
FIG. 16 is a functional block diagram of thetransparent display device 100 according to an exemplary embodiment. - Referring to
FIG. 16 , thetransparent display device 100 may include atransparent display 1610, astorage 1620, acommunication interface 1630, aprocessor 1640, and asensor 1650. However, thetransparent display device 100 may further include additional components other than those shown inFIG. 16 . For example, thetransparent display device 100 may include an interface, such as a universal serial bus (USB) or a camera module. - The
transparent display 1610 is configured so that the object displayed on a screen of theexternal device 110 may be seen through thetransparent display 1610 and may be configured to receive a touch-based input. Thetransparent display unit 1610 may be formed in various types, for example, a transparent liquid crystal display (LCD) type, a transparent thin-film electroluminescent panel (TFEL) type, a transparent OLED type, or a projection type. Hereinafter, examples of the structure of thetransparent display 1610 will be described below. - The transparent LCD type is a transparent display device formed by removing a backlight unit from a currently used LCD device and using a pair of polarization plates, an optical film, a transparent thin film transistor (TFT), and a transparent electrode. The transparent display device may be referred to as a transparent display. In case of the transparent LCD device, a transmittance is degraded due to the polarization plates or the optical film and optical efficiency is reduced since peripheral light is used instead of the backlight unit; however, a large size transparent display may be realized.
- The transparent TFEL type is a transparent display device using an alternating current (AC) type inorganic thin film EL display (AC-TFEL) including a transparent electrode, an inorganic phosphor, and an insulating film. The AC-TFEL emits light when accelerated electrons pass through the inorganic phosphor to excite the phosphor. If the
transparent display unit 1610 is the transparent TFEL type, theprocessor 1640 may adjust the electrons to be projected to an appropriate location to determine a location displaying the information. Since the inorganic phosphor and the insulating film are transparent, the transparent display may be easily obtained. - Otherwise, the transparent OLED type is a transparent display device using an OLED that emits light by itself. Since an organic emission layer is transparent, the OLED may serve as the transparent display device provided that both electrodes are realized as transparent electrodes. In the OLED, electrons and holes are injected from both sides of the organic emission layer to be combined in the organic emission layer and emit light. The transparent OLED device may display the information by injecting the electrons and holes to desired locations.
-
FIG. 17 is a diagram showing a detailed structure of thetransparent display 1610 that is formed as the transparent OLED type. However, thetransparent display 1610 is not limited to the example shown inFIG. 17 . - Referring to
FIG. 17 , thetransparent display 1610 includes atransparent substrate 1702, atransparent transistor layer 1703, a firsttransparent electrode 1704, a transparentorganic emission layer 1705, a secondtransparent electrode 1706, and aconnection electrode 1707. - The
transparent substrate 1702 may be formed of a polymer material that is transparent such as plastic, or a glass material. The material forming thetransparent substrate 1702 may be determined according to environment in which thetransparent display device 100 is used. For example, the polymer material is light and flexible, and thus may be applied to a portable display device. The glass material may be applied to show windows or general windows. - The
transparent transistor layer 1703 is a layer including a transistor that is fabricated by replacing opaque silicon used in a conventional TFT with an organic material such as transparent zinc oxide or titanium oxide. In thetransparent transistor layer 1703, a source, a gate, a drain, and variousdielectric layers connection electrode 1707 for electrically connecting the drain to the firsttransparent electrode 1704 may be formed. Thetransparent transistor layer 1703 includes a plurality of transparent transistors that are distributed throughout the entire display surface of thetransparent display device 100. Theprocessor 1640 applies a control signal to the gate in each of the transistors in thetransparent transistor layer 1703 to drive the corresponding transparent transistor and display information. - The first
transparent electrode 1704 and the secondtransparent electrode 1706 are disposed at opposite sides to each other while the transparentorganic emission layer 1705 is interposed. The firsttransparent electrode 1704, the transparentorganic emission layer 1705, and the secondtransparent electrode 1706 form an organic light-emitting diode (OLED). - The transparent OLED may be classified as a passive matrix OLED (PMOLED) and an active matrix OLED (AMOLED) according to a driving method thereof. The PMOLED has a structure in which cross points between the first and second
transparent electrodes - Each of the first and second
transparent electrodes transparent electrode 1704 are arranged in a transverse direction, the line electrodes of the secondtransparent electrode 1706 are arranged in a longitudinal direction. Accordingly, there are a plurality of crossing areas formed between the first and secondtransparent electrodes - The
processor 1640 generates a potential difference in each of the crossing areas by using the transparent transistor. The electrons and holes are induced to the transparentorganic emission layer 1705 from the first andsecond electrodes - Indium tin oxide (ITO) may be used as the first and second
transparent electrodes organic emission layer 1705 may be formed of various materials. - In addition, as described above, the
transparent display 1610 may be formed as the projection type, as well as the transparent LCD type, the transparent TFEL type, and the transparent OLED type. The projection type is a method of displaying an image by projecting the image to a transparent screen such as a head-up display. - Also, the
transparent display 1610 may be a dual-touchable touch screen, or may be a touch screen, a front surface of which is only touchable. - The
transparent display 1610 displays information including the object processed in thetransparent display device 100. The information may include information except for the object. The information except for the object may denote information that is displayed, but may not be selected by the user input. - The
transparent display 1610 is formed as a transparent device, and a transparency of thetransparent display 1610 may be adjusted by adjusting light transmittance of the transparent device or by adjusting RGB value of each pixel. - Also, the
transparent display 1610 may have a structure in which an OLED and an LCD are combined. In thetransparent display 1610, the OLED may be located adjacent to a front surface input portion, and the LCD may be located adjacent to a rear surface input portion. In a case where thetransparent display 1610 has the above combined structure, thetransparent display 1610 maintains a transparent state such as the glass during power-off status, and when power is applied, the LCD blocks the light so that thetransparent display 1610 becomes opaque. - The
transparent display 1610 receives a touch input of the user through the front surface input unit. The screen displayed on thetransparent display 1610 may include a user interface (UI) or a graphic user interface (GUI). Also, thetransparent display 1610 may receive and display the information related to the object from theexternal device 110 according to the touch input (the first and second touch inputs) of the user on the object displayed on theexternal device 110 that is seen through thetransparent display unit 1610. - The
storage 1620 stores at least one program that is configured to execute the information display method in thetransparent display 1610. Thestorage unit 1620 may include a non-volatile memory such as a high speed random access memory, a magnetic disk storage device, or a flash memory, or other non-volatile semiconductor memories. -
FIG. 18 is a diagram illustrating software layers stored in thestorage 1620 of thetransparent display device 100 according to an exemplary embodiment. - Referring to
FIG. 18 , the software layer may include astorage module 1810, a sensor andrecognition module 1820, acommunication module 1830, an input/output module 1860, and alegend module 1870, but is not limited thereto. - The
storage module 1810 includes asystem database 1811 that is a storage for storing general data such as address book and environmental information, and a touchmode data region 1812 for storing setting values for touch modes of the object that will be displayed on thetransparent display 1610. - The
sensor recognition module 1820 includes amodule 1821 for sensing a touch on thetransparent display 1610, and amodule 1822 for classifying the input touch. Themodule 1822 for classifying the input touch may classify the touch input as afront input mode 1823 for transferring an input on the front surface input interface to an event processor X11, arear input mode 1824 for transferring an input on a rear surface input interface to the event processor X11, and adual mode 1825 for transferring a dual-touch input (both-touch input of the front surface input interface and the real surface input interface) to the event processor X11. However, thesensor recognition module 1820 may be configured by an input mode for only transferring the input on the front surface of thetransparent display 1610 to the event processor X11. - The
communication module 1830 may include atelephony module 1840 and amessaging module 1850, but is not limited thereto. - The
telephony module 1840 includes aninformation collection module 1842 for connecting a phone call, and avoice service module 1841 for transmitting voice over the Internet based on voice over Internet protocol (VoIP). - The
messaging module 1850 includes aninstant module 1851 regarding conversation between users through an Internet connection, amodule 1852 regarding short message service (SMS) text messages and multimedia messages, and amodule 1853 for emailing. - The input/
output module 1860 includes a UI &graphic module 1861, and amultimedia module 1865. - The UI &
graphic module 1861 includes anX11 module 1862 for receiving a touch input by a window manager, amodule 1863 that outputs all objects seen by a user on a screen, and anevaluation module 1864 regarding a mode setting value stored for each object and a current touch input. - The
multimedia module 1865 includes a movingpicture reproducing module 1866, a moving picture and stillimage capturing module 1867, and avoice reproducing module 1868. - The programs for executing the information display method according to the exemplary embodiments may be stored in the
storage module 1871. Thestorage module 1871 may store various applications. - As described above, the
storage 1620 may store programs of various configurations, and is not limited to the example shown inFIG. 18 . - The
communication interface 1630 may communicate with at least one of theexternal device 110, theserver 120, and theAP 130. To perform communication, thecommunication interface 1630 may be configured to transmit/receive data via a wireless communication network such as wireless Internet, wireless Intranet, a wireless phone network, a wireless local area network (LAN), a Wi-Fi network, a Wi-Fi direct (WFD) network, a 3G network, a 4G Long Term Evolution (LTE) network, a Bluetooth network, an infrared data association (IrDA) network, a radio frequency identification (RFID) network, a ultra wideband (UWB) network, a Zigbee network, or a near field communication (NFC) network; however, the present invention is not limited thereto. In particular, thecommunication interface 1630 may include a global positioning system (GPS) module. - The
processor 1640 may perform operations according to the above described exemplary embodiments by executing the programs stored in thestorage 1620. Theprocessor 1640 receives a first touch input representing reference information with respect to theexternal device 110 that is seen through thetransparent display 1610, and a second touch input representing a selection on an object displayed on theexternal device 110. Theprocessor 1640 requests information related to the object to theexternal device 110 based on the first and second touch inputs received via thecommunication interface 1630. When receiving the information related to the object from theexternal device 110 via thecommunication interface 1630, theprocessor 1640 displays the received information on thetransparent display 1610. - Operations of the
processor 1640 regarding the information display method according to the exemplary embodiments may be performed as described with reference to the flowcharts inFIGS. 2, 11, 12, 14, 15 , andFIG. 21 that will be described later. - The
sensor 1650 senses a current status of thetransparent display device 100 such as location of thetransparent display device 100, contact of the user on thetransparent display device 100, orientation of thetransparent display device 100, and acceleration or deceleration of thetransparent display device 100 and generates a sensing signal for controlling operations of thetransparent display device 100. In particular, thesensor 1650 may generate a sensing signal regarding the location of thetransparent display device 100 in order to receive information based on the augmented reality service described with reference toFIGS. 10A through 10D . -
FIG. 19 is a functional block diagram of thetransparent display device 100 according to an exemplary embodiment. - Referring to
FIG. 19 , thetransparent display device 100 may include atransparent display 1901, auser input interface 1902, asensor 1903, acamera 1904, astorage 1905, acommunication interface 1906, aport 1907, anaudio input interface 1908, anaudio signal processor 1909, anaudio output interface 1910, apower supply 1911, and aprocessor 1912, but is not limited thereto. That is, thetransparent display device 100 may include fewer components than those ofFIG. 19 , or may include additional components other than those ofFIG. 19 . - The
transparent display 1901 may be referred to as a touch screen. Thetransparent display 1901 may display objects, and may receive a touch-based user input. Thetransparent display 1901 may receive the touch-based user input via at least one of a front surface and a rear surface of thetransparent display 1901. To do this, thetransparent display 1901 includes at least one touch sensor. The touch sensor may recognize the user input based on (x, y) coordinates. The touch sensor may include a sensor for recognizing a direct-touch, or a sensor for recognizing a proximity-touch. - The user input may be generated according to a request of a user based on gestures of the user, or user's selection. The gesture of the user may be variously defined by combinations of the number of touches, touch patterns, touch area, and touch intensity.
- As described above with reference to the
transparent display 1610 ofFIG. 16 , thetransparent display 1901 is formed as a transparent device, and a transparency of thetransparent display 1901 may be adjusted by adjusting light transmittance of the transparent device or by adjusting RGB value of each pixel. Also, thetransparent display 1901 may have a structure in which an OLED and an LCD are combined. In thetransparent display 1901, the OLED may be located adjacent to a front surface of thetransparent display 1901, and the LCD may be located adjacent to a rear surface of thetransparent display 1901. - The
transparent display 1901 may display a screen respectively responding to a touch-based user input through at least one of the front and rear surfaces thereof, a user input based on thesensor 1903, a user input via thecamera 1904, and a user input via theaudio input interface 1908. The screen displayed on thetransparent display 1901 may include a UI or a GUI screen. - The
transparent display 1901 may have a physical structure like thetransparent display 1610 described with reference toFIG. 16 . Two or moretransparent display 1901 may be formed according to the type of thetransparent display device 100. - The
user input interface 1902 generates input data (or control data) for controlling operations of thetransparent display device 100 and a user input. Theuser input interface 1902 may include a keypad, a dome switch, a touch pad that is used instead of a mouse, a jog wheel, a jog switch, and a hardware (H/W) button. - The
sensor 1903, like thesensor 1650 shown inFIG. 16 , senses a current status of thetransparent display device 100 such as location of thetransparent display device 100, contact of the user on thetransparent display device 100, orientation of thetransparent display device 100, and acceleration or deceleration of thetransparent display device 100 and generates a sensing signal for controlling operations of thetransparent display device 100. - The
sensor 1903 may include a sensor except for the sensors for sensing the direct touch or the proximate touch described regarding thetransparent display 1901. For example, thesensor 1903 may include a proximity sensor. The proximity sensor is a sensor for detecting whether an object approaches a previously set defection surface or whether the external object is present nearby by using a force of an electromagnetic field or an infrared ray without an actual physical touch. Examples of the proximity sensor include a transparent photoelectric sensor, a direct reflective photoelectric sensor, a mirror reflective photoelectric sensor, a high frequency oscillation photoelectric sensor, a capacitive photoelectric sensor, a magnetic photoelectric sensor, an infrared photoelectric sensor, etc. - The
camera 1904 processes an image frame such as a still image or a moving image obtained from an image sensor in a conference call mode or a photographing mode. The processed image frame may be displayed on thetransparent display 1901. The image frame processed by thecamera 1904 may be stored in thestorage 1905 or may be transmitted to another device through thecommunication interface 1906 or theport 1907. The device receiving the transmitted image frame may include at least one of theexternal device 110, theserver 120, and theAP 130, but is not limited thereto. - The
camera 1904 may also be configured to receive the user input to the front and rear surfaces of thetransparent display 1901 or to photograph the object. The number ofcameras 1904 may be two or more according to a structure of thetransparent display device 100. Thecamera 1904 may be used as an input apparatus that recognizes a user's spatial gesture. - The
storage 1905 stores at least one program configured to be executed by theprocessor 1912, which will be described later, and a resource. The at least one program includes a program that executes an information display method, an operating system (OS) program of thetransparent display device 100, applications set in thetransparent display device 100, and a program necessary for performing various functions (for example, communication function and display function) of thetransparent display device 100. - The resource includes information necessary for executing the above-described programs, user interface screen information for performing the information display method mentioned in embodiments of the present invention, and the user input information recognized by the first and second touch inputs. The user input information recognized as the first and second touch inputs may be set based on the examples described with reference to
FIGS. 3 through 10 , but is not limited thereto. - The
storage 1905 may be configured to independently include a storage that stores at least one program necessary for performing various functions of thetransparent display device 100 and an operating system program, and a storage that stores one or more programs, resources, and various applications that execute the information display method. - The
storage 1905 may include at least one type of storage medium among a flash memory type, a hard disk type, a multimedia card micro type, and a card type memory (for example, an SD or XD memory), a read only memory (ROM), an electronically erasable programmable read-only memory (EEPROM), a programmable read only memory (PROM) magnetic memory, and an optical disk. - The
communication interface 1906 may be configured to transmit data to and receive data from at least one of theexternal device 110, a server (120), and AP(130) via a wireless communication network such as wireless Internet, wireless Intranet, a wireless phone network, a wireless local area network (LAN), a Wi-Fi network, a Wi-Fi direct (WFD) network, a 3G network, a 4G Long Term Evolution (LTE) network, a Bluetooth network, an infrared data association (IrDA) network, a radio frequency identification (RFID) network, a ultra wideband (UWB) network, a Zigbee network, or a near field communication (NFC) network, but is not limited thereto. - The
communication interface 1906 may include at least one of a broadcasting reception module, a mobile communication module, a wireless Internet module, a wired Internet module, a short distance communication module, and a location information module but is not limited thereto. - The broadcasting reception module receives a broadcasting signal and/or broadcasting related information from an external broadcasting management server through a broadcasting channel. The broadcasting channel may include a satellite channel and a terrestrial channel. The mobile communication module transmits and receives a wireless signal to and from at least one of a base station, the
external device 110, and theserver 120 over a mobile communication network. The wireless signal may include various types of data according to a voice call signal, a conference phone call, or transmission/reception of a text/multimedia message. The wireless Internet module is a module for a wireless Internet connection. The wired Internet module is a module for a wired Internet connection. - The short distance communication module is a module for short distance communication. Short distance communication technologies may use Bluetooth, RFID, IrDA, UWB, Zigbee, WFD, NFC, etc. Like the exemplary embodiment shown in
FIG. 14 , the information about the object displayed on theexternal device 110 and the screen size information may be received via the short distance communication module. For example, when using the NFC communication method and a distance between theexternal device 110 and thetransparent display device 100 is within a radius of the short distance communication based on the NFC, thetransparent display device 100 may read or receive the above information from theexternal device 110. - The location information module is a module for identifying or obtaining the location of the
transparent display device 100. As an example, a GPS module may be used. The GPS module receives location information from a plurality of satellites. The location information may include coordinate information represented by latitude and longitude. - The
port 1907 may transmit and receive data to and from theexternal device 110 by using a plug and play interface such as a USB port. The plug and play interface is a module that automatically detects and enables use of (i.e., play) if theexternal device 110 is plugged into thetransparent display device 100. The device is not limited to theexternal device 110. - The
audio input interface 1908 receives an input of an external sound signal in a call mode, a recording mode, or a voice recognition mode, etc. Theaudio input interface 1908 may be configured as, for example, a microphone. Theaudio input interface 1908 may be configured to include various noise removal algorithms for removing noise that occurs during the process of receiving the input of the external sound signal. - The sound signal input by using the
audio input interface 1908 may be user's input representing a selection on the object displayed on theexternal device 110 that is seen through thetransparent display unit 1901 according to an exemplary embodiment. The sound signal input by using theaudio input interface 1908 may be stored in thestorage 1905 or may be transmitted to the outside through thecommunication interface 1906 or theport 1907. The outside may include theexternal device 110, other external devices (not shown), theserver 120, and theAP 130. - The audio
signal processing unit 1909 provides an interface between theaudio input interface 1908 and theprocessor 1912 and between theaudio output interface 1910 and theprocessor 1912. That is, the audiosignal processing unit 1909 converts the sound signal received from theaudio input interface 1908 into audio data that may be processed by theprocessor 1912 and transmits the audio data to theprocessor 1912. The audiosignal processing unit 1909 converts the audio data transmitted from theprocessor 1912 into an electrical sound signal and transmits the electrical sound signal to theaudio output interface 1910. - The
audio output interface 1910 outputs the sound signal or the audio signal received from the audiosignal processing unit 1909 in the call mode or an audio production mode. The audiosignal output interface 1910 may be configured as a speaker. Theaudio input interface 1908 and theaudio output interface 1910 may be integrally configured like a head set. - The
transparent display 1901, theuser input interface 1902, thesensor 1903, thecamera 1904, and theaudio input interface 1908 may be referred to as input apparatuses or input/output apparatuses according to a function of a user interface between thetransparent display device 100 and the user. For example, in a case where the function of the user interface between thetransparent display device 100 and the user includes a touch screen function, a sound recognition function, and a spatial gesture recognition function, theuser input interface 1902, thesensor 1903, thecamera 1904, and theaudio input interface 1908 may be referred to as the input apparatuses, and thetransparent display 1901 may be referred to as the input/output apparatus. - The
power supply 1911 supplies power to various elements of thetransparent display device 100. Thepower supply 1911 includes one or more power sources such as a battery and an alternating current (AC) power source. Thetransparent display device 100 may not include thepower supply 1911 but may include a connection unit (not shown) that may be connected to an external power supply (not shown). - The
processor 1912 may be referred to as one or more processors that control a general operation of thetransparent display device 100. Although theprocessor 1912 is implemented as a single chip inFIG. 19 , theprocessor 1912 may be divided into a plurality of processors according to a function of thetransparent display device 100. - The
processor 1912 may generally control thetransparent display 1901, theuser input interface 1902, thesensor 1903, thecamera 1904, thestorage 1905, thecommunication interface 1906, theport 1907, theaudio input interface 1908, the audiosignal processing unit 1909, and theaudio output interface 1910. Thus, theprocessor 1912 may be referred to as a controller, a microprocessor, a digital signal processor, etc. - The
processor 1912 may also provide user's input that is input through thetransparent display 1901, theuser input interface 1902, thesensor 1903, thecamera 1904, and theaudio input interface 1908 that correspond to input apparatuses and a user interface based on thetransparent display 1901. - The
processor 1912 may execute at least one program related to the information display method according to the exemplary embodiments. Theprocessor 1912 may execute the program by reading the program from thestorage 1905 or downloading the program from an external apparatus such as an application providing server (not shown) or a market server (not shown) through thecommunication interface 1906. - The
processor 1912 may be understood to include an interface function unit interfacing between various functional modules and theprocessor 1912 of thetransparent display device 100. The operation of theprocessor 1912 related to the information display method according to the exemplary embodiments may be performed as shown in flowcharts ofFIGS. 2, 11, 12, 14, 15, and 21 that will be described later. -
FIG. 20 is a flowchart illustrating operations of theexternal device 110 according to an exemplary embodiment. - In operation S2001, the
external device 110 receives a request for information related to at least on object displayed on theexternal device 110 from thetransparent display device 100. The request for information may be transmitted via at least one of the direct communication between the devices, the communication via a server, and the communication via a repeater. - The request for information related to the object may be input based on the first touch input and the second touch input to the
transparent display device 100. The first touch input is a user input to thetransparent display device 100 for representing the reference information about theexternal device 110 that is seen through thetransparent display device 100. The second touch input is a user input to thetransparent display device 100 for selecting at least one object displayed on theexternal device 110 that is seen through thetransparent display device 100. - The request for information related to the object may include the displayed location information (coordinate information) of the selected object on the
external device 110 as described in the above exemplary embodiments, the screen size of thetransparent display device 100, and the coordinate information of the first and second touch inputs on thetransparent display device 100, but is not limited thereto. - However, when the
transparent display device 100 operates as the flowchart shown inFIG. 21 , the request for information related to the object may be based on the touch input corresponding to the second touch input. - In operation S2002, the
external device 110 selects an object in response to the received request for the information related to the object. For example, if the requested object is an icon, theexternal device 110 selects the icon of the requested object and the application program connected to the icon. If the requested object is a folder, theexternal device 110 selects the requested folder and files or data located at a lower layer of the folder. If the requested object is an object included in one screen, theexternal device 110 selects the object by using the coordinate information included in the received request. If the requested object is a plurality of objects included in one screen, theexternal device 110 respectively selects the plurality of objects by using the coordinate information of the object included in the received request. - In operation S2003, the
external device 110 transmits information related to the selected object to thetransparent display device 100. The information related to the object is transmitted to thetransparent display device 100 in the same manner as the request for the information is received, but is not limited thereto. For example, the request for the information may be received via the direct communication between the devices, and the information related to the object selected in response to the request may be transmitted to thetransparent display device 100 via the repeater or the server. -
FIG. 21 is a flowchart illustrating a method of displaying information on a transparent display device according to another exemplary embodiment.FIG. 21 shows a case where the second touch input mentioned with reference toFIG. 2 is used. - In operation S2101, the
transparent display device 100 receives a touch input for selecting an object displayed on theexternal device 110 that is seen through thetransparent display device 100. Here, the touch input corresponds to the second touch input mentioned inFIGS. 2 through 10 . The information about theexternal device 110 that is seen through thetransparent display device 100 may be the same as that mentioned inFIGS. 1 through 10 . - In operation S2102, the
transparent display device 100 requests theexternal device 110 for information related to the object selected based on the touch input. A signal for requesting information related to the object selected based on the touch input transmitted to theexternal device 110 includes a signal for requesting information related to the object selected based on the second touch input mentioned inFIGS. 2 through 10 . - In operation S2103, the
transparent display device 100 receives information about the selected object from theexternal device 110. The received information corresponds to the request signal in operation S2102, and may be the same as the information received in operation S203 shown inFIG. 2 . - In operation S2104, the
transparent display device 100 displays the received information. The received information may be displayed in the same manner as that of operation S204 shown inFIG. 2 . - The flowchart shown in
FIG. 21 may be modified to include the operation S1105 shown inFIG. 11 so that the object displayed on thetransparent display device 100 may be edited based on the interaction between thetransparent display device 100 and theexternal device 110. - The information display method according to exemplary embodiments may also be embodied as computer readable codes on a computer readable recording medium. The computer readable medium may be any recording apparatus capable of storing data that is read by a computer system, e.g., a read-only memory (ROM), a random access memory (RAM), a compact disc (CD)-ROM, a magnetic tape, a floppy disk, an optical data storage device, and so on. The computer readable medium may be distributed among computer systems that are interconnected through a network, and the present invention may be stored and implemented as computer readable code in the distributed system.
- While the exemplary embodiments have been particularly shown and described, it will be understood by those of ordinary skill in the art that various changes in form and details may be made therein without departing from the spirit and scope of the present invention as defined by the following claims.
Claims (67)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/002,829 US10788977B2 (en) | 2012-09-19 | 2018-06-07 | System and method for displaying information on transparent display device |
Applications Claiming Priority (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR20120104156 | 2012-09-19 | ||
KR10-2012-00104156 | 2012-09-19 | ||
KR10-2012-0104156 | 2012-09-19 | ||
KR1020130106227A KR102255832B1 (en) | 2012-09-19 | 2013-09-04 | System and method for displaying information on transparent display device |
KR10-2013-00106227 | 2013-09-04 | ||
KR10-2013-0106227 | 2013-09-04 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/002,829 Continuation US10788977B2 (en) | 2012-09-19 | 2018-06-07 | System and method for displaying information on transparent display device |
Publications (3)
Publication Number | Publication Date |
---|---|
US20140078089A1 US20140078089A1 (en) | 2014-03-20 |
US10007417B2 US10007417B2 (en) | 2018-06-26 |
US20180196581A9 true US20180196581A9 (en) | 2018-07-12 |
Family
ID=49231289
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/031,483 Active 2034-11-28 US10007417B2 (en) | 2012-09-19 | 2013-09-19 | System and method for displaying information on transparent display device |
US16/002,829 Active US10788977B2 (en) | 2012-09-19 | 2018-06-07 | System and method for displaying information on transparent display device |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/002,829 Active US10788977B2 (en) | 2012-09-19 | 2018-06-07 | System and method for displaying information on transparent display device |
Country Status (5)
Country | Link |
---|---|
US (2) | US10007417B2 (en) |
EP (1) | EP2711826A1 (en) |
CN (1) | CN104641328B (en) |
TW (1) | TWI637312B (en) |
WO (1) | WO2014046456A1 (en) |
Families Citing this family (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20130092281A (en) * | 2012-02-10 | 2013-08-20 | ์ผ์ฑ์ ์์ฃผ์ํ์ฌ | Method for updating the latest service category table in device and the device therefor |
US9261989B2 (en) | 2012-09-13 | 2016-02-16 | Google Inc. | Interacting with radial menus for touchscreens |
US9195368B2 (en) * | 2012-09-13 | 2015-11-24 | Google Inc. | Providing radial menus with touchscreens |
EP2966560B1 (en) * | 2014-07-08 | 2020-01-22 | Nokia Technologies Oy | Determination of an apparatus display region |
KR20160015843A (en) * | 2014-07-31 | 2016-02-15 | ์ผ์ฑ์ ์์ฃผ์ํ์ฌ | Display apparatus and method for controlling the apparatus thereof |
CN105528023B (en) * | 2014-10-27 | 2019-06-25 | ่ๆณ(ๅไบฌ)ๆ้ๅ ฌๅธ | Display control method, display device and electronic equipment |
US9665697B2 (en) * | 2015-03-17 | 2017-05-30 | International Business Machines Corporation | Selectively blocking content on electronic displays |
US9980304B2 (en) | 2015-04-03 | 2018-05-22 | Google Llc | Adaptive on-demand tethering |
CN104965241B (en) * | 2015-07-17 | 2017-10-24 | ๆจๅผบ | A kind of discoloration projection eyeglass and the HUD with the eyeglass |
CN105589653A (en) * | 2015-12-25 | 2016-05-18 | ๅนฟๅท่งๆบ็ตๅญ็งๆ่กไปฝๆ้ๅ ฌๅธ | Control method and control system for head-up display |
US10254577B2 (en) | 2016-02-09 | 2019-04-09 | Lim Industries LLC | Electronic panel having a see-through mode |
US11243735B2 (en) | 2016-02-09 | 2022-02-08 | Lim Industries LLC | Electronic panel having multiple display devices and a multi-state device operable with a processor to control a see-through mode and a plurality of display modes |
CN107728349B (en) * | 2016-08-12 | 2021-04-27 | ๆทฑๅณๅธๆ็ฝ็งๆ่กไปฝๆ้ๅ ฌๅธ | Interactive device and method capable of switching display content |
KR20180056174A (en) | 2016-11-18 | 2018-05-28 | ์ผ์ฑ์ ์์ฃผ์ํ์ฌ | Method for contents processing and electronic device supporting the same |
TWI629675B (en) | 2017-08-18 | 2018-07-11 | ่ฒกๅๆณไบบๅทฅๆฅญๆ่ก็ ็ฉถ้ข | Image recognition system and information displaying method thereof |
JP6959529B2 (en) * | 2018-02-20 | 2021-11-02 | ๅฏๅฃซ้ๆ ชๅผไผ็คพ | Input information management program, input information management method, and information processing device |
CN112384972A (en) * | 2018-03-27 | 2021-02-19 | ็ปดๆณฝๆๆ้่ดฃไปปๅ ฌๅธ | System and method for multi-screen display and interaction |
TWI650581B (en) * | 2018-08-08 | 2019-02-11 | ๅ็ซไธญๅคฎๅคงๅญธ | High contrast double transparent display |
KR20210014813A (en) | 2019-07-30 | 2021-02-10 | ์ผ์ฑ๋์คํ๋ ์ด ์ฃผ์ํ์ฌ | Display device |
CN110989913A (en) * | 2019-11-22 | 2020-04-10 | ็ปดๆฒ็งปๅจ้ไฟกๆ้ๅ ฌๅธ | Control method and electronic equipment |
WO2021171915A1 (en) * | 2020-02-28 | 2021-09-02 | ใใใฝใใใฏ ใคใณใใฌใฏใใฅใขใซ ใใญใใใฃ ใณใผใใฌใผใทใงใณ ใชใ ใขใกใชใซ | Smart window device, video display method, and program |
TWI745955B (en) * | 2020-05-06 | 2021-11-11 | ๅฎ็ข่กไปฝๆ้ๅ ฌๅธ | Augmented reality system and anchor display method thereof |
CN113703161B (en) * | 2020-05-22 | 2023-07-25 | ๅฎ็ข่กไปฝๆ้ๅ ฌๅธ | Augmented reality system and anchoring display method thereof |
Family Cites Families (61)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5583946A (en) * | 1993-09-30 | 1996-12-10 | Apple Computer, Inc. | Method and apparatus for recognizing gestures on a computer system |
KR100474724B1 (en) * | 2001-08-04 | 2005-03-08 | ์ผ์ฑ์ ์์ฃผ์ํ์ฌ | Apparatus having touch screen and external display device using method therefor |
US6980202B2 (en) * | 2001-12-21 | 2005-12-27 | International Business Machines Corporation | Method and system for creating and accessing hyperlinks from annotations relating to a physical document |
US7743348B2 (en) * | 2004-06-30 | 2010-06-22 | Microsoft Corporation | Using physical objects to adjust attributes of an interactive display application |
JP4537403B2 (en) * | 2004-08-04 | 2010-09-01 | ๆ ชๅผไผ็คพๆฅ็ซ่ฃฝไฝๆ | Image processing device |
JP4517827B2 (en) | 2004-11-22 | 2010-08-04 | ๆ ชๅผไผ็คพๆฅ็ซ่ฃฝไฝๆ | Screen sharing system and information processing apparatus |
US9030497B2 (en) * | 2005-06-30 | 2015-05-12 | Nec Display Solutions, Ltd. | Display device and arrangement method of OSD switches |
DE102005043310B4 (en) * | 2005-09-12 | 2007-10-04 | Siemens Ag | Display system, in particular for an industrial automation device |
KR100765789B1 (en) * | 2006-06-27 | 2007-10-12 | ์ผ์ฑ์ ์์ฃผ์ํ์ฌ | Method and apparatus for displaying information on external device, and computer readable medium recording program performing the method |
US7796802B2 (en) * | 2006-09-26 | 2010-09-14 | The Boeing Company | System for recording and displaying annotated images of object features |
US20120113028A1 (en) * | 2010-06-28 | 2012-05-10 | Cleankeys Inc. | Method for detecting and locating keypress-events on touch- and vibration-sensitive flat surfaces |
EP2283421B1 (en) * | 2008-05-20 | 2019-08-14 | Citrix Systems, Inc. | Methods and systems for using external display devices with a mobile computing device |
US8314859B2 (en) * | 2008-05-29 | 2012-11-20 | Lg Electronics Inc. | Mobile terminal and image capturing method thereof |
CN102077161B (en) * | 2008-06-30 | 2017-08-01 | ๆฅๆฌ็ตๆฐๆ ชๅผไผ็คพ | Message processing device, display control method and recording medium |
JP5080401B2 (en) * | 2008-08-25 | 2012-11-21 | ๆ ชๅผไผ็คพ๏ผฐ๏ฝ๏ฝ | Information processing apparatus, transparent display element control method, and program |
US8279174B2 (en) * | 2008-08-27 | 2012-10-02 | Lg Electronics Inc. | Display device and method of controlling the display device |
KR20100062158A (en) * | 2008-12-01 | 2010-06-10 | ์ผ์ฑ์ ์์ฃผ์ํ์ฌ | Display apparatus and method of displaying |
US8565829B2 (en) | 2009-03-02 | 2013-10-22 | Lg Electronics Inc. | Mobile terminal with detachably coupled sub-device |
KR101587102B1 (en) | 2009-03-02 | 2016-01-20 | ์์ง์ ์ ์ฃผ์ํ์ฌ | Mobile terminal |
JP5428436B2 (en) * | 2009-03-25 | 2014-02-26 | ใฝใใผๆ ชๅผไผ็คพ | Electronic device, display control method and program |
US9241062B2 (en) * | 2009-05-20 | 2016-01-19 | Citrix Systems, Inc. | Methods and systems for using external display devices with a mobile computing device |
US8823743B2 (en) * | 2009-10-02 | 2014-09-02 | Sony Corporation | Image processing device and method, and program |
KR101657565B1 (en) * | 2010-04-21 | 2016-09-19 | ์์ง์ ์ ์ฃผ์ํ์ฌ | Augmented Remote Controller and Method of Operating the Same |
KR101694159B1 (en) * | 2010-04-21 | 2017-01-09 | ์์ง์ ์ ์ฃผ์ํ์ฌ | Augmented Remote Controller and Method of Operating the Same |
KR20110118421A (en) * | 2010-04-23 | 2011-10-31 | ์์ง์ ์ ์ฃผ์ํ์ฌ | Augmented remote controller, augmented remote controller controlling method and the system for the same |
KR20110081040A (en) * | 2010-01-06 | 2011-07-13 | ์ผ์ฑ์ ์์ฃผ์ํ์ฌ | Method and apparatus for operating content in a portable terminal having transparent display panel |
KR101087479B1 (en) | 2010-01-29 | 2011-11-25 | ์ฃผ์ํ์ฌ ํฌํ | Multi display device and method for controlling the same |
US8621365B2 (en) | 2010-04-06 | 2013-12-31 | Asustek Computer Inc. | File sharing method and system |
IT1400746B1 (en) * | 2010-06-30 | 2013-07-02 | Damian S R L | AUTOMATIC PRODUCT DISTRIBUTOR MACHINE |
US8362992B2 (en) * | 2010-07-21 | 2013-01-29 | Delphi Technologies, Inc. | Dual view display system using a transparent display |
KR101688942B1 (en) | 2010-09-03 | 2016-12-22 | ์์ง์ ์ ์ฃผ์ํ์ฌ | Method for providing user interface based on multiple display and mobile terminal using this method |
EP2431895B1 (en) * | 2010-09-16 | 2017-07-26 | LG Electronics Inc. | Transparent display device and method for providing information using the same |
KR20120029228A (en) * | 2010-09-16 | 2012-03-26 | ์์ง์ ์ ์ฃผ์ํ์ฌ | Transparent display device and method for providing object information |
WO2012054063A1 (en) * | 2010-10-22 | 2012-04-26 | Hewlett-Packard Development Company L.P. | An augmented reality display system and method of display |
US20120102439A1 (en) * | 2010-10-22 | 2012-04-26 | April Slayden Mitchell | System and method of modifying the display content based on sensor input |
US20120102438A1 (en) * | 2010-10-22 | 2012-04-26 | Robinson Ian N | Display system and method of displaying based on device interactions |
US9007277B2 (en) * | 2010-10-28 | 2015-04-14 | Microsoft Technology Licensing, Llc | Transparent display assembly |
US20120105428A1 (en) * | 2010-10-28 | 2012-05-03 | Microsoft Corporation | Transparent display configuration modes |
US8941683B2 (en) * | 2010-11-01 | 2015-01-27 | Microsoft Corporation | Transparent display interaction |
US8605048B2 (en) * | 2010-11-05 | 2013-12-10 | Bluespace Corporation | Method and apparatus for controlling multimedia contents in realtime fashion |
DE102010052244A1 (en) * | 2010-11-23 | 2012-05-24 | Pierre-Alain Cotte | Method and device for displaying a graphical user interface of a portable computing unit on an external display device |
EP2500814B1 (en) * | 2011-03-13 | 2019-05-08 | LG Electronics Inc. | Transparent display apparatus and method for operating the same |
US8698771B2 (en) * | 2011-03-13 | 2014-04-15 | Lg Electronics Inc. | Transparent display apparatus and method for operating the same |
EP2500816B1 (en) * | 2011-03-13 | 2018-05-16 | LG Electronics Inc. | Transparent display apparatus and method for operating the same |
KR101226519B1 (en) * | 2011-04-14 | 2013-01-25 | ์ฃผ์ํ์ฌ ์ ์ผ๊ธฐํ | System for advertising products and method for advertising the same |
JP5830987B2 (en) * | 2011-07-06 | 2015-12-09 | ใฝใใผๆ ชๅผไผ็คพ | Display control apparatus, display control method, and computer program |
US9465507B2 (en) * | 2011-10-19 | 2016-10-11 | Microsoft Technology Licensing, Llc | Techniques to facilitate asynchronous communication |
KR101952170B1 (en) * | 2011-10-24 | 2019-02-26 | ์์ง์ ์ ์ฃผ์ํ์ฌ | Mobile device using the searching method |
US9063566B2 (en) * | 2011-11-30 | 2015-06-23 | Microsoft Technology Licensing, Llc | Shared collaboration using display device |
TWI526888B (en) * | 2011-12-20 | 2016-03-21 | ๅ้ๅ ้ป่กไปฝๆ้ๅ ฌๅธ | Vending machine and operating system and operating method thereof |
US9734633B2 (en) * | 2012-01-27 | 2017-08-15 | Microsoft Technology Licensing, Llc | Virtual environment generating system |
KR20130094095A (en) * | 2012-02-15 | 2013-08-23 | ์ผ์ฑ๋์คํ๋ ์ด ์ฃผ์ํ์ฌ | Transparent display device and operating method thereof |
JP5137150B1 (en) * | 2012-02-23 | 2013-02-06 | ๆ ชๅผไผ็คพใฏใณใ | Handwritten information input device and portable electronic device provided with handwritten information input device |
WO2013162583A1 (en) * | 2012-04-26 | 2013-10-31 | Intel Corporation | Augmented reality computing device, apparatus and system |
US20130316767A1 (en) * | 2012-05-23 | 2013-11-28 | Hon Hai Precision Industry Co., Ltd. | Electronic display structure |
US9152226B2 (en) * | 2012-06-15 | 2015-10-06 | Qualcomm Incorporated | Input method designed for augmented reality goggles |
US20130342427A1 (en) * | 2012-06-25 | 2013-12-26 | Hon Hai Precision Industry Co., Ltd. | Monitoring through a transparent display |
US20130342696A1 (en) * | 2012-06-25 | 2013-12-26 | Hon Hai Precision Industry Co., Ltd. | Monitoring through a transparent display of a portable device |
US20140035877A1 (en) * | 2012-08-01 | 2014-02-06 | Hon Hai Precision Industry Co., Ltd. | Using a display device with a transparent display to capture information concerning objectives in a screen of another display device |
TW201407431A (en) * | 2012-08-03 | 2014-02-16 | Novatek Microelectronics Corp | Portable apparatus |
US9250783B2 (en) * | 2012-08-21 | 2016-02-02 | Apple Inc. | Toggle gesture during drag gesture |
-
2013
- 2013-09-14 TW TW102133303A patent/TWI637312B/en not_active IP Right Cessation
- 2013-09-17 WO PCT/KR2013/008399 patent/WO2014046456A1/en active Application Filing
- 2013-09-17 CN CN201380048814.6A patent/CN104641328B/en not_active Expired - Fee Related
- 2013-09-18 EP EP13185054.7A patent/EP2711826A1/en not_active Withdrawn
- 2013-09-19 US US14/031,483 patent/US10007417B2/en active Active
-
2018
- 2018-06-07 US US16/002,829 patent/US10788977B2/en active Active
Also Published As
Publication number | Publication date |
---|---|
CN104641328A (en) | 2015-05-20 |
US10007417B2 (en) | 2018-06-26 |
TW201419124A (en) | 2014-05-16 |
US20190278452A9 (en) | 2019-09-12 |
EP2711826A1 (en) | 2014-03-26 |
CN104641328B (en) | 2017-12-22 |
US20180292967A1 (en) | 2018-10-11 |
TWI637312B (en) | 2018-10-01 |
US20140078089A1 (en) | 2014-03-20 |
WO2014046456A1 (en) | 2014-03-27 |
US10788977B2 (en) | 2020-09-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10788977B2 (en) | System and method for displaying information on transparent display device | |
KR102164453B1 (en) | Object control method performed in device including transparent display, the device, and computer readable recording medium thereof | |
US10296127B2 (en) | Object control method performed in device including transparent display, the device, and computer readable recording medium thereof | |
US10470538B2 (en) | Portable terminal and display method thereof | |
CN105452811B (en) | User terminal device for displaying map and method thereof | |
US10299110B2 (en) | Information transmission method and system, device, and computer readable recording medium thereof | |
US10205873B2 (en) | Electronic device and method for controlling a touch screen of the electronic device | |
US20150067590A1 (en) | Method and apparatus for sharing objects in electronic device | |
KR20140010596A (en) | Control method for terminal using touch and gesture input and terminal thereof | |
US9658762B2 (en) | Mobile terminal and method for controlling display of object on touch screen | |
KR20150004713A (en) | Method and apparatus for managing application in a user device | |
US10331340B2 (en) | Device and method for receiving character input through the same | |
KR102183445B1 (en) | Portable terminal device and method for controlling the portable terminal device thereof | |
EP2808777B1 (en) | Method and apparatus for gesture-based data processing | |
KR102255832B1 (en) | System and method for displaying information on transparent display device | |
AU2015200541A1 (en) | Object control method performed in device including transparent display, the device, and computer readable recording medium thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: SAMSUNG ELECTRONICS CO., LTD., KOREA, REPUBLIC OF Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:LEE, CHANG-SOO;KANG, KYUNG-A;REEL/FRAME:031241/0671 Effective date: 20130910 |
|
FEPP | Fee payment procedure |
Free format text: PETITION RELATED TO MAINTENANCE FEES GRANTED (ORIGINAL EVENT CODE: PTGR); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |