EP2800025B1 - Portable terminal and method for protecting a displayed object - Google Patents
Portable terminal and method for protecting a displayed object Download PDFInfo
- Publication number
- EP2800025B1 EP2800025B1 EP14166857.4A EP14166857A EP2800025B1 EP 2800025 B1 EP2800025 B1 EP 2800025B1 EP 14166857 A EP14166857 A EP 14166857A EP 2800025 B1 EP2800025 B1 EP 2800025B1
- Authority
- EP
- European Patent Office
- Prior art keywords
- screen
- portable terminal
- area
- input
- controller
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Not-in-force
Links
Images
Classifications
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/22—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of characters or indicia using display control signals derived from coded signals representing the characters or indicia, e.g. with a character-code memory
- G09G5/30—Control of display attribute
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/50—Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G3/00—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
- G09G3/20—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/36—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
- G09G5/37—Details of the operation on graphic patterns
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2221/00—Indexing scheme relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F2221/03—Indexing scheme relating to G06F21/50, monitoring users, programs or devices to maintain the integrity of platforms
- G06F2221/032—Protect output to user by software means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2221/00—Indexing scheme relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F2221/21—Indexing scheme relating to G06F21/00 and subgroups addressing additional information or applications relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F2221/2105—Dual mode as a secondary aspect
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2320/00—Control of display operating conditions
- G09G2320/06—Adjustment of display parameters
- G09G2320/066—Adjustment of display parameters for control of contrast
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2320/00—Control of display operating conditions
- G09G2320/06—Adjustment of display parameters
- G09G2320/0686—Adjustment of display parameters with two or more screen areas displaying information with different brightness or colours
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2358/00—Arrangements for display data security
Definitions
- EP 1 681 841 A1 teaches a message display method for a mobile communication terminal comprising changing the colors of a letter of a message displayed on a screen or a color of a background of the message or both to be similar to or the same as each other to protect sensitive information contained in text messages from curious eyes. Based on the position of a cursor, only letters on the line of the cursor or a certain number of letters located in front of the cursor are displayed normally, while the remainder of the text message is displayed with the color of the background or the background is modified to hide the remainder of the text message.
- WO 99/47990 discloses an electronic privacy screen for a computer display, wherein a portion of the screen image outside the privacy viewer window is obscured by a privacy mask, which modifies the font of displayed text.
- the location of the privacy viewer window is determined relative to an active position indicator, such as a mouse pointer or text cursor.
- Korean patent KR 10-0765953 B1 discloses a privacy protection function for part of a text message, wherein the font size of all of the text message but the word, where the cursor is located, is reduced.
- the present invention is designed to address at least the problems and/or disadvantages described above and to provide at least the advantages described below.
- an aspect of the present invention is to protect a screen of a portable terminal from unwanted viewing by others.
- Another aspect of the present invention is to maintain security of displayed content by brightly displaying only a required partial area of the screen and dimly displaying the remaining areas.
- a computer-readable storage medium having instructions for protecting an object displayed on a screen of a portable terminal according to claim 10 is provided.
- Preferred embodiments are subject of the dependent claims.
- first, second, etc. can be used for describing various elements, the elements are not restricted by the terms. The terms are only used to distinguish one element from another element. For example, without departing from the scope of the present invention, a first structural element may be named a second structural named. Similarly, the second structural element also may be named the first structural element. As used herein, the term “and/or" includes any and all combinations of one or more of the associated listed items.
- Portable terminal a mobile terminal that is portable and provides data transmission/reception and a voice and video call, and may include at least one screen(or at least one touch screen).
- Examples of a portable terminal include a smart phone, a tablet Personal Computer (PC), a 3-Dimensional (3D)-TeleVision (TV), a smart TV, a Light Emitting Diode (LED) TV, and a Liquid Crystal Display (LCD) TV and any other terminal that can communicate with a neighboring device or another terminal located at a remote place.
- PC Personal Computer
- 3D)-TeleVision TV
- smart TV a smart TV
- LED Light Emitting Diode
- LCD Liquid Crystal Display
- Input device at least one of an electronic pen, and a stylus pen that can provide a command or an input to the portable terminal in a screen contact state or even in a noncontact state, such as hovering.
- Object something which is displayed or can be displayed on a screen of a portable terminal.
- Examples of an object include a document, a writing, a widget, a picture, a map, a dynamic image, an email, a Short Message Service (SMS) message, a Multimedia Messaging Service (MMS) message, a shortcut icon, a thumbnail image, a folder storing at least one object in the portable terminal etc. Accordingly, the object may be executed, deleted, canceled, stored, or changed by a touch input or hovering input using the input device.
- SMS Short Message Service
- MMS Multimedia Messaging Service
- a portable terminal 100 may be connected with an external device (not shown) by using one of a mobile communication module 120, a sub-communication module 130, a connector 165, and an earphone connecting jack 167.
- the external device may include various devices detached from and attached to the portable terminal 100 by a wire, such as an earphone, an external speaker, a Universal Serial Bus (USB) memory, a charger, a cradle/dock, a Digital Multimedia Broadcasting (DMB) antenna, a mobile payment related device, a health management device (blood sugar tester or the like), a game machine, a car navigation device and the like.
- USB Universal Serial Bus
- DMB Digital Multimedia Broadcasting
- the external device may include a Bluetooth communication device, a Near Field Communication (NFC) device, a WiFi Direct communication device, and a wireless Access Point (AC) which can wirelessly access a network.
- the portable terminal may be connected with other devices, e.g., a mobile phone, a smart phone, a tablet PC, a desktop PC, and a server in a wired or wireless manner.
- the sub communication module 130 includes a wireless Local Area Network (LAN) module 131, a short range communication module 132.
- the multimedia module 140 includes a broadcasting communication module 141, an audio reproduction module 142, and a video reproduction module 143.
- the input/output module 160 includes a button 161, a microphone 162, a speaker 163, a vibration motor 164, a connector 165, and a keypad 166.
- the controller 110 includes a Central Processing Unit (CPU) 111, a Read Only Memory (ROM) 112 which stores control programs for controlling the user terminal 100, and a Random Access Memory (RAM) 113 which stores signals or data input from the outside of the portable terminal 100 or is used as a memory region for an operation executed in the portable terminal 100.
- the CPU 111 may include a single core, a dual core, a triple core, or a quadruple core.
- the CPU 111, the ROM 112 and the RAM 113 may be connected to each other through internal buses.
- the controller 110 controls the mobile communication module 120, the sub communication module 130, the multimedia module 140, the camera module 150, the GSP module 157, the input/output module 160, the sensor module 170, the storage unit 175, the power supplier 180, the screen 190, and the screen controller 195.
- the controller 110 determines whether the hovering is recognized, when a input device 168, such as an electronic pen, approaches an object displayed on the screen 190 and identifies the object corresponding to a position where the hovering occurs.
- the controller 110 detects a distance between the portable terminal 100 and the input unit and the hovering according to the distance.
- the hovering may include at least one of a press of a button formed on the input unit, a tap on the input unit, a movement of the input unit at a speed faster than a predetermined speed, and a touch of the object.
- the controller 110 may form an area corresponding to a position of the detected input to protect a displayed object and display a part of the object corresponding to the formed area differently from other parts of the object corresponding to other areas.
- the controller 110 may control such that a display within the area is different from a display in other areas, and control a partial area of the screen 190.
- the controller 110 may control a display within the area by applying different attributes to the area and other areas to display an object output in the area different from an object output in at least a part of the other areas except for the area.
- the attribute may include at least one of a size, a color, a brightness, and a gradation effect.
- the area may be differently controlled according to a touched position or a distance between a finger or an input unit performing the input and the screen 190.
- the object output within the area and the object output within the other areas may be differently controlled according to a touched position or a distance between a finger and/or an input unit performing the input and the screen 190.
- the object or a part of the object output within the area may be more brightly output than the object or a part of the object output within the other areas.
- the area may be formed with a predetermined radius from the touch position.
- a size of the area may be variably controlled according to a distance between a position where the hovering is detected and the screen 190 detecting the input.
- the screen 190 may include a layer for controlling a display of a displayed object.
- the layer may be a virtual layer which can be implemented by software.
- the controller 110 may control a display of the object displayed on the screen 190 by controlling the layer.
- Another area may include at least partial areas except for the area.
- the controller 110 may transparently display the area and display the other areas more translucently than the area or opaquely.
- the controller 110 may control a color of a part of the object to be different from a color of other parts of the object.
- An area having a predetermined radius from the touch position may be formed, when the input is the touch input, and an area may be formed based on a distance between a position where the hovering is detected and the screen 190, when the input is the hovering input, under a control of the controller 110.
- a position of the area may be changed in accordance with a movement of the input.
- the controller 110 may display a trace formed in response to the detected input to protect a displayed object, and control the display such that the displayed trace sequentially disappears.
- the displayed trace may sequentially disappear according to an input time. That is, the controller 110 may control the screen 190, such that a formed trace sequentially disappears a predetermine time after first being displayed.
- the controller 110 may control a Red, Green, and Blue (RGB) value applied to each pixel of the trace to make the trace sequentially disappear. Further, when a touch input or a hovering input is detected on the disappearing trace, the controller 110 may re-display the disappearing trace in response to the detection of the input.
- RGB Red, Green, and Blue
- a gradation effect may be applied to the translucently or opaquely displayed trace.
- the controller 110 may form an area to display the disappearing trace, based on a position at which the touch or hovering input is detected, and display the disappearing trace on the formed area.
- the translucently or opaquely displayed trace may be included in an area formed, based on a position at which the input is detected or a distance from a position where the hovering is detected to the screen 190.
- the controller 110 may control the screen 190 such that the partial area is more brightly displayed than the other areas in response to an input of writing using the touch or the hovering.
- the controller 110 may apply an object attribute, which is different from an object attribute applied to the other areas except for the area, to the area formed in response to the detected input.
- the controller 110 may change the attribute of at least one of the area and the other areas.
- the attribute may include at least one of a size, a color, and a gradation effect.
- At least one of the size and the object output of the partial area may be controlled in response to the touch or hovering input, and the partial area may be formed based on a position at which the input received or a position having a shortest distance from a position at which the hovering is detected.
- a display of the object may be controlled in accordance with a security mode in which the display of the object is controlled.
- the security mode may be configured using at least one of a current position of the portable terminal, a time, an ambient situation, and ambient brightness, or may be configured by a selection by the user. Further, the security mode may configure at least one of a size of the partial area, an object output, and a time at which a trace input by a touch or a hovering is sequentially transparent or translucent. In addition, the security mode may be automatically configured through pieces of information collected by various sensors included in the portable terminal.
- the aforementioned input according to the present disclosure may include at least one of a touch and a hovering, and a shape of the partial area may be variably controlled.
- the controller 110 may detect at least one input of a touch and a hovering on the screen 190 displaying at least one object and form an area for controlling an attribute of the screen 190, based on a position at which the input is detected.
- the controller 110 may control the screen 190 such that an object within the formed area is more brightly displayed than an object within at least a partial area of the other areas.
- the controller 110 may control a display of the object displayed on the screen 190 by using an input detected on the screen 190.
- the screen 190 may include at least two or more layers.
- a first layer of the at least two layers may display an object and a second layer may form the area.
- An attribute of the second layer may be controlled in accordance with at least one of a position at which the input is detected and a distance of the hovering, and may exist on the first layer.
- the second layer may exist below the first layer.
- the controller 110 may control the attribute of the formed area differently from the attribute of the other areas on the screen 190.
- the attribute may include at least one of a color of a part of the object displayed within the area, a brightness of the area, and a gradation effect applied to a boundary between the area and the other areas outside of the area.
- the mobile communication module 120 connects the portable terminal 100 to the external device through mobile communication by using one or more antennas according to a control of the controller 110.
- the mobile communication module 120 transmits/receives a wireless signal for a voice call, a video call, a SMS, or a Multimedia Message Service (MMS) to/from a mobile phone (not illustrated), a smart phone (not illustrated), a tablet PC, or another device (not illustrated), which has a phone number input into the portable terminal 100.
- MMS Multimedia Message Service
- the sub-communication module 130 includes the wireless LAN module 131 and the short-range communication module 132.
- the sub-communication module 130 may include only the wireless LAN module 131, or only the short-range communication module 132.
- the wireless LAN module 131 may connect to the Internet via a wireless Access Point (AP) (not shown), under a control of the controller 110.
- the wireless LAN module 131 supports a wireless LAN standard (IEEE802.11x) of the Institute of Electrical and Electronics Engineers (IEEE).
- the short range communication module 132 may perform short range communication wirelessly between the portable terminal 100 and an image forming apparatus (not illustrated) according to the control of the control unit 110.
- a short-range communication scheme may include a Bluetooth communication scheme, an Infrared Data Association (IrDA) communication scheme, a WiFi-Direct communication scheme, a Near Field Communication (NFC) scheme, etc.
- the portable terminal 100 includes the mobile communication module 120, the wireless LAN module 131, and the short range communication module 132.
- the portable terminal 100 may include a combination of the mobile communication module 120, the wireless LAN module 131, and the short range communication module 132.
- at least one or a combination of the mobile communication module 120, the wireless LAN module 131, and the short range communication module 132 are referred to as "a transceiver," without limiting the scope of the present disclosure.
- the multimedia module 140 includes the broadcasting communication module 141, the audio reproduction module 142, and the video reproduction module 143.
- the broadcasting communication module 141 receives a broadcasting signal (e.g., a TV broadcasting signal, a radio broadcasting signal or a data broadcasting signal) or additional broadcasting information (e.g., Electric Program Guide (EPS) or Electric Service Guide (ESG)), which are transmitted from a broadcasting station, through a broadcasting communication antenna (not illustrated), under the control of the controller 110.
- EPS Electric Program Guide
- ESG Electric Service Guide
- the audio reproduction module 142 may reproduce a stored or received digital audio file (for example, a file having a file extension of mp3, wma, ogg, or wav) under a control of the controller 110.
- a stored or received digital audio file for example, a file having a file extension of mp3, wma, ogg, or wav
- the video reproduction module 143 may reproduce a stored or received digital video file (for example, a file of which the file extension is mpeg, mpg, mp4, avi, mov, or mkv) under the control of the controller 110.
- a stored or received digital video file for example, a file of which the file extension is mpeg, mpg, mp4, avi, mov, or mkv
- the video reproduction module 143 may reproduce a digital audio file.
- the multimedia module 140 may include different combinations of the broadcasting communication module 141, the audio reproduction module 142, and the video reproduction module 143.
- the multimedia module 140 may include the audio reproduction module 142 and the video reproduction module 143, but not the broadcasting communication module 141.
- the audio reproduction module 142 or the video reproduction module 143 of the multimedia module 140 may be included in the controller 110.
- the first camera 151 may be disposed on a front surface of the portable terminal 100, and the second camera 152 may be disposed on a rear surface of the portable terminal 100.
- the first camera 151 and the second camera 152 may be disposed to be adjacent to each other (e.g., an interval between the first camera 151 and the second camera 152 is larger than 1 cm and smaller than 8 cm) to photograph a three-dimensional still image or a three-dimensional dynamic image.
- Each of the first and second cameras 151 and 152 includes a lens system, an image sensor and the like.
- the first and second cameras 151 and 152 convert optical signals input (or taken) through the lens system into electric image signals, and output the electric image signals to the controller 110.
- a user photographs a dynamic image or a still image through the first and second cameras 151 and 152.
- the GPS module 157 receives radio waves from a plurality of GPS satellites (not illustrated) in Earth's orbit and calculate a position of the portable terminal 100 by using Time of Arrival information from the GPS satellites to the portable terminal 100.
- the input/output module 160 includes a button 161, the microphone 162, the speaker 163, the vibration motor 164, the connector 165, the keypad 166, the earphone connecting jack 167, and the input unit 168.
- the input/output module is not limited thereto, and a mouse, a trackball, a joystick, or a cursor control such as cursor direction keys may be provided to control a movement of the cursor on the screen 190.
- the input/output module 160 may include different combinations of the button 161, the microphone 162, the speaker 163, the vibration motor 164, the connector 165, the keypad 166, the earphone connecting jack 167, and the input unit 168.
- the button 161 may be formed on the front surface, side surfaces or the rear surface of the housing of the portable terminal 100 and may include at least one of a power/lock button (not illustrated), a volume button (not illustrated), a menu button, a home button, a back button, and a search button 161.
- the microphone 162 receives a voice or a sound to generate an electrical signal under the control of the controller 110.
- the speaker 163 outputs sounds corresponding to various signals of the mobile communication module 120, the sub-communication module 130, the multimedia module 140, and the camera module 150 (for example, a radio signal, a broadcast signal, a digital audio file, a digital video file, or photographing) to the outside of the portable terminal 100 under the control of the controller 110.
- the speaker 163 may output a sound (for example, button tone corresponding to phone communication, ringing tone, and a voice of another user) corresponding to a function performed by the portable terminal 100.
- One or more speakers 163 may be formed on a suitable position or positions of the housing of the portable terminal 100.
- the vibration motor 164 converts an electric signal into a mechanical vibration under a control of the controller 110. For example, when the portable terminal 100 in a vibration mode receives a voice call from any other device (not illustrated), the vibration motor 164 operates.
- One or more vibration motors 164 may be provided in the housing of the portable terminal 100. The vibration motor 164 may operate in response to a touch action of the user on the screen 190 and successive motions of touches on the screen 190.
- the connector 165 may be used as an interface for connecting the portable terminal 100 with an external device (not shown) or a power source (not shown).
- the portable terminal 100 may transmit or receive data stored in the storage unit 175 of the portable terminal 100 to or from an external device (not shown) through a wired cable connected to the connector 165 according to a control of the controller 110. Further, the portable terminal 100 may receive power from the power source through the wired cable connected to the connector 165 or charge a battery (not shown) by using the power source.
- the keypad 166 may receive a key input from a user for control of the portable terminal 100.
- the keypad 166 includes a physical keypad (not shown) formed in the portable terminal 100 or a virtual keypad (not shown) displayed on the display unit 190.
- the physical keypad (not illustrated) formed on the portable terminal 100 may be omitted according to the capability or configuration of the portable terminal 100.
- Earphones may be inserted into the earphone connecting jack 167 according to an embodiment to be connected to the portable terminal 100, and the input unit 168 may be inserted into and preserved in the portable terminal 100 and may be extracted or detached from the portable terminal 100 when being used.
- an attachment/detachment recognition switch 169 operating in response to attachment or detachment of the input unit 168 is provided at one area within the portable terminal 100 into which the input unit 168 is inserted, and provides a signal corresponding to the attachment or detachment of the input unit 168 to the controller 110.
- the attachment/detachment recognition switch 169 is located at one area into which the input unit 168 is inserted to directly or indirectly contact the input unit 168 when the input unit 168 is mounted. Accordingly, the attachment/detachment recognition switch 169 generates a signal corresponding to the attachment or the detachment of the input unit 168 based on the direct or indirect contact with the input unit 168 and then provides the generated signal to the controller 110.
- the sensor module 170 includes at least one sensor for detecting a state of the portable terminal 100.
- the sensor module 170 may include a proximity sensor that detects a user's proximity to the portable terminal 100, an illumination sensor (not illustrated) that detects a quantity of light around the portable terminal 100, a motion sensor (not illustrated) that detects a motion (e.g., rotation of the portable terminal 100 and acceleration or a vibration applied to the portable terminal 100) of the portable terminal 100, a geo-magnetic sensor (not illustrated) that detects a point of a compass by using Earth's magnetic field, a gravity sensor that detects an action direction of gravity, and an altimeter that detects an altitude through measuring an atmospheric pressure.
- At least one sensor may detect the state, and may generate a signal corresponding to the detection to transmit the generated signal to the controller 110.
- the sensor of the sensor module 170 may be added or omitted according to a capability of the portable terminal 100.
- the storage unit 175 store signals or data input/output in response to the operations of the mobile communication module 120, the sub-communication module 130, the multimedia module 140, the camera module 150, the GPS module 157, the input/output module 160, the sensor module 170, and the screen 190 according to the control of the controller 110.
- the storage unit 175 can store a control program and applications for controlling the portable terminal 100 or the controller 110.
- the term “storage unit” includes the storage unit 175, the ROM 112 and the RAM 113 within the controller 110, or a memory card (not shown) (for example, an SD card or a memory stick) installed in the portable terminal 100. Further, the storage unit may include a nonvolatile memory, a volatile memory, a Hard Disk Drive (HDD), or a Solid State Drive (SSD).
- HDD Hard Disk Drive
- SSD Solid State Drive
- the storage unit 175 may store applications having various functions such as a navigation function, a video call function, a game function, and a time based alarm function, images for providing a Graphical User Interface (GUI) related to the applications, databases or data related to a method of processing user information, a document, and a touch input, background images (a menu screen, an idle screen or the like) or operating programs required for driving the portable terminal 100, and images photographed by the camera module 150.
- the storage unit 175 is a machine (for example, computer)-readable medium.
- the term "machine-readable medium” may be defined as a medium capable of providing data to the machine so that the machine performs a specific function.
- the machine-readable medium may be a storage medium.
- the storage unit 175 may include a non-volatile medium and a volatile medium. All of these media should be a type that allows the commands transferred by the media to be detected by a physical instrument in which the machine reads the commands into the physical instrument.
- the machine-readable medium includes at least one of a floppy disk, a flexible disk, a hard disk, a magnetic tape, a Compact Disc Read-Only Memory (CD-ROM), an optical disk, a punch card, a paper tape, a Random Access Memory (RAM), a Programmable Read-Only Memory (PROM), an Erasable PROM (EPROM), and a flash-EPROM, but is not limited thereto.
- the power supplier 180 supplies power to one or more batteries (not illustrated) disposed in the housing of the portable terminal 100 under the control of the controller 110.
- the one or more batteries (not illustrated) supply power to the portable terminal 100.
- the power supplier 180 may supply, to the portable terminal 100, the power input from an external power source (not illustrated) through a wired cable connected with the connector 165.
- the power supplier 180 may supply power wirelessly input from the external power source through a wireless charging technology to the portable terminal 100.
- the portable terminal 100 includes a screen 190 providing user interfaces corresponding to various services (for example, a phone call, data transmission, broadcasting, and photography) to the user.
- the portable terminal 100 may include multiple screens 190.
- Each of the screens 190 may transmit an analog signal corresponding to at least one touch input to a user interface to a corresponding screen controller.
- the portable terminal 100 may include a plurality of screens 190, and each of the screens 190 may include a screen controller 195 that receives an analog signal corresponding to a touch.
- the screens may be connected to a plurality of housings through a hinge, respectively, or may be located in a single housing without a hinge connection.
- the screen 190 may detect at least one of the touch and the hovering through a touch device (for example, a stylus pen, an electronic pen, a finger, etc.).
- a touch device for example, a stylus pen, an electronic pen, a finger, etc.
- the screen 190 includes a touch panel 191 that can detect a touch input using an input device and a hovering panel 192 that can detect a hovering input using an input device.
- the hovering panel 912 may include a panel in an EMR type which can detect a distance between the input unit or the finger and the screen 190 through a magnetic field.
- the screen 190 may receive successive motions of the touch or the hovering using the input unit or the finger and include a display panel (not shown) that can display a trace formed by the successive motions.
- the panel in the EMR type may be configured below the display panel and may detect the hovering.
- the display panel may be a panel such as an LCD, an AMOLED or the like and may display various images and a plurality of objects according to various operation statuses of the portable terminal 100, an execution of an application and a service. Further, the display panel (not shown) may include a layer for controlling an output of an object.
- the screen 190 may separately include panels (for example, touch panel 191 and hovering panel 192) that can detect the touch and the hovering using the finger or the input unit, respectively, or may detect the touch and the hovering through only one panel. Values detected by the respective panels may be different and may be provided to the screen controller. Further, the screen controller may differently recognize the values of the touch and the hovering input from the respective panels to determine whether the input from the screen 190 is an input by a user's body or an input by a touchable input unit.
- the screen 190 may display one or more objects on the display panel.
- the object may include a trace formed by a movement of the finger or the input unit as well as a picture, a video, and a document.
- the screen 190 or the display panel may include a layer for controlling a display of an object. The layer may be a virtual layer implemented by software.
- the screen 190 may determine a distance between the detected position and the finger or the input unit having provided the input. An interval which can be detected by the screen 190 may be changed according to a capability or a structure of the portable terminal 100.
- the screen 190 is configured to distinguish the touch by the contact with the user's body or the touch input unit and the input in a proximity state (for example, hovering) and configured to output different values (for example, including voltage values or current values as analog values) detected by the touch and the hovering. Further, it is preferable that the screen 190 outputs a different detected value (for example, a current value or the like) according to a distance between a position in the air where the hovering is generated and the screen 190.
- a different detected value for example, a current value or the like
- the screen 190 may be implemented as, for example, a resistive type, a capacitive type, an infrared type, or an acoustic wave type.
- the screen 190 includes a plurality of pixels and displays an image through the pixels.
- the screen 190 may use an LCD, an OLED, or an LED.
- the screen 190 includes a plurality of sensors detecting, when the input unit 168 touches a surface of the screen 190 or is placed within a predetermined distance from the screen 190, a position of the input unit 168.
- the plurality of sensors may be formed with a coil structure, and in a sensor layer formed of the plurality of sensors, the sensors are arranged in a predetermined pattern and form a plurality of electrode lines.
- the screen 190 transmits, to the controller 110, a detection signal caused by capacitance.
- a predetermined distance between the input unit 168 and the screen 190 may be detected through an intensity of a magnetic field generated by a coil 430.
- the screen 190 may detect the input using the input device 168 and display a screen in which an output of an object is controlled in a partial area of the screen 190 according to the detected input under a control of the controller 110.
- the object may be output after a brightness or a color of the object is controlled or a transparency effect is applied to the object.
- the input may include at least one of the touch and the hovering.
- the screen 190 may display an object output in the partial area differently from an object output in other areas. Further, the screen 190 may display such that a trace of input writing sequentially disappears in response to an input of writing using a touch or hovering. When a touch or hovering using the input device 168 is generated on the disappearing trace, the screen 190 may translucently or opaquely displays the disappearing trace.
- the translucently or opaquely displayed trace may be included in an area formed based on the touch position by the input device 1668 or a position having a proximity distance (or shortest distance) from the input unit. Further, the screen 190 may more brightly display the partial area than other areas in response the input of writing using the touch or hovering. Other areas may also be displayed translucently or opaquely.
- the screen may apply a gradation effect between the partial area and other areas.
- the screen 190 may display such that at least one of a size, an object output, and a color of the partial area is different from that in other areas in response to the detection of the touch or hovering using the input unit or finger in a state where a predetermined object is displayed. At least one of the size of the partial area and an object output of the remaining areas may be controlled in accordance with a touch between the input device 168 and the screen 190 or a distance of the hovering.
- the screen controller 195 converts the analog signal received from the screen 190 to a digital signal (for example, X and Y coordinates) and then transmits the digital signal to the controller 110.
- the controller 110 can control the screen 190 by using the digital signal received from the screen controller 195.
- the controller 110 allows a short-cut icon (not shown) or an object displayed on the screen 190 to be selected or executed in response to a touch or hovering.
- the screen controller 195 may be included in the controller 110.
- the screen controller 195 may identify a distance between a position in the air where the hovering is generated and the screen 190 by detecting a value (for example, a current value or the like) output through the screen 190, convert the identified distance value to a digital signal (for example, a Z coordinate), and then provide the converted digital signal to the controller 110.
- a value for example, a current value or the like
- FIG. 2 is a front perspective view of a portable terminal according to an embodiment of the present invention
- FIG. 3 is a rear perspective view of a portable terminal according to an embodiment of the present invention.
- the screen 190 is disposed in a center of a front surface 100a of the portable terminal 100.
- the screen 190 may have a large size that occupies most of the front surface 100a of the portable terminal 100.
- FIG. 2 illustrates an example where a main home screen is displayed on the screen 190.
- the main home screen is a first screen displayed on the screen 190 when power of the portable terminal 100 is turned on. Further, when the portable terminal 100 has different home screens of several pages, the main home screen may be a first home screen of the home screens of several pages.
- Short-cut icons 191-1, 191-2, and 191-3 for executing frequently used applications, a main menu switching key 191-4, time, weather, etc., are displayed on the home screen.
- the main menu switch key 191-4 displays a menu screen on the screen 190.
- a status bar 192 indicating a status of the portable terminal 100 such as a battery charge status, an intensity of a received signal and a current time is formed at a top end of the screen 190.
- a home button 161a, a menu button 161b, and a back button 161c is formed at a lower portion of the screen 190.
- the home button 161a displays the main home screen on the screen 190. For example, when the home key 161a is touched while a home screen different from the main home screen or the menu screen is displayed on the screen 190, the main home screen is displayed on the screen 190. Further, when the home button 161a is touched while applications are executed on the screen 190, the main home screen illustrated in FIG. 2 is displayed on the screen 190. Further, the home button 161a may be used to display recently used applications or a task manager on the screen 190.
- the menu button 161b provides a connection menu that may be used on the screen 190.
- the connection menu may include a widget addition menu, a background change menu, a search menu, an editing menu, an environment setup menu, etc.
- the back button 161c may be used for displaying the screen which was executed just before the currently executed screen or terminating the most recently used application.
- the first camera 151, an illumination sensor 170a, and a proximity sensor 170b are disposed on edges of the front side 100a of the portable terminal 100.
- the second camera 152, the flash 153, and the speaker 163 are disposed on a rear surface 100c of the portable terminal 100.
- a power/reset button 160a, a volume button 161b, a terrestrial DMB antenna 141a for receiving a broadcast, and one or more microphones 162 are disposed on a side surface 100b of the portable terminal 100.
- the DMB antenna 141a may be fixed to the portable terminal 100 or may be formed to be detachable from the portable terminal 100.
- the portable terminal 100 has the connector 165 arranged on a lower side surface.
- a plurality of electrodes are formed in the connector 165, and the connector 165 may be connected to an external device through a wire.
- the earphone connecting jack 167 is formed on an upper side surface of the portable terminal 100. Earphones may be inserted into the earphone connecting jack 167.
- the input unit 168 such as a stylus, may be received in a lower side surface of the portable terminal 100.
- the input unit 168 may be inserted into the portable terminal 100 to be stored in the portable terminal 100, and withdrawn and detached from the portable terminal 100 when being used.
- FIGs. 4A and 4B illustrate a configuration of a plurality of layers for controlling a display of an object according to an embodiment of the present disclosure. Specifically, FIG. 4A illustrates an example separately including a layer for controlling a display of an object according to an embodiment of the present invention, and FIG. 4B illustrates an example which does not separately include a layer for controlling a display of an object according to an embodiment of the present invention.
- the application according to an embodiment of the present invention can be applied to various applications for displaying objects, such as a picture, and various applications in which the writing input can be made, such as a drawing, a text input, a diary, etc.
- the application may include one or more layers according to a function or attribute thereof.
- the application is an application for displaying or writing objects, such as a writing input, a drawing, a text input or certain pictures
- the application may include various menus such as a menu that receives a selection from the user to display the objects, a menu that configures an environment of the application, and a menu that displays the objects.
- the application of the present invention may include a plurality of layers for displaying the objects, and each of the plurality of layers is allocated to each menu.
- a first layer 410 of the application may be located at the bottom, a second layer 420 may be located on the first layer 410, and a third layer 430 may be located on the second layer 420.
- the first layer 410 is not directly shown to the user but serves as a container that contains other layers, and may be called a layout.
- the layout may include a frame layout, a relative layout, and a linear layout.
- the second layer 420 may display an object such as a picture, writing, or a document.
- the third layer 430 controls at least a part of the display of the object displayed on the second layer 420 and may be opaque or translucent.
- An area for displaying an object in response to a touch or hover input may be formed on the third layer 430.
- the area may be formed with a predetermined radius from a position of the touch or hover input.
- a size of the area may be variably controlled according to a distance between a position where the hovering is detected (in the air) and the screen. Further, a position of the area may be changed in accordance with a movement of the input.
- the controller 110 may control an attribute of the third layer 430 such that a part of the object corresponding to the formed area is displayed differently from other parts of the object.
- the attribute may include at least one of a size of the area, a brightness of the area, a gradation effect of the area, and a color of the object corresponding to the area. Further, as illustrated in FIG. 4B , even when the third layer 430 is not included, the controller may control the display of the object by controlling an attribute of the second layer 420.
- FIG. 5 is a block diagram illustrating an input device according to an embodiment of the present invention.
- the input device 168 (for example, a touch pen) includes a penholder 500, a nib 430 disposed at an end of the penholder, a button 420 that may change an electromagnetic induction value generated by a coil 510 included inside the penholder, e.g., adjacent to the nib 430.
- the input device 168 also includes a vibration device 520, a controller 530 that generally controls the input unit 168, a short-range communication unit 540 that performs short-range communication with the portable terminal 100, and a battery 550 that supplies power to the input unit 168.
- the input unit 168 illustrated in FIG. 5 supports the EMR type. Accordingly, when a magnetic field is form on a predetermined position of the screen 190 by the coil 510, the screen 190 may recognize a touch position through detecting a location of the corresponding magnetic field.
- the portable terminal activates a security mode of the screen in step S610.
- the security mode controls an object output (or display) of the screen, such that the user can view the screen, but other people adjacent to the user cannot easily view the screen.
- the object output may be controlled in a partial area by activating the security mode or the object output may be controlled in a partial area in response to detecting a hovering input, without the activation of the security mode.
- the portable terminal In order to control the object output of the screen using the security mode, the portable terminal automatically activate the security mode by analyzing an ambient situation of the portable terminal or manually activate the security mode in response to a user input or setting. Further, the security mode may configure at least one of a size and an object output of an area through which the object is displayed on the screen, and a time for a trace input is displayed before being sequentially transitioned to be transparent, or translucent.
- the security mode is automatically activated through an analysis of an ambient situation of the portable terminal or may be activated when a condition configured by the user is applied.
- the ambient situation and the condition include at least one of a current position of the portable terminal, a time, an ambient situation, ambient brightness, and ambient noise. Further, the ambient situation and the condition can be controlled or configured by the user.
- the security mode may be automatically executed based on at least one of a current position of the portable terminal, a time, an ambient situation, ambient brightness, and ambient noise.
- the portable terminal controls the object output in a partial area corresponding to the detected input in step S614.
- the controller 220 detects an input of a hovering on the screen by the input device.
- the controller 110 controls the display such that the object output in the partial area is displayed differently from object outputs in other areas.
- controller 110 may control the screen so that the object output or an object color in the partial area of the screen is displayed differently from that in the remaining areas except for the partial area.
- the object in the partial area may be output more brightly than objects in other areas and the other areas may be the remaining areas except for the partial area.
- the size of the partial area and/or the object output may be controlled in accordance with a distance between the detection of the hover input and the screen, and the partial area may be formed based on a position having a proximity (or shortest) distance from the position at which the hovering is detected.
- the controller 110 may control the display such that a trace of writing made using a touch or hover input sequentially disappears according to an illustrative example not forming part of the present invention. Further, the controller 110 may control the object output such that the disappeared trace is translucently redisplayed in response to the generation of the hovering using the input device on the disappearing trace.
- the translucently displayed trace may be included in the area formed based on the touched position by the input device or the position having the proximity (shortest) distance from the position where the hovering is detected.
- controller 110 may control the object in the partial area to be brighter than objects in other areas in response to the input of the writing using the touch or hovering on the screen.
- the other areas may be translucently or opaquely displayed.
- the portable terminal displays a result of the control in step S616.
- the screen may display the object output in the partial area differently from object outputs in other areas, under a control of the controller 110.
- FIG. 7 is a flowchart illustrating a trace display controlling method on a screen of a portable terminal according to an illustrative example not part of the present invention.
- FIGs. 8A to 8D illustrate examples of controlling a display of a trace input on the screen of the portable terminal according to an illustrative example.
- FIG. 8A illustrates an example of inputting a trace on the screen of the portable terminal according to an illustrative example
- FIG. 8B illustrates an example of controlling to make a trace input on the screen of the portable terminal sequentially disappear, after a predetermined time, according to an illustrative example
- FIG 8C illustrates another example of controlling to make a trace input on the screen of the portable terminal sequentially disappear, after a predetermined time, according to an illustrative example
- FIG. 8D illustrates an example of controlling a display of the disappearing trace in accordance with a touch or hovering input on the trace input on the screen of the portable terminal according to an illustrative example.
- the portable terminal activates a security mode of the screen in step S710.
- the controller 110 may analyze an ambient situation of the portable terminal to control an object output on the screen, or configure and/or activate the security mode in response to an input by a user.
- the portable display controls a display of the detected trace after a predetermined threshold time in step S714, and displays a result of the control in step S716. More specifically, a writing input is detected, and the controller 110 displays a trace of the input writing on the screen. Thereafter, the displayed trace is sequentially output as a transparent or translucent trace, after a predetermined time, according to a configuration of the security mode.
- the predetermined time may be configured by the user or may be variably controlled by at least one of a current position of the portable terminal, a time, an ambient situation, ambient brightness, and ambient noise.
- the term "sequentially" corresponds to a time when the trace is input.
- the controller 110 displays the trace. After a predetermined time elapses from a first portion of the trace being input, the trace is sequentially displayed as transparent or translucent. For example, the trace 811 is displayed such that an object output gradually disappears (that is, becomes transparent) according to an input time order of the traces, a first trace 811a, a second trace 811b, and a third trace 811c.
- FIG. 8A object outputs of the first trace to the third trace in FIG. 8A are shown as different segments, this is only for an easy description and the traces may be displayed with gradation.
- FIGs. 8A to 8D illustrate that the trace by writing is input into the memo pad, the illustrative example can be applied to various applications in which the writing input can be made, such as a drawing, a text input, a diary, etc..
- a trace of writing using the input device 850 is input into the screen 820, and after a predetermined time elapses, the trace sequentially disappears (for example, trace is transparently displayed) from the screen, like a pre-input trace 821.
- a trace 822 which is input after the trace 821, is less transparent than the trace 821, and an object output of the trace 822 may be also controlled according to an input time. That is, an object output of the trace 822 may gradually disappear (that is, become transparent) according to an input time sequence, like a trace 822a and a trace 822b.
- the traces according to the present invention may be displayed with gradation, which is sequentially applied according to an input time.
- the time by which the traces sequentially disappear may be configured by the user in the security mode or may be automatically configured by at least one of a current position of the portable terminal, a time, an ambient situation, ambient brightness, and ambient noise.
- a trace is input into the screen 830, and after a predetermined time elapses, the input trace becomes sequentially translucent, like a trace 831.
- the trace 831 may be displayed to sequentially disappear as the time elapses.
- the trace 832 may gradually disappear (that is, become transparent) according to an input time sequence, like a trace 832a and a trace 832b.
- object outputs of the trace 832a and the trace 832b in FIG. 8C are illustrated as separate segments, this is only for an easy description, and the traces according to the present invention may be displayed with gradation sequentially applied according to an input time.
- the time by which the traces sequentially become translucent may be configured by the user in the security mode or may be automatically configured by at least one of a current position of the portable terminal, a time, an ambient situation, ambient brightness, and ambient noise.
- the screen may display the input traces to be sequentially transparent or translucent after a predetermined time, under a control of the controller.
- the portable terminal displays a trace included in a partial area corresponding to the detected touch or hovering, while controlling an output of the trace in step S720.
- the controller 110 controls the screen so that the disappeared trace is translucently or opaquely re-displayed.
- the translucently or opaquely displayed trace may be a trace included in a partial area formed based on a touched position by the input device or a position having a proximity (or shortest) distance between the device and the screen.
- a size of the partial area may be configured in the security mode or controlled by the user.
- the screen 840 displays traces of area 841 and traces of area 842 of writings sequentially input.
- the controller 110 controls the screen such that the input traces sequentially disappear.
- the controller 110 translucently or opaquely re-displays the trace in a partial area 841, in response to the detection of the touch or the hovering.
- the translucently or opaquely displayed trace in the partial area 841 may be re-displayed such that the trace gradually disappears while moving away from a touched position of the input or a position having a proximity (or shortest) distance from a position where a hovering is detected.
- size of the partial area 841 may be controlled according to a distance between the input device 850 and the screen 840 or controlled according to a configuration of the security mode.
- the trace of the area 842 made after the trace of the partial area 841 may be displayed differently from the trace of the partial area 841 according to an input time.
- the trace of the area 842 may gradually disappear (that is, become transparent) according to an input time sequence, like a trace 842a and a trace 842b.
- the trace input by the touch or the hovering is displayed in the other area 842 and the trace within the area 842 is transparent or opaque.
- FIG. 9 is a flowchart illustrating a method of controlling an object output of a screen, based on a position at which writing is input into the screen of the portable terminal, according to an illustrative example.
- FIGs. 10A and 10B illustrate an example of a process of controlling an object output of the screen, based on a position at which writing is input into the screen of the portable terminal, according to an illustrative example.
- FIG. 10A illustrates an example of controlling an object output in a partial area that is formed, based on a position at which writing is input into the screen, according to an illustrative example
- FIG. 10B illustrates another example of controlling an object output in a partial area that is formed, based on a position at which writing is input into the screen, according to an illustrative example.
- step S910 the portable terminal activates a security mode of the screen in step S910.
- the operation of step S910 is the same as described in steps S610 and S710, as described above. Accordingly, a repetitive description of step S910 will not be provided.
- the controller 110 controls an output of the writing according to the elapse of time of the input writing in step S914 and displays a result of the control in step S916. More specifically, the controller 110 may detect a position where the writing is currently made on the screen and may control the display such that an object in a partial area that is formed, based on the detected position, is differently displayed than an object in the remaining areas, i.e., outside of the partial area. The controller 110 may control object outputs in the remaining areas, based on the position at which the writing is currently made on the screen.
- controller 110 may control the display such that the object output in the partial area that is formed, based on a position at which the writing is currently made on the screen, is displayed as being transparent or translucent.
- the screen may display a result of the control. For example, at least one of a size of the partial area and object output of the remaining areas may be controlled according to a distance between the input device and the screen and a touched position of the input device.
- the controller 110 may detect a touched position between an input device 1030 and the screen 1010 to form a partial area 1012, which is based on the touched position, and control the screen to display the made writing in the partial area 1012.
- controller 110 may control the screen 1010 to opaquely display the remaining area 1013, i.e., the area outside the partial area 1012.
- the controller 110 may control the display of the object. As described above, the controller 110 may control the partial area 1012 based on the position at which the touch or the hovering by the input device 1030 is detected. Due to such a control, the user may recognize that a gradation effect is provided to the partial area 1012 or an effect of distinguishing the areas is provided.
- the partial area 1012 is divided into a plurality of areas 1012a to 1012d by a transparency difference or a brightness difference.
- a first partial area 1012a has a radius r1
- a second partial area 1012b has a radius r2
- a third partial area 1012c has a radius r3
- a fourth partial area 1012d has a radius r4.
- the partial area 1012 may be divided into the plurality of areas 1012a to 1012d based on the position where the touch or the hovering by the input unit or the finger is detected, or the partial area 1012 may not be divided into the first to fourth areas 1012a to 1012d by applying gradation to the first to fourth partial areas.
- the controller 110 may control the brightness by increasing the transparency in order to more clearly output an object in an area having a shorter radius (for example, the first area 1012a among the first to fourth areas 1012a to 1012d) and by reducing the transparency in order to make the object opaque or translucent in an area having a larger radius (for example, the fourth area 1012d among the first to fourth areas 1012a to 1012d).
- the screen may be controlled such that the other area 1013 is shown to be opaque.
- controller 110 may control the screen 1020 to translucently display the remaining area 1023.
- the controller 110 may control the display of the object. As described above, the controller 110 may control the display of the object in the partial area 1022 that is based on the position at which the touch or the hovering by the input device1030 is detected. Due to such a control, a gradation effect may be provided to the partial area 1022 or the user may recognize an effect of distinguishing the areas.
- the partial area 1022 is divided into a plurality of areas 1022a to 1022d.
- a first partial area 1022a has a radius r1
- a second partial area 1022b has a radius r2
- a third partial area 1022c has a radius r3
- an fourth partial area 1022d has a radius r4.
- the partial area 1022 may be divided into the plurality of areas 1022a to 1022d based on the position where the touch or the hovering by the input device 1030 is detected, or the partial area 1022 may not be divided into the first to fourth areas 1022a to 1022d, but uniform gradation may be applied.
- the controller 110 may control the screen by controlling the transparency of the first area 1022a to be higher in order to more clearly output an object in an area having a shorter radius (for example, the first area 1022a among the first to fourth areas 1022a to 1022d) and controlling the transparency to be lower than that of the first area 1022a in an area having a larger radius (for example, the fourth area 1022d among the first to fourth areas 1022a to 1022d).
- the other area 1023 i.e., outside the partial area 1022, may be controlled to be opaque.
- FIGs. 12A to 12F illustrate examples of controlling an object output of a screen, based on a position at which an input is detected on the screen displaying an object, according to embodiments of the present invention.
- FIG. 12A illustrates an example of controlling an object output in a partial area that is formed, based on a position at which a hovering is detected on a screen displaying an object, according to an embodiment of the present invention
- FIG. 12B illustrates another example of controlling an object output in a partial area that is formed based on a position at which a hovering is detected on a screen displaying an object, according to an embodiment of the present invention
- FIG. 12A illustrates an example of controlling an object output in a partial area that is formed, based on a position at which a hovering is detected on a screen displaying an object, according to an embodiment of the present invention
- FIG. 12B illustrates another example of controlling an object output in a partial area that is formed based on a position at which a hovering is detected on a screen
- the controller controls the display of an object output in a partial area that is based on a position where the input is detected in step S1114 and displays a result of the control in step S1116.
- the object displayed on the screen which is confidential or that a user wants to be kept from being exposed to other people, includes a document, a widget, a picture, a news article, a map, a diary, a video, an email, an SMS message, and an MMS message.
- the controller may control an object output such that a partial area corresponding to the touch or the hovering is transparently or translucently displayed.
- the controller may control an object output such that the remaining area except for the partial area corresponding to the touch or the hovering is translucently or opaquely displayed.
- a gradation effect may be applied between the partial area and the remaining area, and a size of the partial area may be variably controlled according to a distance between the screen and the input device or preset in the security mode.
- the controller 110 may detect a touched position between the input device 1270 and the screen 1210, and form a partial area 1211 based on the touched position. Further, the controller 110 may control an attribute of the formed partial area 1211 to display the object in the formed partial area 1211 differently from the object in the remaining area 1212, i.e., the area outside of the partial area 1211.
- At least one of a size of the partial area 1211 and object outputs in the partial area 1211 and the remaining area 1212 may be configured in the security mode.
- the security mode may configure gradation between the partial area 1211 and the remaining area 1212.
- the controller 110 may configure the object output in the partial area by using at least one of a current position of the portable terminal, a time, ambient brightness, an ambient situation, and ambient noise, or the object output in the partial area may be controlled by the user.
- the controller 110 may detect a touched position between the input device 1270 and the screen 1220, and form a partial area 1221 that is based on the touched position.
- the controller 110 may control an object output by controlling the screen or an attribute of the remaining areas to translucently display the remaining area 1222 and display a result of the control on the screen 1220.
- the partial area 1221 may be divided into the first to fourth areas 1022a to 1022d as illustrated in FIG. 10B or may have the gradation effect applied.
- At least one of the object outputs in the partial area 1221 and the remaining area 1222 may be configured in the security mode.
- the security mode may configure gradation between the partial area 1221 and the remaining area 1222.
- the controller 110 may configure at least one of a size of the partial area and the object output by using at least one of a current position of the portable terminal, a time, ambient brightness, an ambient situation, and ambient noise, or at least one of the size of the partial area and the object output may be controlled by the user.
- the controller 110 may detect a touched position or a hovering distance between the input device 1270 and the screen 1230, and form a partial area 1231 based on the touched position or the hovering position.
- the controller 110 may control an object output to opaquely display the remaining area 1232, i.e., outside the formed partial area 1231, and display the partial area 1231 having a controlled size on the screen 1230.
- the size of the partial area 1231 may become smaller as the input unit 1270 is closer to the screen and become larger as the input unit 1270 is further away from the screen.
- the partial area 1231 illustrated in FIG. 12C is smaller than the partial area illustrated in FIG. 12A because a hovering distance between the screen 1230 and the input device 1270 in FIG. 12C is shorter than a hovering distance between the screen 1210 and the input device 1270 in FIG. 12A .
- the partial area 1231 may be divided into the first to fourth areas 1012a to 1012d or may have the gradation effect applied.
- At least one of a size of the partial area 1231 and object outputs in the partial area 1231 and the remaining area 1232 may be configured in the security mode.
- the security mode may configure gradation between the partial area 1231 and the remaining area 1232.
- the controller 110 may configure at least one of the size of the partial area and the object output by using at least one of a current position of the portable terminal, a time, ambient brightness, an ambient situation, and ambient noise, or at least one of the size of the partial area and the object output may be controlled by the user.
- the controller 110 may detect or measure a touched position or a hovering distance between the input device 1270 and the screen 1240 and form a partial area 1241 based on the touched position or the hovering position. Further, the controller 110 may control an object output to translucently display the remaining area 1232.
- the partial area 1241 may be divided into the first to fourth areas 1022a to 1022d as illustrated in FIG. 10B or may have the gradation effect applied.
- At least one of the size of the partial area 1241 and object outputs in the partial area 1242 and the remaining area 1242 may be configured in the security mode.
- the security mode may configure gradation between the partial area 1241 and the remaining area 1242.
- the controller 110 may configure at least one of the size of the partial area and the object output by using at least one of a current position of the portable terminal, a time, ambient brightness, an ambient situation, and ambient noise, or at least one of the size of the partial area and the object output may be controlled by the user.
- the controller 110 may detect or measure a touched position or a hovering distance between the input device 1270 and the screen 1250.
- the controller 110 may form a partial area 1252 based on the touched position or the hovering position by using a result of the measurement and perform filtering of changing a color of the partial area 1252 to differentiate an image of the formed partial area 1252 from an image of the remaining area 1251, i.e., outside the partial area 1252.
- the filtering may include a conversion to represent the partial area 1252 with a contour line, to blur the partial area 1252, to change a color of the partial area 1252, or to make the partial area 1252 clear.
- the controller 110 may convert a color of the partial area 1252 to black and white or an image of the partial area 1252 to a contour line, and display the converted partial area 1252 on the screen 1250.
- a size of the partial area 1252 may be displayed according to the hovering distance between the screen 1250 and the input unit 1270 or the finger. For example, the size of the partial area 1252 may become smaller as the hovering distance is shorter and become larger as the hovering distance is longer. Further, the size of the partial area 1252 may become larger as the hovering distance is shorter and become smaller as the hovering distance is longer. In addition, at least one of the size and shape of the partial area 1252 may be configured in the security mode or selectively applied by the user, and is variable.
- the shape of the partial area 1252 may include various forms of figures, such as a circle, an oval, a square, a rectangle and the like.
- the color of the partial area 1252 can be changed to black and white as described above, and, when a black and white picture is displayed, an image of the partial area 1252 formed by the input unit 1270 or the finger may be converted to a color image.
- the controller 110 may configure at least one of the size and the color of the partial area by using at least one of a current position of the portable terminal, a time, ambient brightness, an ambient situation, and ambient noise, or at least one of the size and the color of the partial area may be controlled by the user.
- the controller 110 may change a color the of the displayed object. For example, when the displayed object is a color image, the controller 110 may change the color image to a black and white image and display the black and white image on the screen 1260.
- the controller 110 may change the black and white image to a color image and display the color image on the screen 1260.
- the controller 110 may apply different color concentrations according to the hovering distance. For example, a deeper color is applied as the hovering distance is shorter and a lighter color is applied as the hovering distance is longer. Further, the controller 110 may configure the color change by using at least one of a current position of the portable terminal, a time, ambient brightness, an ambient situation, and ambient noise or the color change may be controlled by the user.
- any such software may be stored, for example, in a volatile or non-volatile storage device such as a ROM, a memory such as a RAM, a memory chip, a memory device, or an IC, or a optical or magnetic recordable and machine (e.g., computer) readable medium such as a CD, a DVD, a magnetic disk, or a magnetic tape, regardless of its ability to be erased or its ability to be re-recorded.
- a volatile or non-volatile storage device such as a ROM, a memory such as a RAM, a memory chip, a memory device, or an IC
- a optical or magnetic recordable and machine (e.g., computer) readable medium such as a CD, a DVD, a magnetic disk, or a magnetic tape, regardless of its ability to be erased or its ability to be re-recorded.
- a memory which may be incorporated in a portable terminal, may be an example of a machine-readable storage medium which is suitable for storing a program or programs including commands to implement the exemplary embodiments of the present disclosure.
- the present invention includes a program that includes a code for implementing an apparatus or a method defined in any claim in the present specification and a machine-readable storage medium that stores such a program.
- the program may be electronically transferred by a predetermined medium such as a communication signal transferred through a wired or wireless connection, and the present disclosure appropriately includes equivalents of the program.
- a program providing apparatus may include a program including instructions controlling the screen by the portable terminal, a memory for storing information required for controlling the screen, a communication unit for performing wired or wireless communication with the portable terminal, and a controller transmitting the corresponding program to a host device automatically or by a request of the portable terminal.
Description
- The present disclosure relates to a portable terminal, and more particularly to a portable terminal and a method for controlling a screen to protect a displayed object.
- With the widespread use of portable terminals, users have become accustomed to using their portable terminals without regard to time and place. Further, with the increased popularity and conveniences of touch screen portable terminals, user demands to produce and share content are also gradually increasing. Additionally, screen sizes of the portable terminals have gradually increased to display and construct various contents.
- However, existing technologies for protecting displayed content from other people are insufficient for these larger screen portable terminals, potentially exposing users' information to other people.
-
EP 1 681 841 A1 teaches a message display method for a mobile communication terminal comprising changing the colors of a letter of a message displayed on a screen or a color of a background of the message or both to be similar to or the same as each other to protect sensitive information contained in text messages from curious eyes. Based on the position of a cursor, only letters on the line of the cursor or a certain number of letters located in front of the cursor are displayed normally, while the remainder of the text message is displayed with the color of the background or the background is modified to hide the remainder of the text message. -
WO 99/47990 - Korean patent
KR 10-0765953 B1 - The present invention is designed to address at least the problems and/or disadvantages described above and to provide at least the advantages described below.
- Accordingly, an aspect of the present invention is to protect a screen of a portable terminal from unwanted viewing by others.
- Another aspect of the present invention is to maintain security of displayed content by brightly displaying only a required partial area of the screen and dimly displaying the remaining areas.
- In accordance with an aspect of the present invention, a method is provided for protecting a displayed object on a screen of a portable terminal according to claim 1.
- In accordance with another aspect of the present invention, a portable terminal for protecting an object displayed on a screen according to claim 7 is provided.
- In accordance with another aspect of the present invention, a computer-readable storage medium having instructions for protecting an object displayed on a screen of a portable terminal according to claim 10 is provided. Preferred embodiments are subject of the dependent claims.
- The above and other aspects, features, and advantages of certain embodiments of the present invention will be more apparent from the following detailed description taken in conjunction with the accompanying drawings, in which:
-
FIG. 1 is a block diagram illustrating a portable terminal according to an embodiment of the present invention; -
FIG. 2 is a front perspective view illustrating a portable device according to an embodiment of the present invention; -
FIG. 3 is a rear perspective view illustrating a portable device according to an embodiment of the present invention; -
FIG. 4A illustrates an example separately including a layer for controlling a display of an object according to an embodiment of the present invention; -
FIG. 4B illustrates an example which does not separately include a layer for controlling a display of an object according to an embodiment of the present invention; -
FIG. 5 illustrates an input device according to an embodiment of the present invention; -
FIG. 6 is a flowchart illustrating a method of controlling a screen of a portable terminal according to an embodiment of the present invention; -
FIG. 7 is a flowchart illustrating a method of controlling a display of a trace by a touch or a hovering on a screen of a portable terminal according to an illustrative example not being part of the present invention; -
FIG. 8A illustrates an example of inputting a trace into a screen of a portable terminal according to an illustrative example; -
FIG. 8B illustrates an example of controlling to make a trace input into a screen of a portable terminal sequentially disappear after a predetermined time according to an illustrative example; -
FIG. 8C illustrates another example of controlling to make a trace input into a screen of a portable terminal sequentially disappear after a predetermined time according to an embodiment of the present invention; -
FIG. 8D illustrates an example of controlling a display of a disappearing trace in response to an input of a touch or a hovering on a trace input into a screen of a portable terminal according to an illustrative example; -
FIG. 9 is a flowchart illustrating a method of controlling an object output on a screen, based on a position at which writing is input into the screen of a portable terminal, according to an illustrative example; -
FIG. 10A illustrates an example of controlling an object output in a partial area formed, based on a position at which writing is input into a screen, according to an illustrative example; -
FIG. 10B illustrates another example of controlling an object output in a partial area formed, based on a position at which writing is input into a screen, according to an illustrative example; -
FIG. 11 is a flowchart illustrating a method of controlling an object output, based on a position at which a hovering is detected on a screen displaying a predetermined object, according to an embodiment of the present invention; -
FIG. 12A illustrates an example of controlling an object output in a partial area formed, based on a position at which a hovering is detected on a screen displaying a predetermined object, according to an embodiment of the present invention; -
FIG. 12B illustrates another example of controlling an object output in a partial area formed, based on a position at which a hovering is detected on a screen displaying a predetermined object, according to an embodiment of the present invention; -
FIG. 12C illustrates an example of controlling a size of a partial area formed, based on a position at which a hovering is detected on a screen displaying a predetermined object, according to an embodiment of the present invention; -
FIG. 12D illustrates another example of controlling a size of a partial area formed, based on a position at which a hovering is detected on a screen displaying a predetermined object, according to an embodiment of the present invention; -
FIG. 12E illustrates an example of converting a partial area formed, based on a position at which a hovering is detected on a screen displaying a predetermined object, by applying a particular image filtering effect, according to an embodiment of the present invention; and -
FIG. 12F illustrates an example of converting an object displayed in response to a detection of a hovering on a screen displaying a predetermined object, by applying a particular image filtering effect, according to an embodiment of the present invention. - Various embodiments of the present invention will now be described in detail with reference to the accompanying drawings. In the following description, specific details such as detailed configuration and components are merely provided to assist the overall understanding of these embodiments of the present invention. Therefore, it should be apparent to those skilled in the art that various changes and modifications of the embodiments described herein can be made without departing from the scope of the present invention. In addition, descriptions of well-known functions and constructions are omitted for clarity and conciseness.
- Although the terms including an ordinal number such as first, second, etc. can be used for describing various elements, the elements are not restricted by the terms. The terms are only used to distinguish one element from another element. For example, without departing from the scope of the present invention, a first structural element may be named a second structural named. Similarly, the second structural element also may be named the first structural element. As used herein, the term "and/or" includes any and all combinations of one or more of the associated listed items.
- The terms used in this application is for the purpose of describing particular embodiments only and is not intended to limit the disclosure. Singular forms are intended to include plural forms unless the context clearly indicates otherwise. In the description, it should be understood that the terms "include" or "have" indicate existences of a feature, a number, a step, an operation, a structural element, parts, or a combination thereof, and do not previously exclude the existences or probability of addition of one or more another features, numeral, steps, operations, structural elements, parts, or combinations thereof.
- Unless defined differently, all terms used herein, which include technical terminologies or scientific terminologies, have the same meaning as a person skilled in the art to which the present disclosure belongs. It should be interpreted that the terms, which are identical to those defined in general dictionaries, have the meaning identical to that in the context of the related technique. The terms should not be ideally or excessively interpreted as a formal meaning.
- Terms to be used in the present disclosure will be defined as follows.
- Portable terminal: a mobile terminal that is portable and provides data transmission/reception and a voice and video call, and may include at least one screen(or at least one touch screen). Examples of a portable terminal include a smart phone, a tablet Personal Computer (PC), a 3-Dimensional (3D)-TeleVision (TV), a smart TV, a Light Emitting Diode (LED) TV, and a Liquid Crystal Display (LCD) TV and any other terminal that can communicate with a neighboring device or another terminal located at a remote place.
- Input device: at least one of an electronic pen, and a stylus pen that can provide a command or an input to the portable terminal in a screen contact state or even in a noncontact state, such as hovering.
- Object: something which is displayed or can be displayed on a screen of a portable terminal. Examples of an object include a document, a writing, a widget, a picture, a map, a dynamic image, an email, a Short Message Service (SMS) message, a Multimedia Messaging Service (MMS) message, a shortcut icon, a thumbnail image, a folder storing at least one object in the portable terminal etc. Accordingly, the object may be executed, deleted, canceled, stored, or changed by a touch input or hovering input using the input device.
-
FIG. 1 is a block diagram schematically illustrating a portable terminal according to an embodiment of the present invention. - Referring to
FIG. 1 , aportable terminal 100 may be connected with an external device (not shown) by using one of amobile communication module 120, asub-communication module 130, aconnector 165, and anearphone connecting jack 167. The external device may include various devices detached from and attached to theportable terminal 100 by a wire, such as an earphone, an external speaker, a Universal Serial Bus (USB) memory, a charger, a cradle/dock, a Digital Multimedia Broadcasting (DMB) antenna, a mobile payment related device, a health management device (blood sugar tester or the like), a game machine, a car navigation device and the like. Further, the external device may include a Bluetooth communication device, a Near Field Communication (NFC) device, a WiFi Direct communication device, and a wireless Access Point (AC) which can wirelessly access a network. The portable terminal may be connected with other devices, e.g., a mobile phone, a smart phone, a tablet PC, a desktop PC, and a server in a wired or wireless manner. - Referring to
FIG. 1 , theportable terminal 100 includes at least onescreen 190, at least onescreen controller 195, acontroller 110, amobile communication module 120, asub-communication module 130, amultimedia module 140, acamera module 150, a Global Positioning System (GPS)module 157, an input/output module 160, asensor module 170, astorage unit 175, and apower supply unit 180. Theportable terminal 100 according to the present disclosure can protect an object displayed on thescreen 190 through at least one device or module which has been described above. - The
sub communication module 130 includes a wireless Local Area Network (LAN)module 131, a shortrange communication module 132. Themultimedia module 140 includes a broadcasting communication module 141, anaudio reproduction module 142, and avideo reproduction module 143. - The
camera module 150 includes afirst camera 151 and asecond camera 152. Further, thecamera module 150 includes abarrel 155 for zooming in/zooming out the first and/orsecond cameras motor 154 for controlling a motion of thebarrel 155 to zoom in/zoom out thebarrel 155, and aflash 153 for providing a light source for photographing according to a main purpose of theportable terminal 100. - The input/
output module 160 includes abutton 161, amicrophone 162, aspeaker 163, avibration motor 164, aconnector 165, and akeypad 166. - The
controller 110 includes a Central Processing Unit (CPU) 111, a Read Only Memory (ROM) 112 which stores control programs for controlling theuser terminal 100, and a Random Access Memory (RAM) 113 which stores signals or data input from the outside of theportable terminal 100 or is used as a memory region for an operation executed in theportable terminal 100. For example, the CPU 111 may include a single core, a dual core, a triple core, or a quadruple core. The CPU 111, theROM 112 and theRAM 113 may be connected to each other through internal buses. - The
controller 110 controls themobile communication module 120, thesub communication module 130, themultimedia module 140, thecamera module 150, theGSP module 157, the input/output module 160, thesensor module 170, thestorage unit 175, thepower supplier 180, thescreen 190, and thescreen controller 195. - Further, the
controller 110 determines whether the hovering is recognized, when ainput device 168, such as an electronic pen, approaches an object displayed on thescreen 190 and identifies the object corresponding to a position where the hovering occurs. Thecontroller 110 detects a distance between theportable terminal 100 and the input unit and the hovering according to the distance. The hovering may include at least one of a press of a button formed on the input unit, a tap on the input unit, a movement of the input unit at a speed faster than a predetermined speed, and a touch of the object. - When an input using a touch or a hovering is detected on the
screen 190, thecontroller 110 may form an area corresponding to a position of the detected input to protect a displayed object and display a part of the object corresponding to the formed area differently from other parts of the object corresponding to other areas. For example, thecontroller 110 may control such that a display within the area is different from a display in other areas, and control a partial area of thescreen 190. Thecontroller 110 may control a display within the area by applying different attributes to the area and other areas to display an object output in the area different from an object output in at least a part of the other areas except for the area. The attribute may include at least one of a size, a color, a brightness, and a gradation effect. Further, the area may be differently controlled according to a touched position or a distance between a finger or an input unit performing the input and thescreen 190. The object output within the area and the object output within the other areas may be differently controlled according to a touched position or a distance between a finger and/or an input unit performing the input and thescreen 190. The object or a part of the object output within the area may be more brightly output than the object or a part of the object output within the other areas. - When the input is a touch, the area may be formed with a predetermined radius from the touch position.
- When the input is a hovering, a size of the area may be variably controlled according to a distance between a position where the hovering is detected and the
screen 190 detecting the input. - Further, the
screen 190 may include a layer for controlling a display of a displayed object. The layer may be a virtual layer which can be implemented by software. Thecontroller 110 may control a display of the object displayed on thescreen 190 by controlling the layer. Another area may include at least partial areas except for the area. When an input is detected while an object is displayed on thescreen 190, thecontroller 110 may transparently display the area and display the other areas more translucently than the area or opaquely. When an input is detected while an object (for example, picture) is displayed, thecontroller 110 may control a color of a part of the object to be different from a color of other parts of the object. An area having a predetermined radius from the touch position may be formed, when the input is the touch input, and an area may be formed based on a distance between a position where the hovering is detected and thescreen 190, when the input is the hovering input, under a control of thecontroller 110. A position of the area may be changed in accordance with a movement of the input. - When a touch input or a hovering input is detected on the
screen 190, thecontroller 110 may display a trace formed in response to the detected input to protect a displayed object, and control the display such that the displayed trace sequentially disappears. For example, the displayed trace may sequentially disappear according to an input time. That is, thecontroller 110 may control thescreen 190, such that a formed trace sequentially disappears a predetermine time after first being displayed. - The
controller 110 may control a Red, Green, and Blue (RGB) value applied to each pixel of the trace to make the trace sequentially disappear. Further, when a touch input or a hovering input is detected on the disappearing trace, thecontroller 110 may re-display the disappearing trace in response to the detection of the input. - A gradation effect may be applied to the translucently or opaquely displayed trace.
- The
controller 110 may form an area to display the disappearing trace, based on a position at which the touch or hovering input is detected, and display the disappearing trace on the formed area. The translucently or opaquely displayed trace may be included in an area formed, based on a position at which the input is detected or a distance from a position where the hovering is detected to thescreen 190. - Further, the
controller 110 may control thescreen 190 such that the partial area is more brightly displayed than the other areas in response to an input of writing using the touch or the hovering. When a touch or hovering input is detected on thescreen 190 displaying an object, thecontroller 110 may apply an object attribute, which is different from an object attribute applied to the other areas except for the area, to the area formed in response to the detected input. Thecontroller 110 may change the attribute of at least one of the area and the other areas. The attribute may include at least one of a size, a color, and a gradation effect. At least one of the size and the object output of the partial area may be controlled in response to the touch or hovering input, and the partial area may be formed based on a position at which the input received or a position having a shortest distance from a position at which the hovering is detected. - A display of the object may be controlled in accordance with a security mode in which the display of the object is controlled. The security mode may be configured using at least one of a current position of the portable terminal, a time, an ambient situation, and ambient brightness, or may be configured by a selection by the user. Further, the security mode may configure at least one of a size of the partial area, an object output, and a time at which a trace input by a touch or a hovering is sequentially transparent or translucent. In addition, the security mode may be automatically configured through pieces of information collected by various sensors included in the portable terminal. The aforementioned input according to the present disclosure may include at least one of a touch and a hovering, and a shape of the partial area may be variably controlled.
- The
controller 110 may detect at least one input of a touch and a hovering on thescreen 190 displaying at least one object and form an area for controlling an attribute of thescreen 190, based on a position at which the input is detected. Thecontroller 110 may control thescreen 190 such that an object within the formed area is more brightly displayed than an object within at least a partial area of the other areas. - The
controller 110 may control a display of the object displayed on thescreen 190 by using an input detected on thescreen 190. - The
screen 190 may include at least two or more layers. For example, a first layer of the at least two layers may display an object and a second layer may form the area. An attribute of the second layer may be controlled in accordance with at least one of a position at which the input is detected and a distance of the hovering, and may exist on the first layer. - Further, in accordance with an embodiment of the present invention, the second layer may exist below the first layer. The
controller 110 may control the attribute of the formed area differently from the attribute of the other areas on thescreen 190. The attribute may include at least one of a color of a part of the object displayed within the area, a brightness of the area, and a gradation effect applied to a boundary between the area and the other areas outside of the area. - The
mobile communication module 120 connects theportable terminal 100 to the external device through mobile communication by using one or more antennas according to a control of thecontroller 110. Themobile communication module 120 transmits/receives a wireless signal for a voice call, a video call, a SMS, or a Multimedia Message Service (MMS) to/from a mobile phone (not illustrated), a smart phone (not illustrated), a tablet PC, or another device (not illustrated), which has a phone number input into theportable terminal 100. - The
sub-communication module 130 includes thewireless LAN module 131 and the short-range communication module 132. Alternatively, thesub-communication module 130 may include only thewireless LAN module 131, or only the short-range communication module 132. - The
wireless LAN module 131 may connect to the Internet via a wireless Access Point (AP) (not shown), under a control of thecontroller 110. Thewireless LAN module 131 supports a wireless LAN standard (IEEE802.11x) of the Institute of Electrical and Electronics Engineers (IEEE). - The short
range communication module 132 may perform short range communication wirelessly between theportable terminal 100 and an image forming apparatus (not illustrated) according to the control of thecontrol unit 110. For example, a short-range communication scheme may include a Bluetooth communication scheme, an Infrared Data Association (IrDA) communication scheme, a WiFi-Direct communication scheme, a Near Field Communication (NFC) scheme, etc. - In
FIG. 1 , theportable terminal 100 includes themobile communication module 120, thewireless LAN module 131, and the shortrange communication module 132. However, according to desired performance, theportable terminal 100 may include a combination of themobile communication module 120, thewireless LAN module 131, and the shortrange communication module 132. Herein, at least one or a combination of themobile communication module 120, thewireless LAN module 131, and the shortrange communication module 132 are referred to as "a transceiver," without limiting the scope of the present disclosure. - The
multimedia module 140 includes the broadcasting communication module 141, theaudio reproduction module 142, and thevideo reproduction module 143. The broadcasting communication module 141 receives a broadcasting signal (e.g., a TV broadcasting signal, a radio broadcasting signal or a data broadcasting signal) or additional broadcasting information (e.g., Electric Program Guide (EPS) or Electric Service Guide (ESG)), which are transmitted from a broadcasting station, through a broadcasting communication antenna (not illustrated), under the control of thecontroller 110. - The
audio reproduction module 142 may reproduce a stored or received digital audio file (for example, a file having a file extension of mp3, wma, ogg, or wav) under a control of thecontroller 110. - The
video reproduction module 143 may reproduce a stored or received digital video file (for example, a file of which the file extension is mpeg, mpg, mp4, avi, mov, or mkv) under the control of thecontroller 110. - The
video reproduction module 143 may reproduce a digital audio file. - Alternatively, the
multimedia module 140 may include different combinations of the broadcasting communication module 141, theaudio reproduction module 142, and thevideo reproduction module 143. For example, themultimedia module 140 may include theaudio reproduction module 142 and thevideo reproduction module 143, but not the broadcasting communication module 141. Further, theaudio reproduction module 142 or thevideo reproduction module 143 of themultimedia module 140 may be included in thecontroller 110. - The
camera module 150 includes thefirst camera 151 and thesecond camera 152, which photograph still images or video under control of thecontroller 110. Further, thecamera module 150 includes thebarrel 155 performing a zoom-in/out for photographing a subject, themotor 154 controlling a movement of thebarrel 155, and aflash 153 providing an auxiliary light source required for photographing the subject. Alternatively, thecamera module 150 may include different combinations of thefirst camera 151, thesecond camera 152, thebarrel 155, themotor 154, and theflash 153. - The
first camera 151 may be disposed on a front surface of theportable terminal 100, and thesecond camera 152 may be disposed on a rear surface of theportable terminal 100. Alternatively, thefirst camera 151 and thesecond camera 152 may be disposed to be adjacent to each other (e.g., an interval between thefirst camera 151 and thesecond camera 152 is larger than 1 cm and smaller than 8 cm) to photograph a three-dimensional still image or a three-dimensional dynamic image. - Each of the first and
second cameras second cameras controller 110. A user photographs a dynamic image or a still image through the first andsecond cameras - The
GPS module 157 receives radio waves from a plurality of GPS satellites (not illustrated) in Earth's orbit and calculate a position of theportable terminal 100 by using Time of Arrival information from the GPS satellites to theportable terminal 100. - The input/
output module 160 includes abutton 161, themicrophone 162, thespeaker 163, thevibration motor 164, theconnector 165, thekeypad 166, theearphone connecting jack 167, and theinput unit 168. The input/output module is not limited thereto, and a mouse, a trackball, a joystick, or a cursor control such as cursor direction keys may be provided to control a movement of the cursor on thescreen 190. Alternatively, the input/output module 160 may include different combinations of thebutton 161, themicrophone 162, thespeaker 163, thevibration motor 164, theconnector 165, thekeypad 166, theearphone connecting jack 167, and theinput unit 168. - The button 161 (or multiple buttons) may be formed on the front surface, side surfaces or the rear surface of the housing of the
portable terminal 100 and may include at least one of a power/lock button (not illustrated), a volume button (not illustrated), a menu button, a home button, a back button, and asearch button 161. - The
microphone 162 receives a voice or a sound to generate an electrical signal under the control of thecontroller 110. - The
speaker 163 outputs sounds corresponding to various signals of themobile communication module 120, thesub-communication module 130, themultimedia module 140, and the camera module 150 (for example, a radio signal, a broadcast signal, a digital audio file, a digital video file, or photographing) to the outside of theportable terminal 100 under the control of thecontroller 110. Thespeaker 163 may output a sound (for example, button tone corresponding to phone communication, ringing tone, and a voice of another user) corresponding to a function performed by theportable terminal 100. One ormore speakers 163 may be formed on a suitable position or positions of the housing of theportable terminal 100. - The
vibration motor 164 converts an electric signal into a mechanical vibration under a control of thecontroller 110. For example, when theportable terminal 100 in a vibration mode receives a voice call from any other device (not illustrated), thevibration motor 164 operates. One ormore vibration motors 164 may be provided in the housing of theportable terminal 100. Thevibration motor 164 may operate in response to a touch action of the user on thescreen 190 and successive motions of touches on thescreen 190. - The
connector 165 may be used as an interface for connecting theportable terminal 100 with an external device (not shown) or a power source (not shown). Theportable terminal 100 may transmit or receive data stored in thestorage unit 175 of theportable terminal 100 to or from an external device (not shown) through a wired cable connected to theconnector 165 according to a control of thecontroller 110. Further, theportable terminal 100 may receive power from the power source through the wired cable connected to theconnector 165 or charge a battery (not shown) by using the power source. - The
keypad 166 may receive a key input from a user for control of theportable terminal 100. Thekeypad 166 includes a physical keypad (not shown) formed in theportable terminal 100 or a virtual keypad (not shown) displayed on thedisplay unit 190. The physical keypad (not illustrated) formed on theportable terminal 100 may be omitted according to the capability or configuration of theportable terminal 100. - Earphones (not shown) may be inserted into the
earphone connecting jack 167 according to an embodiment to be connected to theportable terminal 100, and theinput unit 168 may be inserted into and preserved in theportable terminal 100 and may be extracted or detached from theportable terminal 100 when being used. - In addition, an attachment/
detachment recognition switch 169 operating in response to attachment or detachment of theinput unit 168 is provided at one area within theportable terminal 100 into which theinput unit 168 is inserted, and provides a signal corresponding to the attachment or detachment of theinput unit 168 to thecontroller 110. The attachment/detachment recognition switch 169 is located at one area into which theinput unit 168 is inserted to directly or indirectly contact theinput unit 168 when theinput unit 168 is mounted. Accordingly, the attachment/detachment recognition switch 169 generates a signal corresponding to the attachment or the detachment of theinput unit 168 based on the direct or indirect contact with theinput unit 168 and then provides the generated signal to thecontroller 110. - The
sensor module 170 includes at least one sensor for detecting a state of theportable terminal 100. For example, thesensor module 170 may include a proximity sensor that detects a user's proximity to theportable terminal 100, an illumination sensor (not illustrated) that detects a quantity of light around theportable terminal 100, a motion sensor (not illustrated) that detects a motion (e.g., rotation of theportable terminal 100 and acceleration or a vibration applied to the portable terminal 100) of theportable terminal 100, a geo-magnetic sensor (not illustrated) that detects a point of a compass by using Earth's magnetic field, a gravity sensor that detects an action direction of gravity, and an altimeter that detects an altitude through measuring an atmospheric pressure. At least one sensor may detect the state, and may generate a signal corresponding to the detection to transmit the generated signal to thecontroller 110. The sensor of thesensor module 170 may be added or omitted according to a capability of theportable terminal 100. - The
storage unit 175 store signals or data input/output in response to the operations of themobile communication module 120, thesub-communication module 130, themultimedia module 140, thecamera module 150, theGPS module 157, the input/output module 160, thesensor module 170, and thescreen 190 according to the control of thecontroller 110. Thestorage unit 175 can store a control program and applications for controlling theportable terminal 100 or thecontroller 110. - The term "storage unit" includes the
storage unit 175, theROM 112 and theRAM 113 within thecontroller 110, or a memory card (not shown) (for example, an SD card or a memory stick) installed in theportable terminal 100. Further, the storage unit may include a nonvolatile memory, a volatile memory, a Hard Disk Drive (HDD), or a Solid State Drive (SSD). - The
storage unit 175 may store applications having various functions such as a navigation function, a video call function, a game function, and a time based alarm function, images for providing a Graphical User Interface (GUI) related to the applications, databases or data related to a method of processing user information, a document, and a touch input, background images (a menu screen, an idle screen or the like) or operating programs required for driving theportable terminal 100, and images photographed by thecamera module 150. Thestorage unit 175 is a machine (for example, computer)-readable medium. The term "machine-readable medium" may be defined as a medium capable of providing data to the machine so that the machine performs a specific function. The machine-readable medium may be a storage medium. Thestorage unit 175 may include a non-volatile medium and a volatile medium. All of these media should be a type that allows the commands transferred by the media to be detected by a physical instrument in which the machine reads the commands into the physical instrument. - The machine-readable medium includes at least one of a floppy disk, a flexible disk, a hard disk, a magnetic tape, a Compact Disc Read-Only Memory (CD-ROM), an optical disk, a punch card, a paper tape, a Random Access Memory (RAM), a Programmable Read-Only Memory (PROM), an Erasable PROM (EPROM), and a flash-EPROM, but is not limited thereto.
- The
power supplier 180 supplies power to one or more batteries (not illustrated) disposed in the housing of theportable terminal 100 under the control of thecontroller 110. The one or more batteries (not illustrated) supply power to theportable terminal 100. In addition, thepower supplier 180 may supply, to theportable terminal 100, the power input from an external power source (not illustrated) through a wired cable connected with theconnector 165. Thepower supplier 180 may supply power wirelessly input from the external power source through a wireless charging technology to theportable terminal 100. - As described above, the
portable terminal 100 includes ascreen 190 providing user interfaces corresponding to various services (for example, a phone call, data transmission, broadcasting, and photography) to the user. Alternatively, theportable terminal 100 may includemultiple screens 190. Each of thescreens 190 may transmit an analog signal corresponding to at least one touch input to a user interface to a corresponding screen controller. - As described above, the
portable terminal 100 may include a plurality ofscreens 190, and each of thescreens 190 may include ascreen controller 195 that receives an analog signal corresponding to a touch. The screens may be connected to a plurality of housings through a hinge, respectively, or may be located in a single housing without a hinge connection. - The
screen 190 may detect at least one of the touch and the hovering through a touch device (for example, a stylus pen, an electronic pen, a finger, etc.). - Accordingly, the
screen 190 includes atouch panel 191 that can detect a touch input using an input device and a hoveringpanel 192 that can detect a hovering input using an input device. The hovering panel 912 may include a panel in an EMR type which can detect a distance between the input unit or the finger and thescreen 190 through a magnetic field. Thescreen 190 may receive successive motions of the touch or the hovering using the input unit or the finger and include a display panel (not shown) that can display a trace formed by the successive motions. The panel in the EMR type may be configured below the display panel and may detect the hovering. - The display panel (not shown) may be a panel such as an LCD, an AMOLED or the like and may display various images and a plurality of objects according to various operation statuses of the
portable terminal 100, an execution of an application and a service. Further, the display panel (not shown) may include a layer for controlling an output of an object. - The
screen 190 may separately include panels (for example,touch panel 191 and hovering panel 192) that can detect the touch and the hovering using the finger or the input unit, respectively, or may detect the touch and the hovering through only one panel. Values detected by the respective panels may be different and may be provided to the screen controller. Further, the screen controller may differently recognize the values of the touch and the hovering input from the respective panels to determine whether the input from thescreen 190 is an input by a user's body or an input by a touchable input unit. Thescreen 190 may display one or more objects on the display panel. The object may include a trace formed by a movement of the finger or the input unit as well as a picture, a video, and a document. Further, thescreen 190 or the display panel may include a layer for controlling a display of an object. The layer may be a virtual layer implemented by software. - When the touch or hovering input is detected, the
screen 190 may determine a distance between the detected position and the finger or the input unit having provided the input. An interval which can be detected by thescreen 190 may be changed according to a capability or a structure of theportable terminal 100. Particularly, thescreen 190 is configured to distinguish the touch by the contact with the user's body or the touch input unit and the input in a proximity state (for example, hovering) and configured to output different values (for example, including voltage values or current values as analog values) detected by the touch and the hovering. Further, it is preferable that thescreen 190 outputs a different detected value (for example, a current value or the like) according to a distance between a position in the air where the hovering is generated and thescreen 190. - The
screen 190 may be implemented as, for example, a resistive type, a capacitive type, an infrared type, or an acoustic wave type. - The
screen 190 includes a plurality of pixels and displays an image through the pixels. Thescreen 190 may use an LCD, an OLED, or an LED. - Further, the
screen 190 includes a plurality of sensors detecting, when theinput unit 168 touches a surface of thescreen 190 or is placed within a predetermined distance from thescreen 190, a position of theinput unit 168. The plurality of sensors may be formed with a coil structure, and in a sensor layer formed of the plurality of sensors, the sensors are arranged in a predetermined pattern and form a plurality of electrode lines. When the touch or the hovering is generated on thescreen 190 throughinput unit 168, a detection signal of which a waveform is changed due to a magnetic field between the sensor layer and the input unit is generated by such a structure, and thescreen 190 transmits the generated detection signal to thecontroller 110. - Further, when the contact on the
screen 190 occurs through the finger, thescreen 190 transmits, to thecontroller 110, a detection signal caused by capacitance. - A predetermined distance between the
input unit 168 and thescreen 190 may be detected through an intensity of a magnetic field generated by acoil 430. - The
screen 190 may detect the input using theinput device 168 and display a screen in which an output of an object is controlled in a partial area of thescreen 190 according to the detected input under a control of thecontroller 110. The object may be output after a brightness or a color of the object is controlled or a transparency effect is applied to the object. The input may include at least one of the touch and the hovering. Thescreen 190 may display an object output in the partial area differently from an object output in other areas. Further, thescreen 190 may display such that a trace of input writing sequentially disappears in response to an input of writing using a touch or hovering. When a touch or hovering using theinput device 168 is generated on the disappearing trace, thescreen 190 may translucently or opaquely displays the disappearing trace. - As described above, the translucently or opaquely displayed trace may be included in an area formed based on the touch position by the input device 1668 or a position having a proximity distance (or shortest distance) from the input unit. Further, the
screen 190 may more brightly display the partial area than other areas in response the input of writing using the touch or hovering. Other areas may also be displayed translucently or opaquely. - As described above, the screen may apply a gradation effect between the partial area and other areas. Further, the
screen 190 may display such that at least one of a size, an object output, and a color of the partial area is different from that in other areas in response to the detection of the touch or hovering using the input unit or finger in a state where a predetermined object is displayed. At least one of the size of the partial area and an object output of the remaining areas may be controlled in accordance with a touch between theinput device 168 and thescreen 190 or a distance of the hovering. - The
screen controller 195 converts the analog signal received from thescreen 190 to a digital signal (for example, X and Y coordinates) and then transmits the digital signal to thecontroller 110. Thecontroller 110 can control thescreen 190 by using the digital signal received from thescreen controller 195. For example, thecontroller 110 allows a short-cut icon (not shown) or an object displayed on thescreen 190 to be selected or executed in response to a touch or hovering. Further, thescreen controller 195 may be included in thecontroller 110. - Moreover, the
screen controller 195 may identify a distance between a position in the air where the hovering is generated and thescreen 190 by detecting a value (for example, a current value or the like) output through thescreen 190, convert the identified distance value to a digital signal (for example, a Z coordinate), and then provide the converted digital signal to thecontroller 110. -
FIG. 2 is a front perspective view of a portable terminal according to an embodiment of the present invention, andFIG. 3 is a rear perspective view of a portable terminal according to an embodiment of the present invention. - Referring to
FIGs. 2 and3 , thescreen 190 is disposed in a center of afront surface 100a of theportable terminal 100. Thescreen 190 may have a large size that occupies most of thefront surface 100a of theportable terminal 100. -
FIG. 2 illustrates an example where a main home screen is displayed on thescreen 190. The main home screen is a first screen displayed on thescreen 190 when power of theportable terminal 100 is turned on. Further, when theportable terminal 100 has different home screens of several pages, the main home screen may be a first home screen of the home screens of several pages. Short-cut icons 191-1, 191-2, and 191-3 for executing frequently used applications, a main menu switching key 191-4, time, weather, etc., are displayed on the home screen. The main menu switch key 191-4 displays a menu screen on thescreen 190. - A
status bar 192 indicating a status of theportable terminal 100 such as a battery charge status, an intensity of a received signal and a current time is formed at a top end of thescreen 190. - A
home button 161a, amenu button 161b, and aback button 161c is formed at a lower portion of thescreen 190. - The
home button 161a displays the main home screen on thescreen 190. For example, when thehome key 161a is touched while a home screen different from the main home screen or the menu screen is displayed on thescreen 190, the main home screen is displayed on thescreen 190. Further, when thehome button 161a is touched while applications are executed on thescreen 190, the main home screen illustrated inFIG. 2 is displayed on thescreen 190. Further, thehome button 161a may be used to display recently used applications or a task manager on thescreen 190. - The
menu button 161b provides a connection menu that may be used on thescreen 190. The connection menu may include a widget addition menu, a background change menu, a search menu, an editing menu, an environment setup menu, etc. - The
back button 161c may be used for displaying the screen which was executed just before the currently executed screen or terminating the most recently used application. - The
first camera 151, anillumination sensor 170a, and aproximity sensor 170b are disposed on edges of thefront side 100a of theportable terminal 100. Thesecond camera 152, theflash 153, and thespeaker 163 are disposed on arear surface 100c of theportable terminal 100. - For example, a power/reset button 160a, a
volume button 161b, aterrestrial DMB antenna 141a for receiving a broadcast, and one ormore microphones 162 are disposed on aside surface 100b of theportable terminal 100. TheDMB antenna 141a may be fixed to theportable terminal 100 or may be formed to be detachable from theportable terminal 100. - Further, the
portable terminal 100 has theconnector 165 arranged on a lower side surface. - A plurality of electrodes are formed in the
connector 165, and theconnector 165 may be connected to an external device through a wire. - The
earphone connecting jack 167 is formed on an upper side surface of theportable terminal 100. Earphones may be inserted into theearphone connecting jack 167. - The
input unit 168, such as a stylus, may be received in a lower side surface of theportable terminal 100. For example, theinput unit 168 may be inserted into theportable terminal 100 to be stored in theportable terminal 100, and withdrawn and detached from theportable terminal 100 when being used. -
FIGs. 4A and 4B illustrate a configuration of a plurality of layers for controlling a display of an object according to an embodiment of the present disclosure. Specifically,FIG. 4A illustrates an example separately including a layer for controlling a display of an object according to an embodiment of the present invention, andFIG. 4B illustrates an example which does not separately include a layer for controlling a display of an object according to an embodiment of the present invention. - The application according to an embodiment of the present invention can be applied to various applications for displaying objects, such as a picture, and various applications in which the writing input can be made, such as a drawing, a text input, a diary, etc. The application may include one or more layers according to a function or attribute thereof. For example, if the application is an application for displaying or writing objects, such as a writing input, a drawing, a text input or certain pictures, the application may include various menus such as a menu that receives a selection from the user to display the objects, a menu that configures an environment of the application, and a menu that displays the objects. The application of the present invention may include a plurality of layers for displaying the objects, and each of the plurality of layers is allocated to each menu. Referring to
FIGs. 4A and 4B , when an application displaying an object on thescreen 190 is executed, afirst layer 410 of the application may be located at the bottom, asecond layer 420 may be located on thefirst layer 410, and athird layer 430 may be located on thesecond layer 420. Thefirst layer 410 is not directly shown to the user but serves as a container that contains other layers, and may be called a layout. The layout may include a frame layout, a relative layout, and a linear layout. Thesecond layer 420 may display an object such as a picture, writing, or a document. Thethird layer 430 controls at least a part of the display of the object displayed on thesecond layer 420 and may be opaque or translucent. An area for displaying an object in response to a touch or hover input may be formed on thethird layer 430. The area may be formed with a predetermined radius from a position of the touch or hover input. A described above, when the input is the hovering, a size of the area may be variably controlled according to a distance between a position where the hovering is detected (in the air) and the screen. Further, a position of the area may be changed in accordance with a movement of the input. - The
controller 110 may control an attribute of thethird layer 430 such that a part of the object corresponding to the formed area is displayed differently from other parts of the object. The attribute may include at least one of a size of the area, a brightness of the area, a gradation effect of the area, and a color of the object corresponding to the area. Further, as illustrated inFIG. 4B , even when thethird layer 430 is not included, the controller may control the display of the object by controlling an attribute of thesecond layer 420. -
FIG. 5 is a block diagram illustrating an input device according to an embodiment of the present invention. - Referring to
FIG. 5 , the input device 168 (for example, a touch pen) includes apenholder 500, anib 430 disposed at an end of the penholder, abutton 420 that may change an electromagnetic induction value generated by acoil 510 included inside the penholder, e.g., adjacent to thenib 430. Theinput device 168 also includes avibration device 520, acontroller 530 that generally controls theinput unit 168, a short-range communication unit 540 that performs short-range communication with theportable terminal 100, and abattery 550 that supplies power to theinput unit 168. - The
input unit 168 illustrated inFIG. 5 supports the EMR type. Accordingly, when a magnetic field is form on a predetermined position of thescreen 190 by thecoil 510, thescreen 190 may recognize a touch position through detecting a location of the corresponding magnetic field. -
FIG. 6 is a flowchart illustrating a method of controlling a screen of a portable terminal according to an embodiment of the present invention. - Referring to
FIG. 6 , the portable terminal activates a security mode of the screen in step S610. The security mode controls an object output (or display) of the screen, such that the user can view the screen, but other people adjacent to the user cannot easily view the screen. As described above, in accordance with an embodiment of the present invention, the object output may be controlled in a partial area by activating the security mode or the object output may be controlled in a partial area in response to detecting a hovering input, without the activation of the security mode. - In order to control the object output of the screen using the security mode, the portable terminal automatically activate the security mode by analyzing an ambient situation of the portable terminal or manually activate the security mode in response to a user input or setting. Further, the security mode may configure at least one of a size and an object output of an area through which the object is displayed on the screen, and a time for a trace input is displayed before being sequentially transitioned to be transparent, or translucent.
- As indicated above, the security mode is automatically activated through an analysis of an ambient situation of the portable terminal or may be activated when a condition configured by the user is applied. The ambient situation and the condition include at least one of a current position of the portable terminal, a time, an ambient situation, ambient brightness, and ambient noise. Further, the ambient situation and the condition can be controlled or configured by the user. Basically, the security mode may be automatically executed based on at least one of a current position of the portable terminal, a time, an ambient situation, ambient brightness, and ambient noise.
- When an input on the screen is detected in step S612, the portable terminal controls the object output in a partial area corresponding to the detected input in step S614. The controller 220 detects an input of a hovering on the screen by the input device. The
controller 110 controls the display such that the object output in the partial area is displayed differently from object outputs in other areas. - Further, the
controller 110 may control the screen so that the object output or an object color in the partial area of the screen is displayed differently from that in the remaining areas except for the partial area. - Further, the object in the partial area may be output more brightly than objects in other areas and the other areas may be the remaining areas except for the partial area.
- As described above, for a hovering input, the size of the partial area and/or the object output may be controlled in accordance with a distance between the detection of the hover input and the screen, and the partial area may be formed based on a position having a proximity (or shortest) distance from the position at which the hovering is detected.
- Again, the
controller 110 may control the display such that a trace of writing made using a touch or hover input sequentially disappears according to an illustrative example not forming part of the present invention. Further, thecontroller 110 may control the object output such that the disappeared trace is translucently redisplayed in response to the generation of the hovering using the input device on the disappearing trace. The translucently displayed trace may be included in the area formed based on the touched position by the input device or the position having the proximity (shortest) distance from the position where the hovering is detected. - Further, the
controller 110 may control the object in the partial area to be brighter than objects in other areas in response to the input of the writing using the touch or hovering on the screen. The other areas may be translucently or opaquely displayed. - The portable terminal displays a result of the control in step S616. For example, the screen may display the object output in the partial area differently from object outputs in other areas, under a control of the
controller 110. -
FIG. 7 is a flowchart illustrating a trace display controlling method on a screen of a portable terminal according to an illustrative example not part of the present invention. -
FIGs. 8A to 8D illustrate examples of controlling a display of a trace input on the screen of the portable terminal according to an illustrative example. Specifically,FIG. 8A illustrates an example of inputting a trace on the screen of the portable terminal according to an illustrative example,FIG. 8B illustrates an example of controlling to make a trace input on the screen of the portable terminal sequentially disappear, after a predetermined time, according to an illustrative example,FIG 8C illustrates another example of controlling to make a trace input on the screen of the portable terminal sequentially disappear, after a predetermined time, according to an illustrative example, andFIG. 8D illustrates an example of controlling a display of the disappearing trace in accordance with a touch or hovering input on the trace input on the screen of the portable terminal according to an illustrative example. - Referring to
FIG. 7 , the portable terminal activates a security mode of the screen in step S710. As described above, thecontroller 110 may analyze an ambient situation of the portable terminal to control an object output on the screen, or configure and/or activate the security mode in response to an input by a user. - When a trace on the screen by writing is detected in step S712, the portable display controls a display of the detected trace after a predetermined threshold time in step S714, and displays a result of the control in step S716. More specifically, a writing input is detected, and the
controller 110 displays a trace of the input writing on the screen. Thereafter, the displayed trace is sequentially output as a transparent or translucent trace, after a predetermined time, according to a configuration of the security mode. The predetermined time may be configured by the user or may be variably controlled by at least one of a current position of the portable terminal, a time, an ambient situation, ambient brightness, and ambient noise. The term "sequentially" corresponds to a time when the trace is input. - Referring to
FIG. 8A , when atrace 811 is written using a touch or a hovering input of aninput device 850 into a memo pad displayed on thescreen 810, thecontroller 110 displays the trace. After a predetermined time elapses from a first portion of the trace being input, the trace is sequentially displayed as transparent or translucent. For example, thetrace 811 is displayed such that an object output gradually disappears (that is, becomes transparent) according to an input time order of the traces, afirst trace 811a, asecond trace 811b, and athird trace 811c. - Although object outputs of the first trace to the third trace in
FIG. 8A are shown as different segments, this is only for an easy description and the traces may be displayed with gradation. Further, althoughFIGs. 8A to 8D illustrate that the trace by writing is input into the memo pad, the illustrative example can be applied to various applications in which the writing input can be made, such as a drawing, a text input, a diary, etc.. - Referring to
FIG. 8B , a trace of writing using theinput device 850 is input into thescreen 820, and after a predetermined time elapses, the trace sequentially disappears (for example, trace is transparently displayed) from the screen, like apre-input trace 821. Further, atrace 822, which is input after thetrace 821, is less transparent than thetrace 821, and an object output of thetrace 822 may be also controlled according to an input time. That is, an object output of thetrace 822 may gradually disappear (that is, become transparent) according to an input time sequence, like atrace 822a and atrace 822b. Although object outputs of thetrace 822a and thetrace 822b inFIG. 8B are illustrated as separate segments, it is only for an easy description, and the traces according to the present invention may be displayed with gradation, which is sequentially applied according to an input time. As described above, the time by which the traces sequentially disappear may be configured by the user in the security mode or may be automatically configured by at least one of a current position of the portable terminal, a time, an ambient situation, ambient brightness, and ambient noise. - Referring to
FIG. 8C , a trace is input into thescreen 830, and after a predetermined time elapses, the input trace becomes sequentially translucent, like atrace 831. Thetrace 831 may be displayed to sequentially disappear as the time elapses. For example, thetrace 832 may gradually disappear (that is, become transparent) according to an input time sequence, like atrace 832a and atrace 832b. Again, although object outputs of thetrace 832a and thetrace 832b inFIG. 8C are illustrated as separate segments, this is only for an easy description, and the traces according to the present invention may be displayed with gradation sequentially applied according to an input time. As described above, the time by which the traces sequentially become translucent may be configured by the user in the security mode or may be automatically configured by at least one of a current position of the portable terminal, a time, an ambient situation, ambient brightness, and ambient noise. - As described above, the screen may display the input traces to be sequentially transparent or translucent after a predetermined time, under a control of the controller.
- Referring again to
FIG. 7 , when a touch or hovering input is detected in step S718, the portable terminal displays a trace included in a partial area corresponding to the detected touch or hovering, while controlling an output of the trace in step S720. For example, when a hovering between the input device and the screen is detected, thecontroller 110 controls the screen so that the disappeared trace is translucently or opaquely re-displayed. The translucently or opaquely displayed trace may be a trace included in a partial area formed based on a touched position by the input device or a position having a proximity (or shortest) distance between the device and the screen. A size of the partial area may be configured in the security mode or controlled by the user. - Referring to
FIG. 8D , thescreen 840 displays traces ofarea 841 and traces ofarea 842 of writings sequentially input. After a predetermined time elapses, thecontroller 110 controls the screen such that the input traces sequentially disappear. Thereafter, when a touch or hovering input is detected on the disappeared trace of thearea 841, thecontroller 110 translucently or opaquely re-displays the trace in apartial area 841, in response to the detection of the touch or the hovering. The translucently or opaquely displayed trace in thepartial area 841 may be re-displayed such that the trace gradually disappears while moving away from a touched position of the input or a position having a proximity (or shortest) distance from a position where a hovering is detected. - As described above, size of the
partial area 841 may be controlled according to a distance between theinput device 850 and thescreen 840 or controlled according to a configuration of the security mode. - The trace of the
area 842 made after the trace of thepartial area 841 may be displayed differently from the trace of thepartial area 841 according to an input time. For example, the trace of thearea 842 may gradually disappear (that is, become transparent) according to an input time sequence, like atrace 842a and atrace 842b. - The trace input by the touch or the hovering is displayed in the
other area 842 and the trace within thearea 842 is transparent or opaque. -
FIG. 9 is a flowchart illustrating a method of controlling an object output of a screen, based on a position at which writing is input into the screen of the portable terminal, according to an illustrative example. -
FIGs. 10A and 10B illustrate an example of a process of controlling an object output of the screen, based on a position at which writing is input into the screen of the portable terminal, according to an illustrative example. Specifically,FIG. 10A illustrates an example of controlling an object output in a partial area that is formed, based on a position at which writing is input into the screen, according to an illustrative example, andFIG. 10B illustrates another example of controlling an object output in a partial area that is formed, based on a position at which writing is input into the screen, according to an illustrative example. - Referring to
FIG. 9 , the portable terminal activates a security mode of the screen in step S910. The operation of step S910 is the same as described in steps S610 and S710, as described above. Accordingly, a repetitive description of step S910 will not be provided. - When writing is being input in step S912, the
controller 110 controls an output of the writing according to the elapse of time of the input writing in step S914 and displays a result of the control in step S916. More specifically, thecontroller 110 may detect a position where the writing is currently made on the screen and may control the display such that an object in a partial area that is formed, based on the detected position, is differently displayed than an object in the remaining areas, i.e., outside of the partial area. Thecontroller 110 may control object outputs in the remaining areas, based on the position at which the writing is currently made on the screen. - Further, the
controller 110 may control the display such that the object output in the partial area that is formed, based on a position at which the writing is currently made on the screen, is displayed as being transparent or translucent. - Thereafter, in step S916, the screen may display a result of the control. For example, at least one of a size of the partial area and object output of the remaining areas may be controlled according to a distance between the input device and the screen and a touched position of the input device.
- Referring to
FIG. 10A , when the writing is made on ascreen 1010, e.g., by using a touch or a hovering input, thecontroller 110 may detect a touched position between aninput device 1030 and thescreen 1010 to form apartial area 1012, which is based on the touched position, and control the screen to display the made writing in thepartial area 1012. - Further, the
controller 110 may control thescreen 1010 to opaquely display the remainingarea 1013, i.e., the area outside thepartial area 1012. - When the
partial area 1012 is formed, thecontroller 110 may control the display of the object. As described above, thecontroller 110 may control thepartial area 1012 based on the position at which the touch or the hovering by theinput device 1030 is detected. Due to such a control, the user may recognize that a gradation effect is provided to thepartial area 1012 or an effect of distinguishing the areas is provided. - In
FIG. 10A , thepartial area 1012 is divided into a plurality ofareas 1012a to 1012d by a transparency difference or a brightness difference. For example, a firstpartial area 1012a has a radius r1, a secondpartial area 1012b has a radius r2, a thirdpartial area 1012c has a radius r3, and a fourthpartial area 1012d has a radius r4. Further, thepartial area 1012 may be divided into the plurality ofareas 1012a to 1012d based on the position where the touch or the hovering by the input unit or the finger is detected, or thepartial area 1012 may not be divided into the first tofourth areas 1012a to 1012d by applying gradation to the first to fourth partial areas. For example, thecontroller 110 may control the brightness by increasing the transparency in order to more clearly output an object in an area having a shorter radius (for example, thefirst area 1012a among the first tofourth areas 1012a to 1012d) and by reducing the transparency in order to make the object opaque or translucent in an area having a larger radius (for example, thefourth area 1012d among the first tofourth areas 1012a to 1012d). Further, the screen may be controlled such that theother area 1013 is shown to be opaque. - Referring to
FIG. 10B , when the writing is made on ascreen 1020, e.g., by using a touch or a hovering, thecontroller 110 may detect a touched position between theinput device 1030 and thescreen 1020 to form apartial area 1022 based on the touched position and control the screen to display the made writing in thepartial area 1022. - Further, the
controller 110 may control thescreen 1020 to translucently display the remainingarea 1023. - When the
partial area 1022 is formed, thecontroller 110 may control the display of the object. As described above, thecontroller 110 may control the display of the object in thepartial area 1022 that is based on the position at which the touch or the hovering by the input device1030 is detected. Due to such a control, a gradation effect may be provided to thepartial area 1022 or the user may recognize an effect of distinguishing the areas. - In
FIG. 10B , thepartial area 1022 is divided into a plurality ofareas 1022a to 1022d. For example, a firstpartial area 1022a has a radius r1, a secondpartial area 1022b has a radius r2, a thirdpartial area 1022c has a radius r3, and an fourthpartial area 1022d has a radius r4. Further, thepartial area 1022 may be divided into the plurality ofareas 1022a to 1022d based on the position where the touch or the hovering by theinput device 1030 is detected, or thepartial area 1022 may not be divided into the first tofourth areas 1022a to 1022d, but uniform gradation may be applied. For example, thecontroller 110 may control the screen by controlling the transparency of thefirst area 1022a to be higher in order to more clearly output an object in an area having a shorter radius (for example, thefirst area 1022a among the first tofourth areas 1022a to 1022d) and controlling the transparency to be lower than that of thefirst area 1022a in an area having a larger radius (for example, thefourth area 1022d among the first tofourth areas 1022a to 1022d). Further, theother area 1023, i.e., outside thepartial area 1022, may be controlled to be opaque. -
FIG. 11 is a flowchart illustrating a method of controlling an object output of a screen that is based on a position at which a hovering is detected on a screen displaying an object, according to an embodiment of the present invention. -
FIGs. 12A to 12F illustrate examples of controlling an object output of a screen, based on a position at which an input is detected on the screen displaying an object, according to embodiments of the present invention. Specifically,FIG. 12A illustrates an example of controlling an object output in a partial area that is formed, based on a position at which a hovering is detected on a screen displaying an object, according to an embodiment of the present invention,FIG. 12B illustrates another example of controlling an object output in a partial area that is formed based on a position at which a hovering is detected on a screen displaying an object, according to an embodiment of the present invention,FIG. 12C illustrates an example of controlling a size of a partial area formed based on a position where a hovering is detected on a screen displaying a predetermined object according to an embodiment of the present invention,FIG. 12D illustrates another example of controlling a size of a partial area that is formed based on a position at which a hovering is detected on a screen displaying an object, according to an embodiment of the present invention,FIG. 12E illustrates an example of converting a partial area that is formed based on a position where a hovering is detected on the screen displaying an object by applying a particular image filtering effect, according to an embodiment of the present invention, andFIG. 12F illustrates an example of converting an object displayed in response to a detection of a hovering on a screen displaying an object by applying a particular image filtering effect according to an embodiment of the present invention. - Referring to
FIG. 11 , the portable terminal activates a security mode of the screen in step S1110. The operation of step S1110 is the same as described in steps S610 and S710, as described above. Accordingly, a repetitive description of step S1110 will not be provided. - When an input using a touch or a hovering is detected, while an object is displayed in step S1112, the controller controls the display of an object output in a partial area that is based on a position where the input is detected in step S1114 and displays a result of the control in step S1116. For example, the object displayed on the screen, which is confidential or that a user wants to be kept from being exposed to other people, includes a document, a widget, a picture, a news article, a map, a diary, a video, an email, an SMS message, and an MMS message. When a touch or a hovering using an input device is detected on the screen displaying the object, the
controller 110 may form a partial area based on a position at which the touch or the hovering is detected, and display a part of the object in the formed partial area. Further, thecontroller 110 may control at least one of the object outputs in the partial area and the remaining area, i.e., the area outside of the partial area. - For example, when a touch or a hovering by an input device is detected while an object is opaquely displayed, the controller may control an object output such that a partial area corresponding to the touch or the hovering is transparently or translucently displayed.
- Alternatively, when a touch or a hovering by an input device is detected while an object is transparently displayed, the controller may control an object output such that the remaining area except for the partial area corresponding to the touch or the hovering is translucently or opaquely displayed.
- Further, a gradation effect may be applied between the partial area and the remaining area, and a size of the partial area may be variably controlled according to a distance between the screen and the input device or preset in the security mode.
- Further, the
controller 110 may apply different colors to the partial area formed in response to the detection of the touch or the hovering by the input device while the object is displayed on the screen and the remaining area. For example, thecontroller 110 may detect a touch or a hovering using an input device on the screen displaying a color picture, convert a color of a partial area of the object to black and white, and display the converted object. - Referring to
FIG. 12A , when a touch or a hovering using aninput device 1270 is detected on an object displayed on thescreen 1210, thecontroller 110 may detect a touched position between theinput device 1270 and thescreen 1210, and form apartial area 1211 based on the touched position. Further, thecontroller 110 may control an attribute of the formedpartial area 1211 to display the object in the formedpartial area 1211 differently from the object in the remainingarea 1212, i.e., the area outside of thepartial area 1211. - In addition, the
controller 110 may display the object output by controlling the screen to translucently or transparently display the object included in thepartial area 1211 and opaquely display the object included in the remainingarea 1212. Thepartial area 1211 may be divided into the first tofourth areas 1012a to 1012d, as illustrated inFIG. 10A , or may have the gradation effect applied. - At least one of a size of the
partial area 1211 and object outputs in thepartial area 1211 and the remainingarea 1212 may be configured in the security mode. The security mode may configure gradation between thepartial area 1211 and the remainingarea 1212. Thecontroller 110 may configure the object output in the partial area by using at least one of a current position of the portable terminal, a time, ambient brightness, an ambient situation, and ambient noise, or the object output in the partial area may be controlled by the user. - Referring to
FIG. 12B , when a touch or a hovering using theinput device 1270 is detected on an object displayed on thescreen 1220, thecontroller 110 may detect a touched position between theinput device 1270 and thescreen 1220, and form apartial area 1221 that is based on the touched position. Thecontroller 110 may control an object output by controlling the screen or an attribute of the remaining areas to translucently display the remainingarea 1222 and display a result of the control on thescreen 1220. Thepartial area 1221 may be divided into the first tofourth areas 1022a to 1022d as illustrated inFIG. 10B or may have the gradation effect applied. - At least one of the object outputs in the
partial area 1221 and the remainingarea 1222 may be configured in the security mode. The security mode may configure gradation between thepartial area 1221 and the remainingarea 1222. Thecontroller 110 may configure at least one of a size of the partial area and the object output by using at least one of a current position of the portable terminal, a time, ambient brightness, an ambient situation, and ambient noise, or at least one of the size of the partial area and the object output may be controlled by the user. - Referring to
FIG. 12C , when a touch or a hovering using theinput device 1270 is detected on an object displayed on thescreen 1230, thecontroller 110 may detect a touched position or a hovering distance between theinput device 1270 and thescreen 1230, and form apartial area 1231 based on the touched position or the hovering position. Thecontroller 110 may control an object output to opaquely display the remainingarea 1232, i.e., outside the formedpartial area 1231, and display thepartial area 1231 having a controlled size on thescreen 1230. The size of thepartial area 1231 may become smaller as theinput unit 1270 is closer to the screen and become larger as theinput unit 1270 is further away from the screen. For example, thepartial area 1231 illustrated inFIG. 12C is smaller than the partial area illustrated inFIG. 12A because a hovering distance between thescreen 1230 and theinput device 1270 inFIG. 12C is shorter than a hovering distance between thescreen 1210 and theinput device 1270 inFIG. 12A . - Further, the
partial area 1231 may be divided into the first tofourth areas 1012a to 1012d or may have the gradation effect applied. At least one of a size of thepartial area 1231 and object outputs in thepartial area 1231 and the remainingarea 1232 may be configured in the security mode. The security mode may configure gradation between thepartial area 1231 and the remainingarea 1232. Thecontroller 110 may configure at least one of the size of the partial area and the object output by using at least one of a current position of the portable terminal, a time, ambient brightness, an ambient situation, and ambient noise, or at least one of the size of the partial area and the object output may be controlled by the user. - Referring to
FIG. 12D , when a touch or a hovering using theinput device 1270 is detected on an object displayed on thescreen 1240, thecontroller 110 may detect or measure a touched position or a hovering distance between theinput device 1270 and thescreen 1240 and form apartial area 1241 based on the touched position or the hovering position. Further, thecontroller 110 may control an object output to translucently display the remainingarea 1232. - In addition, the
controller 110 controls a size of thepartial area 1241 and displays thepartial area 1241 having the controlled size on thescreen 1240. The size of thepartial area 1241 may become smaller as the hovering distance is shorter and become larger as the hovering distance is longer. For example, thepartial area 1241 illustrated inFIG. 12D is larger than the partial area illustrated inFIG. 12B because the hovering distance between thescreen 1240 and theinput device 1270 inFIG. 12D is longer than the hovering distance between thescreen 1220 and theinput device 1270 inFIG. 12B . - The
partial area 1241 may be divided into the first tofourth areas 1022a to 1022d as illustrated inFIG. 10B or may have the gradation effect applied. - At least one of the size of the
partial area 1241 and object outputs in thepartial area 1242 and the remainingarea 1242 may be configured in the security mode. The security mode may configure gradation between thepartial area 1241 and the remainingarea 1242. Thecontroller 110 may configure at least one of the size of the partial area and the object output by using at least one of a current position of the portable terminal, a time, ambient brightness, an ambient situation, and ambient noise, or at least one of the size of the partial area and the object output may be controlled by the user. - Referring to
FIG. 12E , when a touch or a hovering using theinput device 1270 is detected on an object (for example, picture) displayed on thescreen 1250, thecontroller 110 may detect or measure a touched position or a hovering distance between theinput device 1270 and thescreen 1250. - Further, the
controller 110 may form apartial area 1252 based on the touched position or the hovering position by using a result of the measurement and perform filtering of changing a color of thepartial area 1252 to differentiate an image of the formedpartial area 1252 from an image of the remainingarea 1251, i.e., outside thepartial area 1252. - The filtering may include a conversion to represent the
partial area 1252 with a contour line, to blur thepartial area 1252, to change a color of thepartial area 1252, or to make thepartial area 1252 clear. For example, when the displayed object corresponds to a color picture, thecontroller 110 may convert a color of thepartial area 1252 to black and white or an image of thepartial area 1252 to a contour line, and display the convertedpartial area 1252 on thescreen 1250. - A size of the
partial area 1252 may be displayed according to the hovering distance between thescreen 1250 and theinput unit 1270 or the finger. For example, the size of thepartial area 1252 may become smaller as the hovering distance is shorter and become larger as the hovering distance is longer. Further, the size of thepartial area 1252 may become larger as the hovering distance is shorter and become smaller as the hovering distance is longer. In addition, at least one of the size and shape of thepartial area 1252 may be configured in the security mode or selectively applied by the user, and is variable. The shape of thepartial area 1252 may include various forms of figures, such as a circle, an oval, a square, a rectangle and the like. - For example, the color of the
partial area 1252 can be changed to black and white as described above, and, when a black and white picture is displayed, an image of thepartial area 1252 formed by theinput unit 1270 or the finger may be converted to a color image. Thecontroller 110 may configure at least one of the size and the color of the partial area by using at least one of a current position of the portable terminal, a time, ambient brightness, an ambient situation, and ambient noise, or at least one of the size and the color of the partial area may be controlled by the user. - Referring to
FIG. 12F , when a touch or a hovering using theinput device 1270 is detected on an object 1261 (for example, picture) displayed on thescreen 1260, thecontroller 110 may change a color the of the displayed object. For example, when the displayed object is a color image, thecontroller 110 may change the color image to a black and white image and display the black and white image on thescreen 1260. - Alternatively, when the displayed object is a black and white image, the
controller 110 may change the black and white image to a color image and display the color image on thescreen 1260. - Further, when the color image is changed to the black and white image or the black and white image to the color image, the
controller 110 may apply different color concentrations according to the hovering distance. For example, a deeper color is applied as the hovering distance is shorter and a lighter color is applied as the hovering distance is longer. Further, thecontroller 110 may configure the color change by using at least one of a current position of the portable terminal, a time, ambient brightness, an ambient situation, and ambient noise or the color change may be controlled by the user. - It may be appreciated that the embodiments of the present disclosure may be implemented in software, hardware, or a combination thereof. Any such software may be stored, for example, in a volatile or non-volatile storage device such as a ROM, a memory such as a RAM, a memory chip, a memory device, or an IC, or a optical or magnetic recordable and machine (e.g., computer) readable medium such as a CD, a DVD, a magnetic disk, or a magnetic tape, regardless of its ability to be erased or its ability to be re-recorded. It will be appreciated that a memory, which may be incorporated in a portable terminal, may be an example of a machine-readable storage medium which is suitable for storing a program or programs including commands to implement the exemplary embodiments of the present disclosure. Accordingly, the present invention includes a program that includes a code for implementing an apparatus or a method defined in any claim in the present specification and a machine-readable storage medium that stores such a program. Further, the program may be electronically transferred by a predetermined medium such as a communication signal transferred through a wired or wireless connection, and the present disclosure appropriately includes equivalents of the program.
- Moreover, the above-described mobile terminal can receive the program from a program provision device which is connected thereto in a wired or wireless manner, and store the program. A program providing apparatus may include a program including instructions controlling the screen by the portable terminal, a memory for storing information required for controlling the screen, a communication unit for performing wired or wireless communication with the portable terminal, and a controller transmitting the corresponding program to a host device automatically or by a request of the portable terminal.
- According to the above-described embodiments of the present invention, when a user is making an object or views an already made object by using an input device, an area at which a touch or a hovering is detected is formed, and the formed area is differently displayed from other areas. As a result, it is possible to protect privacy and also prevent light pollution to adjacent people in a dark place, thereby increasing the convenience.
- While the present invention has been particularly shown and described with reference to certain embodiments thereof, it will be understood by those of ordinary skill in the art that various changes in form and details may be made therein without departing from the scope of the present invention as defined by the following claims.
Claims (10)
- A method for protecting an object displayed on a screen (190) of a portable terminal (100) from unwanted viewing by others, the method comprising:activating (S610, S1110) a security mode of the screen (190);detecting (S612, S1112) a position of a hovering input on the screen (190);forming an area (841, 1012, 1022, 1211, 1221, 1231, 1241, 1252) corresponding to the detected position; anddisplaying (S616, S1116) a part of an object corresponding to the formed area differently from another part of the object in an area outside of the formed area,wherein the security mode is automatically activated based on an ambient situation of the portable terminal (100).
- The method of claim 1, wherein displaying the part of the object corresponding to the formed area (1252) differently from another part of the object outside of the formed area comprises controlling an attribute of the formed area.
- The method of claim 1 or 2, further comprising, if the position is detected while the object is displayed, displaying the formed area (1012, 1211, 1231) more transparent than the area (1013, 1212, 1232) outside of the formed area.
- The method of claim 1 or 2, further comprising, if the position is detected while a picture is displayed, controlling a color of a part of the displayed picture within the formed area (1252) differently from a color of an another part of the picture.
- The method of one of claims 1 to 4, further comprising controlling the screen (190) for displaying the formed area more brightly than the other area.
- The method of one of claims 1 to 5, wherein the step of displaying comprises displaying (S716, S916) a trace (811, 821, 831, 842, 1011, 1021) formed by movement of the hovering input on the screen (190) and controlling the screen (190) for gradually disappearing the displayed trace (811, 821, 831, 842, 1011, 1021), according to an input time order of the trace.
- A portable terminal (100) for protecting an object displayed on a screen (190) from unwanted viewing by others, the portable terminal (100) comprising:a screen (190) configured to display an object;a sensor module (170) configured to detect an ambient situation of the portable terminal (100); anda controller (110) configured to:activate a security mode of the screen (190),detect (S612, S1112) a position of a hovering input on the screen (190),form an area (841, 1012, 1022, 1211, 1221, 1231, 1241, 1252) corresponding to the detected position, anddisplay a part of an object corresponding to the formed area differently from another part of the object in an area outside of the formed area,wherein the controller (110) is further configured to activate the security mode automatically based on the ambient situation of the portable terminal (100).
- The portable terminal (100) of claim 7, wherein the controller (110) is further configured to control an attribute of the formed area.
- The portable terminal (100) of claim 7 or 8, wherein the controller (110) is further configured to control the display for gradually disappearing a trace (811, 821, 831, 842, 1011, 1021), according to an input time order of the trace.
- A computer-readable medium having instructions for protecting an object displayed on a screen (190) of a portable terminal (100) from unwanted viewing by others, when performing said instructions by a processor, cause the processor to perform the steps of:activating (S610, S1110) a security mode of the screen (190);detecting (S612, S1112) a position of a hovering input on the screen (190) displaying at least one object;forming an area (841, 1012, 1022, 1211, 1221, 1231, 1241, 1252) corresponding to the detected position; anddisplaying (S616, S720, S1116) a part of an object corresponding to the formed area differently from another part of the object in an area outside of the formed area,wherein the security mode is automatically activated based on an ambient situation of the portable terminal (100).
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR20130049291 | 2013-05-02 | ||
KR1020140044006A KR102266191B1 (en) | 2013-05-02 | 2014-04-14 | Mobile terminal and method for controlling screen |
Publications (2)
Publication Number | Publication Date |
---|---|
EP2800025A1 EP2800025A1 (en) | 2014-11-05 |
EP2800025B1 true EP2800025B1 (en) | 2018-07-04 |
Family
ID=50846763
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP14166857.4A Not-in-force EP2800025B1 (en) | 2013-05-02 | 2014-05-02 | Portable terminal and method for protecting a displayed object |
Country Status (3)
Country | Link |
---|---|
US (1) | US10319345B2 (en) |
EP (1) | EP2800025B1 (en) |
CN (1) | CN104133632B (en) |
Families Citing this family (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105138931A (en) * | 2015-07-08 | 2015-12-09 | 北京京东尚科信息技术有限公司 | Display method and apparatus for sensitive data in interface |
US10216405B2 (en) * | 2015-10-24 | 2019-02-26 | Microsoft Technology Licensing, Llc | Presenting control interface based on multi-input command |
FR3046320A1 (en) * | 2015-12-23 | 2017-06-30 | Orange | METHOD OF SHARING A DIGITAL IMAGE BETWEEN A FIRST USER TERMINAL AND AT LEAST ONE SECOND USER TERMINAL OVER A COMMUNICATION NETWORK. |
KR102544716B1 (en) * | 2016-03-25 | 2023-06-16 | 삼성전자주식회사 | Method for Outputting Screen and the Electronic Device supporting the same |
KR102356345B1 (en) * | 2016-04-20 | 2022-01-28 | 삼성전자주식회사 | Electronic device and controlling method thereof |
WO2018029855A1 (en) * | 2016-08-12 | 2018-02-15 | 株式会社ワコム | Stylus and sensor controller |
CN106650515B (en) * | 2016-10-08 | 2020-09-04 | 广东小天才科技有限公司 | Screen page protection method and device and mobile device |
JP6375070B1 (en) * | 2017-03-30 | 2018-08-15 | 株式会社オプティム | Computer system, screen sharing method and program |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090225100A1 (en) * | 2008-03-10 | 2009-09-10 | Yu-Chieh Lee | Method and system for magnifying and displaying local image of touch display device by detecting approaching object |
Family Cites Families (37)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4868765A (en) * | 1986-01-02 | 1989-09-19 | Texas Instruments Incorporated | Porthole window system for computer displays |
JPH07175458A (en) | 1993-10-12 | 1995-07-14 | Internatl Business Mach Corp <Ibm> | Method and system for reduction of looking-on of data on screen |
JP3486459B2 (en) * | 1994-06-21 | 2004-01-13 | キヤノン株式会社 | Electronic information equipment and control method thereof |
AU3006399A (en) | 1998-03-16 | 1999-10-11 | Gateway 2000, Inc. | Electronic privacy screen and viewer |
US7391929B2 (en) * | 2000-02-11 | 2008-06-24 | Sony Corporation | Masking tool |
US7286141B2 (en) * | 2001-08-31 | 2007-10-23 | Fuji Xerox Co., Ltd. | Systems and methods for generating and controlling temporary digital ink |
US7120872B2 (en) * | 2002-03-25 | 2006-10-10 | Microsoft Corporation | Organizing, editing, and rendering digital ink |
US7100119B2 (en) * | 2002-11-01 | 2006-08-29 | Microsoft Corporation | Page bar control |
US7515135B2 (en) * | 2004-06-15 | 2009-04-07 | Research In Motion Limited | Virtual keypad for touchscreen display |
US7489306B2 (en) * | 2004-12-22 | 2009-02-10 | Microsoft Corporation | Touch screen accuracy |
KR100677426B1 (en) | 2005-01-14 | 2007-02-02 | 엘지전자 주식회사 | Short message display method for mobile communication device |
US7676767B2 (en) * | 2005-06-15 | 2010-03-09 | Microsoft Corporation | Peel back user interface to show hidden functions |
KR100765953B1 (en) | 2006-10-23 | 2007-10-12 | 에스케이 텔레콤주식회사 | A mobile communication terminal providing a privacy protection function and character message displaying method using there of |
KR101495164B1 (en) * | 2008-04-10 | 2015-02-24 | 엘지전자 주식회사 | Mobile terminal and method for processing screen thereof |
US8576181B2 (en) * | 2008-05-20 | 2013-11-05 | Lg Electronics Inc. | Mobile terminal using proximity touch and wallpaper controlling method thereof |
US8363019B2 (en) * | 2008-05-26 | 2013-01-29 | Lg Electronics Inc. | Mobile terminal using proximity sensor and method of controlling the mobile terminal |
JP5569271B2 (en) * | 2010-09-07 | 2014-08-13 | ソニー株式会社 | Information processing apparatus, information processing method, and program |
KR20120054750A (en) | 2010-11-22 | 2012-05-31 | 삼성전자주식회사 | Method and apparatus for selective display |
US9257098B2 (en) * | 2011-12-23 | 2016-02-09 | Nokia Technologies Oy | Apparatus and methods for displaying second content in response to user inputs |
US20130194301A1 (en) * | 2012-01-30 | 2013-08-01 | Burn Note, Inc. | System and method for securely transmiting sensitive information |
EP2624116B1 (en) * | 2012-02-03 | 2017-09-06 | EchoStar Technologies L.L.C. | Display zoom controlled by proximity detection |
KR20130115742A (en) | 2012-04-13 | 2013-10-22 | 삼성전자주식회사 | Digital signature certificate method and apparatus therefor |
KR101921161B1 (en) * | 2012-05-15 | 2018-11-22 | 삼성전자 주식회사 | Control method for performing memo function and terminal thereof |
US9007402B2 (en) * | 2012-09-18 | 2015-04-14 | Facebook, Inc. | Image processing for introducing blurring effects to an image |
US9152211B2 (en) * | 2012-10-30 | 2015-10-06 | Google Technology Holdings LLC | Electronic device with enhanced notifications |
CN103092322A (en) | 2012-11-16 | 2013-05-08 | 阎跃鹏 | Area-controllable dual-display screen window, energy-saving display method and electronic equipment |
WO2014142951A1 (en) * | 2013-03-15 | 2014-09-18 | Intel Corporation | Display privacy with dynamic configuration |
US9959431B2 (en) * | 2013-09-16 | 2018-05-01 | Google Technology Holdings LLC | Method and apparatus for displaying potentially private information |
US20150082255A1 (en) * | 2013-09-16 | 2015-03-19 | Motorola Mobility Llc | Methods and apparatus for displaying notification information |
US9671946B2 (en) * | 2014-02-06 | 2017-06-06 | Rakuten Kobo, Inc. | Changing settings for multiple display attributes using the same gesture |
US9423901B2 (en) * | 2014-03-26 | 2016-08-23 | Intel Corporation | System and method to control screen capture |
US20150317482A1 (en) * | 2014-04-30 | 2015-11-05 | Mocana Corporation | Preventing visual observation of content on a mobile device by hiding content |
US9235711B1 (en) * | 2014-06-24 | 2016-01-12 | Voxience S.A.R.L. | Systems, methods and devices for providing visual privacy to messages |
US9230355B1 (en) * | 2014-08-21 | 2016-01-05 | Glu Mobile Inc. | Methods and systems for images with interactive filters |
US20160154769A1 (en) * | 2014-11-28 | 2016-06-02 | Kabushiki Kaisha Toshiba | Electronic device and method for handwriting |
KR20160122517A (en) * | 2015-04-14 | 2016-10-24 | 엘지전자 주식회사 | Mobile terminal |
US10216945B2 (en) * | 2015-09-15 | 2019-02-26 | Clipo, Inc. | Digital touch screen device and method of using the same |
-
2014
- 2014-04-23 US US14/259,740 patent/US10319345B2/en active Active
- 2014-05-02 EP EP14166857.4A patent/EP2800025B1/en not_active Not-in-force
- 2014-05-04 CN CN201410185091.0A patent/CN104133632B/en not_active Expired - Fee Related
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090225100A1 (en) * | 2008-03-10 | 2009-09-10 | Yu-Chieh Lee | Method and system for magnifying and displaying local image of touch display device by detecting approaching object |
Also Published As
Publication number | Publication date |
---|---|
CN104133632B (en) | 2019-12-13 |
EP2800025A1 (en) | 2014-11-05 |
US20140327634A1 (en) | 2014-11-06 |
CN104133632A (en) | 2014-11-05 |
US10319345B2 (en) | 2019-06-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP2800025B1 (en) | Portable terminal and method for protecting a displayed object | |
EP2720126B1 (en) | Method and apparatus for generating task recommendation icon in a mobile device | |
US10387014B2 (en) | Mobile terminal for controlling icons displayed on touch screen and method therefor | |
ES2748044T3 (en) | Display apparatus and control procedure thereof | |
EP2808781B1 (en) | Method, storage medium, and electronic device for mirroring screen data | |
KR102063952B1 (en) | Multi display apparatus and multi display method | |
US10254915B2 (en) | Apparatus, method, and computer-readable recording medium for displaying shortcut icon window | |
RU2672134C2 (en) | Portable terminal and method for providing haptic effect for input unit | |
EP2781992A2 (en) | Portable terminal with pen for providing a haptic effect | |
US9262867B2 (en) | Mobile terminal and method of operation | |
EP2703979A2 (en) | Method of controlling a list scroll bar and an electronic device using the same | |
US20140176600A1 (en) | Text-enlargement display method | |
CN105393202B (en) | Method for providing the portable equipment of combined user interface component and controlling it | |
KR20140108010A (en) | Portable apparatus for providing haptic feedback with an input unit and method therefor | |
CN109948581B (en) | Image-text rendering method, device, equipment and readable storage medium | |
CA2835526A1 (en) | Mobile apparatus displaying end effect and control method thereof | |
CN110928464B (en) | User interface display method, device, equipment and medium | |
KR20140089976A (en) | Method for managing live box and apparatus for the same | |
KR20140137616A (en) | Mobile terminal and method for controlling multilateral conversation | |
CN112825040B (en) | User interface display method, device, equipment and storage medium | |
KR102266191B1 (en) | Mobile terminal and method for controlling screen | |
CN114546545B (en) | Image-text display method, device, terminal and storage medium | |
CN112732133B (en) | Message processing method and device, electronic equipment and storage medium | |
US10185457B2 (en) | Information processing apparatus and a method for controlling the information processing apparatus | |
KR102218507B1 (en) | Method for managing live box and apparatus for the same |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 20140502 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
R17P | Request for examination filed (corrected) |
Effective date: 20150505 |
|
RBV | Designated contracting states (corrected) |
Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
17Q | First examination report despatched |
Effective date: 20160412 |
|
GRAP | Despatch of communication of intention to grant a patent |
Free format text: ORIGINAL CODE: EPIDOSNIGR1 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: GRANT OF PATENT IS INTENDED |
|
INTG | Intention to grant announced |
Effective date: 20171208 |
|
RIN1 | Information on inventor provided before grant (corrected) |
Inventor name: KWON, DONG-WOOK Inventor name: KIM, SANG-HO Inventor name: AHN, HEE-BUM Inventor name: KIM, JI-HOON Inventor name: KIM, DO-HYEON Inventor name: HWANG, SEONG-TAEK Inventor name: CHANG, WON-SUK |
|
GRAS | Grant fee paid |
Free format text: ORIGINAL CODE: EPIDOSNIGR3 |
|
GRAA | (expected) grant |
Free format text: ORIGINAL CODE: 0009210 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE PATENT HAS BEEN GRANTED |
|
AK | Designated contracting states |
Kind code of ref document: B1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
REG | Reference to a national code |
Ref country code: GB Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: EP |
|
REG | Reference to a national code |
Ref country code: AT Ref legal event code: REF Ref document number: 1015282 Country of ref document: AT Kind code of ref document: T Effective date: 20180715 |
|
REG | Reference to a national code |
Ref country code: IE Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R096 Ref document number: 602014027763 Country of ref document: DE |
|
REG | Reference to a national code |
Ref country code: NL Ref legal event code: MP Effective date: 20180704 |
|
REG | Reference to a national code |
Ref country code: LT Ref legal event code: MG4D |
|
REG | Reference to a national code |
Ref country code: AT Ref legal event code: MK05 Ref document number: 1015282 Country of ref document: AT Kind code of ref document: T Effective date: 20180704 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: NL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180704 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180704 Ref country code: AT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180704 Ref country code: GR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20181005 Ref country code: IS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20181104 Ref country code: RS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180704 Ref country code: NO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20181004 Ref country code: PL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180704 Ref country code: LT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180704 Ref country code: CZ Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180704 Ref country code: BG Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20181004 Ref country code: FI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180704 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: LV Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180704 Ref country code: ES Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180704 Ref country code: AL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180704 Ref country code: HR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180704 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R097 Ref document number: 602014027763 Country of ref document: DE |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: IT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180704 Ref country code: EE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180704 Ref country code: RO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180704 |
|
PLBE | No opposition filed within time limit |
Free format text: ORIGINAL CODE: 0009261 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: DK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180704 Ref country code: SM Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180704 Ref country code: SK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180704 |
|
26N | No opposition filed |
Effective date: 20190405 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180704 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: GB Payment date: 20190424 Year of fee payment: 6 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R119 Ref document number: 602014027763 Country of ref document: DE |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: PL |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MC Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180704 Ref country code: CH Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20190531 Ref country code: LI Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20190531 |
|
REG | Reference to a national code |
Ref country code: BE Ref legal event code: MM Effective date: 20190531 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: LU Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20190502 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: TR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180704 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: IE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20190502 Ref country code: DE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20191203 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: BE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20190531 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: PT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20181105 Ref country code: FR Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20190531 |
|
GBPC | Gb: european patent ceased through non-payment of renewal fee |
Effective date: 20200502 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: GB Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20200502 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: CY Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180704 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180704 Ref country code: HU Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO Effective date: 20140502 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180704 |