EP3012730B1 - Display securing method and apparatus - Google Patents
Display securing method and apparatus Download PDFInfo
- Publication number
- EP3012730B1 EP3012730B1 EP15172008.3A EP15172008A EP3012730B1 EP 3012730 B1 EP3012730 B1 EP 3012730B1 EP 15172008 A EP15172008 A EP 15172008A EP 3012730 B1 EP3012730 B1 EP 3012730B1
- Authority
- EP
- European Patent Office
- Prior art keywords
- target object
- gesture
- area
- encrypted
- decryption
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims description 32
- 238000012545 processing Methods 0.000 description 25
- 238000010586 diagram Methods 0.000 description 20
- 238000004891 communication Methods 0.000 description 13
- 230000004044 response Effects 0.000 description 7
- 238000013500 data storage Methods 0.000 description 3
- 230000002708 enhancing effect Effects 0.000 description 2
- 239000011521 glass Substances 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 241001025261 Neoraja caerulea Species 0.000 description 1
- 210000003423 ankle Anatomy 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 230000009545 invasion Effects 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000002035 prolonged effect Effects 0.000 description 1
- 230000000644 propagated effect Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L9/00—Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols
- H04L9/32—Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols including means for verifying the identity or authority of a user of the system or for message authentication, e.g. authorization, entity authentication, data integrity or data verification, non-repudiation, key authentication or verification of credentials
- H04L9/3271—Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols including means for verifying the identity or authority of a user of the system or for message authentication, e.g. authorization, entity authentication, data integrity or data verification, non-repudiation, key authentication or verification of credentials using challenge-response
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/30—Authentication, i.e. establishing the identity or authorisation of security principals
- G06F21/31—User authentication
- G06F21/36—User authentication by graphic or iconic representation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09C—CIPHERING OR DECIPHERING APPARATUS FOR CRYPTOGRAPHIC OR OTHER PURPOSES INVOLVING THE NEED FOR SECRECY
- G09C5/00—Ciphering apparatus or methods not provided for in the preceding groups, e.g. involving the concealment or deformation of graphic data such as designs, written or printed messages
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04808—Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L2209/00—Additional information or applications relating to cryptographic mechanisms or cryptographic arrangements for secret or secure communication H04L9/00
- H04L2209/24—Key scheduling, i.e. generating round keys or sub-keys for block encryption
Description
- The following description relates to a display securing method and apparatus.
- Development of information technology (IT) has brought about an increase in usage of terminal devices such as a smartphone, a tablet personal computer (PC), and the like. In a recent trend, the terminal device adopts an increased display size.
- In a case in which the terminal device is utilized for business or directly used to process business transactions, critical information may be exposed. For example, according to an increase in a display size, information displayed on the terminal device may be easily exposed to others. The exposed information may cause an invasion of personal privacy of a user of the terminal device. Recently, research is continuously being conducted to enhance a security of the terminal device.
EP2927903A1 ,US2013/321452A1 andWO2014/119862A1 each disclose visually encrypting target objects on a display based on user gestures. -
WO2014/119862A1 discloses displaying a target object on a remote displaying device. - This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
- In one general aspect, there is provided a security apparatus including an encryptor configured to visually encrypt a target object, and a decryptor configured to decrypt an area corresponding to a decryption gesture in the encrypted target object, during a predetermined period of time.
- The encryptor may be configured to visually encrypt the target object based on an encryption gesture to the target object.
- The encryptor may be configured to visually encrypt an area corresponding to the encryption gesture in the target object.
- The encryptor may be configured to recognize, as the encryption gesture, any one or any combination of a touch gesture to an area in the target object, a sliding gesture to an area in the target object, and a drag gesture for setting a range of an area in the target object based on two touch gestures.
- The encryptor may be configured to recognize, as the encryption gesture, the drag gesture, the sliding gesture, or the touch gesture simultaneously input with a touch gesture to a predetermined area on a display.
- The encryptor may be configured to mix a noise object to the target object to visually encrypt the target object.
- The encryptor may be configured to adjust a mixing ratio of the noise object to the target object based on a speed of an encryption gesture to the target object.
- The encryptor may be configured to mix the noise object to the target object such that a presence of the target object is recognizable in the encrypted target object.
- The encryptor may be configured to receive any one or any combination of an area to be visually encrypted in the target object, a noise object to be mixed to the target object, and a mixing ratio of the noise object to the target object.
- The encryptor may be configured to overlay the target object with a noise object to visually encrypt the target object.
- The encryptor may be configured to divide the target object into groups, and alternately display the groups to visually encrypt the target object.
- The encryptor may be further configured to visually encrypt the area corresponding to the decryption gesture in response to the predetermined period of time elapsing.
- The decryptor may be configured to receive either one or both of the predetermined period of time and a range of the area corresponding to the decryption gesture.
- The decryptor may be configured to set the area corresponding to the decryption gesture as an entire area of the encrypted target object.
- The decryptor may be configured to recognize, as the decryption gesture, any one or any combination of a gaze gesture into the area in the encrypted target object, a drag gesture for setting a range of the area in the encrypted target object based on two touch gestures, a sliding gesture to the area in the encrypted target object, and a touch gesture to the area in the encrypted target object.
- The decryptor may be configured to recognize, as the decryption gesture, the gaze gesture, the drag gesture, the sliding gesture, or the touch gesture simultaneously input with a touch gesture to a predetermined area on a display.
- The target object may include any one or any combination of an image, a text, and a video that are displayed on a display.
- The encryptor may be configured to transmit the encrypted target object to another device, and the decryptor may be configured to receive the encrypted target object from the other device.
- In another general aspect, there is provided a terminal device including an encryptor configured to visually encrypt a target object, and a transmitter configured to transmit the encrypted target object to another device. The encrypted target object is decrypted during a predetermined period of time, based on a decryption gesture.
- In still another general aspect, there is provided a terminal device including a receiver configured to receive an encrypted target object from another device, and a decryptor configured to decrypt an area corresponding to a decryption gesture in the encrypted target object, during a predetermined period of time.
- In yet another general aspect, there is provided a security method including visually encrypting an object displayed on a display based on an encryption gesture to the object, and decrypting the encrypted object based on a decryption gesture to the encrypted object.
- The decrypting may include decrypting an area corresponding to the decryption gesture in the encrypted object, during a predetermined period of time.
- The security method may further include visually encrypting the area corresponding to the decryption gesture in response to the predetermined period of time elapsing.
- Other features and aspects will be apparent from the following detailed description, the drawings, and the claims.
-
-
FIG. 1 is a block diagram illustrating an example of a security apparatus. -
FIGS. 2A through 4C are diagrams illustrating examples of an encryption. -
FIGS. 5A and5B are diagrams illustrating examples of setting an encryption. -
FIGS. 6A and 6B are diagrams illustrating examples of setting an encryption area. -
FIGS. 7 through 8C are diagrams illustrating examples of setting an encryption level. -
FIGS. 9A through 11 are diagrams illustrating examples of a decryption. -
FIGS. 12A through 12D are diagrams illustrating examples of setting a decryption area. -
FIGS. 13 and14 are diagrams illustrating examples of transmitting and receiving an encrypted target object. -
FIG. 15 is a block diagram illustrating an example of a terminal apparatus. -
FIG. 16 is a block diagram illustrating another example of a terminal apparatus. -
FIG. 17 is a flowchart illustrating an example of a security method. -
FIG. 18 is a flowchart illustrating an example of a terminal apparatus control method. -
FIG. 19 is a flowchart illustrating another example of a terminal apparatus control method. - Throughout the drawings and the detailed description, unless otherwise described or provided, the same drawing reference numerals will be understood to refer to the same elements, features, and structures. The drawings may not be to scale, and the relative size, proportions, and depiction of elements in the drawings may be exaggerated for clarity, illustration, and convenience.
- The following detailed description is provided to assist the reader in gaining a comprehensive understanding of the methods, apparatuses, and/or systems described herein. However, various changes, modifications, and equivalents of the systems, apparatuses and/or methods described herein will be apparent to one of ordinary skill in the art. The progression of processing steps and/or operations described is an example; however, the sequence of and/or operations is not limited to that set forth herein and may be changed as is known in the art, with the exception of steps and/or operations necessarily occurring in a certain order. Also, descriptions of functions and constructions that are well known to one of ordinary skill in the art may be omitted for increased clarity and conciseness.
- The features described herein may be embodied in different forms, and are not to be construed as being limited to the examples described herein. Rather, the examples described herein have been provided so that this disclosure will be thorough and complete, and will convey the full scope of the disclosure to one of ordinary skill in the art.
-
FIG. 1 is a block diagram illustrating an example of asecurity apparatus 100. Referring toFIG. 1 , thesecurity apparatus 100 includes anencryptor 110 and adecryptor 120. - The
encryptor 110 partially or fully visually encrypts a target object. In an example, theencryptor 110 may visually encrypt an entirety or a portion of the target object by mixing a noise object to the target object. In this example, an object may be any type of data displayed on a display. For example, the object may include at least one of an image, a text, and a video. The target object indicates a target to be encrypted by theencryptor 110. The noise object indicates an object mixed to the target object to encrypt the target object. The noise object may be an identical object to the target object, or a different object from the target object. The noise object may be selected by a user or determined in advance. - The
encryptor 110 partially or fully visually encrypts the target object based on an encryption gesture input by the user to at least a portion of the target object. In this example, the encryption gesture may include a touch gesture corresponding to at least one area of the target object, a sliding gesture performed on at least one area of the target object, and a drag gesture for setting a range of at least one area in the target object based on two touch gestures. For example, theencryptor 110 may recognize a gesture of the user touching a predetermined area of the target object, using a finger, as the touch gesture. Also, theencryptor 110 may recognize a gesture of the user moving two fingers while the two fingers are in contact with a predetermined area of the target object, as the drag gesture. - Additionally, the encryption gesture may include the drag gesture, the sliding gesture, or the touch gesture simultaneously input with a touch gesture corresponding to a predetermined area on the display. For example, the
encryptor 110 may recognize the drag gesture, the sliding gesture, and the touch gesture by the user on at least one area of the target object, using a finger while the finger is in contact with a predetermined area on the display, as the encryption gesture. - In an example, the
encryptor 110 determines an area to be encrypted in the target object based on an area corresponding to the encryption gesture. For example, theencryptor 110 may determine a portion in contact with the finger of the user, of the target object on the display, as the area to be encrypted. Additionally, theencryptor 110 may determine a predetermined range from the portion in contact with the finger of the user, as the area to be encrypted. Also, theencryptor 110 may encrypt an entire area of the target object to which the encryption gesture is input. - In another example, the
encryptor 110 adjusts a mixing ratio of a noise object to the target object based on a speed of the encryption gesture. For example, when theencryptor 110 recognizes a sliding gesture performed at a relatively high speed, a high mixing ratio of the noise object may be set for an area corresponding to the sliding gesture in the target object. Conversely, when theencryptor 110 recognizes a sliding gesture performed at a relatively low speed of the encryption gesture, theencryptor 110 may set a low mixing ratio of the noise object to an area corresponding to the sliding gesture in the target object. - Additionally, in an example, the
encryptor 110 receives the mixing ratio of the noise object to the target object, the noise object, and/or the area to be encrypted in the target object, from the user through a predetermined interface. For example, theencryptor 110 may set an operation mode of thesecurity apparatus 110 to an encryption mode by receiving a command from the user through an interface. As an example, in the encryption mode, theencryptor 110 may receive an input indicating whether the entire area of the target object or the area corresponding to the encryption gesture is to be set as the area to be encrypted, from the user through the interface. Also, theencryptor 110 may receive an input indicating whether the portion in contact with the finger of the user, or the range from the portion in contact with the finger of the user, is to be set as the area corresponding to the encryption gesture, from the user through the interface. - Also, in an example, the
encryptor 110 may receive the command from the user through the interface, and set a mode of thesecurity apparatus 110 to a normal mode. Despite an input of, for example, the touch gesture, the sliding gesture, and the drag gesture, theencryptor 110 may not recognize the input as the encryption gesture in the normal mode. - In another example, the
encryptor 110 may overlay the entirety or the portion of the target object with the noise object to partially or fully visually encrypt the target object. Also, theencryptor 110 may divide the entirety or the portion of the target object into a plurality of groups, and encrypt the target object by alternately displaying objects included in the plurality of groups. - Also, the
encryptor 110 may mix the noise object to the target object such that a presence of the target object is recognizable in the encrypted target object. - During a predetermined period of time, the
decryptor 120 decrypts an area corresponding to a decryption gesture input by the user in the encrypted target object. Theencryptor 110 visually encrypts the area corresponding to the decryption gesture when the predetermined period of time elapses. In this example, the predetermined period of time may be input from the user or set in advance. For example, when the predetermined period of time is set as 30 seconds, thedecryptor 120 may decrypt the area corresponding to the decryption gesture in the encrypted target object for 30 seconds from a point in time at which the decryption gesture is input by the user. When a 30 second period elapses from the point in time, theencryptor 110 may visually encrypt the area corresponding to the decryption gesture. Through this, thedecryptor 120 may allow a portion recognized by the user through a decryption to be encrypted again, thereby enhancing security. - The decryption gesture may include at least one of the touch gesture corresponding to at least one area of the encrypted target object, the sliding gesture performed on at least one area of the encrypted target object, the drag gesture for setting a range from at least one area of the encrypted target object based on two touch gestures, and a gaze gesture performed by gazing into at least one area of the encrypted target object. For example, when the user gazes at a predetermined area of the encrypted target object, the
decryptor 120 may recognize the area at which the user gazes in the target object, by using a camera. Thedecryptor 120 may recognize the gaze gesture of the user as the decryption gesture, and recognize the area at which the user gazes as the area corresponding to the decryption gesture. Through this, thedecryptor 120 may decrypt only a portion of the target object that is desired by the user, thereby enhancing security. - Additionally, the
decryptor 120 may recognize the gaze gesture, the drag gesture, the sliding gesture, or the touch gesture simultaneously input with the touch gesture corresponding to a predetermined area on the display, as the decryption gesture. For example, thedecryptor 120 may recognize the gaze gesture, the drag gesture, the sliding gesture, or the touch gesture performed while a finger of the user is in contact with a predetermined area on the display, as the decryption gesture. - Also, the
decryptor 120 may receive an input including a time for performing a decryption or a range of the area corresponding to the decryption gesture, from the user through a predetermined interface. For example, thedecryptor 120 may provide an interface for setting the range of the area corresponding to the decryption gesture, and set the range of the area corresponding to the decryption gesture such that the range corresponds to the input received through the interface. - In an example, the
decryptor 120 may set the time for performing the decryption or the range of the area corresponding to the decryption gesture, based on the decryption gesture. For example, in a case in which thedecryptor 120 recognizes a sliding gesture performed at a relatively high speed, thedecryptor 120 may set the range of the area corresponding to the decryption gesture to be expanded twice, and set the time for performing the decryption to be prolonged twice when compared to a case in which thedecryptor 120 recognizes a sliding gesture performed at a relatively low speed. - In an example, based on the area corresponding to the decryption gesture, the
decryptor 120 may determine an area to be decrypted in the target object. For example, thedecryptor 120 may determine a portion in contact with the finger of the user, of the target object on the display, as the area to be decrypted. Also, thedecryptor 120 may determine a predetermined range from the portion in contact with the finger of the user as the area to be decrypted. For example, in a case in which an object represented by a plurality of lines on a display is displayed through an encryption, thedecryptor 120 may decrypt all lines corresponding to a portion in contact with the finger of the user when the finger comes into contact with a left edge portion or a right edge portion of the display. As an example, when the user touches the left edge portion or the right edge portion of the display using a finger and moves the finger down, thedecryptor 120 may decrypt all lines corresponding to a portion at which the user moves the finger down, and then theencryptor 110 may encrypt all lines corresponding to the portion in response to the finger being lifted off the touched portion. In this example, thedecryptor 120 may move the object up on the display based on the lines corresponding to the portion. Also, when the user touches the left edge portion or the right edge portion of the display using the finger and moves the finger up, thedecryptor 120 may decrypt all lines corresponding to a portion at which the user moves the finger up, and then theencryptor 110 may encrypt all lines corresponding to the portion in response to the finger being lifted off the touched portion. In this example, thedecryptor 120 may move the object down on the display based on the lines corresponding to the portion. - Also, to decrypt an entire area of the encrypted target object, the
decryptor 120 sets the area corresponding to the decryption gesture as the entire area. For example, when a gesture of touching the target object three times is set in advance as a gesture for decrypting the entire area of the target object, thedecryptor 120 may decrypt the entire area of the target object in response to the gesture of touching the target object three times. - The
encryptor 110 transmits the encrypted target object to an external source through a communication interface. The communication interface may include a wireless internet interface, for example, a wireless local area network (WLAN), a wireless fidelity (WiFi) direct, a digital living network appliance (DLNA), a wireless broadband (WiBro), a world interoperability for microwave access (WiMAX), a high speed downlink packet access (HSDPA), and a local area communication interface, for example, Bluetooth, a Radio frequency identification (RFID), an infrared data association (IrDA), an ultra wideband (UWB), Zigbee, and a near field communication (NFC). The communication interface may also refer to any interface, for example, a wired interface, which is capable of communicating with an external source. - Also, the
decryptor 120 receives the encrypted target object from the external source through the communication interface. -
FIGS. 2A through 4C are diagrams illustrating examples of an encryption. Referring toFIGS. 2A and 2B , a security apparatus encrypts atarget object 211 based on an overlay scheme. In an example, an encryption may be performed based on an area corresponding to an encryption gesture input by a user. For example, the security apparatus may encrypt thetarget object 211 by recognizing a touch gesture corresponding to a start area and an end area of thetarget object 211. - The security apparatus may set a noise object as an identical object to the
target object 211 in advance. Alternatively, the user may input the identical object to thetarget object 211 as the noise object to the security apparatus through an interface. Also, the security apparatus may set whether thetarget object 211 is to be encrypted based on the overlay scheme among various encryption schemes, in advance. Alternatively, the user may select the overlay scheme from the encryption schemes, and input the overlay scheme to the security apparatus through the interface. - The security apparatus generates an
encrypted target object 221 obtained by overlaying thetarget object 211 with the noise object. In this example, the security apparatus may overlay thetarget object 211 with the noise object such that a presence of thetarget object 211 is recognizable in theencrypted target object 221. - Referring to
FIGS. 3A through 3C , a security apparatus encrypts atarget object 311 based on a division scheme. InFIG. 3A , thetarget object 311 is a text, and anoise object 312 is a grayscale image. In this example, the security apparatus may set thenoise object 312 as the grayscale image, or the user may input the grayscale image to the security apparatus as thenoise object 312. Also, the security apparatus may set whether thetarget object 311 is to be encrypted, in advance. Alternatively, the user may select the division scheme, and input the division scheme to the security apparatus through an interface. - In
FIGS. 3B and 3C , the security apparatus mixes thenoise object 312 to thetarget object 311, and divides a result of the mixing into two groups. In this example, the security apparatus classifies letters including spaces in odd positions into a first group, and classifies letters including spaces in even positions into a second group based on the text of thetarget object 311. The security apparatus alternately displaysobjects objects - Referring to
FIG. 4A , a security apparatus overlays atarget object 411 based on an overlay scheme. The security apparatus sets an image having a relatively low brightness as a noise object, and generate anencrypted target object 412 by overlaying thetarget object 411 with the noise object. - Referring to
FIG. 4B , atarget object 421 includes a text and an image. The security apparatus encrypts the text of thetarget object 421. As illustrated inFIG. 4B , the security apparatus generates anencrypted target object 422 by changing a text option, for example, a type, a size, and a boldness of a font, of a text "Dog". Also, the security apparatus may generate theencrypted target object 422 by removing letters "o" and "g" from the text "Dog". - Referring to
FIG. 4C , the security apparatus encrypts atarget object 431 based on a plurality of schemes. In this example, the plurality of schemes may be set in advance, or input from a user through an interface. InFIG. 4C , the security apparatus performs a first encryption based on an overlay scheme, and then performs a second encryption on anencrypted target object 432 on which the first encryption is performed, based on a removal scheme. - In a process of the first encryption, the security apparatus changes a text option of a text "Dog" included in the
target object 431, and generates theencrypted target object 432 by overlaying thetarget object 431 with a noise object "D g". In a process of the second encryption, the security apparatus removes the noise object "D g" from theencrypted target object 432, thereby generating anencrypted target object 433. In an example, the security apparatus may alternately display theencrypted target object 432 and theencrypted target object 433 on a display. -
FIGS. 5A and5B are diagrams illustrating examples of setting an encryption. Referring toFIGS. 5A and5B , a security apparatus sets an operation mode to an encryption mode. In this example, the encryption mode refers to an operation mode in which the security apparatus encrypts a target object. Also, a normal mode refers to an operation mode in which the security apparatus does not encrypt the target object despite an input of an encryption gesture. - In
FIG. 5A , the security apparatus sets the operation mode to the encryption mode based on an encryption gesture input by a user. For example, the user may sequentially touch orslide areas 511 through 515 of atarget object 510. The security apparatus may recognize a touch gesture or a slide gesture input by the user as an encryption gesture for setting the operation mode to the encryption mode. The security apparatus may set the operation mode to the encryption mode based on the recognized encryption gesture. - In
FIG. 5B , the security apparatus receives a command from a user through aninterface 520, and sets the operation mode to the encryption mode. For example, the security apparatus provides theinterface 520 to the user. Theinterface 520 includes aninterface 521 used to set the operation mode to the encryption mode. When the user selects theinterface 521, the security apparatus sets the operation mode to the encryption mode. When the user revokes the selecting of theinterface 521, the security interface sets the operation mode to the normal mode. -
FIGS. 6A and 6B are diagrams illustrating examples of setting an encryption area. Referring toFIGS. 6A and 6B , a security apparatus sets the encryption area. The encryption area refers to an area to be encrypted in a target object. - In an example, the security apparatus determines the encryption area based on an area corresponding to an encryption gesture. In this example, the encryption gesture may include a touch gesture corresponding to at least one area of a target object, a sliding gesture performed on at least one area of the target object, and a drag gesture for setting a range of at least one area in the target object based on two touch gestures. Also, the encryption gesture may include the drag gesture, the sliding gesture, or the touch gesture simultaneously input with a touch gesture corresponding to a predetermined area on a display. The security apparatus may set an area corresponding to the touch gesture, the sliding gesture, and/or the drag gesture as the encryption area.
- In
FIG. 6A , a user inputs adrag gesture 611 for setting a range of anarea 612 in a target object, to the security apparatus. In this example, the security apparatus determines thearea 612 set based on thedrag gesture 611, as the encryption area. - In
FIG. 6B , the security apparatus simultaneously receives atouch gesture 621 and a slidinggesture 623, thetouch gesture 621 corresponding to apredetermined area 622 on a display. In this example, the security apparatus determines anarea 624 set based on the slidinggesture 623, as the encryption area. -
FIGS. 7 through 8C are diagrams illustrating examples of setting an encryption level. Referring toFIG. 7 , atarget object 710 includes animage 711 and atext 712. A security apparatus adjusts a mixing ratio of a noise object to thetarget object 710 based on a speed of an encryption gesture for thetarget object 710. - In an example, the security apparatus receives a sliding
gesture 713 input by a user to theimage 711 of thetarget object 710. Based on the slidinggesture 713 input by the user, the security apparatus mixes a noise image to theimage 711. In this example, the security apparatus may adjust a mixing ratio of the noise image to theimage 711 based on a speed of the slidinggesture 713. For example, when the speed of the slidinggesture 713 is 10 centimeters per second (cm/s), the security apparatus may set a relatively low mixing ratio of the noise image to theimage 711 as represented by anencrypted target object 721. When the speed of the sliding gesture is 20 cm/s, the secure apparatus may set a relatively high mixing ratio of the noise image to theimage 711 as represented by anencrypted target object 731. Also, in a case in which the speed of the slidinggesture 713 is low, the secure apparatus may set a higher mixing ratio of the noise image to theimage 711 as compared to a case in which the speed of the slidinggesture 713 is high. - Referring to
FIG. 8A , atarget object 810 includes animage 811 and atext 812. A user inputs a mixing ratio of a noise object to thetarget object 810, to a security apparatus through an interface. - In an example, the security apparatus receives a sliding
gesture 813 input by the user to thetext 812 of thetarget object 810. Based on the slidinggesture 813 input by the user, the security apparatus mixes a noise text to thetext 812. - Referring to
FIG. 8B , the security apparatus provides aninterface 821 to the user. The secure apparatus receives the mixing ratio of the noise text to thetext 812 from the user through theinterface 821, and mixes the noise text to thetext 812 based on the mixing ratio received from the user. - Referring to
FIG. 8C , when a high mixing ratio of the noise text is input from the user through theinterface 821, the secure apparatus mixes the noise text to thetext 812 based on the high mixing ratio to generate anencrypted target object 831. -
FIGS. 9A through 11 are diagrams illustrating examples of a decryption. Referring toFIGS. 9A and 9B , during a predetermined period of time, a secure apparatus decrypts an area corresponding to a decryption gesture input by a user in an encrypted target object. Subsequently, an encryptor visually encrypts the area corresponding to the decryption gesture when the predetermined period of time elapses. - For example, the secure apparatus receives, from the user, sliding
gestures area 913 to anarea 923 of anencrypted target text 911. Through this, the secure apparatus decrypts letters from "layer is" of thetarget text 911 to "encryp" of anencrypted target text 921. In this example, the secure apparatus decrypts "layer is" of thetarget text 911 at a moment at which the user touches thearea 913, and then encrypts "layer is" of thetarget text 911 again when the user is detached from thearea 913. In an example, a period of time during which the decryption is maintained may be set in advance, or input by the user. - Referring to
FIGS 10A and 10B , the secure apparatus decrypts an entire area of anencrypted target object 1010 based on a decryption gesture. For example, theencrypted target object 1010 is obtained by mixing anoise object 1012 to atarget object 1011. The secure apparatus receives consecutive touch gestures corresponding toareas 1021 through 1024 that are input by a user. Based on the input touch gestures, the secure apparatus removes thenoise object 1012 from theencrypted target object 1010 as illustrated inFIG. 10B . In this example, the touch gestures corresponding to theareas 1021 through 1024 may be set to decrypt the entire area of theencrypted target object 1010 in advance, or set by the user to decrypt the entire area of theencrypted target object 1010. - Referring to
FIG. 11 , the secure apparatus decrypts a target object by recognizing a line of sight of a user. For example, the secure apparatus uses a camera to recognize anarea 1111 at which the user gazes at anencrypted target text 1110. The secure apparatus decrypts only thearea 1111 of theencrypted target text 1110. In this example, when the user changes a direction of the line of sight, the secure apparatus may decrypt an area at which the user gazes in the changed direction, and then encrypt thearea 1111 again. -
FIGS. 12A through 12D are diagrams illustrating examples of setting a decryption area. Referring toFIGS. 12A through 12D , a secure apparatus sets a decryption area of an encrypted target object. The decryption area refers to an area to be decrypted - In an example, the secure apparatus determines the decryption area based on an area corresponding to a decryption gesture. In this example, the decryption gesture may include a touch gesture corresponding to at least one area of the encrypted target object, a sliding gesture performed on at least one area of the encrypted target object, a drag gesture for setting a range of at least one area in the encrypted target object based on two touch gestures, or a gaze gesture performed by gazing into at least one area of the encrypted target object. Also, the decryption gesture may include the gaze gesture, the drag gesture, the sliding gesture, or the touch gesture simultaneously input with a touch gesture corresponding to a predetermined area.
- Referring to
FIGS. 12A and 12B , the security apparatus simultaneously receives atouch gesture 1211 and a slidinggesture 1213, thetouch gesture 1211 being input to apredetermined area 1212 on a display. In this example, the security apparatus determines anarea 1214 set based on the slidinggesture 1213 as the decryption area. A user may adjust thearea 1214 based on a slidinggesture 1221 such that the security apparatus determines anarea 1222 to be the decryption area. - Referring to
FIGS. 12C and 12D , the user inputs adrag gesture 1231 for setting a range of anarea 1232, to the security apparatus. In this example, the security apparatus determines thearea 1232 set based on thedrag gesture 1231, as the decryption area. The user adjusts thearea 1232 based on adrag gesture 1241 such that the security apparatus determines anarea 1242 as the decryption area. -
FIGS. 13 and14 are diagrams illustrating examples of transmitting and receiving an encrypted target object. Referring toFIG. 13 , atarget object 1310 includes animage 1311, atext 1312, and abarcode 1313. A terminal apparatus displays thetarget object 1310 on a display. In this example, the terminal apparatus receives a sliding gesture input to thebarcode 1313 from a user. In response to the inputting, the terminal apparatus encrypts thetarget object 1310 by mixing anoise object 1321 to thebarcode 1313 to generate anencrypted target object 1320. The terminal apparatus provides the user with aninterface 1331 for transmitting theencrypted target object 1320 to an external terminal. Through theinterface 1331, the terminal apparatus receives information of the external terminal to which theencrypted target object 1320 is to be transmitted, from the user. By using theinterface 1331, the terminal apparatus may transmit theencrypted target object 1320 to the external terminal based on the information received from the user. - Referring to
FIG. 14 , the terminal apparatus receives anencrypted target object 1410 from an external terminal through a communication interface. Theencrypted target object 1410 includes animage 1411, atext 1412, and anencrypted barcode 1413. The terminal apparatus receives a sliding gesture input to theencrypted barcode 1413 from a user. In response to the receiving, the terminal apparatus generates abarcode 1421 by decrypting theencrypted barcode 1413 during a predetermined period of time. When the predetermined period of time elapses, the terminal apparatus encrypts thebarcode 1421, thereby generating anencrypted barcode 1431. In an example, a period of time during which the terminal apparatus decrypts theencrypted barcode 1413 may be set by the terminal apparatus in advance, or set by the user. The period of time may also be set in advance by the external terminal from which theencrypted target object 1410 is received. -
FIG. 15 is a block diagram illustrating an example of aterminal apparatus 1500. Referring toFIG. 15 , theterminal apparatus 1500 includes anencryptor 1510 and atransmitter 1520. - The
encryptor 1510 may partially or fully visually encrypts a target object. In this example, the encrypted target object may be decrypted during a predetermined period of time based on a decryption gesture input by a user. - Also, the
encryptor 1510 may visually encrypt an entirety or a portion of the target object by mixing a noise object to the entirety or the portion of the target object. Also, theencryptor 1510 may visually encrypt the entirety or the portion of the target object based on an encryption gesture input by the user to at least a portion of the target object. - The
transmitter 1520 transmits the encrypted target object to an external source through a communication interface. - Since the descriptions provided with reference to
FIGS. 1 through 14 are also applicable here, repeated descriptions with respect to theterminal apparatus 1500 ofFIG. 15 will be omitted for increased clarity and conciseness. -
FIG. 16 is a block diagram illustrating an example of aterminal apparatus 1600. Referring toFIG. 16 , theterminal apparatus 1600 includes areceiver 1610 and adecryptor 1620. - The
receiver 1610 receives an encrypted target object from an external source through a communication interface. - The
decryptor 1620 decrypts an area corresponding to a decryption gesture input by the user in the encrypted target object during a predetermined period of time. - Since the descriptions provided with reference to
FIGS. 1 through 14 are also applicable here, repeated descriptions with respect to theterminal apparatus 1600 ofFIG. 16 will be omitted for increased clarity and conciseness. -
FIG. 17 is a flowchart illustrating an example of a security method. Referring toFIG. 17 , inoperation 1710, a security apparatus partially or fully visually encrypts a target object. - In
operation 1720, the security apparatus decrypts an area corresponding to a decryption gesture input by a user in the encrypted target object during a predetermined period of time. - Since the descriptions provided with reference to
FIGS. 1 through 14 are also applicable here, repeated descriptions with respect to the security method ofFIG. 17 will be omitted for increased clarity and conciseness. -
FIG. 18 is a flowchart illustrating an example of a terminal apparatus control method. Referring toFIG. 18 , inoperation 1810, a terminal apparatus partially or fully visually encrypts a target object. In this example, the encrypted target object may be decrypted during a predetermined period of time based on a decryption gesture input by a user. - In
operation 1820, the terminal apparatus transmits the encrypted target object to an external source through a communication interface. - Since descriptions provided with reference to
FIGS. 1 through 14 are also applicable here, repeated descriptions with respect to the terminal apparatus control method ofFIG. 18 will be omitted for increased clarity and conciseness. -
FIG. 19 is a flowchart illustrating another example of a terminal apparatus control method. Referring toFIG. 19 , inoperation 1910, a terminal apparatus receives an encrypted target object from an external source through a communication interface. - In
operation 1920, the terminal apparatus decrypts an area corresponding to a decryption gesture input by a user in the encrypted target object during a predetermined period of time. - Since descriptions provided with reference to
FIGS. 1 through 14 are also applicable here, repeated descriptions with respect to the terminal apparatus control method ofFIG. 19 will be omitted for increased clarity and conciseness. - The various elements and methods described above may be implemented using one or more hardware components, or a combination of one or more hardware components and one or more software components.
- A hardware component may be, for example, a physical device that physically performs one or more operations, but is not limited thereto. Examples of hardware components include microphones, amplifiers, low-pass filters, high-pass filters, band-pass filters, analog-to-digital converters, digital-to-analog converters, and processing devices.
- A software component may be implemented, for example, by a processing device controlled by software or instructions to perform one or more operations, but is not limited thereto. A computer, controller, or other control device may cause the processing device to run the software or execute the instructions. One software component may be implemented by one processing device, or two or more software components may be implemented by one processing device, or one software component may be implemented by two or more processing devices, or two or more software components may be implemented by two or more processing devices.
- A processing device may be implemented using one or more general-purpose or special-purpose computers, such as, for example, a processor, a controller and an arithmetic logic unit, a digital signal processor, a microcomputer, a field-programmable array, a programmable logic unit, a microprocessor, or any other device capable of running software or executing instructions. The processing device may run an operating system (OS), and may run one or more software applications that operate under the OS. The processing device may access, store, manipulate, process, and create data when running the software or executing the instructions. For simplicity, the singular term "processing device" may be used in the description, but one of ordinary skill in the art will appreciate that a processing device may include multiple processing elements and multiple types of processing elements. For example, a processing device may include one or more processors, or one or more processors and one or more controllers. In addition, different processing configurations are possible, such as parallel processors or multi-core processors.
- A processing device configured to implement a software component to perform an operation A may include a processor programmed to run software or execute instructions to control the processor to perform operation A. In addition, a processing device configured to implement a software component to perform an operation A, an operation B, and an operation C may have various configurations, such as, for example, a processor configured to implement a software component to perform operations A, B, and C; a first processor configured to implement a software component to perform operation A, and a second processor configured to implement a software component to perform operations B and C; a first processor configured to implement a software component to perform operations A and B, and a second processor configured to implement a software component to perform operation C; a first processor configured to implement a software component to perform operation A, a second processor configured to implement a software component to perform operation B, and a third processor configured to implement a software component to perform operation C; a first processor configured to implement a software component to perform operations A, B, and C, and a second processor configured to implement a software component to perform operations A, B, and C, or any other configuration of one or more processors each implementing one or more of operations A, B, and C. Although these examples refer to three operations A, B, C, the number of operations that may implemented is not limited to three, but may be any number of operations required to achieve a desired result or perform a desired task.
- Software or instructions for controlling a processing device to implement a software component may include a computer program, a piece of code, an instruction, or some combination thereof, for independently or collectively instructing or configuring the processing device to perform one or more desired operations. The software or instructions may include machine code that may be directly executed by the processing device, such as machine code produced by a compiler, and/or higher-level code that may be executed by the processing device using an interpreter. The software or instructions and any associated data, data files, and data structures may be embodied permanently or temporarily in any type of machine, component, physical or virtual equipment, computer storage medium or device, or a propagated signal wave capable of providing instructions or data to or being interpreted by the processing device. The software or instructions and any associated data, data files, and data structures also may be distributed over network-coupled computer systems so that the software or instructions and any associated data, data files, and data structures are stored and executed in a distributed fashion.
- For example, the software or instructions and any associated data, data files, and data structures may be recorded, stored, or fixed in one or more non-transitory computer-readable storage media. A non-transitory computer-readable storage medium may be any data storage device that is capable of storing the software or instructions and any associated data, data files, and data structures so that they can be read by a computer system or processing device. Examples of a non-transitory computer-readable storage medium include read-only memory (ROM), random-access memory (RAM), flash memory, CD-ROMs, CD-Rs, CD+Rs, CD-RWs, CD+RWs, DVD-ROMs, DVD-Rs, DVD+Rs, DVD-RWs, DVD+RWs, DVD-RAMs, BD-ROMs, BD-Rs, BD-R LTHs, BD-REs, magnetic tapes, floppy disks, magnetooptical data storage devices, optical data storage devices, hard disks, solid-state disks, or any other non-transitory computer-readable storage medium known to one of ordinary skill in the art.
- Functional programs, codes, and code segments for implementing the examples disclosed herein can be easily constructed by a programmer skilled in the art to which the examples pertain based on the drawings and their corresponding descriptions as provided herein.
- As a non-exhaustive illustration only, a terminal described herein may refer to mobile devices such as, for example, a cellular phone, a smart phone, a wearable smart device (such as, for example, a ring, a watch, a pair of glasses, a bracelet, an ankle bracket, a belt, a necklace, an earring, a headband, a helmet, a device embedded in the cloths or the like), a personal computer (PC), a tablet personal computer (tablet), a phablet, a personal digital assistant (PDA), a digital camera, a portable game console, an MP3 player, a portable/personal multimedia player (PMP), a handheld e-book, an ultra mobile personal computer (UMPC), a portable lab-top PC, a global positioning system (GPS) navigation, and devices such as a high definition television (HDTV), an optical disc player, a DVD player, a Blue-ray player, a setup box, or any other device capable of wireless communication or network communication consistent with that disclosed herein. In a non-exhaustive example, the wearable device may be self-mountable on the body of the user, such as, for example, the glasses or the bracelet. In another non-exhaustive example, the wearable device may be mounted on the body of the user through an attaching device, such as, for example, attaching a smart phone or a tablet to the arm of a user using an armband, or hanging the wearable device around the neck of a user using a lanyard.
- While this disclosure includes specific examples, it will be apparent to one of ordinary skill in the art that various changes in form and details may be made in these examples without departing from the scope of the claims. The examples described herein are to be considered in a descriptive sense only, and not for purposes of limitation. Descriptions of features or aspects in each example are to be considered as being applicable to similar features or aspects in other examples. Suitable results may be achieved if the described techniques are performed in a different order, and/or if components in a described system, architecture, device, or circuit are combined in a different manner and/or replaced or supplemented by other components or their equivalents. Therefore, the scope of the disclosure is defined not by the detailed description, but by the claims.
Claims (12)
- A security method comprising:visually encrypting, by an encryptor (110), a target object displayed on a display such that the target object is only partially recognizable or isn't recognizable any longer; and decrypting at least one part of the encrypted target object, by a decryptor (120), at an area corresponding to a decryption gesture made in the encrypted target object;encrypting comprises mixing a noise object to the target object to visually encrypt the target object, characterized in that the decrypting of the at least one part of the encrypted target object due to the decryption gesture is carried out for a predetermined period of time, and wherein the predetermined period of time is set based on the decryption gesture.
- The method of claim 1, wherein the step of encrypting comprises:visually encrypting the target object based on an encryption gesture to the target object, andvisually encrypting an area corresponding to the encryption gesture in the target object.
- The method of claim 2, wherein the step of encrypting comprises:recognizing, as the encryption gesture, any one or any combination of a touch gesture to an area in the target object, a sliding gesture to an area in the target object, and a drag gesture for setting a range of an area in the target object based on two touch gestures, andrecognizing, as the encryption gesture, the drag gesture, the sliding gesture, or the touch gesture simultaneously input with a touch gesture to a predetermined area on a display.
- The method of claim 1, wherein the step of mixing further comprises:adjusting a mixing ratio of the noise object to the target object based on a speed of an encryption gesture to the target object, ormixing the noise object to the target object such that a presence of the target object is recognizable in the encrypted target object.
- The method of claim 1, wherein the step of encrypting comprises:overlaying the target object with the noise object to visually encrypt the target object, ordividing the target object into groups and alternately displaying the groups to visually encrypt the target object.
- The method of claim 1, further comprising:
visually encrypting the area corresponding to the decryption gesture again after the predetermined period of time has elapsed. - The method of one of claims 1 to 6, wherein the step of uncovering comprises:receiving a range of the area corresponding to the decryption gesture, andsetting the area corresponding to the decryption gesture as an entire area of the encrypted target object.
- The method of one of claims 1 to 6, wherein the step of uncovering comprises:
recognizing, as the decryption gesture, any one or any combination of a gaze gesture into the area in the encrypted target object, a drag gesture for setting a range of the area in the encrypted target object based on two touch gestures, a sliding gesture to the area in the encrypted target object, and a touch gesture to the area in the encrypted target object. - The method of claim 8, wherein the step of recognizing comprises:
recognizing, as the decryption gesture, the gaze gesture, the drag gesture, the sliding gesture, or the touch gesture simultaneously input with a touch gesture to a predetermined area on a display. - The method of one of claims 1 to 9, wherein the target object comprises any one or any combination of an image, a text, and a video that are displayed on a display.
- The method of claim 1, further comprising:transmitting, by the encryptor (110), the encrypted target object to another device; andreceiving, by the decryptor (120), the encrypted target object from the other device.
- A terminal device (1600) comprising:a receiver (1610) configured to receive an encrypted target object from another device; anda decryptor (1620) configured to decrypt at least one part of the encrypted target object at an area corresponding to a decryption gesture in the encrypted targer object;the terminal device characterized in that
the decrypting of the at least one part of the encrypted target object due to the decryption gesture is carried out for a predetermined period of time, and
wherein the predetermined period of time is set based on the decryption gesture.
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020140141577A KR102257304B1 (en) | 2014-10-20 | 2014-10-20 | Method and apparatus for securing display |
Publications (2)
Publication Number | Publication Date |
---|---|
EP3012730A1 EP3012730A1 (en) | 2016-04-27 |
EP3012730B1 true EP3012730B1 (en) | 2020-07-22 |
Family
ID=53496409
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP15172008.3A Active EP3012730B1 (en) | 2014-10-20 | 2015-06-15 | Display securing method and apparatus |
Country Status (5)
Country | Link |
---|---|
US (1) | US9871664B2 (en) |
EP (1) | EP3012730B1 (en) |
JP (1) | JP6590583B2 (en) |
KR (1) | KR102257304B1 (en) |
CN (1) | CN106203063A (en) |
Families Citing this family (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102332468B1 (en) * | 2014-07-24 | 2021-11-30 | 삼성전자주식회사 | Method for controlling function and electronic device thereof |
JP5986334B1 (en) * | 2016-05-28 | 2016-09-06 | 後藤 誠 | Handling attention information browsing management program |
JP2018081407A (en) * | 2016-11-15 | 2018-05-24 | 株式会社 エヌティーアイ | User terminal, method and computer program |
KR20190021724A (en) | 2017-08-23 | 2019-03-06 | 삼성전자주식회사 | Security improved method and electronic device performing the same for displaying image |
CN107704777A (en) * | 2017-10-13 | 2018-02-16 | 上海爱优威软件开发有限公司 | Concealed display methods and system for terminal |
KR20190064807A (en) * | 2017-12-01 | 2019-06-11 | 삼성전자주식회사 | Electronic device and control method thereof |
US11450069B2 (en) | 2018-11-09 | 2022-09-20 | Citrix Systems, Inc. | Systems and methods for a SaaS lens to view obfuscated content |
US11201889B2 (en) | 2019-03-29 | 2021-12-14 | Citrix Systems, Inc. | Security device selection based on secure content detection |
US11544415B2 (en) | 2019-12-17 | 2023-01-03 | Citrix Systems, Inc. | Context-aware obfuscation and unobfuscation of sensitive content |
US11539709B2 (en) | 2019-12-23 | 2022-12-27 | Citrix Systems, Inc. | Restricted access to sensitive content |
US11582266B2 (en) | 2020-02-03 | 2023-02-14 | Citrix Systems, Inc. | Method and system for protecting privacy of users in session recordings |
US20230082679A1 (en) * | 2020-03-18 | 2023-03-16 | Sony Group Corporation | Data processing device, data processing method, data processing program, data extraction device, data extraction method, and data extraction program |
US11361113B2 (en) | 2020-03-26 | 2022-06-14 | Citrix Systems, Inc. | System for prevention of image capture of sensitive information and related techniques |
CN111813309B (en) * | 2020-07-10 | 2022-04-29 | 维沃移动通信(杭州)有限公司 | Display method, display device, electronic equipment and readable storage medium |
WO2022041058A1 (en) | 2020-08-27 | 2022-03-03 | Citrix Systems, Inc. | Privacy protection during video conferencing screen share |
WO2022041163A1 (en) | 2020-08-29 | 2022-03-03 | Citrix Systems, Inc. | Identity leak prevention |
CN112529586B (en) * | 2020-12-15 | 2023-07-28 | 深圳市快付通金融网络科技服务有限公司 | Transaction information management method, device, equipment and storage medium |
Family Cites Families (26)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6529209B1 (en) | 2000-01-12 | 2003-03-04 | International Business Machines Corporation | Method for providing privately viewable data in a publically viewable display |
JP2001312657A (en) | 2000-04-28 | 2001-11-09 | Apex Interactive Inc | System, method and device for interactive promotion on web site, and computer program product |
WO2004066620A1 (en) | 2003-01-20 | 2004-08-05 | Nexvi Corporation | Device and method for outputting a private image using a public display |
JP2007140856A (en) * | 2005-11-17 | 2007-06-07 | Toyo Network Systems Co Ltd | Consultation fee settlement terminal device and user terminal device |
US20100259560A1 (en) * | 2006-07-31 | 2010-10-14 | Gabriel Jakobson | Enhancing privacy by affecting the screen of a computing device |
US7884805B2 (en) * | 2007-04-17 | 2011-02-08 | Sony Ericsson Mobile Communications Ab | Using touches to transfer information between devices |
KR101420419B1 (en) * | 2007-04-20 | 2014-07-30 | 엘지전자 주식회사 | Electronic Device And Method Of Editing Data Using the Same And Mobile Communication Terminal |
KR20080108722A (en) | 2007-06-11 | 2008-12-16 | (주) 엘지텔레콤 | Contents display method for privacy protection in mobile communication terminal |
EP2235713A4 (en) * | 2007-11-29 | 2012-04-25 | Oculis Labs Inc | Method and apparatus for display of secure visual content |
KR101488393B1 (en) | 2008-11-19 | 2015-01-30 | 엘지전자 주식회사 | Mobile terminal and operation method thereof |
JP2012129701A (en) | 2010-12-14 | 2012-07-05 | Nec Casio Mobile Communications Ltd | Portable device, information display device, privacy protection method and privacy protection program |
JP5063791B2 (en) * | 2011-03-28 | 2012-10-31 | 株式会社エヌ・ティ・ティ・ドコモ | Portable information processing apparatus and display control method for portable information processing apparatus |
JP2012208794A (en) * | 2011-03-30 | 2012-10-25 | Ntt Docomo Inc | Portable terminal and display control method |
US9417754B2 (en) * | 2011-08-05 | 2016-08-16 | P4tents1, LLC | User interface system, method, and computer program product |
US20130100243A1 (en) | 2011-10-20 | 2013-04-25 | Broadcom Corporation | Secure Stereoscopic Display |
JP5945417B2 (en) * | 2012-01-06 | 2016-07-05 | 京セラ株式会社 | Electronics |
US20130321452A1 (en) * | 2012-05-30 | 2013-12-05 | Honeywell International Inc. | System and method for protecting the privacy of objects rendered on a display |
CN102841749A (en) * | 2012-07-16 | 2012-12-26 | 宇龙计算机通信科技(深圳)有限公司 | Terminal and integrated operational zone control method |
KR101948893B1 (en) | 2012-07-19 | 2019-04-26 | 엘지디스플레이 주식회사 | Visibility Controllable Display |
US20140201527A1 (en) * | 2013-01-17 | 2014-07-17 | Zohar KRIVOROT | Systems and methods for secure and private delivery of content |
KR101429582B1 (en) * | 2013-01-31 | 2014-08-13 | (주)카카오 | Method and device for activating security function on chat area |
US20140267094A1 (en) * | 2013-03-13 | 2014-09-18 | Microsoft Corporation | Performing an action on a touch-enabled device based on a gesture |
US20150007351A1 (en) * | 2013-06-27 | 2015-01-01 | Maher Janajri | Mobile Messaging Enhanced with Concealable and Selectively Revealable Text, Image, and Video Messages |
US9779474B2 (en) * | 2014-04-04 | 2017-10-03 | Blackberry Limited | System and method for electronic device display privacy |
KR101545446B1 (en) | 2014-04-10 | 2015-08-18 | 장순길 | Secure Video Display Terminal |
US20150371611A1 (en) * | 2014-06-19 | 2015-12-24 | Contentguard Holdings, Inc. | Obscurely rendering content using masking techniques |
-
2014
- 2014-10-20 KR KR1020140141577A patent/KR102257304B1/en active IP Right Grant
-
2015
- 2015-04-08 US US14/681,162 patent/US9871664B2/en active Active
- 2015-05-14 CN CN201510245896.4A patent/CN106203063A/en active Pending
- 2015-06-15 EP EP15172008.3A patent/EP3012730B1/en active Active
- 2015-08-07 JP JP2015157208A patent/JP6590583B2/en active Active
Non-Patent Citations (1)
Title |
---|
None * |
Also Published As
Publication number | Publication date |
---|---|
EP3012730A1 (en) | 2016-04-27 |
KR102257304B1 (en) | 2021-05-27 |
JP2016081516A (en) | 2016-05-16 |
US9871664B2 (en) | 2018-01-16 |
CN106203063A (en) | 2016-12-07 |
JP6590583B2 (en) | 2019-10-16 |
KR20160046121A (en) | 2016-04-28 |
US20160112209A1 (en) | 2016-04-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3012730B1 (en) | Display securing method and apparatus | |
US11894122B2 (en) | Exercise feedback provision apparatus and method | |
US10762233B2 (en) | Method and device for encrypting or decrypting content | |
US9733700B2 (en) | Ring-type mobile terminal | |
US9778749B2 (en) | Occluded gesture recognition | |
KR102204553B1 (en) | Watch type mobile terminal and control method for the mobile terminal | |
US10747328B2 (en) | Motion recognition apparatus and control method thereof | |
US20140267024A1 (en) | Computing interface system | |
US20170338973A1 (en) | Device and method for adaptively changing task-performing subjects | |
KR20160024690A (en) | Rotary apparatus and electronic device having the same | |
CN107210822B (en) | Method for body contact initiated communication, wireless communication device | |
EP3146669B1 (en) | Method and device for data encrypting | |
CN103279714A (en) | Mobile terminal as well as data encryption and decryption method | |
KR20160062922A (en) | Method for exchanging information with external device and electronic device thereof | |
US20180249056A1 (en) | Mobile terminal and method for controlling same | |
US20160306421A1 (en) | Finger-line based remote control | |
US20180260064A1 (en) | Wearable device and control method therefor | |
KR102208121B1 (en) | Mobile terminal and control method therefor | |
KR102314646B1 (en) | Method and device for encrypting or decrypting contents | |
US10361791B2 (en) | Information interaction methods and user equipment | |
JP2015225370A (en) | Authentication system, authentication method, and program | |
EP3286868A1 (en) | Apparatus and method to decrypt file segments in parallel | |
EP3179731A1 (en) | Method and device for arranging applications | |
KR20150124361A (en) | Apparatus and methodfor providing visual message service which cooperates with wearable device | |
KR20160076273A (en) | Mobile terminal and method for controlling the same |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
17P | Request for examination filed |
Effective date: 20160912 |
|
RBV | Designated contracting states (corrected) |
Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
17Q | First examination report despatched |
Effective date: 20180919 |
|
GRAP | Despatch of communication of intention to grant a patent |
Free format text: ORIGINAL CODE: EPIDOSNIGR1 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: GRANT OF PATENT IS INTENDED |
|
INTG | Intention to grant announced |
Effective date: 20200228 |
|
GRAS | Grant fee paid |
Free format text: ORIGINAL CODE: EPIDOSNIGR3 |
|
GRAA | (expected) grant |
Free format text: ORIGINAL CODE: 0009210 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE PATENT HAS BEEN GRANTED |
|
AK | Designated contracting states |
Kind code of ref document: B1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
REG | Reference to a national code |
Ref country code: GB Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: EP |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R096 Ref document number: 602015056044 Country of ref document: DE |
|
REG | Reference to a national code |
Ref country code: AT Ref legal event code: REF Ref document number: 1294032 Country of ref document: AT Kind code of ref document: T Effective date: 20200815 |
|
REG | Reference to a national code |
Ref country code: IE Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: LT Ref legal event code: MG4D |
|
REG | Reference to a national code |
Ref country code: AT Ref legal event code: MK05 Ref document number: 1294032 Country of ref document: AT Kind code of ref document: T Effective date: 20200722 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: ES Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200722 Ref country code: HR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200722 Ref country code: LT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200722 Ref country code: PT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20201123 Ref country code: SE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200722 Ref country code: BG Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20201022 Ref country code: GR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20201023 Ref country code: FI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200722 Ref country code: AT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200722 Ref country code: NO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20201022 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: RS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200722 Ref country code: PL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200722 Ref country code: LV Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200722 Ref country code: IS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20201122 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: NL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200722 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R097 Ref document number: 602015056044 Country of ref document: DE |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: EE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200722 Ref country code: SM Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200722 Ref country code: RO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200722 Ref country code: IT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200722 Ref country code: CZ Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200722 Ref country code: DK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200722 |
|
PLBE | No opposition filed within time limit |
Free format text: ORIGINAL CODE: 0009261 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: AL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200722 |
|
26N | No opposition filed |
Effective date: 20210423 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200722 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200722 |
|
REG | Reference to a national code |
Ref country code: NL Ref legal event code: MP Effective date: 20200722 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MC Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200722 |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: PL |
|
REG | Reference to a national code |
Ref country code: BE Ref legal event code: MM Effective date: 20210630 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: LU Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20210615 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: LI Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20210630 Ref country code: IE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20210615 Ref country code: CH Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20210630 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: BE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20210630 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: HU Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO Effective date: 20150615 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: CY Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200722 |
|
P01 | Opt-out of the competence of the unified patent court (upc) registered |
Effective date: 20230530 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: FR Payment date: 20230522 Year of fee payment: 9 Ref country code: DE Payment date: 20230522 Year of fee payment: 9 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: GB Payment date: 20230523 Year of fee payment: 9 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200722 |