KR101804369B1 - Device, method for driving device, and computer program for driving device - Google Patents

Device, method for driving device, and computer program for driving device Download PDF

Info

Publication number
KR101804369B1
KR101804369B1 KR1020160021438A KR20160021438A KR101804369B1 KR 101804369 B1 KR101804369 B1 KR 101804369B1 KR 1020160021438 A KR1020160021438 A KR 1020160021438A KR 20160021438 A KR20160021438 A KR 20160021438A KR 101804369 B1 KR101804369 B1 KR 101804369B1
Authority
KR
South Korea
Prior art keywords
visual
visual symbols
display unit
blur
symbols
Prior art date
Application number
KR1020160021438A
Other languages
Korean (ko)
Other versions
KR20170099272A (en
Inventor
이대영
윤현수
최원준
Original Assignee
라인 가부시키가이샤
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 라인 가부시키가이샤 filed Critical 라인 가부시키가이샤
Priority to KR1020160021438A priority Critical patent/KR101804369B1/en
Priority to JP2017031819A priority patent/JP6961356B2/en
Publication of KR20170099272A publication Critical patent/KR20170099272A/en
Application granted granted Critical
Publication of KR101804369B1 publication Critical patent/KR101804369B1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • H04M1/72519
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2201/00Electronic components, circuits, software, systems or apparatus used in telephone systems
    • H04M2201/42Graphical user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/22Details of telephonic subscriber devices including a touch pad, a touch sensor or a touch detector

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Controls And Circuits For Display Device (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

According to an embodiment of the present invention, there is provided a method of driving a device capable of being displayed with a visual symbol, the method comprising: displaying the visual symbol by a display unit of the device; Controlling, by the control unit of the device, the display unit such that at least some of the visual symbols are blurred; Detecting an input to a predetermined first position by an input unit of the device; And control, by the control unit, to cause the display unit to release the blur applied to at least some of the visual symbols of the visual symbols corresponding to the first position if the detected input corresponds to a predetermined first action A method of driving a device, comprising:

Description

[0001] The present invention relates to a device, a device driving method, and a computer program,

Embodiments of the present invention relate to a device, a device driving method, and a computer program.

As technology relating to electronic devices evolves, the user can use the device to provide the user with visual symbols received from an external device or visual symbols generated in the device itself. The visual symbol provided to the user through the device may include a photograph, a picture, a moving picture, or a text indicating a conversation with another person.

On the other hand, in public transportation and streets where distance from others is difficult, visual symbols displayed on the device may be exposed to others. Visual symbols can display information related to privacy, such as important personal information or private conversations, so it is necessary to eliminate the cases where visual symbols are exposed to others.

The above-described background technology is technical information that the inventor holds for the derivation of the present invention or acquired in the process of deriving the present invention, and can not necessarily be a known technology disclosed to the general public prior to the filing of the present invention.

Korean Published Patent Application No. KR 2000-0053697 A

Embodiments of the present invention provide a device capable of blurring visual symbols displayed on a device according to predetermined conditions and releasing blur applied to the visual symbols according to other predetermined conditions, Program.

Embodiments of the present invention provide a device, a device driving method, and a device having reduced computation amount and computation time for blur processing by performing blur processing after reducing a resolution of a visual symbol when blurring visual symbols displayed on a device. Computer programs.

According to an embodiment of the present invention, there is provided a method of driving a device capable of being displayed with a visual symbol, the method comprising: displaying the visual symbol by a display unit of the device; Controlling, by the control unit of the device, the display unit such that at least some of the visual symbols are blurred; Detecting an input to a predetermined first position by an input unit of the device; And control, by the control unit, to cause the display unit to release the blur applied to at least some of the visual symbols of the visual symbols corresponding to the first position if the detected input corresponds to a predetermined first action A method of driving a device, comprising:

Another embodiment of the present invention is a display apparatus comprising: a display unit for displaying a visual symbol; A control unit for controlling the display unit such that at least some of the visual symbols are blurred; And an input unit for detecting an input to a predetermined first position, wherein the control unit is operable to, when the detected input corresponds to a predetermined first operation, to select at least one of the visual symbols corresponding to the first position And controls the display so that the blur applied to some of the visual symbols is released.

Other aspects, features, and advantages will become apparent from the following drawings, claims, and detailed description of the invention.

According to the present invention, a device, a device driving method, and a device capable of blurring visual symbols displayed on a device according to predetermined conditions and releasing blur applied to the visual symbols according to other predetermined conditions, A computer program can be provided.

Also, a device, a device driving method, and a computer program in which a calculation amount and a calculation time for blur processing are reduced by performing blur processing after reducing the resolution of a visual symbol when blurring visual symbols displayed on a device are provided can do.

FIG. 1 is a view schematically showing a configuration of a device driving system according to an embodiment of the present invention.
2 and 3 are block diagrams schematically illustrating an internal configuration of a device according to an embodiment of the present invention.
FIG. 4 is a view schematically showing an internal configuration of a device driving system according to an embodiment of the present invention.
5 is a flowchart schematically showing a device driving method according to an embodiment of the present invention.
FIGS. 6 to 12 are views schematically showing an example of how a device is driven according to an embodiment of the present invention.

BRIEF DESCRIPTION OF THE DRAWINGS The present invention is capable of various modifications and various embodiments, and specific embodiments are illustrated in the drawings and described in detail in the detailed description. The effects and features of the present invention and methods of achieving them will be apparent with reference to the embodiments described in detail below with reference to the drawings. However, the present invention is not limited to the embodiments described below, but may be implemented in various forms. In the following embodiments, the terms first, second, and the like are used for the purpose of distinguishing one element from another element, not the limitative meaning. Also, the singular expressions include plural expressions unless the context clearly dictates otherwise. Also, the terms include, including, etc. mean that there is a feature, or element, recited in the specification and does not preclude the possibility that one or more other features or components may be added. Also, in the drawings, for convenience of explanation, the components may be exaggerated or reduced in size. For example, the size and thickness of each component shown in the drawings are arbitrarily shown for convenience of explanation, and thus the present invention is not necessarily limited to those shown in the drawings.

Hereinafter, exemplary embodiments of the present invention will be described in detail with reference to the accompanying drawings, wherein like reference numerals refer to like or corresponding components throughout the drawings, and a duplicate description thereof will be omitted .

FIG. 1 is a view schematically showing a configuration of a device driving system according to an embodiment of the present invention.

Referring to FIG. 1, a device driving system 10 according to an embodiment of the present invention may include a device 100, an external device 200, and a server 300.

Device drive system 10 may refer to a system for driving device 100. That is, in the device driving system 10 of the present invention, the external device 200 and / or the server 300 can transmit / receive signals necessary for correct driving of the device 100 or store data. That is, the device driving system 10 may mean a system in which data can be transmitted / received through wired / wireless communication between the device 100, the external device 200, and / or the server 300. In addition, a plurality of devices 100 may be included in one device driving system 10, and content may be transmitted / received through wired / wireless communication between the devices 100. [

The device 100 may be an electronic device capable of transmitting and receiving data with the external device 200 and / or the server 300 included in the device driving system 10. [ The device 100 may receive a signal capable of displaying a visual symbol from the external device 200 and / or the server 300, and may display the visual symbol.

In the present specification, a visual symbol may refer to an object or a set of objects that can be visually recognized such as alphabet, figure, and photograph, and can convey information to the other in a visual form. For example, the visual symbols may include text messages, pictures, photographs, or a sentence or emoticon that may be generated by combining them, which may be displayed on the display screen of the device 100, But is not limited thereto.

The device 100 may be one of various types of electronic devices capable of displaying visual symbols. For example, the device 100 may be a smart phone, a laptop, a tablet PC, a smart TV, a cell phone, a personal digital assistant (PDA), a desktop, a media player, A portable terminal, a navigation terminal, a kiosk, an MP3 player, a digital camera, a wearable device, and other mobile or non-mobile computing devices, but the present invention is not limited thereto. In addition, the device 100 may include various devices capable of displaying visual symbols such as an electronic board, a touch table, and the like. In addition, the device 100 may be an accessory such as a clock, an eyeglass, a head wearable electronic device, and a ring having a video display function, but the present invention is not limited thereto.

The external device 200 can transmit the visual symbol to be displayed on the device 100 to the device 100. [ The external device 200 may be an electronic device that is distinguished from the device 100 by physical characteristics. However, the device 100 and the external device 200 may be electronic devices classified according to the use subject. That is, in the case of the first user, the electronic device possessed by the user is the device 100, and the electronic device owned by the second user, which is another person, may be the external device 200. [ On the other hand, in the case of the second user, the electronic device possessed by the user is the device 100, and the electronic device owned by the first user, which is another person, may be the external device 200. [

The server 300 may transmit and receive signals to and / or from the device 100 and / or the external device 200 via wired / wireless communication on the device driving system 10, and may store data included in the signals.

The server 300 may receive a signal including information on a visual symbol generated in the device 100 from the device 100 and store the received signal. In addition, the server 300 can also transmit the signal transmitted from the device 100 to the external device 200. In addition, the server 300 can transmit the signal received from the external device 200 to the device 100. [

The server 300 may be a server that provides a variety of services for enhancing convenience of a general search service and other users in addition to providing a device-driven service. That is, the server 300 may provide various services such as search, e-mail, blog, social network service, news, and shopping information in addition to the device driving service.

Alternatively, the server 300 may be connected to a server providing a portal service such as search, e-mail, news, and shopping, and may transmit a web page provided by the portal service to the device 100 requesting information provision to the portal service Server. Here, the server 300 and the portal service providing server may be separate servers physically separated from each other, or may be the same servers separated conceptually.

The communication network 400 may provide a wired / wireless communication path between the device 100, the external device 200 and / or the server 300. The communication network 400 may be a wired network such as LANs (Local Area Networks), WANs (Wide Area Networks), MANs (Metropolitan Area Networks), ISDNs (Integrated Service Digital Networks), wireless LANs, CDMA, Bluetooth, But the scope of the present invention is not limited thereto.

2 and 3 are block diagrams schematically illustrating an internal configuration of a device according to an embodiment of the present invention.

Referring to FIG. 2, a device 100 included in a device driving system 10 according to an embodiment of the present invention may include an input unit 110, a display unit 120, and a controller 130.

The input unit 110 may receive various signals from a user of the device 100. The input unit 110 may receive a voice input, a character input, an input for pressing a button, or a touch input from a user of the device 100, but the scope of the present invention is not limited thereto. For example, the input unit 110 may receive a message from a user of the device 100.

The input unit 110 can detect a touch, i.e., a touch, with respect to a specific position on the device 100. Alternatively, the input 110 may detect an input that depresses a physical button on the device 100. Alternatively, the input unit 110 may detect input through fingerprint sensing, body temperature sensing, pressure sensing, a certain level of electrical flow sensing, etc., through a specific location on the device 100. Alternatively, the input unit 110 may exist as a separate device that performs wired / wireless communication with the device 100. [

The display unit 120 can display a visual symbol. The display unit 120 may be a display screen for providing information through visible light. The display unit 120 may be a liquid crystal display, a thin film transistor-liquid crystal display, an organic light-emitting diode, a flexible display, a three-dimensional display 3D display, and electrophoretic display, but the scope of the present invention is not limited thereto. In addition, depending on the implementation of the device 100, the device 100 may include two or more display portions 120. FIG.

On the other hand, the input unit 110 and the display unit 120 can achieve one touch display. That is, the input unit 110 and the display unit 120 may form one touch display layer. In this case, the input unit 110 can detect an input to a specific position on the display unit 120. [

The control unit 130 is typically capable of controlling the overall operation of the device 100. For example, the control unit 130 may control the signal transmitting / receiving operation of the communication unit 140 to allow or prohibit the signal transmission / reception of the communication unit 140. The control unit 130 may include any kind of device capable of processing data, such as a processor. Herein, the term " processor " may refer to a data processing apparatus embedded in hardware, for example, having a circuit physically structured to perform a function represented by a code or an instruction contained in the program. As an example of the data processing apparatus built in hardware, a microprocessor, a central processing unit (CPU), a processor core, a multiprocessor, an application specific integrated circuit (ASIC) ), A field programmable gate array (FPGA), and the like, but the scope of the present invention is not limited thereto.

The control unit 130 may blur at least a part of visual symbols displayed on the display unit 120. [ Blurred visual symbols may appear blurred, such as in a defocused image or in an unfocused object. In this case, the degree of blur processing may be such that the information to be provided through the visual symbol can not be recognized at all, or may be the extent to which information to be provided through the visual symbol can be recognized when focused. In addition, the controller 130 can release at least a part of the blur applied when a predetermined condition is satisfied. Specific conditions and examples for applying or releasing the blur in the control unit 130 will be described later.

3, the device 100 included in the device driving system 10 according to an embodiment of the present invention includes a communication unit 140 (in addition to the input unit 110, the display unit 120, and the control unit 130) ).

The communication unit 140 may transmit and receive various electronic signals through wired / wireless communication with the external device 200 and / or the server 300. [ For example, the communication unit 140 may transmit a signal including information on the visual symbol input to the device 100 to the external device 200 or the server 300, 300, < / RTI >

The communication unit 140 includes a Bluetooth communication unit, a Bluetooth low energy communication unit, a near field communication unit, a WLAN communication unit, a Zigbee communication unit, an IrDA (infrared data association) communication unit, a WFD (Wi-Fi Direct) communication unit, UWB (ultra wideband) communication unit, Ant + communication unit, and the like.

FIG. 4 is a view schematically showing an internal configuration of a device driving system according to an embodiment of the present invention.

4, a device 100 included in a device driving system 10 according to an embodiment of the present invention includes an input unit 110, a display unit 120, a control unit 130, a communication unit 140, A storage unit 150, a memory 160, and a program storage unit 170. Hereinafter, the contents described with reference to Figs. 1 to 3 will be omitted.

The output unit 150 may output various types of signals in order to provide various information to the user of the device 100. The output unit 150 may include a display unit 120 for displaying a visual symbol. The output unit 150 may include an acoustic unit for providing information through an auditory signal.

The acoustic unit may output information to be transmitted in the content provided to the device 100 in an audible form. The sound unit can output sounds having different lengths, frequencies, intensities, elevations, or strokes depending on the type of notification. The acoustic unit may include at least one of a tuner, an equalizer, a headphone, an earphone, and various types of speakers, but the scope of the present invention is not limited thereto. That is, the acoustic portion may be various kinds of devices capable of generating sound through an electric or mechanical method.

The memory 160 may perform a function of temporarily or permanently storing data processed by the controller 130. Here, the memory 160 may include magnetic storage media or flash storage media, but the scope of the present invention is not limited thereto.

Program storage 170 may be a component that implements essential or ancillary software for device 100 to operate and / or software that is necessary or ancillary to perform various applications utilizing device 100 .

4, the server 300 according to an exemplary embodiment of the present invention includes a communication unit 310, a memory 320, a program storage unit 330, a control unit 340, and a database 350 can do.

The communication unit 310 can transmit and receive an electronic signal through wired / wireless communication with the device 100. The memory 320 may perform a function of temporarily or permanently storing data processed by the control unit 340. The program storage unit 330 may be a component that implements the necessary or auxiliary software to operate the server 300 and / or the necessary or auxiliary software to perform various applications utilizing the server 300 . The control unit 340 can generally control the overall operation of the server 300. For example, the control unit 340 may control the signal transmission / reception operation of the communication unit 310 to permit or prohibit the transmission / reception of signals by the communication unit 310. The database 350 may store data included in various signals transmitted and received through the server 300. [ The database 350 may store data about the data transmitted between the device 100 and / or the external devices 200, or about the visual symbols to be displayed on the device 100.

5 is a flowchart schematically showing a device driving method according to an embodiment of the present invention. Hereinafter, the contents explained with reference to Figs. 1 to 4 will be omitted.

Referring to FIG. 5, a device 100 according to an embodiment of the present invention may perform a step of displaying a visual symbol (step S401). In this case, the visual symbol may be input through the input unit 110 of the device 100 or may be generated from the signal received from the external device 200 or the server 300 through the communication unit 140 of the device 100 It may be something. The step S401 may be performed by the display unit 120 of the device 100. [

Then, the device 100 may perform the step of controlling the display unit 120 so that at least a part of the visual symbols displayed on the display unit 120 are blurred (S402). At this time, the device 100 may cause all of the visual symbols displayed on the display unit 120 to be blurred, and only some visual symbols may be blurred. The step S402 may be performed by the control unit 130 of the device 100. [

Then, the device 100 may perform the step of detecting the input to the first position (step S403). Here, the first position may be a position where the blurred visual symbols are displayed, or a position corresponding to the blurred visual symbols. Alternatively, the first position may be a predetermined position on the display unit 120. [ Alternatively, the first position may be a position where a particular button or sensor on the device 100 is present. The step S403 may be performed by the input unit 110 of the device 100. [

Thereafter, the device 100 may perform the step of checking whether the detected input is the same as the first operation (step S404). Here, the first operation may be a touch operation of a tap for a predetermined position, a double tap for a predetermined position, a long tap for a predetermined position, and a tap-after-tapping operation. Alternatively, the first operation may be an operation of pressing a predetermined physical button or an operation of holding the physical button pressed for a predetermined period or more. Alternatively, the first operation may be fingerprint sensing, body temperature sensing, pressure sensing, operation of contacting a portion of the body or a predetermined instrument with a predetermined level of electrical flow sensing device. Alternatively, the first operation may be an air gesture taken on a predetermined position, without physical contact. The step S404 may be performed by the control unit 130 of the device 100. [

If the detected input is inconsistent with the first operation, the device 100 may revert back to before step S403 to detect the input to the first position. That is, the device 100 may be in a standby state until it receives an input to the first location, or may switch or maintain a state in which it can receive another input.

If the detected input matches the first action, the device 100 may perform the step of releasing some of the blurs applied to the visual symbols (step S405). At this time, the device 100 may control the display unit 120 to release the blur applied to at least some of the visual symbols corresponding to the first of the visual symbols. Optionally, the device 100 may also control the display 120 to release the blur of visual symbols across the display 120 of the device 100 when a first action is detected for a predetermined position have. The step S405 may be performed by the control unit 130 of the device 100. [

Alternatively, after step S405, the controller 130 of the device 100 may further perform the step of measuring the elapsed time after the blur applied to at least some of the visual symbols has been released. In this case, the control unit 130 of the device 100 may control the display unit 120 such that at least some of the visual symbols are blurred when the elapsed time reaches the first time. That is, when the user takes the first operation at the first position, the device 100 releases some blur and applies the blur again after a predetermined time elapses.

Alternatively, after step S405, the control unit 130 of the device 100 may measure the elapsed time from any one of the moment when the input to the first position is started and the moment when the input to the first position is finished You can do more. In this case, the control unit 130 of the device 100 may control the display unit 120 so that at least some of the visual symbols are blurred when the elapsed time reaches the second time. That is, when the user keeps inputting to the first position (for example, when the user touches a position where a specific visual symbol is displayed and keeps the corresponding touch state) The blur can be applied again after a certain period of time has elapsed from the moment when the user has left the finger (i.e., the moment the user removes the finger). In this case, when the user of the device 100 holds the touch, the state where the continuous blur is released is maintained, and when the touch is released, the blur can be applied after a predetermined time elapses. Through this, the device 100 can provide the effect of varying the period in which the blur is released according to the convenience of the user of the device 100.

Optionally, when visual symbols are displayed in step S401, the control unit 130 of the device 100 may display the visual symbols on the display 100 such that at least some of the visual symbols on the way to the device 100 are blurred as soon as they are input, Lt; RTI ID = 0.0 > 120 < / RTI > Specifically, the control unit 130 of the device 100 determines whether the user of the device 100 is in the middle of proceeding with the current input (for example, while the user is writing text to send to the external user, Text created until the call is made) can also be controlled so that the display unit 120 is blurred. Through this, the device 100 can provide the effect of keeping the privacy of the visual symbol currently being input by the user of the device 100 as well.

Optionally, the control unit 130 of the device 100 may select one of the visual symbols on the way to the input of the device 100, the visual symbols whose time elapsed since the input is less than the third time, The visual symbols of at least one of the visual symbols within the first distance in at least one of the up, down, left, and right directions at the position where the at least one visual symbol is disposed may be blurred. In general, when a user of the device 100 inputs a text, it is most likely to directly confirm the content that the user has just entered before correcting errors in the typo or contents. Thus, if the blur is applied immediately when the user enters a visual symbol into the device 100, the user may not be able to verify that he or she has correctly entered the content. Accordingly, the device 100 can control the display unit 120 so that the visual symbols after a predetermined time, i.e., the third time period, are not blurred. Or, in general, the user of device 100 may move to the middle of the visual symbols he / she has entered and modify the content. Accordingly, the device 100 can control the display unit 120 so that the visual symbols near the position where the input cursor is disposed are not blurred. For example, when the user moves the cursor between the word 'today' and the word 'lunch' while inputting the sentence 'what to eat for lunch today', the device 100 is called 'today' and 'lunch' You can make sure the word blur is not applied and that the blur is applied to 'what to eat?' In this way, the device 100 can provide an opportunity to easily edit the visual symbol input by the user of the device 100.

Optionally, the communication unit 140 of the device 100 may further perform the step of receiving a data signal from the external device 200 before displaying the visual symbols in step S401. In this case, the device 100 can acquire a visual symbol from the received data and display it on the display unit 120 in step S401. That is, the device 100 receives not only the visual symbols created by the user of the device 100 but also the information including the information of the visual symbols created by the user (i.e., the external user) of the external device 100, The display section 120 can display the visual symbols created by the user.

Optionally, the control unit 130 of the device 100 may cause the visual symbol obtained from the data signal received through the communication unit 140 to be blurred after a third time period has elapsed after being displayed on the display unit 120 The display unit 120 may be controlled. That is, when a user of the device 100 communicates with an external user of the external device 200 through a chat service or the like, the device 100 may display the visual symbol created by the external user in the device 100, So that the user of the device 100 can recognize the contents. After that, the device 100 can protect the privacy by allowing the visual symbol created by the external user to be blurred when a predetermined time has elapsed.

Alternatively, the control unit 130 of the device 100 may display a predetermined number of visual symbols in the order of the latest displayed time on the display unit 120 among the visual symbols obtained from the data signal received through the communication unit 140 It is possible to perform the step of controlling the display unit 120 so as not to be blurred. That is, when a user of the device 100 communicates with an external user of the external device 200 through a chat service or the like, the device 100 prevents the visual symbols recently created by the external user from being blurred, The user of the device 100 can recognize the content. The device 100 may then blur existing visual symbols when new visual symbols are displayed on the display 120.

Optionally, the controller 130 of the device 100 may perform the step of controlling the display 120 to release at least some of the blur applied to the display 120 when scrolling occurs on the display 120 have. That is, if the user of the device 100 has a large number of visual symbols to perform scrolling, the user can perform scrolling to confirm the visual symbols received or input in the past. In this case, since the user of the device 100 must not apply blur in order to find a past visual symbol, the control unit 130 of the device 100 controls the display unit 120 to release the blur of the display unit 120 . The controller 130 of the device 100 may selectively display the display unit 120 so that at least a part of the blur applied to the display unit 120 is released only when the display unit 120 scrolls at a predetermined first speed or more Control. That is, since the slow scrolling may not be an operation for the user of the device 100 to find a specific visual symbol, the device 100 may display only the scroll symbol, which is applied to the display unit 120 only when scrolling occurs above a predetermined first speed The display unit 120 may be controlled so that at least a part of the blur is released.

Optionally, the controller 130 of the device 100 may also perform a step of controlling the blur mode to one of an on state or an off state. If the blur mode is on, the device 100 can apply the blur processing method of the visual symbol described with reference to Fig. On the other hand, when the blur mode is off, the device 100 may omit blur processing of at least some of the visual symbols. Through this, the device 100 can provide the user of the device 100 with the ability to prevent or apply blur to visual symbols, such as the need to be kept private.

Optionally, the control unit 130 of the device 100 recognizes the state of the device 100 itself or the environment around the device 100, and controls the blur mode to be in an on state or an off state Can be performed. Specifically, the control unit 130 of the device 100 may determine the on / off state of the blur mode directly from the user of the device 100. [ However, even when the on / off state of the blur mode is not directly determined, the device 100 can adjust the blur mode according to the surrounding environment. For example, the device 100 may include a sensor for detecting shaking. In this case, the device 100 judges how much the device 100 is shaken. If the device 100 has a lot of shaking, the device 100 determines that the user of the device 100 is in the public, and turns the blur mode on If the device 100 has a small amount of blur, it is determined that the user of the device 100 is in a private space, and the blur mode can be changed to the off state. Conversely, the device 100 may change the blur mode to the off state when the device 100 has a lot of shaking, and may change the blur mode to on when the device 100 has little shaking. In another example, the device 100 may include a sensor for detecting whether or not a sound is generated. In this case, the device 100 determines how loud the periphery of the device 100 is. If the noise around the device 100 is present at a certain level or more, the device 100 determines that the user of the device 100 is in the public The blur mode can be changed to the on state, and when the device 100 is quiet, the blur mode can be changed to the off state by determining that the user of the device 100 is in the private space. Conversely, the device 100 may change the blur mode to off when the device 100 is noisy and change the blur mode to on when the device 100 is noisy. In another example, the device 100 may determine the on / off state of the blur mode based on whether an electronic signal of a first intensity or greater is being received from the external device 200. [ At this time, the electronic signal of the first intensity or higher may be a signal for short-range communication such as a Bluetooth signal, a near field communication (NFC) communication signal, or the like. That is, when the device 100 receives a close-range communication signal of a first strength or higher from the external device 200, the device 100 determines that there are other electronic devices around the device 100, 100) is in the public. Thus, in this case, the device 100 may change the blur mode to the on state. If the device 100 does not receive a local communication signal of a first intensity or higher, the device 100 may determine that the user of the device 100 is in a private space and may change the blur mode to the off state . Conversely, the device 100 may change the blur mode to an off state when an electronic signal of a first intensity or more is received, and change the blur mode to on state when an electronic signal of a first intensity or more is not received.

Alternatively, the device 100 may send the blur only when at least one of fingerprint recognition of the user, eye recognition of the user, and password input is performed through the input unit 110 or the communication unit 140 of the device 100 Apply blur, turn blur off, or turn blur mode on / off. That is, since the privacy protection of the user of the device 100 is one of the reasons for blurring, the device 100 may undergo various authentication processes from the user for controlling the blurring state.

Alternatively, the device 100 can recognize the shape of the eyeball of the user or the surrounding person by using an image pickup element such as a camera included in the device 100, and can grasp the position where the line of sight is directed. In this case, the device 100 can release the blur when the user's gaze of the device 100 stays on the display unit 120 of the device 100, and the gaze of a person other than the user of the device 100 When staying on the display unit 120 of the device 100, the blur effect can be applied.

Optionally, the device 100 may selectively apply a first blur with a relatively strong blur degree and a second blur with a relatively weak blur degree to the visual symbols. Concretely, the visual symbol without blur processing can easily be recognized by the viewer. In addition, the weakly blurred visual symbol may be perceived when the visual symbol is concentrated, but may not be perceived when not focused. In addition, the visual symbols that are subjected to strong blur processing may not be recognized even when the visual symbols are concentrated. The device 100 may control the display unit 120 to apply the first blur to the visual symbols to which the blur is to be applied and apply the second blur to the visual symbols that need to be blurred. Alternatively, the device 100 may determine a blur to apply to the visual symbols, either as a first blur or a second blur, depending on the input from the user, the environment around the device 100, or the state of the device 100 itself have.

The device 100 may perform this using at least one of various image processing techniques when applying or releasing blur to visual symbols. At this time, the device 100 may apply or cancel the blur by applying an image processing mask or an image processing filter corresponding to at least one of various image processing techniques.

FIGS. 6 to 12 are views schematically showing an example of how a device is driven according to an embodiment of the present invention. Hereinafter, examples of blur application or blur release described above will be described with reference to Figs. 6 to 12. Fig.

Referring to FIG. 6, the display unit 120 of the device 100 may display visual symbols 601 indicating the content of the conversation between the user of the device 100 and the external user of the external device 200. FIG. In addition, a UI 602 of an application providing a visual dialog function can be displayed on the display unit 120 of the device 100 as well.

At this time, blur can be applied to only a part of the area of the display unit 120 of the device 100. [ 7, blur can be applied only to the visual symbols 701 indicating the contents of the conversation between the user of the device 100 and the external user of the external device 200, and the UI of the application providing the visual conversation function (702) may not be blurred. That is, since the visual symbols 701 representing the conversation contents must be maintained in privacy, the device 100 performs the blur processing on the visual symbols 701 indicating the conversation contents, and the UI of the application providing the visual conversation function Since the portion 702 of the application 702 is not related to the privacy of the user, the device 100 does not perform the blur processing on the UI 702 portion of the application providing the visual dialog function, .

8, the device 100 receives an input to select a particular visual symbol among the visual symbols 801 representing the content of conversation between the user of the device 100 and an external user of the external device 200 . For example, when a user touches a portion indicated by a visual symbol 'EFG' among visual symbols 801 indicating the contents of conversation between a user of the device 100 and an external user of the external device 200, the device 100 ) Can release the blur applied to the visual symbol 'EFG'. At this time, the device 100 can maintain the blur applied to the remaining area except the visual symbol 'EFG' without releasing it. In this case, the device 100 can maintain the state where the blur is not applied in the UI (802) portion of the application providing the visual dialogue function.

9, the device 100 may display a UI 903 for switching blur application on the display unit 120. [ When the user of the device 100 touches the UI 903 for switching blurring application, the device 100 displays visual symbols representing the content of the conversation between the user of the device 100 and the external user of the external device 200 It is possible to apply the blur to the entire image 901 or to release the applied blur. Even in this case, the device 100 can maintain a state in which the UI 902 portion of the application providing the visual dialogue function is not blurred.

10, the device 100 determines whether or not blur processing of the visual symbols 1001 in the middle of inputting to the device 100 is performed on the display unit 120, It can be determined separately from visual symbols. That is, the device 100 may not apply the blur to the visual symbols 1001 in the middle of the input, even if the visual symbols of the other area of the display unit 120 are all blurred.

11, the device 100 may display a layer for displaying a visual symbol on the display unit 120 and a layer for displaying a space in which a visual symbol can be displayed. For example, as shown in FIG. 11 (a), the device 100 displays a visual symbol called " conversation contents ", and a visual symbol indicating a two parallelogram shape surrounding the visual symbol It is possible to display a layer for indicating a space that can be used. At this time, the device 100 can perform a blur processing operation only on a part of the layers. For example, as shown in FIG. 11 (b), the device 100 performs a blurring process on a layer in which visual symbols such as 'conversation contents' are displayed, so that the visual symbols 'conversation contents' Can be changed. At this time, the device 100 may omit blur processing for the layer for indicating the space where the visual symbol can be displayed. Accordingly, the device 100 can apply blur only to a necessary area, thereby reducing the hardware resource consumption required for blur processing while maintaining a visually clean effect.

On the other hand, referring to FIG. 12, when device 100 performs blur processing, it may perform blur processing on visual symbols whose resolution is lower than the original visual symbol. Specifically, the device 100 can change the visual symbol having the size as shown in Fig. 12 (a) to a visual symbol having the size as shown in Fig. 12 (b). That is, the device 100 may generate a first temporary visual symbol by reducing the resolution of the visual symbols to which the blur is to be applied. Thereafter, the device 100 may blur the first temporary visual symbol as shown in Figure 12 (c) to generate a second temporary visual symbol. Thereafter, the device 100 may enlarge the second temporary visual symbol, as shown in Figure 12 (d), to generate the visual symbol for which the blur processing is complete. In general, when applying blur to an image, it may be necessary to apply a relatively large image processing mask or apply a complex image processing filter. As a result, hardware resource consumption for blur processing can be very large when the visual symbol itself for applying blur is high in resolution. Accordingly, the device 100 can reduce the amount of computation generated in the blur application process by using a method of switching the resolution of visual symbols as shown in FIG.

Through the above-described embodiments of the present invention, a device capable of blurring visual symbols displayed on a device according to predetermined conditions and releasing blur applied to the visual symbols according to other predetermined conditions, A device driving method, and a computer program. In addition, through the embodiments of the present invention described above, when the visual symbols displayed on the device are blurred, the blur process is performed after reducing the resolution of the visual symbol, thereby reducing the calculation amount and the calculation time for blur processing A device driving method, and a computer program.

The embodiments of the present invention described above can be embodied in the form of a computer program that can be executed on various components on a computer, and the computer program can be recorded on a computer-readable medium. At this time, the medium may be a magnetic medium such as a hard disk, a floppy disk and a magnetic tape, an optical recording medium such as CD-ROM and DVD, a magneto-optical medium such as a floptical disk, , A RAM, a flash memory, and the like, which are specifically configured to store and execute program instructions.

Meanwhile, the computer program may be specifically designed and configured for the present invention or may be known and used by those skilled in the computer software field. Examples of computer programs may include machine language code such as those produced by a compiler, as well as high-level language code that may be executed by a computer using an interpreter or the like.

The specific acts described in the present invention are, by way of example, not intended to limit the scope of the invention in any way. For brevity of description, descriptions of conventional electronic configurations, control systems, software, and other functional aspects of such systems may be omitted. Also, the connections or connecting members of the lines between the components shown in the figures are illustrative of functional connections and / or physical or circuit connections, which may be replaced or additionally provided by a variety of functional connections, physical Connection, or circuit connections. Also, unless explicitly referred to as " essential ", " important ", etc., it may not be a necessary component for application of the present invention.

Accordingly, the spirit of the present invention should not be construed as being limited to the above-described embodiments, and all ranges that are equivalent to or equivalent to the claims of the present invention as well as the claims .

10: Device drive system
100: device
110: input unit
120:
130:
140:
200: External device
300: server
400: communication network

Claims (20)

A method of driving a device capable of being displayed with visual symbols,
Displaying the visual symbol by a display unit of the device;
Controlling, by the control unit of the device, the display unit such that at least some of the visual symbols are blurred;
Detecting an input to a predetermined first position by an input unit of the device; And
Controlling, by the control unit, the display unit such that, if the detected input corresponds to a predetermined first action, the blur applied to at least some of the visual symbols corresponding to the first position of the visual symbols is released Lt; / RTI >
The step of controlling the display unit to perform blur processing may include:
Reducing the resolution of the at least some visual symbols to produce first temporary visual symbols;
Blurring the first temporary visual symbols to generate and display second temporary visual symbols.
The method according to claim 1,
After the step of controlling the display unit to release the blur,
Wherein the controller is configured to measure the elapsed time after the blur applied to the at least some visual symbols is released and to cause the at least some visual symbols to be blurred when the elapsed time reaches a first time, The method comprising the steps of:
The method according to claim 1,
After the step of controlling the display unit to release the blur,
Wherein the controller measures the elapsed time from any one of a moment when the detected input is started and a moment when the detected input ends and if the elapsed time reaches a second time, Controlling the display unit such that the display unit is blurred.
The method according to claim 1,
Wherein the first position is any one of a position where the at least some visual symbols are displayed, a position corresponding to the at least some visual symbols, a predetermined position on the display, and a predetermined position on the device .
The method according to claim 1,
Wherein the step of displaying the visual symbol comprises:
And controlling, by the control unit, the display unit such that at least some of the visual symbols on the way to the input to the device are blurred.
6. The method of claim 5,
Wherein the step of controlling the display unit such that the visual symbols on the way that the input is proceeding is blurred by the visual display of the visual symbols having the lapse time after the input, And removing at least one of the visual symbols of the visual symbols within the first distance from the blurring process in at least one of up, down, left, and right directions at positions where the input cursors are arranged.
The method according to claim 1,
Prior to displaying the visual symbol,
Further comprising: receiving a data signal from an external device by a communication unit of the device;
Wherein the step of displaying the visual symbol comprises obtaining and displaying the visual symbol from the data signal.
8. The method of claim 7,
The step of controlling the display unit to perform blur processing may include:
And controlling the display unit such that a visual symbol obtained from the data signal is blurred after a third time period after being displayed on the display unit.
The method according to claim 1,
Controlling the display unit such that at least a part of the blur applied to the display unit is released when at least one of a case where scrolling occurs on the display unit and a case where scrolling more than a first speed occurs on the display unit by the control unit; Further comprising the steps of:
The method according to claim 1,
Wherein generating and displaying the second temporary visual symbols comprises:
And magnifying and displaying the second temporary visual symbols to the same magnitude as the at least a portion of the visual symbols.
The method according to claim 1,
Controlling the blur mode to one of an on state and an off state by the control unit,
Wherein controlling the display to blur comprises omitting blur processing of at least some of the visual symbols when the blur mode is in the off state.
12. The method of claim 11,
Wherein the controlling step includes a step of controlling the blurring based on at least one of whether the device is shaken, whether noise is generated around the device, whether a predetermined input is received through the input unit, And controlling the on state and the off state of the mode.
A computer program stored on a computer-readable medium for executing a method of any one of claims 1 to 12 using a computer. A display unit for displaying a visual symbol;
A control unit for controlling the display unit such that at least some of the visual symbols are blurred; And
And an input unit for detecting an input to a predetermined first position,
Wherein the control unit controls the display unit such that a blur applied to at least some of the visual symbols corresponding to the first one of the visual symbols is released when the detected input corresponds to a predetermined first action,
Wherein the control unit generates first temporary visual symbols by reducing the resolution of the at least some visual symbols and generates and displays second temporary visual symbols by blurring the first temporary visual symbols.
15. The method of claim 14,
Wherein the control unit is configured to measure the elapsed time after the blur applied to the at least some visual symbols is released and to cause the at least some visual symbols to be blurred when the elapsed time reaches a first time Device.
15. The method of claim 14,
Wherein the control unit measures elapsed time from any one of a moment when the detected input is started and a moment when the detected input ends and if the elapsed time reaches a second time, And controls the display unit to be blurred.
15. The method of claim 14,
Wherein the control unit controls the display unit such that at least some of the visual symbols on the way to the input to the device are blurred.
15. The method of claim 14,
And a communication unit for receiving a data signal from an external device,
Wherein the display unit obtains and displays the visual symbol from the data signal.
15. The method of claim 14,
Wherein the control unit magnifies and displays the second temporary visual symbols to the same size as the at least some visual symbols.
15. The method of claim 14,
Wherein the control unit controls the blur mode to one of an on state and an off state and omits blur processing of at least some of the visual symbols when the blur mode is the off state.
KR1020160021438A 2016-02-23 2016-02-23 Device, method for driving device, and computer program for driving device KR101804369B1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
KR1020160021438A KR101804369B1 (en) 2016-02-23 2016-02-23 Device, method for driving device, and computer program for driving device
JP2017031819A JP6961356B2 (en) 2016-02-23 2017-02-23 Equipment, device drive method and computer program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020160021438A KR101804369B1 (en) 2016-02-23 2016-02-23 Device, method for driving device, and computer program for driving device

Publications (2)

Publication Number Publication Date
KR20170099272A KR20170099272A (en) 2017-08-31
KR101804369B1 true KR101804369B1 (en) 2017-12-04

Family

ID=59740721

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020160021438A KR101804369B1 (en) 2016-02-23 2016-02-23 Device, method for driving device, and computer program for driving device

Country Status (2)

Country Link
JP (1) JP6961356B2 (en)
KR (1) KR101804369B1 (en)

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09282449A (en) * 1996-04-12 1997-10-31 Matsushita Electric Ind Co Ltd Image processor
JP5433935B2 (en) * 2007-07-24 2014-03-05 日本電気株式会社 Screen display control method, screen display control method, electronic device, and program
JP2012208794A (en) * 2011-03-30 2012-10-25 Ntt Docomo Inc Portable terminal and display control method
JP2013020522A (en) * 2011-07-13 2013-01-31 Kyocera Corp Display divice, display method, and program
KR101429582B1 (en) * 2013-01-31 2014-08-13 (주)카카오 Method and device for activating security function on chat area
US20150213274A1 (en) * 2014-01-29 2015-07-30 Samsung Electronics Co., Ltd. Device and method of shielding region of display screen

Also Published As

Publication number Publication date
JP6961356B2 (en) 2021-11-05
JP2017151439A (en) 2017-08-31
KR20170099272A (en) 2017-08-31

Similar Documents

Publication Publication Date Title
JP7209782B2 (en) Device control using gaze information
KR102534596B1 (en) User Interfaces for Simulated Depth Effects
KR102447503B1 (en) Message Service Providing Device and Method Providing Content thereof
US10056082B2 (en) Mobile terminal and method of controlling therefor
KR101657231B1 (en) Hiding method, device, program and recording medium for privacy information
EP2613224B1 (en) Mobile terminal and control method therof
JP2022163060A (en) Notification processing method, electronic device, and program
ES2643176T3 (en) Method and apparatus for providing independent view activity reports that respond to a tactile gesture
US9959086B2 (en) Electronic device and control method thereof
US9891706B2 (en) Mobile terminal and control method therof
US9625996B2 (en) Electronic device and control method thereof
US20150187357A1 (en) Natural input based virtual ui system for mobile devices
KR20150003591A (en) Smart glass
US20150242118A1 (en) Method and device for inputting
CN106067833B (en) Mobile terminal and control method thereof
US20150010236A1 (en) Automatic image refocusing method
US9565289B2 (en) Mobile terminal and method of controlling the same
WO2014185885A1 (en) Line of sight initiated handshake
KR20150044830A (en) Mobile apparatus and wearable apparatus for displaying information, and methods thereof
KR101758863B1 (en) Chatting service providing method and chatting service providing system
US20140368432A1 (en) Wearable smart glasses as well as device and method for controlling the same
CN111597797A (en) Method, device, equipment and medium for editing social circle message
CN111554314A (en) Noise detection method, device, terminal and storage medium
KR101804369B1 (en) Device, method for driving device, and computer program for driving device
WO2022222688A1 (en) Window control method and device

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E701 Decision to grant or registration of patent right
GRNT Written decision to grant