KR20170105285A - Device, method, and computer readable medium for displaying a digital calendar - Google Patents
Device, method, and computer readable medium for displaying a digital calendar Download PDFInfo
- Publication number
- KR20170105285A KR20170105285A KR1020160028301A KR20160028301A KR20170105285A KR 20170105285 A KR20170105285 A KR 20170105285A KR 1020160028301 A KR1020160028301 A KR 1020160028301A KR 20160028301 A KR20160028301 A KR 20160028301A KR 20170105285 A KR20170105285 A KR 20170105285A
- Authority
- KR
- South Korea
- Prior art keywords
- input
- electronic device
- gesture
- electronic calendar
- user
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/002—Specific input/output arrangements not covered by G06F3/01 - G06F3/16
- G06F3/005—Input arrangements through a video camera
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
Abstract
A device for displaying an electronic calendar is provided. Specifically, the device comprises a display unit for displaying an electronic calendar, an input for detecting a user's gesture to the displayed electronic calendar, and an input unit for determining input coordinates in the displayed electronic calendar based on the detected gesture, And a controller for performing an operation corresponding to the detected gesture on the determined input coordinates.
Description
The disclosed embodiments relate to a device, method and computer readable medium for displaying an electronic calendar, and more particularly to a device, method and computer readable medium for displaying a gesture of a user to a displayed electronic calendar and performing a corresponding operation Methods, and computer readable media.
As electronic devices evolve, a variety of conventional products have been replaced by electronic devices. For example, paper letters can be replaced by electronic letters, film photographs can be replaced with digital pictures, and board games such as cards, chess, chess, etc. can be digitally implemented in computing devices such as PCs, smart phones, tablets, .
In response to this demand for digitalization, technologies are being developed that can digitize calendars.
Digital calendars with electronic calendars are typically run as applications or software programs on smartphones or PCs. However, electronic calendars that run on smartphones or PCs are not yet able to replace the calendars used in homes and offices.
First, electronic calendars that run on smartphones or PCs are inevitably not big enough for the display size, giving users a cognitive burden, and consuming a lot of power to keep them running constantly.
On the other hand, with the development of technology, various electronic devices have become available to users, and research is being conducted to enable users to use various electronic devices more efficiently.
Recently, the Internet of Things (IOT) technology has attracted attention as a way for users to use various electronic devices more efficiently. The things Internet allows a user to control various electronic devices via a single electronic device, for example, a smart phone.
The disclosed embodiments provide an electronic calendar device with an interface that can enhance mutual interaction with a user.
The disclosed embodiments seek to provide an electronic calendar device for detecting a user's gesture and performing a corresponding action.
The disclosed embodiments provide an electronic calendar device that is capable of efficiently communicating with other electronic devices.
According to a first aspect of the present disclosure, there is provided a device for displaying an electronic calendar, comprising: a display for displaying an electronic calendar; a display for displaying a gesture of the user on the displayed electronic calendar; And a control unit for determining an input coordinate in the displayed electronic calendar based on the input unit and the detected gesture, and performing an operation corresponding to the detected gesture on the determined input coordinate.
In addition, a second aspect of the disclosure provides a method, comprising displaying an electronic calendar, detecting a user's gesture on the displayed electronic calendar, and determining input coordinates on the displayed electronic calendar based on the detected gesture And performing an action corresponding to the detected gesture on the determined input coordinate.
In addition, the third aspect of the present disclosure can provide a computer-readable medium having recorded thereon a program for causing a computer to execute the method of the second aspect.
1 is a schematic diagram of a device according to one embodiment.
2 is a block diagram of a device according to one embodiment.
3 is an exemplary diagram of an interface for displaying an electronic calendar, in accordance with one embodiment.
4A is a perspective view of a device according to one embodiment.
4B is a diagram for explaining a method for detecting a gesture of a user according to an embodiment.
5A is a perspective view of a device according to one embodiment.
5B is a diagram for explaining a method for detecting a gesture of a user according to an embodiment.
6A to 6D are diagrams illustrating a method for performing a corresponding operation in an electronic calendar based on a user's original input, according to an embodiment.
7A to 7D are diagrams illustrating a method for performing a corresponding operation in an electronic calendar based on a user's original input, according to an embodiment.
8A and 8B are diagrams for explaining a method for performing a corresponding operation in an electronic calendar based on an arrow input of a user according to an embodiment.
9 is a diagram for explaining a method for providing a notification based on a star input of a user according to an embodiment.
10 is a diagram illustrating a method for providing a notification based on user's star inputs, according to one embodiment.
11 is a block diagram of a device according to one embodiment.
Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings so that those skilled in the art can easily carry out the present invention. The present invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. In order to clearly illustrate the present invention, parts not related to the description are omitted, and like parts are denoted by similar reference numerals throughout the specification.
As used herein, the terminology used herein is intended to encompass all commonly used generic terms that are possible in light of the functionality of the various embodiments, which may vary depending upon the intent or circumstance of the skilled artisan, the emergence of new technology, and the like. In addition, in certain cases, there may be a term arbitrarily selected by the applicant, in which case the meaning thereof will be described in detail in the description of the corresponding embodiment. Accordingly, the terms used in the present specification should be defined based on the meaning of the term, not on the name of a simple term, and on the contents throughout the specification.
The singular expressions include plural expressions unless the context clearly indicates otherwise. It is to be understood that the terms "comprises", "having", and the like in the specification are intended to specify the presence of stated features, integers, steps, operations, elements, parts or combinations thereof, But do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, parts, or combinations thereof. In particular, the numbers set forth in the specification are intended to be illustrative, and the invention should not be limited by the numbers set forth in the specification.
Also, the terms " part, "" module," etc. in the specification mean units for processing at least one function or operation, which may be implemented in hardware or software or a combination of hardware and software.
Although "first "," second "and the like are used throughout the specification to describe various components, it goes without saying that these components are not limited by these terms. These terms are used only to distinguish one component from another. It goes without saying that the "first component" mentioned below may be a "second component" within the technical concept of the embodiment.
The terms used in this specification will be briefly described and the present invention will be described in detail.
Throughout the specification, the electronic device can be an electronic calendar device, a smart phone, a tablet, a mobile phone, a personal digital assistant (PDA), a media player, a portable multimedia player (PMP), an electronic book terminal, a digital broadcast terminal, but are not limited to, a laptop, a micro server, a global positioning system (GPS) device, a navigation, a kiosk, an MP3 player, a smart TV, a digital camera and other mobile or non-mobile computing devices.
1 is a schematic diagram of a device according to one embodiment.
An electronic calendar is displayed in
The
For example, if a user applies a gesture input for any date of
According to the
According to the
2 is a block diagram of a device according to one embodiment.
Referring to FIG. 2, the
The
The
In one embodiment, the
In one embodiment, the
In one embodiment, the
The
The
The
The
3 is an exemplary diagram of an interface for displaying an electronic calendar, in accordance with one embodiment.
Referring to FIG. 3, dates and images can be alternately displayed by the
In one embodiment, the image may be displayed in various forms. For example, an image can be a video itself, or it can include a video. Further, the image may include text.
In one embodiment, the image may be an image related to advertising, public relations, marketing, public announcement, promotion, propaganda, etc., and may be an image provided by the person or organization. For example, when an electronic device according to an embodiment is installed in a school, an image may include content for a school or a class, and an electronic device may also be used as a bulletin board. At this time, the dates of the electronic device may be input with the schedule of the school or class (for example, midterm exam, athletic meet, graduation ceremony, etc.) and displayed alternately with the image.
For example, when the electronic device according to one embodiment is installed in-house, the image may include content for the company, and the electronic device may also be used as an in-house bulletin board. At this time, the company's schedule (e.g., deadline for delivery, anniversary of foundation, anniversary, etc.) may be input into the dates of the electronic device, and displayed together with the image or alternately.
For example, when an electronic device according to one embodiment is installed in a public place, the image may include content related to the public place, and the electronic device may also be used as a guide. At this time, the dates of the electronic device may be input with a schedule (e.g., an event date, a performance date, a holiday date, etc.) associated with a public place and displayed alternately with the image.
The electronic device according to one embodiment can be installed in various places for the purpose of advertisement, publicity, marketing, announcement, promotion, propaganda, etc., so that an enormous advertising effect can be expected.
The electronic device according to an embodiment may be accessed by a plurality, so that information can be easily shared.
The electronic device according to an embodiment can display an image corresponding to various purposes and can update the date, so that there is no limitation in its utilization and duration of use compared with the existing paper calendar.
The dates and images of the electronic calendar may be displayed together as shown in FIG.
4A is a perspective view of a device according to one embodiment. For convenience of explanation, FIG.
4B is a diagram for explaining a method for detecting a gesture of a user according to an embodiment.
The distance between the user's gesture input and the input may be detected by the input of the electronic device to determine the input coordinates of the gesture input applied by the user. The input unit may be, but is not limited to, a camera, a depth camera, an infrared sensor, an ultrasonic sensor, a laser sensor, and the like.
Referring to Figs. 4A and 4B, an electronic device 1000c1 according to an embodiment may include an input unit configured by two cameras 1102c1 and 1104c1, and a display unit 1210c1.
In order to improve the accuracy of the input coordinates, the electronic device 1000c1 may include three or more cameras, or may include only one camera.
The camera of the electronic device 1000c1 may be located on at least one of the upper end, the lower end, the left end, and the right end of the electronic device 1000c1. For example, as shown in Figs. 4A and 4B, two cameras 1102c1 and 1104c1 of the electronic device 1000c1 may be located at the upper and left ends.
On the other hand, as shown in Fig. 4A, the display portion 1210c1 of the electronic device 1000c1 may be located on the rear surface of the electronic device 1000c1. That is, the input unit of the electronic device 1000c1, for example, two cameras 1102c1 and 1104c1, can direct a space on the display unit 1210c1.
Referring to FIG. 4B, the first camera 1102c1 located at the upper end of the electronic device 1000c1 may have a predetermined first angle of view FOV1, and may be a second camera located at the left end of the electronic device 1000c1 1104c1 may have a predetermined second angle of view FOV2.
The first camera 1102c1 may detect the distance between the user's gesture input and the first camera 1102c1 to determine the ordinate axis input coordinates of the user's gesture input to the electronic device 1000c1. For example, when a user draws a predetermined gesture by using his / her hand on a space on the display portion 1210c1 of the electronic device 1000c1 or on the display portion 1210c1, the first camera 1102c1 displays a
On the other hand, the second camera 1104c1 detects the distance between the user's gesture and the second camera 1104c1, and the electronic device 1000c1 can determine the coordinates of the horizontal axis input of the gesture based on the detected distance.
Based on the determined ordinate input coordinates and the abscissa input coordinates, the final input coordinates of the gesture input may be determined.
5A is a perspective view of a device according to one embodiment. For convenience of explanation, FIG. 5B is also referred to together.
5B is a diagram for explaining a method for detecting a gesture of a user according to an embodiment.
The camera of electronic device 1000c2 may be located on at least one of the top, bottom, left end, and right end of electronic device 1000c2. For example, as shown in Figs. 5A and 5B, two cameras 1102c2 and 1104c2 of the electronic device 1000c2 may be located at the upper end and the upper end of the electronic device 1000c2.
5B, the third camera 1102c2 positioned at the upper end of the electronic device 1000c2 may have a predetermined third angle of view FOV3, and the third camera 1102c2 positioned at the upper end of the electronic device 1000c2 may have a third angle of view The camera 1104c2 may have a predetermined fourth angle of view FOV4.
In order to determine the first diagonal axis input coordinates extending in the upper-left and lower-right directions of the gesture input of the user applied to the electronic device 1000c2, the third camera 1102c2 determines The distance between the gesture input and the third camera 1102c2 can be detected. Referring to FIG. 5B, the shorter the detected distance, the larger the first diagonal axis input coordinate value is calculated, and the smaller the first diagonal axis input coordinate value can be calculated as the detected distance is longer.
On the other hand, the second camera 1104c2 detects the distance between the user's gesture input and the second camera 1104c2, and the electronic device 1000c2 detects the upper-right and lower-left lower-left direction of the first diagonal axis.
Based on the determined first oblique axis input coordinates and second oblique axis input coordinates, the final input coordinates of the gesture input may be determined.
6A to 6D are diagrams illustrating a method for performing a corresponding operation in an electronic calendar based on a user's original input, according to an embodiment.
6A, an electronic calendar is displayed on the
The width of each schedule (A, B, and C) may correspond to the dates on which the schedule was entered. For example, as shown in FIG. 6A, the "A" and "B" schedules may correspond to two days and 19 days respectively, and the B "schedules may all correspond to 10-11 days.
The height of each schedule (A, B, and C) may correspond to the duration of the schedule. For example, as shown in FIG. 6A, the duration of the " B " schedule may be shorter than the duration of the " A " schedule, and the duration of the " C "
Referring to FIG. 6A, a user's first input I1 for the electronic calendar displayed by the
If there are two or more dates in the space formed by the circle, the
Referring to FIG. 6B, a user's handwriting input ABC for the electronic calendar displayed by the
The
The user can more easily recognize his / her handwriting and can be erroneously converted to text, so that the
7A to 7D are diagrams illustrating a method for performing a corresponding operation in an electronic calendar based on a user's original input, according to an embodiment.
Referring to FIG. 7A, a user's
When there are two or more dates in the space formed by the ellipse, the
7B, a user's handwriting input DEF for the electronic calendar displayed by the
The
8A and 8B are diagrams for explaining a method for performing a corresponding operation in an electronic calendar based on an arrow input of a user according to an embodiment.
Referring to FIG. 8A, the user's third input I3 to the electronic calendar displayed by the
When the user's input to the electronic calendar corresponds to an arrow, the
The
9 is a diagram for explaining a method for providing a notification based on a star input of a user according to an embodiment.
The electronic device may perform an action corresponding to the gesture. The action corresponding to the gesture can be predetermined, for example, if the gesture is as shown in FIG. 9, the electronic device can adjust the priority of the schedule corresponding to the star input. For example, the electronic device can improve the priority of a schedule corresponding to a star input, and can provide a schedule corresponding to a star as a notification to a user. The notification can be pushed to the user ' s
For example, if there is a schedule (ABC) on
10 is a diagram illustrating a method for providing a notification based on user's star inputs, according to one embodiment.
The electronic device can determine the priority in the notification based on the number of stars. For example, as shown in FIG. 10, when two stars are drawn on six days with schedule ABC, and one star is drawn on
The electronic device may generate an alert based on the determined priority. For example, the schedule (ABC) having the highest priority in the notification, the schedule (C) having the next priority, and the schedules (A and B) having the next priority are located in order from the top .
Of course, the priority can be determined in various ways other than the above-described method, and the operation corresponding to the gesture can also be variously implemented.
11 is a block diagram of a device according to one embodiment.
All of the components shown in Fig. 2 are not essential components of the
Hereinafter, each component of the
The user
The touch pad can be configured not only to detect a real-touch but also to detect a proximity touch. For convenience of description, the touch-pad can be configured to detect both a real-touch and a proximity touch Can be referred to as "touch ".
In the present specification, the term "real-touch" refers to an input generated when a pointer is physically touched on the screen, and "proximity-touch" Refers to an input that is generated in a state in which it is approached by a predetermined distance from the screen even if it is not physically touched.
As used herein, the term " pointer "refers to a tool for directly touching or touching a specific portion of a displayed screen. Examples include stylus pens and fingers.
The
The
Meanwhile, when the
The
The
The
The
The
The
The short-range wireless communication unit 151 includes a Bluetooth communication unit, a Bluetooth low energy communication unit, a near field communication unit, a WLAN communication unit, a Zigbee communication unit, an IrDA , an infrared data association communication unit, a WFD (Wi-Fi Direct) communication unit, an UWB (ultra wideband) communication unit, an Ant + communication unit, and the like.
The
The
The A / V (Audio / Video)
The image frame processed by the
The microphone 1620 receives an external acoustic signal and processes it as electrical voice data. For example, the microphone 1620 may receive acoustic signals from an external device or speaker. The microphone 1620 may use various noise reduction algorithms to remove noise generated in receiving an external sound signal. The microphone 1620 may receive a user ' s voice response input corresponding to the problem content provided by the
The
Programs stored in the
The
Various sensors may be provided inside or around the touch pad to detect a user's touch. An example of such various sensors is a tactile sensor. A tactile sensor is a sensor that detects the contact of a specific object with a degree or more that a person feels. The tactile sensor can detect various information such as the roughness of the contact surface, the rigidity of the contact object, and the temperature of the contact point.
In addition, proximity sensors are examples of such various sensors. The proximity sensor refers to a sensor that detects the presence or absence of an object approaching a predetermined detection surface or a nearby object without mechanical contact using the force of an electromagnetic field or infrared rays. Examples of proximity sensors include a transmission type photoelectric sensor, a direct reflection type photoelectric sensor, a mirror reflection type photoelectric sensor, a high frequency oscillation type proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor.
An example of such various sensors is a force touch sensor. Different functions can be performed in the
The
In addition, some or all of the respective configurations of the
In addition, some of the respective configurations of the
The
All publications, including publications, patent applications, patents, and the like, cited herein are incorporated herein by reference in their entirety as if each cited document were individually and specifically incorporated or represented in its entirety by incorporation herein .
In order to facilitate understanding of the present invention, reference will be made to the preferred embodiments shown in the drawings, and specific terminology is used to describe the embodiments of the present invention. However, the present invention is not limited to the specific terminology, Lt; / RTI > may include all elements commonly conceivable by those skilled in the art.
The present invention may be represented by functional block configurations and various processing steps. These functional blocks may be implemented in a wide variety of hardware and / or software configurations that perform particular functions. For example, the present invention may include integrated circuit configurations, such as memory, processing, logic, look-up tables, etc., that may perform various functions by control of one or more microprocessors or other control devices Can be adopted. Similar to the components of the present invention that may be implemented with software programming or software components, the present invention may be implemented as a combination of C, C ++, and C ++, including various algorithms implemented with data structures, processes, routines, , Java (Java), assembler, and the like. Functional aspects may be implemented with algorithms running on one or more processors. Further, the present invention can employ conventional techniques for electronic environment setting, signal processing, and / or data processing. Terms such as "mechanism," "element," "means," and "configuration" may be used broadly and are not limited to mechanical and physical configurations. The term may include the meaning of a series of routines of software in conjunction with a processor or the like.
The specific acts described in the present invention are, by way of example, not intended to limit the scope of the invention in any way. For brevity of description, descriptions of conventional electronic configurations, control systems, software, and other functional aspects of such systems may be omitted. Also, the connections or connecting members of the lines between the components shown in the figures are illustrative of functional connections and / or physical or circuit connections, which may be replaced or additionally provided by a variety of functional connections, physical Connection, or circuit connections. Also, unless explicitly referred to as " essential ", " important ", etc., it may not be a necessary component for application of the present invention.
The use of the terms " above " and similar indication words in the specification of the present invention (particularly in the claims) may refer to both singular and plural. In addition, in the present invention, when a range is described, it includes the invention to which the individual values belonging to the above range are applied (unless there is contradiction thereto), and each individual value constituting the above range is described in the detailed description of the invention The same. Finally, the steps may be performed in any suitable order, unless explicitly stated or contrary to the description of the steps constituting the method according to the invention. The present invention is not necessarily limited to the order of description of the above steps. The use of all examples or exemplary language (e.g., etc.) in this invention is for the purpose of describing the present invention only in detail and is not to be limited by the scope of the claims, It is not. It will also be appreciated by those skilled in the art that various modifications, combinations, and alterations may be made depending on design criteria and factors within the scope of the appended claims or equivalents thereof.
Claims (10)
A display unit for displaying an electronic calendar;
An input unit for detecting a user's gesture with respect to the displayed electronic calendar; And
Determining an input coordinate in the displayed electronic calendar based on the detected gesture and performing an action corresponding to the detected gesture on the determined input coordinate.
Wherein the input is located on at least one of an upper end, a lower end, a left end, and a right end of the device.
Wherein the input coordinate is determined based on a distance between the gesture and the input,
Wherein the input comprises at least one camera for measuring the distance.
Wherein a first camera among the at least one camera is located at a first end of the device, a second camera is located at a second end of the device,
Wherein the first end and the second end are orthogonal to each other,
Wherein the first camera is used to measure coordinates on a first axis of the input coordinate,
Wherein the second camera is used to measure coordinates on a second axis of the input coordinate,
Wherein the first axis and the second axis are orthogonal to each other.
Wherein the input unit detects at least one of a moving direction and a moving speed of the gesture.
Wherein the input coordinate is determined within a space formed by the gesture within the displayed electronic calendar.
Wherein the input coordinate is determined to be a point in the displayed electronic calendar indicated by the gesture.
Wherein the gesture includes a symbol for determining a priority of the determined input coordinate in the displayed electronic calendar,
Wherein the control unit further generates, based on the determined priority, a notification related to the determined input coordinates.
Wherein the input unit further detects a user's handwriting input to the displayed electronic calendar,
Wherein the detected handwriting input is written to the determined input coordinates.
Wherein the display further displays at least one image based on a predetermined condition.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020160028301A KR101883353B1 (en) | 2016-03-09 | 2016-03-09 | Device, method, and computer readable medium for displaying a digital calendar |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020160028301A KR101883353B1 (en) | 2016-03-09 | 2016-03-09 | Device, method, and computer readable medium for displaying a digital calendar |
Publications (2)
Publication Number | Publication Date |
---|---|
KR20170105285A true KR20170105285A (en) | 2017-09-19 |
KR101883353B1 KR101883353B1 (en) | 2018-07-31 |
Family
ID=60033507
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020160028301A KR101883353B1 (en) | 2016-03-09 | 2016-03-09 | Device, method, and computer readable medium for displaying a digital calendar |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR101883353B1 (en) |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20110094740A (en) | 2010-02-17 | 2011-08-24 | 엘지전자 주식회사 | Image display device enable of displaying 3d object in a shape of analog watch and operation controlling method for the same |
KR20130076992A (en) | 2011-12-29 | 2013-07-09 | 전자부품연구원 | System and method for generating mask image applied in each threshold in region |
-
2016
- 2016-03-09 KR KR1020160028301A patent/KR101883353B1/en active IP Right Grant
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20110094740A (en) | 2010-02-17 | 2011-08-24 | 엘지전자 주식회사 | Image display device enable of displaying 3d object in a shape of analog watch and operation controlling method for the same |
KR20130076992A (en) | 2011-12-29 | 2013-07-09 | 전자부품연구원 | System and method for generating mask image applied in each threshold in region |
Non-Patent Citations (1)
Title |
---|
plandays, "프랭클린플래너 노트 10.1 어플_사용법1," Youtube.com, [online] 2012.10.11., [URL:https://www.youtube.com/watch?v=IPtVYw_Gw-Q] |
Also Published As
Publication number | Publication date |
---|---|
KR101883353B1 (en) | 2018-07-31 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11886252B2 (en) | Foldable device and method of controlling the same | |
CN107408045B (en) | Method of controlling apparatus having a plurality of operating systems installed therein and the apparatus | |
CN105278902B (en) | Mobile terminal and control method thereof | |
US10268434B2 (en) | Mobile terminal and control method therefor | |
US9887949B2 (en) | Displaying interactive notifications on touch sensitive devices | |
US20150095819A1 (en) | Method for displaying previews in a widget | |
US9632578B2 (en) | Method and device for switching tasks | |
KR102060153B1 (en) | A cover,an electronic device using the same and operating method thereof | |
RU2667047C2 (en) | Method for providing tactical effect in a portable terminal, machine-readable media and portable terminal | |
EP3109785A1 (en) | Portable apparatus and method for changing screen of the same | |
US20140333551A1 (en) | Portable apparatus and method of displaying object in the same | |
US11513676B2 (en) | Method and system for controlling device | |
US20160147429A1 (en) | Device for resizing window, and method of controlling the device to resize window | |
US20170199631A1 (en) | Devices, Methods, and Graphical User Interfaces for Enabling Display Management of Participant Devices | |
EP2741207A1 (en) | Method and system for providing information based on context, and computer-readable recording medium thereof | |
US11209930B2 (en) | Method of controlling device using various input types and device for performing the method | |
KR20140119608A (en) | Method and device for providing a private page | |
US20200326786A1 (en) | Device and method of controlling device | |
CN105635434B (en) | Mobile terminal and control method thereof | |
KR20150032068A (en) | Method and device for executing a plurality of applications | |
KR101883353B1 (en) | Device, method, and computer readable medium for displaying a digital calendar | |
KR20150026615A (en) | Method for providing schedule management and mobile device thereof | |
KR102306535B1 (en) | Method for controlling device and the device | |
KR20160027856A (en) | Method and portable terminal having bended display unit and cover for executing application | |
US11360652B2 (en) | Apparatus and method for providing for receipt of indirect touch input to a touch screen display |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
E902 | Notification of reason for refusal | ||
E701 | Decision to grant or registration of patent right | ||
GRNT | Written decision to grant |