KR20170105285A - Device, method, and computer readable medium for displaying a digital calendar - Google Patents

Device, method, and computer readable medium for displaying a digital calendar Download PDF

Info

Publication number
KR20170105285A
KR20170105285A KR1020160028301A KR20160028301A KR20170105285A KR 20170105285 A KR20170105285 A KR 20170105285A KR 1020160028301 A KR1020160028301 A KR 1020160028301A KR 20160028301 A KR20160028301 A KR 20160028301A KR 20170105285 A KR20170105285 A KR 20170105285A
Authority
KR
South Korea
Prior art keywords
input
electronic device
gesture
electronic calendar
user
Prior art date
Application number
KR1020160028301A
Other languages
Korean (ko)
Other versions
KR101883353B1 (en
Inventor
이상근
김종윤
Original Assignee
이상근
경동대학교 산학협력단
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 이상근, 경동대학교 산학협력단 filed Critical 이상근
Priority to KR1020160028301A priority Critical patent/KR101883353B1/en
Publication of KR20170105285A publication Critical patent/KR20170105285A/en
Application granted granted Critical
Publication of KR101883353B1 publication Critical patent/KR101883353B1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/002Specific input/output arrangements not covered by G06F3/01 - G06F3/16
    • G06F3/005Input arrangements through a video camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Abstract

A device for displaying an electronic calendar is provided. Specifically, the device comprises a display unit for displaying an electronic calendar, an input for detecting a user's gesture to the displayed electronic calendar, and an input unit for determining input coordinates in the displayed electronic calendar based on the detected gesture, And a controller for performing an operation corresponding to the detected gesture on the determined input coordinates.

Description

[0001] DEVICE, METHOD, AND COMPUTER READABLE MEDIUM FOR DISPLAYING ELECTRONIC CALENDAR [0002] DEVICE, METHOD, AND COMPUTER READABLE MEDIUM FOR DISPLAYING A DIGITAL CALENDAR [

The disclosed embodiments relate to a device, method and computer readable medium for displaying an electronic calendar, and more particularly to a device, method and computer readable medium for displaying a gesture of a user to a displayed electronic calendar and performing a corresponding operation Methods, and computer readable media.

As electronic devices evolve, a variety of conventional products have been replaced by electronic devices. For example, paper letters can be replaced by electronic letters, film photographs can be replaced with digital pictures, and board games such as cards, chess, chess, etc. can be digitally implemented in computing devices such as PCs, smart phones, tablets, .

In response to this demand for digitalization, technologies are being developed that can digitize calendars.

Digital calendars with electronic calendars are typically run as applications or software programs on smartphones or PCs. However, electronic calendars that run on smartphones or PCs are not yet able to replace the calendars used in homes and offices.

First, electronic calendars that run on smartphones or PCs are inevitably not big enough for the display size, giving users a cognitive burden, and consuming a lot of power to keep them running constantly.

On the other hand, with the development of technology, various electronic devices have become available to users, and research is being conducted to enable users to use various electronic devices more efficiently.

Recently, the Internet of Things (IOT) technology has attracted attention as a way for users to use various electronic devices more efficiently. The things Internet allows a user to control various electronic devices via a single electronic device, for example, a smart phone.

The disclosed embodiments provide an electronic calendar device with an interface that can enhance mutual interaction with a user.

The disclosed embodiments seek to provide an electronic calendar device for detecting a user's gesture and performing a corresponding action.

The disclosed embodiments provide an electronic calendar device that is capable of efficiently communicating with other electronic devices.

According to a first aspect of the present disclosure, there is provided a device for displaying an electronic calendar, comprising: a display for displaying an electronic calendar; a display for displaying a gesture of the user on the displayed electronic calendar; And a control unit for determining an input coordinate in the displayed electronic calendar based on the input unit and the detected gesture, and performing an operation corresponding to the detected gesture on the determined input coordinate.

In addition, a second aspect of the disclosure provides a method, comprising displaying an electronic calendar, detecting a user's gesture on the displayed electronic calendar, and determining input coordinates on the displayed electronic calendar based on the detected gesture And performing an action corresponding to the detected gesture on the determined input coordinate.

In addition, the third aspect of the present disclosure can provide a computer-readable medium having recorded thereon a program for causing a computer to execute the method of the second aspect.

1 is a schematic diagram of a device according to one embodiment.
2 is a block diagram of a device according to one embodiment.
3 is an exemplary diagram of an interface for displaying an electronic calendar, in accordance with one embodiment.
4A is a perspective view of a device according to one embodiment.
4B is a diagram for explaining a method for detecting a gesture of a user according to an embodiment.
5A is a perspective view of a device according to one embodiment.
5B is a diagram for explaining a method for detecting a gesture of a user according to an embodiment.
6A to 6D are diagrams illustrating a method for performing a corresponding operation in an electronic calendar based on a user's original input, according to an embodiment.
7A to 7D are diagrams illustrating a method for performing a corresponding operation in an electronic calendar based on a user's original input, according to an embodiment.
8A and 8B are diagrams for explaining a method for performing a corresponding operation in an electronic calendar based on an arrow input of a user according to an embodiment.
9 is a diagram for explaining a method for providing a notification based on a star input of a user according to an embodiment.
10 is a diagram illustrating a method for providing a notification based on user's star inputs, according to one embodiment.
11 is a block diagram of a device according to one embodiment.

Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings so that those skilled in the art can easily carry out the present invention. The present invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. In order to clearly illustrate the present invention, parts not related to the description are omitted, and like parts are denoted by similar reference numerals throughout the specification.

As used herein, the terminology used herein is intended to encompass all commonly used generic terms that are possible in light of the functionality of the various embodiments, which may vary depending upon the intent or circumstance of the skilled artisan, the emergence of new technology, and the like. In addition, in certain cases, there may be a term arbitrarily selected by the applicant, in which case the meaning thereof will be described in detail in the description of the corresponding embodiment. Accordingly, the terms used in the present specification should be defined based on the meaning of the term, not on the name of a simple term, and on the contents throughout the specification.

The singular expressions include plural expressions unless the context clearly indicates otherwise. It is to be understood that the terms "comprises", "having", and the like in the specification are intended to specify the presence of stated features, integers, steps, operations, elements, parts or combinations thereof, But do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, parts, or combinations thereof. In particular, the numbers set forth in the specification are intended to be illustrative, and the invention should not be limited by the numbers set forth in the specification.

Also, the terms " part, "" module," etc. in the specification mean units for processing at least one function or operation, which may be implemented in hardware or software or a combination of hardware and software.

Although "first "," second "and the like are used throughout the specification to describe various components, it goes without saying that these components are not limited by these terms. These terms are used only to distinguish one component from another. It goes without saying that the "first component" mentioned below may be a "second component" within the technical concept of the embodiment.

The terms used in this specification will be briefly described and the present invention will be described in detail.

Throughout the specification, the electronic device can be an electronic calendar device, a smart phone, a tablet, a mobile phone, a personal digital assistant (PDA), a media player, a portable multimedia player (PMP), an electronic book terminal, a digital broadcast terminal, but are not limited to, a laptop, a micro server, a global positioning system (GPS) device, a navigation, a kiosk, an MP3 player, a smart TV, a digital camera and other mobile or non-mobile computing devices.

1 is a schematic diagram of a device according to one embodiment.

An electronic calendar is displayed in electronic device 1000a according to one embodiment. Referring to Figure 1, an electronic calendar may include an image 1210a and dates 1100a.

The electronic device 1000a may determine input coordinates in the electronic calendar based on the user's gesture input to the electronic calendar and perform actions corresponding to the detected gesture input on the determined input coordinates.

For example, if a user applies a gesture input for any date of dates 1100a in an electronic calendar, the electronic device 1000a detects the gesture input applied by the user and determines the intended date of the user And perform an action corresponding to the detected gesture input for the determined date.

According to the electronic device 1000a according to the embodiment, the gesture input intuitively applied to the electronic calendar by the user is recognized, and the user's intended operation can be performed. Examples of actions performed in response to the detected gesture input are described below with reference to Figures 6A-10.

According to the electronic device 1000a according to the embodiment, since the image 1210a can be displayed in the electronic device 1000a, a better esthetics can be provided to the user compared to the existing calendar.

2 is a block diagram of a device according to one embodiment.

Referring to FIG. 2, the electronic device 1000 may include an input unit 1100, a display unit 1210, and a control unit 1300.

The display unit 1210 of the electronic device 1000 visualizes and displays various information processed by the electronic device 1000. [ For example, the display unit 1210 may display an electronic calendar.

The input 1100 of the electronic device 1000 may receive various inputs applied to the electronic device 1000. [ For example, the input unit 1100 may detect the user's gesture input to the electronic calendar. The input unit 1100 may be a camera, which will be described later with reference to Figs. 4A to 5B.

In one embodiment, the display portion 1210 of the electronic device 1000a may further display objects. For example, the object may be an object representing a current time, a device state of the electronic device 1000a, a communication state of the electronic device 1000a, a connection state of the electronic device 1000a with another electronic device, and the like.

In one embodiment, the display portion 1210 of the electronic device 1000a can display the electronic calendar and the object together, alternately display the electronic calendar and the object, or overlay the object on the full calendar .

In one embodiment, the display portion 1210 of the electronic device 1000a may display a graphical user interface (GUI) for displaying an icon of the application. The graphical user interface may be, for example, a desktop, a tile, a launchpad, a widget board, a homescreen, and the like. An electronic calendar icon displayed on this graphical user interface may be selected to enable an electronic calendar application to be executed. The control unit 1300 of the electronic device 1000 controls the components of the electronic device 1000. For example, the control unit 1300 may control the display unit 1210 to display various information. For example, the control unit 1300 may control the input unit 1100 to determine the input coordinates of the input received through the input unit 1100. Further, the control unit may perform an operation corresponding to the received gesture input on the determined input coordinates.

The display portion 1210 of the electronic device 1000 may be a touch screen display. The touch screen display may include a touch panel and a liquid crystal panel, and the liquid crystal panel may be implemented by an electronic ink (E-ink). Since a liquid crystal panel implemented with electronic ink requires less power than a liquid crystal panel implemented in another manner, the electronic device according to an embodiment can be driven with less power. The touch screen display can be implemented in various ways, for example, E-Ink, LCD, OLED, and the like.

The display portion 1210 of the electronic device 1000 may detect the user's input on the dates included in the electronic calendar. The user's input may be a gesture input. Examples of gesture inputs are described below with reference to Figures 6A-10.

The input 1100 of the electronic device 1000 may be a touch screen display.

The input unit 1100 and the display unit 1210 of the electronic device 1000 may be a touch screen display. That is, the touch screen display may include a touch panel and a liquid crystal panel, the input unit 1100 may be a touch panel, and the display unit 1210 may be a liquid crystal panel.

3 is an exemplary diagram of an interface for displaying an electronic calendar, in accordance with one embodiment.

Referring to FIG. 3, dates and images can be alternately displayed by the display portion 1210b of the spread device 1000b. Here, the image may be an image determined randomly by the electronic device 1000b, and may be an image determined by a user, but may be various images without limitation.

In one embodiment, the image may be displayed in various forms. For example, an image can be a video itself, or it can include a video. Further, the image may include text.

In one embodiment, the image may be an image related to advertising, public relations, marketing, public announcement, promotion, propaganda, etc., and may be an image provided by the person or organization. For example, when an electronic device according to an embodiment is installed in a school, an image may include content for a school or a class, and an electronic device may also be used as a bulletin board. At this time, the dates of the electronic device may be input with the schedule of the school or class (for example, midterm exam, athletic meet, graduation ceremony, etc.) and displayed alternately with the image.

For example, when the electronic device according to one embodiment is installed in-house, the image may include content for the company, and the electronic device may also be used as an in-house bulletin board. At this time, the company's schedule (e.g., deadline for delivery, anniversary of foundation, anniversary, etc.) may be input into the dates of the electronic device, and displayed together with the image or alternately.

For example, when an electronic device according to one embodiment is installed in a public place, the image may include content related to the public place, and the electronic device may also be used as a guide. At this time, the dates of the electronic device may be input with a schedule (e.g., an event date, a performance date, a holiday date, etc.) associated with a public place and displayed alternately with the image.

The electronic device according to one embodiment can be installed in various places for the purpose of advertisement, publicity, marketing, announcement, promotion, propaganda, etc., so that an enormous advertising effect can be expected.

The electronic device according to an embodiment may be accessed by a plurality, so that information can be easily shared.

The electronic device according to an embodiment can display an image corresponding to various purposes and can update the date, so that there is no limitation in its utilization and duration of use compared with the existing paper calendar.

The dates and images of the electronic calendar may be displayed together as shown in FIG.

4A is a perspective view of a device according to one embodiment. For convenience of explanation, FIG.

4B is a diagram for explaining a method for detecting a gesture of a user according to an embodiment.

The distance between the user's gesture input and the input may be detected by the input of the electronic device to determine the input coordinates of the gesture input applied by the user. The input unit may be, but is not limited to, a camera, a depth camera, an infrared sensor, an ultrasonic sensor, a laser sensor, and the like.

Referring to Figs. 4A and 4B, an electronic device 1000c1 according to an embodiment may include an input unit configured by two cameras 1102c1 and 1104c1, and a display unit 1210c1.

In order to improve the accuracy of the input coordinates, the electronic device 1000c1 may include three or more cameras, or may include only one camera.

The camera of the electronic device 1000c1 may be located on at least one of the upper end, the lower end, the left end, and the right end of the electronic device 1000c1. For example, as shown in Figs. 4A and 4B, two cameras 1102c1 and 1104c1 of the electronic device 1000c1 may be located at the upper and left ends.

On the other hand, as shown in Fig. 4A, the display portion 1210c1 of the electronic device 1000c1 may be located on the rear surface of the electronic device 1000c1. That is, the input unit of the electronic device 1000c1, for example, two cameras 1102c1 and 1104c1, can direct a space on the display unit 1210c1.

Referring to FIG. 4B, the first camera 1102c1 located at the upper end of the electronic device 1000c1 may have a predetermined first angle of view FOV1, and may be a second camera located at the left end of the electronic device 1000c1 1104c1 may have a predetermined second angle of view FOV2.

The first camera 1102c1 may detect the distance between the user's gesture input and the first camera 1102c1 to determine the ordinate axis input coordinates of the user's gesture input to the electronic device 1000c1. For example, when a user draws a predetermined gesture by using his / her hand on a space on the display portion 1210c1 of the electronic device 1000c1 or on the display portion 1210c1, the first camera 1102c1 displays a predetermined gesture 1 camera 1102c1, and the electronic device 1000c1 can determine the vertical axis input coordinates of the gesture based on the detected distance. Referring to FIG. 4B, the shorter the detected distance is, the larger the vertical axis input coordinate value is calculated, and the longer the detected distance, the smaller the vertical axis input coordinate value can be calculated.

On the other hand, the second camera 1104c1 detects the distance between the user's gesture and the second camera 1104c1, and the electronic device 1000c1 can determine the coordinates of the horizontal axis input of the gesture based on the detected distance.

Based on the determined ordinate input coordinates and the abscissa input coordinates, the final input coordinates of the gesture input may be determined.

5A is a perspective view of a device according to one embodiment. For convenience of explanation, FIG. 5B is also referred to together.

5B is a diagram for explaining a method for detecting a gesture of a user according to an embodiment.

The camera of electronic device 1000c2 may be located on at least one of the top, bottom, left end, and right end of electronic device 1000c2. For example, as shown in Figs. 5A and 5B, two cameras 1102c2 and 1104c2 of the electronic device 1000c2 may be located at the upper end and the upper end of the electronic device 1000c2.

5B, the third camera 1102c2 positioned at the upper end of the electronic device 1000c2 may have a predetermined third angle of view FOV3, and the third camera 1102c2 positioned at the upper end of the electronic device 1000c2 may have a third angle of view The camera 1104c2 may have a predetermined fourth angle of view FOV4.

In order to determine the first diagonal axis input coordinates extending in the upper-left and lower-right directions of the gesture input of the user applied to the electronic device 1000c2, the third camera 1102c2 determines The distance between the gesture input and the third camera 1102c2 can be detected. Referring to FIG. 5B, the shorter the detected distance, the larger the first diagonal axis input coordinate value is calculated, and the smaller the first diagonal axis input coordinate value can be calculated as the detected distance is longer.

On the other hand, the second camera 1104c2 detects the distance between the user's gesture input and the second camera 1104c2, and the electronic device 1000c2 detects the upper-right and lower-left lower-left direction of the first diagonal axis.

Based on the determined first oblique axis input coordinates and second oblique axis input coordinates, the final input coordinates of the gesture input may be determined.

6A to 6D are diagrams illustrating a method for performing a corresponding operation in an electronic calendar based on a user's original input, according to an embodiment.

6A, an electronic calendar is displayed on the display portion 1210d of the electronic device 1000d, and predetermined dates (2, 10-11, and 19) of the dates of the electronic calendar are stored in the schedule (A, B, and C) may be input.

The width of each schedule (A, B, and C) may correspond to the dates on which the schedule was entered. For example, as shown in FIG. 6A, the "A" and "B" schedules may correspond to two days and 19 days respectively, and the B "schedules may all correspond to 10-11 days.

The height of each schedule (A, B, and C) may correspond to the duration of the schedule. For example, as shown in FIG. 6A, the duration of the " B " schedule may be shorter than the duration of the " A " schedule, and the duration of the " C "

Referring to FIG. 6A, a user's first input I1 for the electronic calendar displayed by the display portion 1210d may be applied. The first input I1 may be a circle. Circles can be drawn on the 6th and 7th days.

Electronic device 1000d may determine the date based on the space formed by the input applied to the electronic calendar. For example, when the first input I1 corresponding to a circle is applied as shown in Fig. 6A, the electronic device 1000d can determine the date within the space formed by the circle.

If there are two or more dates in the space formed by the circle, the electronic device 1000d may determine the date occupying the most space among the two or more dates. For example, if the first input I1 is drawn on days 6 and 7, as shown in FIG. 6A, the electronic device 1000d may associate the first input I1 with six days.

Referring to FIG. 6B, a user's handwriting input ABC for the electronic calendar displayed by the display portion 1210d may be applied. The electronic device 1000d may associate the handwriting input ABC with a date adjacent to the location where the handwriting input ABC is applied. The electronic device 1000d may associate the handwriting input ABC with another gesture input adjacent to where the handwriting input ABC is applied. For example, referring to FIG. 6B, the electronic device 1000d may associate a handwriting input ABC with a first input I1. The handwriting input ABC may be associated with a predetermined date if the associated gesture input is associated with a predetermined date. For example, referring to FIG. 6B, the electronic device 1000d may associate the handwriting input ABC with the six days associated with the first input I1.

The electronic device 1000d can associate the handwriting input ABC with a predetermined date and register the handwriting input ABC on a predetermined date. For example, referring to FIG. 6C, the handwriting input ABC can be registered on a schedule of six days. At this time, the handwriting input ABC can be converted into text and registered as a schedule. Optical Character Recognition (OCR) may be used to convert handwritten input ABC to text, but is not limited thereto.

The user can more easily recognize his / her handwriting and can be erroneously converted to text, so that the electronic device 1000d may display the handwriting input ABC as shown in Fig. 6D. The handwriting input ABC can be displayed within the corresponding date.

7A to 7D are diagrams illustrating a method for performing a corresponding operation in an electronic calendar based on a user's original input, according to an embodiment.

Referring to FIG. 7A, a user's second input 12 for the electronic calendar displayed by the display portion 1210e may be applied. The second input 12 may be an ellipse extending in the left and right direction. Ellipses can be drawn on days 6, 7, 12, 13 and 14.

Electronic device 1000e may determine the date based on the space formed by the input applied to the electronic calendar. For example, when the second input I2 corresponding to the ellipse is applied as shown in Fig. 7A, the electronic device 1000e can determine the date within the space formed by the ellipse.

When there are two or more dates in the space formed by the ellipse, the electronic device 1000e can determine two or more dates occupying a space of a predetermined ratio or more among two or more dates. The predetermined ratio can be predetermined. For example, the predetermined ratio may be 30% of the total space. As shown in FIG. 7A, when the second input I2 is drawn on the 6th, 7th, 12th, 13th and 14th days, a date occupying 30% or more of the space in the space formed by the ellipse, The electronic device 1000e can determine 13 days and 14 days. Electronic device 1000e may associate the second input 12 with the determined dates of 13 days and 14 days.

7B, a user's handwriting input DEF for the electronic calendar displayed by the display portion 1210e can be applied, and the electronic device 1000e receives the handwriting input DEF via the second input I2 ≪ / RTI > on the 13th and 14th days associated with < RTI ID = 0.0 >

The electronic device 1000e can associate the handwriting input DEF with a predetermined date and register the handwriting input DEF as a schedule on a predetermined date. For example, referring to FIG. 7C, the handwriting input DEF can be registered on a schedule of 13 days and 14 days. The user can more easily recognize his / her handwriting and can be erroneously converted into text, so that the electronic device 1000e can display the handwriting input DEF as shown in FIG. 7D. When the handwriting input DEF corresponds to a plurality of dates, the handwriting input DEF can be positioned and displayed in the middle of a plurality of dates.

8A and 8B are diagrams for explaining a method for performing a corresponding operation in an electronic calendar based on an arrow input of a user according to an embodiment.

Referring to FIG. 8A, the user's third input I3 to the electronic calendar displayed by the display portion 1210f may be applied. The third input I3 may correspond to a position in the electronic calendar. For example, if the third input I3 is a line, the end of the line may correspond to a position in the electronic calendar. The location in the electronic calendar may be the date of the electronic calendar and, if the third input I3 is a line, the electronic device 1000f may associate the third input I3 with the date the end of the line is located. For example, referring to FIG. 8A, a third input I3 may be associated on days 6 and 20, where the end of the third input I3 is located.

When the user's input to the electronic calendar corresponds to an arrow, the electronic device 1000f can distinguish the starting and ending points of the arrows based on the nibs of the arrows. 8A, the electronic device 1000f may determine 6 days as the start point of the third input I3, which is an arrow, and 20 days, as the end point of the third input I3.

The electronic device 1000f can perform an operation corresponding to the gesture. The action corresponding to the gesture can be predetermined. For example, if the gesture is an arrow, the electronic device 1000f may move the schedule of dates corresponding to the start point of the arrow to a date corresponding to the end point of the arrow. For example, referring to FIGS. 8A and 8B, electronic device 1000f may move a six day schedule ABC to 20 days in response to an arrow gesture input beginning on day 6 and ending on day 20 .

9 is a diagram for explaining a method for providing a notification based on a star input of a user according to an embodiment.

The electronic device may perform an action corresponding to the gesture. The action corresponding to the gesture can be predetermined, for example, if the gesture is as shown in FIG. 9, the electronic device can adjust the priority of the schedule corresponding to the star input. For example, the electronic device can improve the priority of a schedule corresponding to a star input, and can provide a schedule corresponding to a star as a notification to a user. The notification can be pushed to the user ' s mobile device 2000.

For example, if there is a schedule (ABC) on day 6 as shown in FIG. 9, and a star entry is detected on day 6, the electronic device sends a message (M1) that the schedule (ABC) To the mobile device 2000 of FIG.

10 is a diagram illustrating a method for providing a notification based on user's star inputs, according to one embodiment.

The electronic device can determine the priority in the notification based on the number of stars. For example, as shown in FIG. 10, when two stars are drawn on six days with schedule ABC, and one star is drawn on day 19 with schedule C, (ABC), a 19-day schedule with one star (C), a 2-day schedule without a star (A), and a 10-11-day schedule B) in order.

The electronic device may generate an alert based on the determined priority. For example, the schedule (ABC) having the highest priority in the notification, the schedule (C) having the next priority, and the schedules (A and B) having the next priority are located in order from the top .

Of course, the priority can be determined in various ways other than the above-described method, and the operation corresponding to the gesture can also be variously implemented.

11 is a block diagram of a device according to one embodiment.

All of the components shown in Fig. 2 are not essential components of the electronic device 1000. The electronic device 1000 may be implemented by more components than the components shown in FIG. 2, for example, as shown in FIG. 3, the electronic device 1000, according to one embodiment, And may further include an output unit 1200, a sensing unit 1400, a communication unit 1500, and an A / V input unit 1600 in addition to the input unit 1700, the user input receiving unit 1100, and the control unit 1300.

Hereinafter, each component of the electronic device 1000 will be described in detail.

The user input receiving unit 1100 means a means for the user to input data in order to control the electronic device 1000. For example, the user input receiving unit 1100 may include a key pad, a dome switch, a touch pad (contact type capacitance type, pressure type resistive type, infrared ray detection type, surface ultrasonic wave type, A fatigue strength measuring method, a piezo effect method, etc.), a jog wheel, a jog switch, and the like, but is not limited thereto. In particular, when the touch pad constitutes a layer structure with the display unit 1210 and is configured as a touch screen or a touch screen display, the user input receiving unit 1100 may be used as an output device in addition to the input device.

The touch pad can be configured not only to detect a real-touch but also to detect a proximity touch. For convenience of description, the touch-pad can be configured to detect both a real-touch and a proximity touch Can be referred to as "touch ".

In the present specification, the term "real-touch" refers to an input generated when a pointer is physically touched on the screen, and "proximity-touch" Refers to an input that is generated in a state in which it is approached by a predetermined distance from the screen even if it is not physically touched.

As used herein, the term " pointer "refers to a tool for directly touching or touching a specific portion of a displayed screen. Examples include stylus pens and fingers.

The output unit 1200 may output an audio signal, a video signal, or a vibration signal, and the output unit 1200 may include a display unit 1210, an acoustic output unit 1220, and a vibration motor 1230 .

The display unit 1210 displays information processed in the electronic device 1000. [ For example, the display unit 1210 may display an execution screen of an operating system running on the electronic device 1000 and an execution screen of an application running on the operating system.

Meanwhile, when the display unit 1210 and the touch pad constitute a layer structure to constitute a touch screen, the display unit 1210 can be used as an input device in addition to the output device. The display unit 1210 may be a liquid crystal display, a thin film transistor-liquid crystal display, an organic light-emitting diode, a flexible display, a three-dimensional display A 3D display, and an electrophoretic display. According to the embodiment of the electronic device 1000, the electronic device 1000 may include two or more display units 1210. At this time, the two or more display units 1210 may be arranged to face each other using a hinge.

The audio output unit 1220 outputs audio data received from the communication unit 1500 or stored in the memory 1700. Further, the sound output unit 1220 outputs an acoustic signal related to the function performed in the electronic device 1000. [ The sound output unit 1220 may include a speaker, a buzzer, and the like.

The vibration motor 1230 can output a vibration signal. For example, the vibration motor 1230 can output a vibration signal corresponding to an output of audio data or video data. In addition, the vibration motor 1230 may output a vibration signal when a touch is input to the touch screen.

The control unit 1300 can perform the functions of the electronic device 1000 by controlling the overall operation of the electronic device 1000. [ For example, the control unit 1300 may include a user input receiving unit 1100, an output unit 1200, a sensing unit 1400, a communication unit 1500, and an A / V input unit 1100, by executing programs stored in the storage unit 1700. [ (1600), and the like.

The sensing unit 1400 may sense a state of the electronic device 1000 or a state around the electronic device 1000 and may transmit sensed information to the control unit 1300. [

The sensing unit 1400 includes a magnetic sensor 1410, an acceleration sensor 1420, an on / humidity sensor 1430, an infrared sensor 1440, a gyroscope sensor 1450, a position sensor (GPS) 1460, an air pressure sensor 1470, a proximity sensor 1480, and an RGB sensor (illuminance sensor) 1490. However, the present invention is not limited thereto. The function of each sensor can be intuitively deduced from the name by those skilled in the art, so a detailed description will be omitted.

The communication unit 1500 may include one or more components that allow communication between the electronic device 1000 and an external device. For example, the communication unit 1500 may include a local communication unit 1510, a mobile communication unit 1520, and a broadcast receiving unit 1530.

The short-range wireless communication unit 151 includes a Bluetooth communication unit, a Bluetooth low energy communication unit, a near field communication unit, a WLAN communication unit, a Zigbee communication unit, an IrDA , an infrared data association communication unit, a WFD (Wi-Fi Direct) communication unit, an UWB (ultra wideband) communication unit, an Ant + communication unit, and the like.

The mobile communication unit 1520 transmits and receives a radio signal to at least one of a base station, an external terminal, and a server on a mobile communication network. Here, the wireless signal may include various types of data depending on a voice signal, a video call signal, or a text / multimedia message transmission / reception.

The broadcast receiver 1530 receives broadcast signals and / or broadcast-related information from outside through a broadcast channel. The broadcast channel may include a satellite channel and a terrestrial channel. The electronic device 1000 may not include the broadcast receiver 1530 according to the embodiment.

The A / V (Audio / Video) input unit 1600 is for inputting an audio signal or a video signal, and may include a camera 1610, a microphone 1620, and the like. The camera 1610 can obtain image frames such as still images or moving images through the image sensor in the video communication mode or the photographing mode. The image captured through the image sensor can be processed through the control unit 1300 or a separate image processing unit.

The image frame processed by the camera 1610 may be stored in the storage unit 1700 or may be transmitted to the outside through the communication unit 1500. More than one camera 1610 may be provided according to the configuration of the terminal.

The microphone 1620 receives an external acoustic signal and processes it as electrical voice data. For example, the microphone 1620 may receive acoustic signals from an external device or speaker. The microphone 1620 may use various noise reduction algorithms to remove noise generated in receiving an external sound signal. The microphone 1620 may receive a user ' s voice response input corresponding to the problem content provided by the control unit 1300 described above.

The storage unit 1700 may be a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (for example, SD or XD memory), a RAM (Random Access Memory) SRAM (Static Random Access Memory), ROM (Read Only Memory), EEPROM (Electrically Erasable Programmable Read-Only Memory), PROM (Programmable Read-Only Memory) , An optical disc, and the like. The storage unit may be referred to as a memory.

Programs stored in the storage unit 1700 can be classified into a plurality of modules according to their functions. For example, the programs can be classified into a UI module 1710, a touch screen module 1720, a notification module 1730, have.

The UI module 1710 can provide a specialized UI, a GUI, and the like that are interlocked with the electronic device 1000 for each application. The touch screen module 1720 senses a touch gesture on the user's touch screen and can transmit information on the touch gesture to the control unit 1300. [ The touch screen module 1720 according to one embodiment can recognize and analyze the touch code. The touch screen module 1720 may be configured as separate hardware including a controller.

Various sensors may be provided inside or around the touch pad to detect a user's touch. An example of such various sensors is a tactile sensor. A tactile sensor is a sensor that detects the contact of a specific object with a degree or more that a person feels. The tactile sensor can detect various information such as the roughness of the contact surface, the rigidity of the contact object, and the temperature of the contact point.

In addition, proximity sensors are examples of such various sensors. The proximity sensor refers to a sensor that detects the presence or absence of an object approaching a predetermined detection surface or a nearby object without mechanical contact using the force of an electromagnetic field or infrared rays. Examples of proximity sensors include a transmission type photoelectric sensor, a direct reflection type photoelectric sensor, a mirror reflection type photoelectric sensor, a high frequency oscillation type proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor.

An example of such various sensors is a force touch sensor. Different functions can be performed in the electronic device 1000 according to the magnitude of the pressure of touching the touch pad, so that gesture input that can be applied through the touch pad can be further diversified.

The notification module 1730 may generate a signal for notifying the occurrence of an event of the electronic device 1000. [ Examples of events generated in the electronic device 1000 include signal reception, message reception, key signal input, schedule notification, and the like. The notification module 1730 may output a notification signal in the form of a video signal through the display unit 1210 or may output a notification signal in the form of an audio signal through the sound output unit 1220, It is possible to output a notification signal in the form of a vibration signal.

In addition, some or all of the respective configurations of the electronic device 1000 shown in Figs. 2 and 3 may be implemented by at least one hardware processor. For example, some or all of the configurations of the electronic device 1000 may be implemented through a separate processor other than the main processor of the electronic device 1000. [

In addition, some of the respective configurations of the electronic device 1000 shown in Figs. 2 and 3 may be implemented by at least one software program. For example, some functions of the electronic device 1000 may be implemented by an Operating System program, and some functions may be implemented by an application program. Accordingly, the functions of the electronic device 1000 may be implemented by at least one piece of hardware and at least one piece of software, and the functions of the electronic device 1000 implemented by the software may be implemented by operating system And an application.

The electronic device 1000 according to one embodiment includes a processor, a memory for storing and executing program data, a permanent storage such as a disk drive, a communication port for communicating with an external device, a touch panel, a key, A user interface device such as a button, and the like. Methods implemented with software modules or algorithms may be stored on a computer readable recording medium as computer readable codes or program instructions executable on the processor. (ROM), a random-access memory (RAM), a floppy disk, a hard disk), an optical reading medium (such as a CD-ROM), a magneto- , And a DVD (Digital Versatile Disc)). The computer-readable recording medium may be distributed over networked computer systems so that computer readable code can be stored and executed in a distributed manner. The medium is readable by a computer, stored in a memory, and executable on a processor.

All publications, including publications, patent applications, patents, and the like, cited herein are incorporated herein by reference in their entirety as if each cited document were individually and specifically incorporated or represented in its entirety by incorporation herein .

In order to facilitate understanding of the present invention, reference will be made to the preferred embodiments shown in the drawings, and specific terminology is used to describe the embodiments of the present invention. However, the present invention is not limited to the specific terminology, Lt; / RTI > may include all elements commonly conceivable by those skilled in the art.

The present invention may be represented by functional block configurations and various processing steps. These functional blocks may be implemented in a wide variety of hardware and / or software configurations that perform particular functions. For example, the present invention may include integrated circuit configurations, such as memory, processing, logic, look-up tables, etc., that may perform various functions by control of one or more microprocessors or other control devices Can be adopted. Similar to the components of the present invention that may be implemented with software programming or software components, the present invention may be implemented as a combination of C, C ++, and C ++, including various algorithms implemented with data structures, processes, routines, , Java (Java), assembler, and the like. Functional aspects may be implemented with algorithms running on one or more processors. Further, the present invention can employ conventional techniques for electronic environment setting, signal processing, and / or data processing. Terms such as "mechanism," "element," "means," and "configuration" may be used broadly and are not limited to mechanical and physical configurations. The term may include the meaning of a series of routines of software in conjunction with a processor or the like.

The specific acts described in the present invention are, by way of example, not intended to limit the scope of the invention in any way. For brevity of description, descriptions of conventional electronic configurations, control systems, software, and other functional aspects of such systems may be omitted. Also, the connections or connecting members of the lines between the components shown in the figures are illustrative of functional connections and / or physical or circuit connections, which may be replaced or additionally provided by a variety of functional connections, physical Connection, or circuit connections. Also, unless explicitly referred to as " essential ", " important ", etc., it may not be a necessary component for application of the present invention.

The use of the terms " above " and similar indication words in the specification of the present invention (particularly in the claims) may refer to both singular and plural. In addition, in the present invention, when a range is described, it includes the invention to which the individual values belonging to the above range are applied (unless there is contradiction thereto), and each individual value constituting the above range is described in the detailed description of the invention The same. Finally, the steps may be performed in any suitable order, unless explicitly stated or contrary to the description of the steps constituting the method according to the invention. The present invention is not necessarily limited to the order of description of the above steps. The use of all examples or exemplary language (e.g., etc.) in this invention is for the purpose of describing the present invention only in detail and is not to be limited by the scope of the claims, It is not. It will also be appreciated by those skilled in the art that various modifications, combinations, and alterations may be made depending on design criteria and factors within the scope of the appended claims or equivalents thereof.

Claims (10)

A device for displaying an electronic calendar,
A display unit for displaying an electronic calendar;
An input unit for detecting a user's gesture with respect to the displayed electronic calendar; And
Determining an input coordinate in the displayed electronic calendar based on the detected gesture and performing an action corresponding to the detected gesture on the determined input coordinate.
The method according to claim 1,
Wherein the input is located on at least one of an upper end, a lower end, a left end, and a right end of the device.
The method according to claim 1,
Wherein the input coordinate is determined based on a distance between the gesture and the input,
Wherein the input comprises at least one camera for measuring the distance.
The method according to claim 1,
Wherein a first camera among the at least one camera is located at a first end of the device, a second camera is located at a second end of the device,
Wherein the first end and the second end are orthogonal to each other,
Wherein the first camera is used to measure coordinates on a first axis of the input coordinate,
Wherein the second camera is used to measure coordinates on a second axis of the input coordinate,
Wherein the first axis and the second axis are orthogonal to each other.
The method according to claim 1,
Wherein the input unit detects at least one of a moving direction and a moving speed of the gesture.
The method according to claim 1,
Wherein the input coordinate is determined within a space formed by the gesture within the displayed electronic calendar.
The method according to claim 1,
Wherein the input coordinate is determined to be a point in the displayed electronic calendar indicated by the gesture.
The method according to claim 1,
Wherein the gesture includes a symbol for determining a priority of the determined input coordinate in the displayed electronic calendar,
Wherein the control unit further generates, based on the determined priority, a notification related to the determined input coordinates.
The method according to claim 1,
Wherein the input unit further detects a user's handwriting input to the displayed electronic calendar,
Wherein the detected handwriting input is written to the determined input coordinates.
The method according to claim 1,
Wherein the display further displays at least one image based on a predetermined condition.
KR1020160028301A 2016-03-09 2016-03-09 Device, method, and computer readable medium for displaying a digital calendar KR101883353B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020160028301A KR101883353B1 (en) 2016-03-09 2016-03-09 Device, method, and computer readable medium for displaying a digital calendar

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020160028301A KR101883353B1 (en) 2016-03-09 2016-03-09 Device, method, and computer readable medium for displaying a digital calendar

Publications (2)

Publication Number Publication Date
KR20170105285A true KR20170105285A (en) 2017-09-19
KR101883353B1 KR101883353B1 (en) 2018-07-31

Family

ID=60033507

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020160028301A KR101883353B1 (en) 2016-03-09 2016-03-09 Device, method, and computer readable medium for displaying a digital calendar

Country Status (1)

Country Link
KR (1) KR101883353B1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20110094740A (en) 2010-02-17 2011-08-24 엘지전자 주식회사 Image display device enable of displaying 3d object in a shape of analog watch and operation controlling method for the same
KR20130076992A (en) 2011-12-29 2013-07-09 전자부품연구원 System and method for generating mask image applied in each threshold in region

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20110094740A (en) 2010-02-17 2011-08-24 엘지전자 주식회사 Image display device enable of displaying 3d object in a shape of analog watch and operation controlling method for the same
KR20130076992A (en) 2011-12-29 2013-07-09 전자부품연구원 System and method for generating mask image applied in each threshold in region

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
plandays, "프랭클린플래너 노트 10.1 어플_사용법1," Youtube.com, [online] 2012.10.11., [URL:https://www.youtube.com/watch?v=IPtVYw_Gw-Q]

Also Published As

Publication number Publication date
KR101883353B1 (en) 2018-07-31

Similar Documents

Publication Publication Date Title
US11886252B2 (en) Foldable device and method of controlling the same
CN107408045B (en) Method of controlling apparatus having a plurality of operating systems installed therein and the apparatus
CN105278902B (en) Mobile terminal and control method thereof
US10268434B2 (en) Mobile terminal and control method therefor
US9887949B2 (en) Displaying interactive notifications on touch sensitive devices
US20150095819A1 (en) Method for displaying previews in a widget
US9632578B2 (en) Method and device for switching tasks
KR102060153B1 (en) A cover,an electronic device using the same and operating method thereof
RU2667047C2 (en) Method for providing tactical effect in a portable terminal, machine-readable media and portable terminal
EP3109785A1 (en) Portable apparatus and method for changing screen of the same
US20140333551A1 (en) Portable apparatus and method of displaying object in the same
US11513676B2 (en) Method and system for controlling device
US20160147429A1 (en) Device for resizing window, and method of controlling the device to resize window
US20170199631A1 (en) Devices, Methods, and Graphical User Interfaces for Enabling Display Management of Participant Devices
EP2741207A1 (en) Method and system for providing information based on context, and computer-readable recording medium thereof
US11209930B2 (en) Method of controlling device using various input types and device for performing the method
KR20140119608A (en) Method and device for providing a private page
US20200326786A1 (en) Device and method of controlling device
CN105635434B (en) Mobile terminal and control method thereof
KR20150032068A (en) Method and device for executing a plurality of applications
KR101883353B1 (en) Device, method, and computer readable medium for displaying a digital calendar
KR20150026615A (en) Method for providing schedule management and mobile device thereof
KR102306535B1 (en) Method for controlling device and the device
KR20160027856A (en) Method and portable terminal having bended display unit and cover for executing application
US11360652B2 (en) Apparatus and method for providing for receipt of indirect touch input to a touch screen display

Legal Events

Date Code Title Description
E902 Notification of reason for refusal
E701 Decision to grant or registration of patent right
GRNT Written decision to grant