KR20170009688A - Electronic device and Method for controlling the electronic device thereof - Google Patents
Electronic device and Method for controlling the electronic device thereof Download PDFInfo
- Publication number
- KR20170009688A KR20170009688A KR1020150168552A KR20150168552A KR20170009688A KR 20170009688 A KR20170009688 A KR 20170009688A KR 1020150168552 A KR1020150168552 A KR 1020150168552A KR 20150168552 A KR20150168552 A KR 20150168552A KR 20170009688 A KR20170009688 A KR 20170009688A
- Authority
- KR
- South Korea
- Prior art keywords
- display area
- touch
- main display
- user
- electronic device
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04886—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The present invention provides an electronic device and a control method thereof. The electronic device controls the touch display to provide a UI for inputting characters in a plurality of auxiliary display areas of the touch display and the touch display including a main display area and a plurality of auxiliary display areas located on the side of the main display area And a processor for controlling the touch display to provide the input character to the main display area of the touch display when the touch interaction for the UI is input.
Description
BACKGROUND OF THE
Recently, various electronic devices have been provided. Particularly, a wearable electronic device such as a smart watch which is worn by a user is provided. Such a wearable electronic device has a problem that the portable device is highly portable, but the touch display is small.
There are several problems in inputting characters through a wearable electronic device having such a small touch display. Particularly, since the size of the touch display is small, the size of the keyboard UI for inputting characters is small. Therefore, the user is not only slow in inputting characters due to the small size of the keyboard UI, but also has a problem in that the user is unable to smoothly input a character because the possibility of touching a wrong character key is increased when the character is input. Also, when the keyboard UI is displayed on a small touch screen, there is a problem of covering most of the screen.
SUMMARY OF THE INVENTION The present invention has been made in order to solve the above-mentioned problems, and it is an object of the present disclosure to provide a UI for inputting characters in a plurality of auxiliary display areas of a touch display so that a user can perform character input more conveniently and intuitively And a control method thereof.
According to an aspect of the present invention, an electronic device includes a touch display including a main display area and a plurality of auxiliary display areas located on a side of the main display area; And controlling the touch display to provide a UI for inputting characters in the plurality of auxiliary display areas of the touch display, and when a touch interaction for the UI is input, And a processor for controlling the touch display to provide the main display area.
The touch display includes a first curved auxiliary display region extending integrally from the main display region and bent toward the first side of the main display region and smaller than the main display region and a second curved auxiliary display region extending integrally from the main display region, And a second curved auxiliary display area bent to the second side of the display area and smaller than the main display area.
In addition, the UI may include a plurality of UI elements, and each of the plurality of UI elements may correspond to at least one character.
The processor may provide a UI element corresponding to a vowel among the plurality of UI elements to the first curved surface auxiliary display area, and a UI element corresponding to a consonant among the plurality of UI elements, To control the touch display.
In addition, when the user touches one of the plurality of UI elements and swips in the direction of the main display area, the processor displays characters corresponding to the touched UI elements in the main display area The touch display can be controlled.
The processor may delete a character provided in the main display area when a user touch that sweeps the main display area in a predetermined direction is sensed.
In addition, the processor may execute a specific application according to the user's touch when the user touches the first curved surface auxiliary display region and the second curved auxiliary display region simultaneously by swiping.
When the user touches the first curved surface auxiliary display region and the second curved auxiliary display region simultaneously in the left and right directions, the processor can execute the camera application according to the user's touch.
The processor may further include means for detecting a direction of the electronic device during execution of the camera application and for displaying at least some of the main display region and the plurality of auxiliary display regions on the basis of the detected direction of the electronic device, The touch display can be controlled to display the touch display.
According to another aspect of the present invention, there is provided a method of controlling an electronic device including a touch display including a main display area and a plurality of auxiliary display areas located on a side of the main display area, Providing a UI for inputting characters in the plurality of auxiliary display areas of the touch display; And providing the input character according to the touch interaction to the main display area of the touch display when the touch interaction for the UI is input.
The touch display includes a first curved auxiliary display region extending integrally from the main display region and bent toward the first side of the main display region and smaller than the main display region and a second curved auxiliary display region extending integrally from the main display region, And a second curved auxiliary display area bent to the second side of the display area and smaller than the main display area.
In addition, the UI may include a plurality of UI elements, and each of the plurality of UI elements may correspond to at least one character.
The step of providing the UI may further include providing a UI element corresponding to a vowel among the plurality of UI elements to the first curved surface auxiliary display area and providing a UI element corresponding to a consonant among the plurality of UI elements to the second Can be provided in the curved auxiliary display area.
In addition, the step of providing the main UI display area may include displaying a character corresponding to the touched UI element when touching one of the plurality of UI elements and then swiping in the direction of the main display area is detected To the main display area.
The method may further include deleting a character provided in the main display area when a user touch to sweep the main display area in a preset direction is detected.
If the user touches the first curved surface auxiliary display area and the second curved auxiliary display area simultaneously, the specific application may be executed according to the user's touch.
If the user touches the first curved surface auxiliary display area and the second curved auxiliary display area simultaneously in the left and right directions, the controller may execute the camera application according to the user's touch.
Detecting the orientation of the electronic device while the camera application is running; And displaying the live view image on at least a part of the main display area and the plurality of auxiliary display areas based on the detected direction of the electronic device.
According to various embodiments of the present invention as described above, a user can more conveniently input characters and perform various functions using the auxiliary display area of the electronic device.
1 is a block diagram schematically illustrating the configuration of an electronic device, in accordance with one embodiment of the present disclosure;
Figures 2A-2C illustrate a type of touch display, in accordance with various embodiments of the present disclosure;
Figure 3 is a block diagram detailing the configuration of an electronic device, in accordance with one embodiment of the present disclosure;
4 is a diagram illustrating an example of a software configuration of an electronic device, according to one embodiment of the present disclosure;
Figures 5A-10 illustrate embodiments for performing various functions using an auxiliary display area of an electronic device, in accordance with various embodiments of the present disclosure,
11 is a flowchart illustrating a method of controlling an electronic device according to an embodiment of the present invention.
These embodiments are capable of various modifications and various embodiments, and specific embodiments are illustrated in the drawings and described in detail in the description. It is to be understood, however, that it is not intended to limit the scope of the specific embodiments but includes all transformations, equivalents, and alternatives falling within the spirit and scope of the disclosure disclosed. DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS In the following description of the embodiments of the present invention,
The terms first, second, etc. may be used to describe various elements, but the elements should not be limited by terms. Terms are used only for the purpose of distinguishing one component from another.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to limit the scope of the claims. The singular expressions include plural expressions unless the context clearly dictates otherwise. In this application, the terms "comprise", "comprising" and the like are used to specify that there is a stated feature, number, step, operation, element, component, or combination thereof, But do not preclude the presence or addition of features, numbers, steps, operations, components, parts, or combinations thereof.
In the embodiment, 'module' or 'sub' performs at least one function or operation, and may be implemented in hardware or software, or a combination of hardware and software. In addition, a plurality of 'modules' or a plurality of 'parts' may be integrated into at least one module except for 'module' or 'module' which need to be implemented by specific hardware, and implemented by at least one processor (not shown) .
Also, in the embodiment, the UI (user interface) may include at least one of a configuration for receiving a user interaction and a configuration for indicating notification information. At this time, the UI may include a UI element, and the UI element may include an element capable of interacting with a user and providing notification information as well as elements capable of feedback based on user input such as visual, auditory, and olfactory feedback . At this time, the UI element may be expressed in the form of at least one of image, text, and moving image, or may be referred to as a UI element if there is a region in which the above information is not displayed but can be fed back according to user input.
Hereinafter, the present invention will be described in detail with reference to the accompanying drawings.
1 is a block diagram illustrating an example of a basic configuration of an
The touch display 110 outputs image data and receives a user's touch. In particular, the
In particular, the
At this time, the area of each of the plurality of sub display areas may be smaller than the main display area. Further, the plurality of curved surface display areas can form a surface different from the main display area. For example, if the main display area is disposed on the front surface of the
2A to 2C are views showing an example of an external configuration of an
2A, the
In particular, the
The
Specifically, when a situation for inputting characters is detected, the
In particular, the
In particular, if a user touches one of a plurality of UI elements and swips in the direction of the
When a user touch to sweep the
If a user touch for simultaneously sweeping the first curved surface auxiliary display area 113-1 and the second curved auxiliary display area 113-2 is detected, the
The
3 is a block diagram showing an example of a detailed configuration of the
The
The
The
The
The
The
When the playback program for the multimedia contents is executed, the
The
The
The
When the
In the
The
At this time, the touch sensor may include at least one of a touch panel and a pen recognition panel. The touch panel senses a user's finger gesture input and outputs a touch event value corresponding to the sensed touch signal. The touch panel may be mounted both under the main sub-area and the sub-display area of the
The pen recognition panel senses the user's pen gesture input according to the operation of the user's touch pen (e.g., a stylus pen, a digitizer pen) and outputs a pen proximity event value or a pen touch event value can do. The pen recognition panel may be mounted below at least one of the main area of the
Meanwhile, although the
The
The
The
3, a USB port through which a USB connector can be connected in the
Meanwhile, as described above, the
An operating system (OS) 410 functions to control and manage the overall operation of the hardware. That is, the
The
The
The X11 module 430-1 is a module for receiving various event signals from various hardware provided in the
The APP manager 430-2 is a module for managing the execution status of
The connection manager 430-3 is a module for supporting a wired or wireless network connection. The connection manager 430-3 may include various detailed modules such as a DNET module, a UPnP module, and the like.
The security module 430-4 is a module that supports certification for hardware, permission permission, and secure storage.
The system manager 430-5 monitors the status of each component in the
The multimedia framework 430-6 is a module for reproducing the multimedia contents stored in the
The main UI framework 430-7 is a module for providing various UIs to be displayed in the main display area of the
The window manager 430-8 can detect touch events or other input events using the user's body or pen. When this event is detected, the window manager 430-8 delivers an event signal to the main UI framework 430-7 or the sub UI framework 430-9 to perform an operation corresponding to the event.
In addition, when the user touches and drags the screen, a writing module for drawing a line according to the drag trajectory or a pitch angle, a roll angle, a yaw angle, etc. based on the sensor value sensed by the
The
The software structure shown in Fig. 4 is merely an example, and is not necessarily limited to this. Therefore, it goes without saying that parts may be omitted, modified, or added as necessary. For example, the
Meanwhile, as described above, the
Hereinafter, various embodiments of the present invention will be described with reference to the drawings.
If a situation requiring character input is detected, the
In the above-described embodiment, the
On the other hand, the UI for inputting characters includes a plurality of UI elements, as shown in FIG. 5B, and each UI element may correspond to at least one character, symbol or function. For example, the first UI element may correspond to ".", ",", "?", "!", And the second UI element may correspond to "a", "b" G ", " h ", and "i ", and the third UI element may correspond to" d ", & The sixth UI element may correspond to "m", "n", "o", the seventh UI element may correspond to "p", " quot ;, " r ", and "s ", the eighth UI element may correspond to" t ", "u "," v ", and the ninth UI element may correspond to & , "y ", and" z ", and the tenth UI element may correspond to an "enter" function. On the other hand, the character arrangement shown in FIG. 5B is only an embodiment, and other character arrangements can also be applied to the present invention.
The
In particular, when one character is associated with one UI element, the
However, when a plurality of characters are associated with one UI element, the
Specifically, as shown in FIG. 5B, while the UI for inputting characters is displayed in the plurality of auxiliary display areas 520-1 and 520-2, the
When the user touches the fourth UI element once and then touches the fourth UI element and then sweeps in the direction of the
However, if the user touches the fourth UI element twice and then touches the fourth UI element and then sweeps in the direction of the
If the user touches the fourth UI element three times and then touches the fourth UI element and then sweeps in the direction of the
As described above, the user can input characters with one hand through the touch input and the swipe input, and can perform character input without selecting the
In addition, the
6A, when a character is input to the
6B, when touching one area of the
Meanwhile, although the deletion function and the space bar function have been described in the above embodiments, various functions such as a copy function, a cut-in function, and the like can be performed according to a user input in the main display area .
In the above-described embodiment, the UI for inputting characters is English. However, the UI for inputting characters may be another language. For example, as shown in FIGS. 7A and 7B, a UI for inputting Korean can be provided.
In particular, FIG. 7A displays the UI for the Korean input of the narrative method. In this case, the UI for the Korean input of the narrative method includes a UI element for inputting consonants in the first sub display area 720-1, a UI for inputting a vowel in the second sub display area 720-2, Element can be placed. 7B, a UI for Korean input of the skypeed method is displayed.
In addition, a UI for inserting various texts such as numerals and special symbols other than characters can be displayed in a plurality of sub display areas.
Also, if a user touching one point of each of the plurality of auxiliary display areas and then swiping in the same direction is detected, the
At this time, the
For example, if a user touches one point of each of the plurality of auxiliary display areas 820-1 and 820-2 while the clock screen is displayed in the
As another example, while one side of the first sub display area 820-1 of the plurality of sub display areas 820-1 and 820-2 is touched while the clock screen is displayed in the
As another example, while one side of the second sub display area 820-2 of the plurality of sub display areas 820-1 and 820-2 is touched while the clock screen is displayed in the
As described above, by executing the specific application through the swipe input sensed in the plurality of auxiliary display areas, the user can execute the application more quickly as using the shortcut key.
In the above-described embodiment, the type of the user's touch for executing a fast application is a swipe input. However, the present invention is not limited to this, and may be applied to other embodiments such as a long press input Type user input.
Also, an application executed according to a user's touch can be set at the time of manufacture, but this is only an example and can be set by the user.
Further, while the camera application is being executed, the
The
The
Specifically, when the camera application is first executed, the
When the user wearing the
Meanwhile, in the above-described embodiment, the live view image is moved according to the direction of the
As described above, by moving and displaying the live view image according to the direction of the electronic device, the user can more easily check the live view image and photograph the image.
When a plurality of
As described above, by photographing a plurality of live view images by dividing the main display area 1010 of the electronic device into a plurality of areas, photographing can be performed in more various environments.
11 is a flowchart for explaining a control method of the
First, the
Then, the
When the touch interaction is inputted through the UI (S1120-Y), the
According to various embodiments of the present invention as described above, a user can more conveniently input characters using the auxiliary display area of the
Meanwhile, the control method of the electronic device according to the above-described various embodiments may be implemented by a program and provided to a display device or an input device. In particular, a program including a control method of a display device may be stored in a non-transitory computer readable medium.
A non-transitory readable medium is a medium that stores data for a short period of time, such as a register, cache, memory, etc., but semi-permanently stores data and is readable by the apparatus. In particular, the various applications or programs described above may be stored on non-volatile readable media such as CD, DVD, hard disk, Blu-ray disk, USB, memory card, ROM,
While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed exemplary embodiments, but, on the contrary, It will be understood by those skilled in the art that various changes in form and detail may be made therein without departing from the spirit and scope of the present invention.
110, 300:
210: memory 220: GPS chip
230: communication unit 240: video processor
250: Audio Processor 260: Button
270: microphone 280: camera
290: speaker 310: sensor
Claims (18)
A touch display including a main display area and a plurality of sub display areas located at sides of the main display area; And
Wherein the controller controls the touch display to provide a UI for inputting characters in the plurality of auxiliary display areas of the touch display, and when the touch interaction for the UI is input, And controlling the touch display to provide the main display area.
The touch display includes:
A first curved surface auxiliary display region extending integrally from the main display region and bent toward the first side of the main display region and being smaller than the main display region and a curved auxiliary display region extending integrally from the main display region and bent to the second side of the main display region And a second curved auxiliary display area that is smaller than the main display area.
The UI includes a plurality of UI elements,
Wherein each of the plurality of UI elements corresponds to at least one character.
The processor comprising:
Providing a UI element corresponding to a vowel among the plurality of UI elements to the first curved surface sub display area,
Controls the touch display to provide a UI element corresponding to a consonant among the plurality of UI elements to the second curved surface auxiliary display area.
The processor comprising:
When touching one of the plurality of UI elements and detecting a user touch to swipe in the direction of the main display area, the touch display is controlled to provide the character corresponding to the touched UI element to the main display area And the electronic device.
The processor comprising:
And deletes characters provided in the main display area when a user touch to sweep the main display area in a predetermined direction is detected.
The processor comprising:
And when the user touches the first curved surface auxiliary display region and the second curved auxiliary display region simultaneously by swiping, the specific application is executed according to the user touch.
The processor comprising:
Wherein the controller executes the camera application according to the user's touch when the user touches the first curved surface auxiliary display area and the second curved auxiliary display area simultaneously in the left and right direction.
The processor comprising:
Detecting a direction of the electronic device during execution of the camera application and displaying a live view image on at least a portion of the main display area and the plurality of auxiliary display areas based on the detected direction of the electronic device And said control means controls said control means.
Providing a UI for inputting characters in the plurality of auxiliary display areas of the touch display; And
And providing the input character according to the touch interaction to the main display area of the touch display when the touch interaction for the UI is input.
The touch display includes:
A first curved surface auxiliary display region extending integrally from the main display region and bent toward the first side of the main display region and being smaller than the main display region and a curved auxiliary display region extending integrally from the main display region and bent to the second side of the main display region And a second curved surface auxiliary display area that is smaller than the main display area.
The UI includes a plurality of UI elements,
Wherein each of the plurality of UI elements corresponds to at least one character.
The providing of the UI may include:
Providing a UI element corresponding to a vowel among the plurality of UI elements to the first curved surface sub display area,
Wherein a UI element corresponding to a consonant among the plurality of UI elements is provided in the second curved surface auxiliary display area.
Wherein the step of providing the main display area includes:
When the user touches one of the plurality of UI elements and sweeps in the direction of the main display area, the character corresponding to the touched UI element is provided to the main display area Way.
And deleting a character provided in the main display area when a user touch to sweep the main display area in a preset direction is detected.
And executing a specific application according to the user's touch when a user touch for simultaneously swiping the first curved surface auxiliary display area and the second curved surface auxiliary display area is detected.
Wherein the performing comprises:
Wherein the camera application is executed according to the user's touch when the user touches the first curved surface auxiliary display area and the second curved auxiliary display area simultaneously by swiping in the left and right directions.
Detecting a direction of the electronic device while the camera application is running;
And displaying the live view image on at least a part of the main display area and the plurality of auxiliary display areas based on the detected direction of the electronic device.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/KR2016/007536 WO2017014475A1 (en) | 2015-07-17 | 2016-07-12 | Electronic device and control method therefor |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201562193806P | 2015-07-17 | 2015-07-17 | |
US62/193,806 | 2015-07-17 |
Publications (1)
Publication Number | Publication Date |
---|---|
KR20170009688A true KR20170009688A (en) | 2017-01-25 |
Family
ID=57991676
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020150168552A KR20170009688A (en) | 2015-07-17 | 2015-11-30 | Electronic device and Method for controlling the electronic device thereof |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR20170009688A (en) |
-
2015
- 2015-11-30 KR KR1020150168552A patent/KR20170009688A/en unknown
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR102423826B1 (en) | User termincal device and methods for controlling the user termincal device thereof | |
US10698564B2 (en) | User terminal device and displaying method thereof | |
KR102230708B1 (en) | User termincal device for supporting user interaxion and methods thereof | |
KR101580300B1 (en) | User termincal device and methods for controlling the user termincal device thereof | |
KR102255143B1 (en) | Potable terminal device comprisings bended display and method for controlling thereof | |
US10416883B2 (en) | User terminal apparatus and controlling method thereof | |
US10067666B2 (en) | User terminal device and method for controlling the same | |
KR102168648B1 (en) | User terminal apparatus and control method thereof | |
KR101559091B1 (en) | Potable terminal device comprisings bended display and method for controlling thereof | |
US10572148B2 (en) | Electronic device for displaying keypad and keypad displaying method thereof | |
US11243687B2 (en) | User terminal apparatus and controlling method thereof | |
US10691333B2 (en) | Method and apparatus for inputting character | |
KR102183445B1 (en) | Portable terminal device and method for controlling the portable terminal device thereof | |
KR20140028383A (en) | User terminal apparatus and contol method thereof | |
EP3287886B1 (en) | User terminal apparatus and controlling method thereof | |
WO2017014475A1 (en) | Electronic device and control method therefor | |
KR20170009688A (en) | Electronic device and Method for controlling the electronic device thereof | |
KR102305314B1 (en) | User terminal device and methods for controlling the user terminal device | |
CN111580706A (en) | Electronic device providing user interaction and method thereof |