KR20170009688A - Electronic device and Method for controlling the electronic device thereof - Google Patents

Electronic device and Method for controlling the electronic device thereof Download PDF

Info

Publication number
KR20170009688A
KR20170009688A KR1020150168552A KR20150168552A KR20170009688A KR 20170009688 A KR20170009688 A KR 20170009688A KR 1020150168552 A KR1020150168552 A KR 1020150168552A KR 20150168552 A KR20150168552 A KR 20150168552A KR 20170009688 A KR20170009688 A KR 20170009688A
Authority
KR
South Korea
Prior art keywords
display area
touch
main display
user
electronic device
Prior art date
Application number
KR1020150168552A
Other languages
Korean (ko)
Inventor
박정록
조시연
김대현
노완호
박진형
Original Assignee
삼성전자주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 삼성전자주식회사 filed Critical 삼성전자주식회사
Priority to PCT/KR2016/007536 priority Critical patent/WO2017014475A1/en
Publication of KR20170009688A publication Critical patent/KR20170009688A/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present invention provides an electronic device and a control method thereof. The electronic device controls the touch display to provide a UI for inputting characters in a plurality of auxiliary display areas of the touch display and the touch display including a main display area and a plurality of auxiliary display areas located on the side of the main display area And a processor for controlling the touch display to provide the input character to the main display area of the touch display when the touch interaction for the UI is input.

Description

BACKGROUND OF THE INVENTION 1. Field of the Invention [0001] The present invention relates to an electronic device and a control method thereof,

BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to an electronic device and a control method thereof, and more particularly to an electronic device having a touch display including a main display region and a plurality of auxiliary display regions located on a side of the main display region, .

Recently, various electronic devices have been provided. Particularly, a wearable electronic device such as a smart watch which is worn by a user is provided. Such a wearable electronic device has a problem that the portable device is highly portable, but the touch display is small.

There are several problems in inputting characters through a wearable electronic device having such a small touch display. Particularly, since the size of the touch display is small, the size of the keyboard UI for inputting characters is small. Therefore, the user is not only slow in inputting characters due to the small size of the keyboard UI, but also has a problem in that the user is unable to smoothly input a character because the possibility of touching a wrong character key is increased when the character is input. Also, when the keyboard UI is displayed on a small touch screen, there is a problem of covering most of the screen.

SUMMARY OF THE INVENTION The present invention has been made in order to solve the above-mentioned problems, and it is an object of the present disclosure to provide a UI for inputting characters in a plurality of auxiliary display areas of a touch display so that a user can perform character input more conveniently and intuitively And a control method thereof.

According to an aspect of the present invention, an electronic device includes a touch display including a main display area and a plurality of auxiliary display areas located on a side of the main display area; And controlling the touch display to provide a UI for inputting characters in the plurality of auxiliary display areas of the touch display, and when a touch interaction for the UI is input, And a processor for controlling the touch display to provide the main display area.

The touch display includes a first curved auxiliary display region extending integrally from the main display region and bent toward the first side of the main display region and smaller than the main display region and a second curved auxiliary display region extending integrally from the main display region, And a second curved auxiliary display area bent to the second side of the display area and smaller than the main display area.

In addition, the UI may include a plurality of UI elements, and each of the plurality of UI elements may correspond to at least one character.

The processor may provide a UI element corresponding to a vowel among the plurality of UI elements to the first curved surface auxiliary display area, and a UI element corresponding to a consonant among the plurality of UI elements, To control the touch display.

In addition, when the user touches one of the plurality of UI elements and swips in the direction of the main display area, the processor displays characters corresponding to the touched UI elements in the main display area The touch display can be controlled.

The processor may delete a character provided in the main display area when a user touch that sweeps the main display area in a predetermined direction is sensed.

In addition, the processor may execute a specific application according to the user's touch when the user touches the first curved surface auxiliary display region and the second curved auxiliary display region simultaneously by swiping.

When the user touches the first curved surface auxiliary display region and the second curved auxiliary display region simultaneously in the left and right directions, the processor can execute the camera application according to the user's touch.

The processor may further include means for detecting a direction of the electronic device during execution of the camera application and for displaying at least some of the main display region and the plurality of auxiliary display regions on the basis of the detected direction of the electronic device, The touch display can be controlled to display the touch display.

According to another aspect of the present invention, there is provided a method of controlling an electronic device including a touch display including a main display area and a plurality of auxiliary display areas located on a side of the main display area, Providing a UI for inputting characters in the plurality of auxiliary display areas of the touch display; And providing the input character according to the touch interaction to the main display area of the touch display when the touch interaction for the UI is input.

The touch display includes a first curved auxiliary display region extending integrally from the main display region and bent toward the first side of the main display region and smaller than the main display region and a second curved auxiliary display region extending integrally from the main display region, And a second curved auxiliary display area bent to the second side of the display area and smaller than the main display area.

In addition, the UI may include a plurality of UI elements, and each of the plurality of UI elements may correspond to at least one character.

The step of providing the UI may further include providing a UI element corresponding to a vowel among the plurality of UI elements to the first curved surface auxiliary display area and providing a UI element corresponding to a consonant among the plurality of UI elements to the second Can be provided in the curved auxiliary display area.

In addition, the step of providing the main UI display area may include displaying a character corresponding to the touched UI element when touching one of the plurality of UI elements and then swiping in the direction of the main display area is detected To the main display area.

The method may further include deleting a character provided in the main display area when a user touch to sweep the main display area in a preset direction is detected.

If the user touches the first curved surface auxiliary display area and the second curved auxiliary display area simultaneously, the specific application may be executed according to the user's touch.

If the user touches the first curved surface auxiliary display area and the second curved auxiliary display area simultaneously in the left and right directions, the controller may execute the camera application according to the user's touch.

Detecting the orientation of the electronic device while the camera application is running; And displaying the live view image on at least a part of the main display area and the plurality of auxiliary display areas based on the detected direction of the electronic device.

According to various embodiments of the present invention as described above, a user can more conveniently input characters and perform various functions using the auxiliary display area of the electronic device.

1 is a block diagram schematically illustrating the configuration of an electronic device, in accordance with one embodiment of the present disclosure;
Figures 2A-2C illustrate a type of touch display, in accordance with various embodiments of the present disclosure;
Figure 3 is a block diagram detailing the configuration of an electronic device, in accordance with one embodiment of the present disclosure;
4 is a diagram illustrating an example of a software configuration of an electronic device, according to one embodiment of the present disclosure;
Figures 5A-10 illustrate embodiments for performing various functions using an auxiliary display area of an electronic device, in accordance with various embodiments of the present disclosure,
11 is a flowchart illustrating a method of controlling an electronic device according to an embodiment of the present invention.

These embodiments are capable of various modifications and various embodiments, and specific embodiments are illustrated in the drawings and described in detail in the description. It is to be understood, however, that it is not intended to limit the scope of the specific embodiments but includes all transformations, equivalents, and alternatives falling within the spirit and scope of the disclosure disclosed. DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS In the following description of the embodiments of the present invention,

The terms first, second, etc. may be used to describe various elements, but the elements should not be limited by terms. Terms are used only for the purpose of distinguishing one component from another.

The terminology used herein is for the purpose of describing particular embodiments only and is not intended to limit the scope of the claims. The singular expressions include plural expressions unless the context clearly dictates otherwise. In this application, the terms "comprise", "comprising" and the like are used to specify that there is a stated feature, number, step, operation, element, component, or combination thereof, But do not preclude the presence or addition of features, numbers, steps, operations, components, parts, or combinations thereof.

In the embodiment, 'module' or 'sub' performs at least one function or operation, and may be implemented in hardware or software, or a combination of hardware and software. In addition, a plurality of 'modules' or a plurality of 'parts' may be integrated into at least one module except for 'module' or 'module' which need to be implemented by specific hardware, and implemented by at least one processor (not shown) .

Also, in the embodiment, the UI (user interface) may include at least one of a configuration for receiving a user interaction and a configuration for indicating notification information. At this time, the UI may include a UI element, and the UI element may include an element capable of interacting with a user and providing notification information as well as elements capable of feedback based on user input such as visual, auditory, and olfactory feedback . At this time, the UI element may be expressed in the form of at least one of image, text, and moving image, or may be referred to as a UI element if there is a region in which the above information is not displayed but can be fed back according to user input.

Hereinafter, the present invention will be described in detail with reference to the accompanying drawings.

1 is a block diagram illustrating an example of a basic configuration of an electronic device 100 for illustrating various embodiments of the present invention. As shown in FIG. 1, the electronic device 100 includes a touch display 110 and a processor 120. The electronic device 100 of FIG. 1 may be a wearable device such as a smart watch, but it may be a TV, a PC, a laptop PC, a mobile phone, a tablet PC, a PDA, an MP3 player, a kiosk, , A table display device, and the like. Portable terminals such as mobile phones, tablet PCs, PDAs, MP3 players, laptop PCs and the like may be referred to as portable terminals, but they will be collectively referred to as electronic devices in this specification.

The touch display 110 outputs image data and receives a user's touch. In particular, the touch display 110 may include a main display area and a plurality of auxiliary display areas. At this time, the main display area means an area disposed on the front surface of the electronic device 100, and the auxiliary display area can be disposed on the side surface of the main display area. Here, the main display area and the auxiliary display area can be interpreted in different meanings. For example, a region having a relatively large size may be referred to as a main display region and a region having a small size may be defined as an auxiliary display region.

In particular, the touch display 110 may be embodied as a flexible display and includes a main display area provided on the front surface of the electronic device 100, a first curved surface 120 extending from the main display area and bent to the first side of the electronic device 100, A second curved auxiliary display area extending from the auxiliary display area and the main display area and bent to the second side of the electronic device 100. Meanwhile, the first auxiliary display region and the second auxiliary display region may include only the portion bent and fixed in the lateral direction of the electronic device 100, but this is only an example, As well as a flat portion located on the front side of the electronic device 100, as well as the bent portion.

At this time, the area of each of the plurality of sub display areas may be smaller than the main display area. Further, the plurality of curved surface display areas can form a surface different from the main display area. For example, if the main display area is disposed on the front surface of the electronic device 100, the plurality of auxiliary display areas may be a right side surface, a right side surface, Such as a left side surface, an upper side surface, a lower side surface, and the like. The surface including the main display area and the surface including the at least one auxiliary display area may be fixed to form an obtuse angle. The shape, position and number of the auxiliary display area can be variously implemented according to the embodiment. This will be described later in detail with reference to the drawings. On the other hand, when the auxiliary display area is located on the side surface among the surfaces forming the appearance of the electronic device 100, the auxiliary display area can be referred to as an edge area. Further, the auxiliary display region may be modified into various terms such as a sub region, a side region, and the like according to the embodiment.

2A to 2C are views showing an example of an external configuration of an electronic device 100 including a touch display 110 divided into a main display area 111 and two auxiliary display areas 113-1 and 113-2 to be.

2A, the main display region 111 is disposed on the front side, and auxiliary display regions 113-1 and 113-2 are disposed on the upper and lower sides, respectively. At this time, the main display area 111 and the auxiliary display areas 113-1 and 113-2 are separated by the boundary lines 115-1 and 115-2. 2A, each of the auxiliary display areas 113-1 and 113-2 may be disposed at an obtuse angle with the main display area 111 so as to be viewed from the front direction. However, the sub display area in the form of a curved surface is only an embodiment, and the sub display area can be implemented in other forms. For example, according to FIG. 2B, the main display area 111 and the plurality of auxiliary display areas 113-1 and 113-2 may be disposed on one surface. That is, one display can be divided into the main display area 111 and the two sub display areas 113-1 and 113-2 through the interface 115-1 and 115-2. 2C, the first sub display area 113-1 of the plurality of sub display areas 113-1 and 113-2 may be disposed on the same plane as the main display area 111, The auxiliary display area 113-2 may be bent to the lower side of the main display area 111. [

In particular, the touch display 110 displays a UI for inputting characters in the auxiliary display areas 113-1 and 113-2. At this time, the UI includes a plurality of UI elements, and each of the plurality of UI elements can correspond to at least one character.

The processor 120 controls the overall operation of the electronic device 100 using the main display area of the touch display 100 and the plurality of auxiliary display areas.

Specifically, when a situation for inputting characters is detected, the processor 120 displays the touch display 110 to provide a UI for inputting characters in a plurality of sub display areas 113-1 and 113-2 of the touch display 110 Can be controlled. At this time, the processor 120 may provide various types of UI according to the type of the language or the character input method. This will be described later.

In particular, the processor 120 provides a UI element corresponding to a vowel among a plurality of UI elements to the first curved surface sub display area 113-1, and the UI element corresponding to the consonant among the plurality of UI elements is referred to as a second curved surface assistant The touch display 110 can be controlled to provide the display area 113-2.

In particular, if a user touches one of a plurality of UI elements and swips in the direction of the main display area 111, the processor 120 displays the characters corresponding to the UI elements touched by the touch display 110 To the main display area 111 of the touch-sensitive display 110. FIG. This will be described later in detail with reference to the drawings.

When a user touch to sweep the main display area 111 in a predetermined direction is detected, the processor 120 can delete the inputted character. When the user touches the main display area 111 in a direction opposite to the predetermined direction, the processor 120 may perform a space bar function.

If a user touch for simultaneously sweeping the first curved surface auxiliary display area 113-1 and the second curved auxiliary display area 113-2 is detected, the processor 120 can execute a specific application according to a user's touch. For example, if the user touches the first curved surface auxiliary display area 113-1 and the second curved auxiliary display area 113-2 to swipe rightward simultaneously, the processor 120 detects Camera application can be executed.

The processor 120 also detects the orientation of the electronic device 100 while the camera application is running and displays the main display area 111 and the plurality of auxiliary display areas 113 -1,113-2, the touch display 110 may be controlled to display the live view image.

3 is a block diagram showing an example of a detailed configuration of the electronic device 200 implemented with a smart watch. 3, the electronic device 200 includes a memory 210, a GPS chip 220, a communication unit 230, a video processor 240, an audio processor 250, a button 260, a microphone unit 270, A camera 280, a speaker 290, a touch display 300, a sensor 310, and a processor 320.

The memory 210 may store various programs and data necessary for the operation of the electronic device 200. Specifically, the memory 210 may store programs and data for configuring various screens to be displayed in the main display area and the auxiliary display area. The processor 320 displays the contents in the main display area and the auxiliary display area of the touch display 300 using the program and data stored in the memory 210. [ In other words, the processor 320 may control the touch display 300 to display the content. In particular, the processor 320 may display a UI for inputting characters in the auxiliary display area of the touch display 300. [ In addition, when the user touches the boundary line corresponding to the main display area, the auxiliary display area, and the boundary therebetween, the processor 320 performs a control operation corresponding to the touch.

The GPS chip 220 is a component for receiving the GPS signal from a GPS (Global Positioning System) satellite and calculating the current position of the electronic device 200. The processor 320 may calculate the user location using the GPS chip 220 when using the navigation program or otherwise requiring the user's current location.

The communication unit 230 is configured to perform communication with various types of external devices according to various types of communication methods. The communication unit 230 includes a Wi-Fi chip 231, a Bluetooth chip 232, a wireless communication chip 233, an NFC chip 234, and the like. The processor 320 communicates with various external devices using the communication unit 230.

The WiFi chip 231 and the Bluetooth chip 232 perform communication using WiFi and Bluetooth, respectively. In the case of using the Wi-Fi chip 231 or the Bluetooth chip 232, various connection information such as an SSID and a session key may be transmitted and received first, and communication information may be used to transmit and receive various information. The wireless communication chip 233 is a chip that performs communication according to various communication standards such as IEEE, Zigbee, 3G (3rd Generation), 3GPP (Third Generation Partnership Project), LTE (Long Term Evolution) The NFC chip 234 is a chip operating in an NFC (Near Field Communication) mode using 13.56 MHz band among various RF-ID frequency bands such as 135 kHz, 13.56 MHz, 433 MHz, 860 to 960 MHz, 2.45 GHz,

The video processor 240 is a component for processing the content received through the communication unit 230 or the video data included in the content stored in the memory 210. [ The video processor 240 may perform various image processing such as decoding, scaling, noise filtering, frame rate conversion, resolution conversion, etc. on the video data.

The audio processor 250 is a component for processing the contents received through the communication unit 230 or the audio data contained in the contents stored in the memory 210. [ In the audio processor 250, various processes such as decoding or amplification of audio data, noise filtering, and the like can be performed.

When the playback program for the multimedia contents is executed, the processor 320 may drive the video processor 240 and the audio processor 250 to play the corresponding content. The touch display 300 may display the image frame generated by the video processor 340 in at least one of the main display area and the auxiliary display area. Also, the speaker 290 outputs the audio data generated by the audio processor 250.

The button 260 may be various types of buttons such as a mechanical button, a touch pad, a wheel, and the like formed in an arbitrary area such as a front or a side part of a body exterior of the electronic device 200,

The microphone 270 is a configuration for receiving a user voice or other sound and converting it into audio data. The processor 320 may use the user's voice input through the microphone unit 270 during a call or convert the user's voice into audio data and store the audio data in the memory 210. [

The camera 280 is a configuration for capturing a still image or a moving image under the control of the user. The camera 280 may be implemented as a plurality of cameras such as a front camera, a rear camera, and a side camera. As described above, the camera 280 can be used as a means for acquiring an image of a user in an embodiment for tracking the user's gaze.

When the camera 280 and the microphone 270 are provided, the processor 320 may perform a control operation according to a user's voice input through the microphone 270 or a user motion recognized by the camera 280. [ That is, the electronic device 200 can operate in a motion control mode or a voice control mode. When operating in the motion control mode, the processor 320 activates the camera 280 to capture the user, tracks the user's motion change, and performs the corresponding control operation. When operating in the voice control mode, the processor 320 may operate in a voice recognition mode for analyzing the user voice input through the microphone 270 and performing a control operation according to the analyzed user voice.

In the electronic device 200 in which the motion control mode or the voice control mode is supported, a speech recognition technique or a motion recognition technique can be used in the various embodiments described above. For example, when the user takes a motion that selects an object displayed on the home screen or sounds a voice command corresponding to the object, it is determined that the object is selected and a control operation matched to the object is performed .

The touch display 300 may be divided into a main area and a plurality of auxiliary display areas as described above. The touch display 300 may be implemented as various types of displays such as a liquid crystal display (LCD), an organic light emitting diode (OLED) display, and a plasma display panel (PDP). The touch display 300 may also include a driving circuit, a backlight unit, and the like that can be implemented in the form of an a-si TFT, a low temperature poly silicon (LTPS) TFT, an OTFT (organic TFT) Meanwhile, the touch display 300 may be implemented as a touch screen in combination with the touch sensor included in the sensor 310. [

At this time, the touch sensor may include at least one of a touch panel and a pen recognition panel. The touch panel senses a user's finger gesture input and outputs a touch event value corresponding to the sensed touch signal. The touch panel may be mounted both under the main sub-area and the sub-display area of the touch display 300, or only under the sub-display area of the touch display 300. [ There are two ways in which the touch panel senses the user's finger gesture input: electrostatic and pressure sensitive. The electrostatic method is a method of calculating the touch coordinates by sensing the minute electricity generated by the user's body. The pressure reduction type includes two electrode plates built in the touch panel, and touch coordinates are calculated by detecting the current flowing through the upper and lower plates of the touch point.

The pen recognition panel senses the user's pen gesture input according to the operation of the user's touch pen (e.g., a stylus pen, a digitizer pen) and outputs a pen proximity event value or a pen touch event value can do. The pen recognition panel may be mounted below at least one of the main area of the touch display 300 and the plurality of sub display areas. The pen recognition panel can be implemented, for example, by the EMR method, and can detect the touch or proximity input according to the proximity of the pen or the change in intensity of the electromagnetic field due to the touch. More specifically, the pen recognition panel includes an electromagnetic induction coil sensor (not shown) having a grid structure and an electronic signal processor (not shown) for sequentially providing an AC signal having a predetermined frequency to each loop coil of the electromagnetic induction coil sensor . When a pen incorporating a resonant circuit exists in the vicinity of the loop coil of the pen recognition panel, the magnetic field transmitted from the loop coil generates a current based on mutual electromagnetic induction in the resonant circuit in the pen. Based on this current, an induction magnetic field is generated from the coils constituting the resonance circuit in the pen. The pen recognition panel detects the induction magnetic field in the loop coil in the signal receiving state, so that the approach position or the touch position Can be detected.

Meanwhile, although the electronic device 200 according to an exemplary embodiment of the present invention is described as being implemented as a flexible display including a main display area and a plurality of auxiliary display areas, The main display region and the plurality of sub display regions can be formed. When implemented as a general display, a plurality of displays can be connected to each other to form a display of a curved shape.

Sensor 310 senses various states and user interactions of electronic device 200. In particular, the sensor 310 may sense a state of gripping the electronic device 200 by the user. Specifically, the electronic device 200 may be rotated or tilted in various directions. At this time, the sensor 310 detects at least one of various sensors such as a geomagnetism sensor, a gyro sensor, an acceleration sensor, and the like to detect an inclination of the electronic device 100 held by the user on the basis of the rotational motion or gravity direction . In addition, the sensor 310 may sense the touch area and the grip pressure distribution for the auxiliary display area.

The processor 320 includes a RAM 321, a ROM 322, a CPU 323, a GPU (Graphic Processing Unit) 324, and a bus 325. The RAM 321, the ROM 322, the CPU 323, the GPU (Graphic Processing Unit) 324, and the like may be connected to each other via the bus 325.

The CPU 323 accesses the memory 210 and performs booting using the O / S stored in the memory 210. [ And performs various operations using various programs, contents, data, and the like stored in the memory 210.

The ROM 322 stores a command set for booting the system and the like. The CPU 323 copies the O / S stored in the memory 210 to the RAM 321 according to the instruction stored in the ROM 322, executes the O / S, Boot up. When the booting is completed, the CPU 323 copies various programs stored in the memory 210 to the RAM 321, executes the program copied to the RAM 321, and performs various operations. When the booting of the electronic device 200 is completed, the GPU 324 displays the UI screen in the active area of the main area and the auxiliary display area. Specifically, the GPU 324 can generate a screen including various objects such as an icon, an image, and a text using an operation unit (not shown) and a rendering unit (not shown). The operation unit calculates an attribute value such as a coordinate value, a shape, a size, and a color to be displayed by each object according to the layout of the screen. The rendering unit generates screens of various layouts including the objects based on the attribute values calculated by the operation unit. The screen generated by the rendering unit is provided to the touch display 300 and is displayed in the main area and the auxiliary display area, respectively.

3, a USB port through which a USB connector can be connected in the electronic device 200, various external input ports for connecting with various external terminals such as a headset, a mouse, and a LAN, etc. are provided in the electronic device 200 , A DMB chip for receiving and processing a DMB (Digital Multimedia Broadcasting) signal, various sensors, and the like.

Meanwhile, as described above, the memory 210 may store various programs. Fig. 4 is a diagram for explaining the structure of software stored in the electronic device 200. Fig. 4, software including the OS 410, the kernel 420, the middleware 430, and the application 440 may be stored in the memory 210. [

An operating system (OS) 410 functions to control and manage the overall operation of the hardware. That is, the OS 410 is a layer that performs basic functions such as hardware management, memory, and security.

The kernel 420 serves as a channel for transmitting various signals including the touch signal sensed by the sensor 310 to the middleware 430.

The middleware 430 includes various software modules for controlling the operation of the electronic device 200. 4, the middleware 430 includes an X11 module 430-1, an APP manager 430-2, a connection manager 430-3, a security module 430-4, a system manager 430-5, A multimedia framework 430-6, a UI framework 430-7, a window manager 430-8, and a handwriting recognition module 430-9.

The X11 module 430-1 is a module for receiving various event signals from various hardware provided in the electronic device 200. [ Here, the event can be variously set as an event in which a user gesture is detected, an event in which a system alarm occurs, an event in which a specific program is executed or terminated, and the like.

The APP manager 430-2 is a module for managing the execution status of various applications 440 installed in the memory 210. [ When an application execution event is detected from the X11 module 430-1, the APP manager 430-2 calls and executes an application corresponding to the event.

The connection manager 430-3 is a module for supporting a wired or wireless network connection. The connection manager 430-3 may include various detailed modules such as a DNET module, a UPnP module, and the like.

The security module 430-4 is a module that supports certification for hardware, permission permission, and secure storage.

The system manager 430-5 monitors the status of each component in the electronic device 200 and provides the monitoring result to the other modules. For example, when a remaining battery level is low, an error occurs, a communication connection state is lost, or the like, the system manager 430-5 transmits the monitoring result to the main UI framework 430-7 or the sub UI frame Work 430-9 to output a notification message or a notification sound.

The multimedia framework 430-6 is a module for reproducing the multimedia contents stored in the electronic device 200 or provided from an external source. The multimedia framework 430-6 may include a player module, a camcorder module, a sound processing module, and the like. Accordingly, various multimedia contents can be reproduced to generate and reproduce a screen and sound.

The main UI framework 430-7 is a module for providing various UIs to be displayed in the main display area of the touch display 300. The sub UI framework 430-9 displays various UIs to be displayed in the sub display area . The main UI framework 430-7 and the sub UI framework 430-9 include an image composer module for composing various objects, a coordinate synthesizer for calculating coordinates to be displayed on the object, A 2D / 3D UI toolkit that provides a tool for constructing a UI in 2D or 3D form, and the like.

The window manager 430-8 can detect touch events or other input events using the user's body or pen. When this event is detected, the window manager 430-8 delivers an event signal to the main UI framework 430-7 or the sub UI framework 430-9 to perform an operation corresponding to the event.

In addition, when the user touches and drags the screen, a writing module for drawing a line according to the drag trajectory or a pitch angle, a roll angle, a yaw angle, etc. based on the sensor value sensed by the sensor 310 And various program modules such as an angle calculation module for the angular calculation module may be stored.

The application module 440 includes applications 440-1 to 440-n for supporting various functions. For example, a program module for providing various services such as a navigation program module, a game module, an electronic book module, a calendar module, an alarm management module, and the like. These applications may be installed by default or may be installed and used arbitrarily by the user during use. The CPU 423 can execute the application corresponding to the selected object by using the application module 440 when the object is selected.

The software structure shown in Fig. 4 is merely an example, and is not necessarily limited to this. Therefore, it goes without saying that parts may be omitted, modified, or added as necessary. For example, the memory 210 may include a sensing module for analyzing signals sensed by various sensors, a messaging module such as a messenger program, an SMS (Short Message Service) & MMS (Multimedia Message Service) program, an e-mail program, (Call Info Aggregator) program module, a VoIP module, a web browser module, and the like.

Meanwhile, as described above, the electronic device 200 can be implemented in various types of devices such as a mobile phone, a tablet PC, a laptop PC, a PDA, an MP3 player, an electronic frame device, a TV, a PC, a kiosk, 3 and 4 may be variously modified depending on the type of the electronic device 200. [

Hereinafter, various embodiments of the present invention will be described with reference to the drawings.

If a situation requiring character input is detected, the processor 320 may control the touch display 300 to provide a UI for inputting characters in a plurality of auxiliary display areas. 5A, when the user touches the character input window 505 while the character input window 505 is provided in the main display area 510, the processor 320 determines whether the user touches the character input window 505, , It is possible to control the touch display 300 to provide a UI for inputting characters in the plurality of auxiliary display areas 520-1 and 520-2.

In the above-described embodiment, the character input window 505 is displayed in a situation where a character input is required. However, the character input window 505 is only an example. For example, various situations such as a situation for inputting a telephone number and a situation for inputting a text message may be judged as a situation requiring a character input.

On the other hand, the UI for inputting characters includes a plurality of UI elements, as shown in FIG. 5B, and each UI element may correspond to at least one character, symbol or function. For example, the first UI element may correspond to ".", ",", "?", "!", And the second UI element may correspond to "a", "b" G ", " h ", and "i ", and the third UI element may correspond to" d ", & The sixth UI element may correspond to "m", "n", "o", the seventh UI element may correspond to "p", " quot ;, " r ", and "s ", the eighth UI element may correspond to" t ", "u "," v ", and the ninth UI element may correspond to & , "y ", and" z ", and the tenth UI element may correspond to an "enter" function. On the other hand, the character arrangement shown in FIG. 5B is only an embodiment, and other character arrangements can also be applied to the present invention.

The processor 320 can input characters according to the user's touch input through the UI displayed in the plurality of auxiliary display areas 520-1 and 520-2.

In particular, when one character is associated with one UI element, the processor 320 detects a user touch that touches one of the plurality of UI elements and sweeps toward the main display area 510 Characters corresponding to the touched UI elements can be input.

However, when a plurality of characters are associated with one UI element, the processor 320 can input characters through touch input and swipe input.

Specifically, as shown in FIG. 5B, while the UI for inputting characters is displayed in the plurality of auxiliary display areas 520-1 and 520-2, the processor 320, through the sensor 310, A user touching the element can be detected.

When the user touches the fourth UI element once and then touches the fourth UI element and then sweeps in the direction of the main display area 510 as shown in FIG. 5D, Quot; g "in the character input window 505 as shown in Fig. At this time, if the user once touched the fourth UI element, the processor 320 may control the touch display 300 to provide an indicator that the "g" is selected in the fourth UI element.

However, if the user touches the fourth UI element twice and then touches the fourth UI element and then sweeps in the direction of the main display area 510 as shown in FIG. 5D, the processor 320 quot; h "can be input to the character input window 505. At this time, if the user once touched the fourth UI element, the processor 320 may control the touch display 300 to provide an indicator that the "h" is selected in the fourth UI element.

If the user touches the fourth UI element three times and then touches the fourth UI element and then sweeps in the direction of the main display area 510 as shown in FIG. 5D, the processor 320 quot; i "in the character input window 505. [ At this time, if the user once touched the fourth UI element, the processor 320 may control the touch display 300 to provide an indicator that the "i" is selected in the fourth UI element.

As described above, the user can input characters with one hand through the touch input and the swipe input, and can perform character input without selecting the main display area 510. You can also have an entertainment element that sweeps the abacus through the swipe input.

In addition, the processor 320 may perform various character input functions according to a user's touch input in the main display area while a UI for inputting characters is displayed in a plurality of sub display areas.

6A, when a character is input to the main display area 610, the user touches one area of the main display area 610 and then inputs a swipe input in the left direction 630) is detected, the processor 320 may delete the most recently entered character. At this time, if the swipe input is detected more than the predetermined number of times, the processor 320 can delete all the inputted characters.

6B, when touching one area of the main display area 610 and then inputting a swipe input 640 in the right direction, The processor 320 may perform a spacing function corresponding to a space bar.

Meanwhile, although the deletion function and the space bar function have been described in the above embodiments, various functions such as a copy function, a cut-in function, and the like can be performed according to a user input in the main display area .

In the above-described embodiment, the UI for inputting characters is English. However, the UI for inputting characters may be another language. For example, as shown in FIGS. 7A and 7B, a UI for inputting Korean can be provided.

In particular, FIG. 7A displays the UI for the Korean input of the narrative method. In this case, the UI for the Korean input of the narrative method includes a UI element for inputting consonants in the first sub display area 720-1, a UI for inputting a vowel in the second sub display area 720-2, Element can be placed. 7B, a UI for Korean input of the skypeed method is displayed.

In addition, a UI for inserting various texts such as numerals and special symbols other than characters can be displayed in a plurality of sub display areas.

Also, if a user touching one point of each of the plurality of auxiliary display areas and then swiping in the same direction is detected, the processor 320 can execute the predetermined application according to the user's touch. Specifically, as shown in FIG. 8A, while one side of each of the plurality of auxiliary display areas 820-1 and 820-2 is touched while the clock screen is displayed in the main display area 810, When the swipe input 830 is detected, the processor 320 may execute the camera application and control the touch display 810 to display a live view image, as shown in FIG. 8B.

At this time, the processor 320 may execute another application according to the position of the auxiliary display area where the swipe input is detected or the direction of the swipe input.

For example, if a user touches one point of each of the plurality of auxiliary display areas 820-1 and 820-2 while the clock screen is displayed in the main display area 810 and then sweeps to the left direction, Lt; RTI ID = 0.0 > 320 < / RTI >

As another example, while one side of the first sub display area 820-1 of the plurality of sub display areas 820-1 and 820-2 is touched while the clock screen is displayed in the main display area 810, If a swipe user touch is detected, the processor 320 may execute the schedule application.

As another example, while one side of the second sub display area 820-2 of the plurality of sub display areas 820-1 and 820-2 is touched while the clock screen is displayed in the main display area 810, When a user touch to swipe is detected, the processor 320 can execute the memo application.

As described above, by executing the specific application through the swipe input sensed in the plurality of auxiliary display areas, the user can execute the application more quickly as using the shortcut key.

In the above-described embodiment, the type of the user's touch for executing a fast application is a swipe input. However, the present invention is not limited to this, and may be applied to other embodiments such as a long press input Type user input.

Also, an application executed according to a user's touch can be set at the time of manufacture, but this is only an example and can be set by the user.

Further, while the camera application is being executed, the processor 320 detects the direction of the electronic device 200 through the sensor 310, and detects at least one of the main display area and the plurality of sub display areas based on the detected direction of the electronic device It is possible to control the touch display 300 to display a live view image in a part thereof. 9A, the touch display 300 includes a first main display region 910-1, a second main display region 910-2, and a third main display region 910-1, ), A first sub display area 920-1, and a second sub display area 920-2.

The processor 320 may detect the orientation of the electronic device 200 while the camera application is running. At this time, the processor 320 can detect the direction of the electronic device 200 through the motion sensor, but this is merely an embodiment. The camera 280 photographs the user, So that the direction of the electronic device 200 can be detected.

The processor 320 may control the touch display 300 to display a live view image on at least a portion of the touch display 300 according to the orientation of the electronic device 200. [

Specifically, when the camera application is first executed, the processor 320 displays a live view image in the first main display area 910-1 and the second main display area 910-2 as shown in FIG. 9B The touch display 300 can be controlled.

When the user wearing the electronic device 200 with his left hand rotates the electronic device 200 in the counterclockwise direction, the processor 320 displays the first main display area 910-1 And the first auxiliary display area 920-1, as shown in FIG. As another example, when the user wearing the electronic device 200 with his left hand rotates the electronic device 200 in a clockwise direction, the processor 320 may move the second main display area 910-2 and the second auxiliary display area 910-2, The touch display 300 may be controlled to display the live view image on the display unit 920-2. In other words, the processor 320 may control the touch display 300 to display the live view image by moving the touch panel 300 in another area of the touch display 300 by detecting the change of the attitude of the electronic device 200.

Meanwhile, in the above-described embodiment, the live view image is moved according to the direction of the electronic device 200. However, the present invention is not limited to this, and various types of images (for example, a web page image, Etc.) can move in the direction of the electronic device 200. [

As described above, by moving and displaying the live view image according to the direction of the electronic device, the user can more easily check the live view image and photograph the image.

When a plurality of cameras 280 are provided in the electronic device 200, the processor 320 controls the touch display 300 to display a plurality of live view images photographed by a plurality of cameras in a plurality of main display areas, Can be controlled. 10, the processor 320 displays the live view image photographed by the first camera in the first main display area 1010-1, and the second main display area 1010-2 The touch display 300 can be controlled to display the live view image photographed by the second camera. In this case, the first auxiliary display area 1020-1 may display a UI for capturing a live view image displayed on the first camera, and the second auxiliary display area 1020-2 may be displayed on the second camera A UI for capturing a live view image can be displayed.

As described above, by photographing a plurality of live view images by dividing the main display area 1010 of the electronic device into a plurality of areas, photographing can be performed in more various environments.

11 is a flowchart for explaining a control method of the electronic device 100, according to an embodiment of the present invention. First, the electronic device 100 according to an embodiment of the present invention may include a touch display including a main display area and a plurality of auxiliary display areas located on the sides of the main display area.

First, the electronic device 100 provides a UI for inputting characters in a plurality of auxiliary display areas of the touch display (S1110). At this time, the UI for inputting characters includes a plurality of UI elements, and each of the plurality of UI elements may correspond to at least one character, number, symbol, and special character.

Then, the electronic device 100 determines whether the touch interaction is input through the UI (S1120). At this time, the touch interaction may be a swipe interaction that touches one of the plurality of UI elements and swipes in the direction of the main display area.

When the touch interaction is inputted through the UI (S1120-Y), the electronic device 100 inputs a character according to the touch interaction (S1130). That is, the electronic device 100 may provide the input character to the main display area 111 of the touch display 110 according to the touch interaction.

According to various embodiments of the present invention as described above, a user can more conveniently input characters using the auxiliary display area of the electronic device 100 without disturbing the viewing of the main display area.

Meanwhile, the control method of the electronic device according to the above-described various embodiments may be implemented by a program and provided to a display device or an input device. In particular, a program including a control method of a display device may be stored in a non-transitory computer readable medium.

A non-transitory readable medium is a medium that stores data for a short period of time, such as a register, cache, memory, etc., but semi-permanently stores data and is readable by the apparatus. In particular, the various applications or programs described above may be stored on non-volatile readable media such as CD, DVD, hard disk, Blu-ray disk, USB, memory card, ROM,

While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed exemplary embodiments, but, on the contrary, It will be understood by those skilled in the art that various changes in form and detail may be made therein without departing from the spirit and scope of the present invention.

110, 300: Touch display 120, 320: Processor
210: memory 220: GPS chip
230: communication unit 240: video processor
250: Audio Processor 260: Button
270: microphone 280: camera
290: speaker 310: sensor

Claims (18)

In an electronic device,
A touch display including a main display area and a plurality of sub display areas located at sides of the main display area; And
Wherein the controller controls the touch display to provide a UI for inputting characters in the plurality of auxiliary display areas of the touch display, and when the touch interaction for the UI is input, And controlling the touch display to provide the main display area.
The method according to claim 1,
The touch display includes:
A first curved surface auxiliary display region extending integrally from the main display region and bent toward the first side of the main display region and being smaller than the main display region and a curved auxiliary display region extending integrally from the main display region and bent to the second side of the main display region And a second curved auxiliary display area that is smaller than the main display area.
3. The method of claim 2,
The UI includes a plurality of UI elements,
Wherein each of the plurality of UI elements corresponds to at least one character.
The method of claim 3,
The processor comprising:
Providing a UI element corresponding to a vowel among the plurality of UI elements to the first curved surface sub display area,
Controls the touch display to provide a UI element corresponding to a consonant among the plurality of UI elements to the second curved surface auxiliary display area.
The method of claim 3,
The processor comprising:
When touching one of the plurality of UI elements and detecting a user touch to swipe in the direction of the main display area, the touch display is controlled to provide the character corresponding to the touched UI element to the main display area And the electronic device.
The method according to claim 1,
The processor comprising:
And deletes characters provided in the main display area when a user touch to sweep the main display area in a predetermined direction is detected.
3. The method of claim 2,
The processor comprising:
And when the user touches the first curved surface auxiliary display region and the second curved auxiliary display region simultaneously by swiping, the specific application is executed according to the user touch.
8. The method of claim 7,
The processor comprising:
Wherein the controller executes the camera application according to the user's touch when the user touches the first curved surface auxiliary display area and the second curved auxiliary display area simultaneously in the left and right direction.
9. The method of claim 8,
The processor comprising:
Detecting a direction of the electronic device during execution of the camera application and displaying a live view image on at least a portion of the main display area and the plurality of auxiliary display areas based on the detected direction of the electronic device And said control means controls said control means.
And a touch display including a main display area and a plurality of sub display areas located on side surfaces of the main display area,
Providing a UI for inputting characters in the plurality of auxiliary display areas of the touch display; And
And providing the input character according to the touch interaction to the main display area of the touch display when the touch interaction for the UI is input.
11. The method of claim 10,
The touch display includes:
A first curved surface auxiliary display region extending integrally from the main display region and bent toward the first side of the main display region and being smaller than the main display region and a curved auxiliary display region extending integrally from the main display region and bent to the second side of the main display region And a second curved surface auxiliary display area that is smaller than the main display area.
12. The method of claim 11,
The UI includes a plurality of UI elements,
Wherein each of the plurality of UI elements corresponds to at least one character.
13. The method of claim 12,
The providing of the UI may include:
Providing a UI element corresponding to a vowel among the plurality of UI elements to the first curved surface sub display area,
Wherein a UI element corresponding to a consonant among the plurality of UI elements is provided in the second curved surface auxiliary display area.
13. The method of claim 12,
Wherein the step of providing the main display area includes:
When the user touches one of the plurality of UI elements and sweeps in the direction of the main display area, the character corresponding to the touched UI element is provided to the main display area Way.
11. The method of claim 10,
And deleting a character provided in the main display area when a user touch to sweep the main display area in a preset direction is detected.
12. The method of claim 11,
And executing a specific application according to the user's touch when a user touch for simultaneously swiping the first curved surface auxiliary display area and the second curved surface auxiliary display area is detected.
17. The method of claim 16,
Wherein the performing comprises:
Wherein the camera application is executed according to the user's touch when the user touches the first curved surface auxiliary display area and the second curved auxiliary display area simultaneously by swiping in the left and right directions.
18. The method of claim 17,
Detecting a direction of the electronic device while the camera application is running;
And displaying the live view image on at least a part of the main display area and the plurality of auxiliary display areas based on the detected direction of the electronic device.
KR1020150168552A 2015-07-17 2015-11-30 Electronic device and Method for controlling the electronic device thereof KR20170009688A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/KR2016/007536 WO2017014475A1 (en) 2015-07-17 2016-07-12 Electronic device and control method therefor

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201562193806P 2015-07-17 2015-07-17
US62/193,806 2015-07-17

Publications (1)

Publication Number Publication Date
KR20170009688A true KR20170009688A (en) 2017-01-25

Family

ID=57991676

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150168552A KR20170009688A (en) 2015-07-17 2015-11-30 Electronic device and Method for controlling the electronic device thereof

Country Status (1)

Country Link
KR (1) KR20170009688A (en)

Similar Documents

Publication Publication Date Title
KR102423826B1 (en) User termincal device and methods for controlling the user termincal device thereof
US10698564B2 (en) User terminal device and displaying method thereof
KR102230708B1 (en) User termincal device for supporting user interaxion and methods thereof
KR101580300B1 (en) User termincal device and methods for controlling the user termincal device thereof
KR102255143B1 (en) Potable terminal device comprisings bended display and method for controlling thereof
US10416883B2 (en) User terminal apparatus and controlling method thereof
US10067666B2 (en) User terminal device and method for controlling the same
KR102168648B1 (en) User terminal apparatus and control method thereof
KR101559091B1 (en) Potable terminal device comprisings bended display and method for controlling thereof
US10572148B2 (en) Electronic device for displaying keypad and keypad displaying method thereof
US11243687B2 (en) User terminal apparatus and controlling method thereof
US10691333B2 (en) Method and apparatus for inputting character
KR102183445B1 (en) Portable terminal device and method for controlling the portable terminal device thereof
KR20140028383A (en) User terminal apparatus and contol method thereof
EP3287886B1 (en) User terminal apparatus and controlling method thereof
WO2017014475A1 (en) Electronic device and control method therefor
KR20170009688A (en) Electronic device and Method for controlling the electronic device thereof
KR102305314B1 (en) User terminal device and methods for controlling the user terminal device
CN111580706A (en) Electronic device providing user interaction and method thereof