KR20160005416A - Mobile terminal - Google Patents
Mobile terminal Download PDFInfo
- Publication number
- KR20160005416A KR20160005416A KR1020140084290A KR20140084290A KR20160005416A KR 20160005416 A KR20160005416 A KR 20160005416A KR 1020140084290 A KR1020140084290 A KR 1020140084290A KR 20140084290 A KR20140084290 A KR 20140084290A KR 20160005416 A KR20160005416 A KR 20160005416A
- Authority
- KR
- South Korea
- Prior art keywords
- touch
- mobile terminal
- user
- information
- input
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04B—TRANSMISSION
- H04B1/00—Details of transmission systems, not covered by a single one of groups H04B3/00 - H04B13/00; Details of transmission systems not characterised by the medium used for transmission
- H04B1/38—Transceivers, i.e. devices in which transmitter and receiver form a structural unit and in which at least one part is used for functions of transmitting and receiving
- H04B1/40—Circuits
Abstract
Description
The present invention relates to a mobile terminal.
A terminal can be divided into a mobile terminal (mobile / portable terminal) and a stationary terminal according to whether the terminal can be moved. The mobile terminal can be divided into a handheld terminal and a vehicle mounted terminal according to whether the user can directly carry the mobile terminal.
The functions of mobile terminals are diversified. For example, there are data and voice communication, photographing and video shooting through a camera, voice recording, music file playback through a speaker system, and outputting an image or video on a display unit. Some terminals are equipped with an electronic game play function or a multimedia player function. In particular, modern mobile terminals can receive multicast signals that provide visual content such as broadcast and video or television programs.
Such a terminal has various functions, for example, in the form of a multimedia device having multiple functions such as photographing and photographing of a moving picture, reproduction of a music or video file, reception of a game and broadcasting, etc. .
In order to support and enhance the functionality of such terminals, it may be considered to improve the structural and / or software parts of the terminal.
The present invention proposes a terminal that can easily enlarge a screen from a user's touch gesture for operating a terminal, easily confirm information of the corresponding content or application, and execute commands.
The mobile terminal of the present embodiment is a terminal capable of touch input. The terminal includes a camera for capturing an image, a display unit for displaying contents related to an application being displayed or an image captured by the camera, And a control unit for controlling the content displayed through the display unit according to the content of the content, wherein the content is at least one of a text, an image, and a moving image, 1 touch and a second touch including at least a part of the first touch are performed, the control unit recognizes the touch gesture as a preset touch gesture when the touch gesture is input, And displays information about the content.
The effect of the mobile terminal according to the present invention will be described below.
There is an advantage that various operations can be performed from a touch gesture input by a user with one hand for terminal operation.
Further, it is possible to use the touch gesture inputted by the user with various commands according to the type of application to be executed and the touch position, thereby providing a terminal which can more accurately determine the intention of the user.
FIG. 1A is a block diagram for explaining a mobile terminal according to the present invention, and FIGS. 1B and 1C are conceptual diagrams illustrating an example of a mobile terminal according to the present invention in different directions.
2 is a conceptual diagram for explaining another example of the
3 is a view showing enlargement of a part of a region by a touch gesture of a user in the terminal of the present invention.
4 is a view showing enlargement of a part of a region and movement of an enlarged region according to a touch gesture of a user in the terminal of the present invention.
FIG. 5 is a view showing an enlargement of a part of a region according to a touch gesture of a user inputted when a terminal of the present invention is playing a moving image.
FIG. 6 is a view showing information of a selected moving picture according to a touch gesture of a user when a moving picture list is displayed in the terminal of the present invention.
7 is a diagram for explaining a case where a user's touch gesture is input when an email application is being executed in the terminal of the present invention.
FIGS. 8 and 9 are diagrams for explaining a case where a user's touch gesture is input when a setting application that allows a user to set an operation of the terminal of the present invention is being executed.
10 is a diagram for explaining a case where a user's touch gesture is input when the camera application is being executed in the terminal of the present invention.
11 is a diagram for explaining a case where a user's touch gesture is input when a gallery application is being executed in a terminal of the present invention.
12 is a diagram for explaining a case where an application is edited using the touch gesture of this embodiment.
13 is a diagram showing a UI displayed on the screen when the music application is being executed in the terminal of the present invention.
FIG. 14 is a diagram showing detailed information about music when a user's touch gesture is input during execution of the music application according to the present embodiment.
FIG. 15 is a diagram showing that, when a user's touch gesture is input during execution of a music application according to the present embodiment, sound output of the music can be set.
FIGS. 16 and 17 are diagrams showing the display of lyrics of the music or the function of searching for lyrics when the user's touch gesture is input during the execution of the music application according to the present embodiment.
Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings, wherein like reference numerals are used to designate identical or similar elements, and redundant description thereof will be omitted. The suffix "module" and " part "for the components used in the following description are given or mixed in consideration of ease of specification, and do not have their own meaning or role. In the following description of the embodiments of the present invention, a detailed description of related arts will be omitted when it is determined that the gist of the embodiments disclosed herein may be blurred. It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are intended to provide further explanation of the invention as claimed. , ≪ / RTI > equivalents, and alternatives.
Terms including ordinals, such as first, second, etc., may be used to describe various elements, but the elements are not limited to these terms. The terms are used only for the purpose of distinguishing one component from another.
It is to be understood that when an element is referred to as being "connected" or "connected" to another element, it may be directly connected or connected to the other element, . On the other hand, when an element is referred to as being "directly connected" or "directly connected" to another element, it should be understood that there are no other elements in between.
The singular expressions include plural expressions unless the context clearly dictates otherwise.
In the present application, the terms "comprises", "having", and the like are used to specify that a feature, a number, a step, an operation, an element, a component, But do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, or combinations thereof.
The mobile terminal described in this specification includes a mobile phone, a smart phone, a laptop computer, a digital broadcasting terminal, a personal digital assistant (PDA), a portable multimedia player (PMP), a navigation device, a slate PC A tablet PC, an ultrabook, a wearable device such as a smartwatch, a smart glass, and a head mounted display (HMD). have.
However, it will be appreciated by those skilled in the art that the configuration according to the embodiments described herein may be applied to fixed terminals such as a digital TV, a desktop computer, a digital signage, and the like, will be.
1A to 1C are block diagrams for explaining a mobile terminal according to the present invention, and FIGS. 1B and 1C are conceptual diagrams showing an example of a mobile terminal according to the present invention in different directions.
The
The
The
The
The
The
The
In addition, the
In addition to the operations related to the application program, the
In addition, the
The
At least some of the components may operate in cooperation with one another to implement a method of operation, control, or control of a mobile terminal according to various embodiments described below. In addition, the operation, control, or control method of the mobile terminal may be implemented on the mobile terminal by driving at least one application program stored in the
Hereinafter, the various components of the
First, referring to the
The mobile communication module 112 may be a mobile communication module or a mobile communication module such as a mobile communication module or a mobile communication module that uses technology standards or a communication method (e.g., Global System for Mobile communication (GSM), Code Division Multi Access (CDMA), Code Division Multi Access 2000 (Enhanced Voice-Data Optimized or Enhanced Voice-Data Only), Wideband CDMA (WCDMA), High Speed Downlink Packet Access (HSDPA), High Speed Uplink Packet Access (HSUPA), Long Term Evolution And an external terminal, or a server on a mobile communication network established according to a long term evolution (AR), a long term evolution (AR), or the like.
The wireless signal may include various types of data depending on a voice call signal, a video call signal or a text / multimedia message transmission / reception.
The wireless Internet module 113 is a module for wireless Internet access, and may be built in or externally attached to the
Wireless Internet technologies include, for example, wireless LAN (WLAN), wireless fidelity (Wi-Fi), wireless fidelity (Wi-Fi) Direct, DLNA (Digital Living Network Alliance), WiBro Interoperability for Microwave Access, High Speed Downlink Packet Access (HSDPA), High Speed Uplink Packet Access (HSUPA), Long Term Evolution (LTE) and Long Term Evolution-Advanced (LTE-A) 113 transmit and receive data according to at least one wireless Internet technology, including Internet technologies not listed above.
The wireless Internet module 113 for performing a wireless Internet connection through the mobile communication network can be used for wireless Internet access by WiBro, HSDPA, HSUPA, GSM, CDMA, WCDMA, LTE or LTE- May be understood as a kind of the mobile communication module 112.
The short-range communication module 114 is for short-range communication, and includes Bluetooth ™, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB) (Near Field Communication), Wi-Fi (Wireless-Fidelity), Wi-Fi Direct, and Wireless USB (Wireless Universal Serial Bus) technology. The short-range communication module 114 is connected to the
Here, the other
The
Next, the
The
The
Meanwhile, the
First, the
Examples of the
On the other hand, for convenience of explanation, the act of recognizing that the object is located on the touch screen in proximity with no object touching the touch screen is referred to as "proximity touch & The act of actually touching an object on the screen is called a "contact touch. &Quot; The position at which the object is closely touched on the touch screen means a position where the object corresponds to the touch screen vertically when the object is touched. The
The touch sensor uses a touch (or touch input) applied to the touch screen (or the display unit 151) by using at least one of various touch methods such as a resistance film type, a capacitive type, an infrared type, an ultrasonic type, Detection.
For example, the touch sensor may be configured to convert a change in a pressure applied to a specific portion of the touch screen or a capacitance generated in a specific portion to an electrical input signal. The touch sensor may be configured to detect a position, an area, a pressure at the time of touch, a capacitance at the time of touch, and the like where a touch object touching the touch screen is touched on the touch sensor. Here, the touch object may be a finger, a touch pen, a stylus pen, a pointer, or the like as an object to which a touch is applied to the touch sensor.
Thus, when there is a touch input to the touch sensor, the corresponding signal (s) is sent to the touch controller. The touch controller processes the signal (s) and transmits the corresponding data to the
On the other hand, the
On the other hand, the touch sensors and the proximity sensors discussed above can be used independently or in combination to provide a short touch (touch), a long touch, a multi touch, a drag touch ), Flick touch, pinch-in touch, pinch-out touch, swipe touch, hovering touch, and the like. Touch can be sensed.
The ultrasonic sensor can recognize the position information of the object to be sensed by using ultrasonic waves. Meanwhile, the
The
The
The
Also, the
In the stereoscopic display unit, a three-dimensional display system such as a stereoscopic system (glasses system), an autostereoscopic system (no-glasses system), and a projection system (holographic system) can be applied.
The sound output unit 152 may output audio data received from the
The
In addition to vibration, the
The
The
The signal output from the
The
The identification module is a chip for storing various information for authenticating the use right of the
The
The
The
Meanwhile, as described above, the
In addition, the
The
In addition, the
As another example, the
In the following, various embodiments may be embodied in a recording medium readable by a computer or similar device using, for example, software, hardware, or a combination thereof.
Referring to FIGS. 1B and 1C, the disclosed
Here, the terminal body can be understood as a concept of referring to the
The
A
In some cases, electronic components may also be mounted on the
As shown, when the
These
The
Meanwhile, the
The
1B and 1C, a
However, these configurations are not limited to this arrangement. These configurations may be excluded or replaced as needed, or placed on different planes. For example, the
The
The
In addition, the
The
The touch sensor may be a film having a touch pattern and disposed between the
In this way, the
The first
The
The
The
The first and
In this figure, the
The contents input by the first and
On the other hand, a rear input unit (not shown) may be provided on the rear surface of the terminal body as another example of the
The rear input unit may be disposed so as to overlap with the
When a rear input unit is provided on the rear surface of the terminal body, a new type of user interface using the rear input unit can be realized. When the
Meanwhile, the
The
The
And a
The
The
And a second
The terminal body may be provided with at least one antenna for wireless communication. The antenna may be embedded in the terminal body or formed in the case. For example, an antenna constituting a part of the broadcast receiving module 111 (see FIG. 1A) may be configured to be able to be drawn out from the terminal body. Alternatively, the antenna may be formed in a film type and attached to the inner surface of the
The terminal body is provided with a power supply unit 190 (see FIG. 1A) for supplying power to the
The
The
The
Meanwhile, in the present invention, information processed in the mobile terminal can be displayed using a flexible display. Hereinafter, the present invention will be described in detail with reference to the accompanying drawings.
2 is a conceptual diagram for explaining another example of the
As shown, the
The deformation may be at least one of warping, bending, folding, twisting, and curling of the
A typical flexible display refers to a sturdy display that is lightweight and does not break easily, such as paper, formed on a thin, flexible substrate that can flex, bend, fold, twist, or curl like a conventional flat panel display.
In addition, the electronic paper is a display technology to which general ink characteristics are applied, and the point that the reflected light is used is different from the conventional flat panel display. The electronic paper can be changed by using a twist ball or electrophoresis (electrophoresis) using a capsule.
In a state in which the
The
Meanwhile, the
Meanwhile, the
The deformation detecting unit may be provided in the
The
Meanwhile, the
In addition, a battery (not shown) included in the
The state change of the
Meanwhile, the mobile terminal can be extended to a wearable device that can be worn on the body beyond the dimension that the user mainly grasps and uses. These wearable devices include smart watch, smart glass, and head mounted display (HMD). Hereinafter, examples of a mobile terminal extended to a wearable device will be described.
The wearable device can be made to be able to exchange (or interlock) data with another
Next, a communication system that can be implemented through the
First, the communication system may use different wireless interfaces and / or physical layers. For example, wireless interfaces that can be used by a communication system include Frequency Division Multiple Access (FDMA), Time Division Multiple Access (TDMA), Code Division Multiple Access (CDMA) ), Universal mobile telecommunication systems (UMTS) (in particular Long Term Evolution (LTE), Long Term Evolution-Advanced (LTE-A)), Global System for Mobile Communications May be included.
Hereinafter, for the sake of convenience of description, the description will be limited to CDMA. However, it is apparent that the present invention can be applied to all communication systems including an OFDM (Orthogonal Frequency Division Multiplexing) wireless communication system as well as a CDMA wireless communication system.
A CDMA wireless communication system includes at least one
Each of the plurality of BSs may comprise at least one sector, and each sector may comprise an omnidirectional antenna or an antenna pointing to a particular direction of radial emission from the BS. In addition, each sector may include two or more antennas of various types. Each BS may be configured to support a plurality of frequency assignments, and a plurality of frequency assignments may each have a specific spectrum (e.g., 1.25 MHz, 5 MHz, etc.).
The intersection of sector and frequency assignment may be referred to as a CDMA channel. The BS may be referred to as a base station transceiver subsystem (BTSs). In this case, one BSC and at least one BS can be summed together. The base station may also indicate a "cell site ". Alternatively, each of the plurality of sectors for a particular BS may be referred to as a plurality of cell sites.
A broadcast transmission unit (BT) transmits a broadcast signal to
In addition, a Global Positioning System (GPS) may be associated with the CDMA wireless communication system to identify the location of the
The
The
The WiFi Positioning System (WPS) is a system in which a
The WiFi location tracking system may include a Wi-Fi location server, a
The
The Wi-Fi position location server extracts information of the wireless AP connected to the
The information of the wireless AP to be extracted based on the location information request message of the
As described above, the Wi-Fi position location server can receive the information of the wireless AP connected to the
Then, the Wi-Fi location server can extract (or analyze) the location information of the
As a method for extracting (or analyzing) the position information of the
The Cell-ID method is a method of determining the position of the mobile station with the strongest signal strength among neighboring wireless AP information collected by the mobile terminal. Although the implementation is simple, it does not cost extra and it can acquire location information quickly, but there is a disadvantage that positioning accuracy is lowered when the installation density of the wireless AP is low.
The fingerprint method collects signal strength information by selecting a reference position in a service area, and estimates the position based on the signal strength information transmitted from the mobile terminal based on the collected information. In order to use the fingerprint method, it is necessary to previously convert the propagation characteristics into a database.
The triangulation method is a method of calculating the position of the mobile terminal based on the coordinates of at least three wireless APs and the distance between the mobile terminals. (Time of Arrival, ToA), Time Difference of Arrival (TDoA) in which a signal is transmitted, and the time difference between the wireless AP and the wireless AP, in order to measure the distance between the mobile terminal and the wireless AP. , An angle at which a signal is transmitted (Angle of Arrival, AoA), or the like.
The landmark method is a method of measuring the position of a mobile terminal using a landmark transmitter that knows the location.
Various algorithms can be utilized as a method for extracting (or analyzing) the location information of the mobile terminal.
The extracted location information of the
The
As shown in FIG. 1A, a mobile terminal according to the present invention includes Bluetooth ™, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), UWB (Ultra Wideband), ZigBee, NFC Field Communication), and Wireless USB (Wireless Universal Serial Bus).
Among them, the NFC module provided in the mobile terminal supports the non-contact type short-range wireless communication between the terminals at a distance of about 10 cm. The NFC module may operate in either a card mode, a reader mode, or a P2P mode. In order for the NFC module to operate in the card mode, the
When the NFC module is operated in the card mode, the mobile terminal can transmit the stored card information to the outside such as a conventional IC card. Specifically, if the mobile terminal storing the card information of the payment card such as a credit card or a bus card is brought close to the fare payment machine, the mobile local payment can be processed, and the mobile terminal storing the card information of the access card can be accessed If you are close to the time, the approval process for access may begin. Cards such as credit cards, transportation cards, and access cards are mounted on the security module in the form of an applet, and the security module can store card information on the mounted card. Here, the card information of the payment card may be at least one of a card number, balance, and usage details, and the card information of the access card may include at least one of a name, a number (e.g., It can be one.
When the NFC module is operated in the reader mode, the mobile terminal can read data from an external tag. At this time, the data received from the mobile terminal by the tag may be coded into a data exchange format (NFC Data Exchange Format) defined by the NFC Forum. In addition, the NFC Forum defines four record types. Specifically, the NFC forum defines four RTDs (Record Type Definitions) such as Smart Poster, Text, Uniform Resource Identifier (URI), and General Control. If the data received from the tag is a smart poster type, the control unit executes a browser (e.g., an Internet browser), and if the data received from the tag is a text type, the control unit can execute the text viewer. When the data received from the tag is a URI type, the control unit executes the browser or makes a telephone call, and if the data received from the tag is a general control type, it can execute appropriate operations according to the control contents.
When the NFC module is operated in a peer-to-peer (P2P) mode, the mobile terminal can perform P2P communication with another mobile terminal. At this time, LLCP (Logical Link Control Protocol) may be applied to P2P communication. For P2P communication, a connection can be created between the mobile terminal and another mobile terminal. At this time, the generated connection can be divided into a connectionless mode in which one packet is exchanged and terminated, and a connection-oriented mode in which packets are exchanged consecutively. Through P2P communications, data such as business cards, contact information, digital photos, URLs in electronic form, and setup parameters for Bluetooth and Wi-Fi connectivity can be exchanged. However, since the usable distance of NFC communication is short, the P2P mode can be effectively used to exchange small-sized data.
Hereinafter, embodiments related to a control method that can be implemented in a mobile terminal configured as above will be described with reference to the accompanying drawings. It will be apparent to those skilled in the art that the present invention may be embodied in other specific forms without departing from the spirit or essential characteristics thereof.
3 is a view showing enlargement of a part of a region by a touch gesture of a user in the terminal of the present invention.
First, according to the present embodiment, when the user operates the terminal with one hand while holding the terminal, the terminal can perform various operations according to the touch gesture inputted from one finger.
The touch gesture of the user to be described below will be described with reference to FIG.
3 (a), when a specific image (photograph) is displayed through the
When the
For example, when a user grips a terminal with one hand and takes a gesture (second touch) for touching a specific region with one finger (first finger) and increasing the contact area of the thumb to be. In this case, the
When image enlargement is performed, image enlargement can be performed with the image of the area corresponding to the
As described above, in this embodiment, a first touch for touching a predetermined area as a touch gesture of a user command and a second touch including at least a part of the first touch area while touching a larger area are performed The time for which the touch input is held may be determined in advance for the first touch or the second touch.
Accordingly, the
Hereinafter, the touch gesture input by the user command in the embodiment of the present invention will be described as an example of the first touch and the second touch.
4 is a view showing enlargement of a part of a region and movement of an enlarged region according to a touch gesture of a user in the terminal of the present invention.
4, in the case where a predetermined image is displayed through the
4B, a
In addition, the user can perform the
For example, the user performs the
FIG. 5 is a view showing an enlargement of a part of a region according to a touch gesture of a user inputted when a terminal of the present invention is playing a moving image.
The embodiment illustrated in Fig. 5 is a case in which a user's touch gesture is input when the application being executed is a moving picture application and the moving picture is being played back.
As shown in FIG. 5A, when a moving
When the
In the above-described embodiments, the area of the image or the moving image is enlarged through the gesture consisting of the first touch and the second touch of the user. However, this is not the case when the
According to another embodiment, the
FIG. 6 is a view showing information of a selected moving picture according to a touch gesture of a user when a moving picture list is displayed in the terminal of the present invention.
The terminal can display a list of moving pictures stored in the terminal or a list of moving pictures on the web on the screen as shown in FIG. 6 (a) according to an installed application. For example, in a case where an educational application is installed in a terminal, and a user can watch a video lecture through the application, an image of a lecturer for each lecture can be displayed as respective video information.
At this time, in order for a user to conveniently select a moving image to be viewed from a plurality of moving images, it is necessary for the terminal to easily provide information about the moving image.
When the
As shown in Fig. 6 (a), when a list of moving images is displayed as a content, a
That is, when the
If the touch gesture composed of the first touch and the second touch is input in a state in which the list of the video contents to be viewed or viewed by the user is displayed, It is possible to judge the intention of acquiring information on a specific content from a list.
At this time, the
7 is a diagram for explaining a case where a user's touch gesture is input when an email application is being executed in the terminal of the present invention.
In Fig. 7, a case where a touch gesture consisting of a first touch and a second touch is performed in an e-mail application is shown. However, this case may be performed in a message application in which a plurality of messages are arranged in a caller name or a time sequence.
That is, as shown in Fig. 7 (a), when the e-mail application is running in the terminal, a
At this time, when the user inputs the
Also, when the drag gesture is input while the
In this way, when a list of text contents is displayed, such as a message application or an e-mail application, the
8 is a diagram for explaining a case in which a user's touch gesture is input when a setting application that allows the user to set the operation of the terminal of the present invention is being executed.
As described above, in the terminal of the embodiment, the same touch gesture of the user can be used as another command, depending on the application being executed or the type of UI of the application displayed on the screen.
In another embodiment, when the user's touch gesture is input in the setting application, the
The setting application of the terminal is an application that allows the user to select the brightness of the terminal, the sound of the terminal, the connection between the terminal and the external device, and the like.
As shown in FIG. 8 (a), the setting application may include a
When a second touch including a first touch of the user and at least a part of the first touch and a wider area is performed for a predetermined time, the control unit can enlarge the displayed image or screen according to the executed application, The corresponding touch gesture may be used as an instruction for information confirmation as shown in FIGS. 8 and 9.
In the case of a setting item such as the
However, if the setting item selected by the user's touch gesture can be additionally set or selected by the user, the
For example, as shown in FIG. 9A, the
Even if the
In this way, when there is an item that can be selected / adjusted by the user in the setting item selected by the user's touch gesture input, the
10 is a diagram for explaining a case where a user's touch gesture is input when the camera application is being executed in the terminal of the present invention.
The
For example, as shown in Fig. 10 (a), when the camera application is being executed, the
In the case where the first touch and the second touch are performed, the
That is, when the first touch and the second touch of the user are performed in the photographing
Through this process, the user has the advantage of being able to instruct the zooming process of the camera with one finger, focusing and inputting the shooting button at a time.
11 is a diagram for explaining a case where a user's touch gesture is input when a gallery application is being executed in a terminal of the present invention.
A gallery application that can be installed and executed in a terminal is an application that provides a list of photos and videos stored in a terminal or stored on a web, and displays pictures and videos in an album or folder selected by the user on the screen.
In the case where the touch gesture according to the first touch and the second touch of the user is inputted during execution of the gallery application, the
For example, as shown in Figs. 11 (a) to 11 (c), when a list of photographs and moving pictures is displayed for each album or for each folder in the gallery application, 1110 and a
11, when the user selects a specific album or folder, selects a specific picture, and a single picture is displayed in the display unit 181, a touch made of the first touch and the second touch When the gesture is input, enlargement of a partial area of the picture or moving picture is performed as shown in Figs. 3 and 4.
Accordingly, even within the gallery application, the
12 is a diagram for explaining a case where an application is edited using the touch gesture of this embodiment.
According to another embodiment of the present invention, when the
For example, on the home screen of the terminal, the installed application is displayed as an icon image, and the user can input the touch gesture on the home screen, thereby deleting the installed application or changing the position of the icon image of the application.
In addition, in the Android operating system, there is an application drawer in which icon images are listed as a list of installed applications and installed widgets. In this way, in the case where a user is provided with a UI capable of confirming an installed application and a widget, performing an array editing of an icon image, or deleting an application, the user's touch gesture enables editing and deletion of such an application .
12 are home screen or application drawers in which information about the installed applications is displayed as
At this time, as shown in FIG. 12 (a), the
In this case, the
Thus, when the touch gesture is input in the home screen on which the installed applications are arranged, a
At this time, when the user drags the icon image after selecting it, the
Hereinafter, various operations are performed according to a user's touch gesture when the terminal of the present invention is executing a music application.
In the above description, it has been described that the same touch gesture can instruct different operations of the terminal depending on the type of application executed in the terminal. In particular, an application in which a plurality of pieces of information and a plurality of settings are displayed together When executed, various operations can be performed according to the item in which the touch gesture is made.
13 is a diagram showing a UI displayed on the screen when the music application is being executed in the terminal of the present invention.
A smart phone plays a sufficient role as a multimedia device, and in particular, a variety of functions are provided, such as not only a simple music reproduction but also a customization of a sound output, a display of a synced lyrics at the time of music reproduction, and the like.
13, when the music application is being executed in the terminal, a
Also, a
In the embodiment of the present invention, when the areas where the specific information is displayed and the key areas directly selectable / adjustable by the user are displayed together, the
FIG. 14 is a diagram showing detailed information about music when a user's touch gesture is input during execution of the music application according to the present embodiment.
As shown in Fig. 14 (a), the title or the artist name for the music being reproduced can be displayed in the screen (the first information area 1301).
At this time, a
That is, after the
Accordingly, when the first touch with respect to the user's touch input and the second touch including at least a part of the first touch region are performed for a predetermined time, the
FIG. 15 is a diagram showing that, when a user's touch gesture is input during execution of a music application according to the present embodiment, sound output of the music can be set.
In an application program such as a music application, in addition to displaying information about reproduced music, button keys may be displayed together to allow various settings such as adjusting the volume, selecting the next music, speeding up the reproduction speed, and the like. That is, in addition to displaying information about the content, there is a case where the user can directly set / select the reproduction or display of the content.
For example, as shown in FIGS. 13 and 15, in the case of a music application, it may be a key for adjusting the volume. When the user touches the key area for volume control, a volume control bar or the like is displayed so that the user can adjust the volume or reduce the volume.
However, in the embodiment of the present invention, when a touch gesture is inputted to a user-configurable area (key area), it is judged as a setting command for sound output in addition to adjusting the volume, A
When the application is executed, the UI displayed in the area where the user's touch gesture is performed among the various UIs displayed on the screen can be displayed with detailed information about the content in the area where the information is displayed. If the UI displayed in the area where the touch gesture is made is a key area that can be manipulated or selected by the user, a menu for allowing the user to select a detailed setting related to the key may be displayed.
In this regard, in the case of the music application, in the case where the touch gesture is performed in the
FIGS. 16 and 17 are diagrams showing the display of lyrics of the music or the function of searching for lyrics when the user's touch gesture is input during the execution of the music application according to the present embodiment.
When the
In this case, when the user's touch gesture is input to the area where the lyrics can be displayed, the
That is, as shown in FIG. 17 (c), if the operation for displaying the lyrics by the user's touch gesture is performed but the lyrics are not stored in the terminal or the music file, And a
If the first touch and the second touch input by the user are performed in the area where the lyrics can be displayed, the
[0031] [0034] With the various embodiments of the present invention as described above, the user is able to control various operations of the terminal with one finger. This is because the control section of the terminal is based on the fact that the intention of the user can be grasped more accurately from the judgment of the item displayed in the area where the touch gesture is made.
Claims (11)
A camera for capturing images,
A display unit for displaying an image captured by the camera or contents related to an application being driven,
And a control unit for controlling the content displayed through the display unit according to a touch input of a user,
Wherein the content is at least one of a text, an image, and a moving image,
Wherein the controller recognizes a touch input of the user as a touch gesture set in advance when a first touch for touching a predetermined area of the content and a second touch including at least a portion of the first touch are performed,
Wherein the control unit changes the display of the content or displays information about the content when the touch gesture is input.
Wherein the controller recognizes the touch as the touch when the first touch is made and the second touch including at least a part of the first touch is continued for a predetermined time or more.
Wherein the control unit recognizes the second touch as the touch gesture when the second touch is input while the first touch is maintained.
Wherein the control unit moves the display region of the content displayed through the display unit when the touch gesture is input and the touch gesture is dragged.
Wherein the content is an image or a moving image displayed through the display unit,
Wherein when the touch gesture is input, the controller performs a screen enlargement operation for an area where the first touch is performed in the image or the moving image.
When the camera is in operation and an image photographed by the camera is being displayed through the display unit,
Wherein the control unit controls the camera so that the photographing is performed after zooming and focusing an image region corresponding to the first touch as an operation for the touch gesture.
Wherein the content is text or image,
Wherein the control unit generates a window indicating information on a text or an image of an area corresponding to the first touch as an operation for the touch gesture.
Wherein the content is text or image,
Wherein the control unit generates a setting window of the mobile terminal by the operation for the touch gesture when the content of the area corresponding to the first touch corresponds to the item for setting the mobile terminal operation, .
Wherein when an application driven by the control unit is an application capable of playing music and a UI associated with the application is displayed on the display unit,
Wherein the controller is configured to determine an item for the area where the first touch is made when the touch gesture is input, display information about the music being played according to the determination result, To the mobile terminal.
When a first touch constituting the touch gesture is made for an area in which at least a part of information of music being reproduced among the UIs related to the application is displayed,
Wherein the control unit displays information on music being played through the application on the display unit.
When the first touch constituting the touch gesture is made with respect to the area indicating the setting of the volume size among the UIs related to the application,
Wherein the control unit displays a menu for setting a sound effect of music to be reproduced through the application on the display unit.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020140084290A KR20160005416A (en) | 2014-07-07 | 2014-07-07 | Mobile terminal |
PCT/KR2015/006026 WO2016006835A1 (en) | 2014-07-07 | 2015-06-15 | Mobile terminal |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020140084290A KR20160005416A (en) | 2014-07-07 | 2014-07-07 | Mobile terminal |
Publications (1)
Publication Number | Publication Date |
---|---|
KR20160005416A true KR20160005416A (en) | 2016-01-15 |
Family
ID=55064415
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020140084290A KR20160005416A (en) | 2014-07-07 | 2014-07-07 | Mobile terminal |
Country Status (2)
Country | Link |
---|---|
KR (1) | KR20160005416A (en) |
WO (1) | WO2016006835A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP4040261A1 (en) * | 2021-02-07 | 2022-08-10 | Beijing SuperHexa Century Technology Co. Ltd. | Method and apparatus for controlling video recording |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109845245B (en) * | 2016-08-19 | 2021-10-22 | 韩国斯诺有限公司 | Dynamic image processing method and computer-readable recording medium |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101473491B1 (en) * | 2008-06-16 | 2014-12-16 | 주식회사 케이티 | Method of zooming in/out of video processing apparatus with touch input device and video processing apparatus performing the same |
KR20110112980A (en) * | 2010-04-08 | 2011-10-14 | 삼성전자주식회사 | Apparatus and method for sensing touch |
KR20120018397A (en) * | 2010-08-23 | 2012-03-05 | 에스케이플래닛 주식회사 | Zoom in/out method for device having touch screen and device performing the same |
KR101385625B1 (en) * | 2011-10-19 | 2014-04-18 | 주식회사 네오위즈인터넷 | Method, apparatus, and recording medium for processing touch process |
KR20130143381A (en) * | 2012-06-21 | 2013-12-31 | 삼성전자주식회사 | Digital photographing apparatus and method for controlling the same |
-
2014
- 2014-07-07 KR KR1020140084290A patent/KR20160005416A/en not_active Application Discontinuation
-
2015
- 2015-06-15 WO PCT/KR2015/006026 patent/WO2016006835A1/en active Application Filing
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP4040261A1 (en) * | 2021-02-07 | 2022-08-10 | Beijing SuperHexa Century Technology Co. Ltd. | Method and apparatus for controlling video recording |
US11539878B2 (en) | 2021-02-07 | 2022-12-27 | Beijing SuperHexa Century Technology CO. Ltd. | Method and apparatus for controlling video recording |
Also Published As
Publication number | Publication date |
---|---|
WO2016006835A1 (en) | 2016-01-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR101763599B1 (en) | Mobile terminal system and method for controlling the same | |
KR101703867B1 (en) | Mobile terminal controlled by at least one touch and the method for controlling the mobile terminal | |
KR20170017280A (en) | Mobile terminal and method for controlling the same | |
KR20170013555A (en) | Mobile terminal and method for controlling the same | |
KR20180032038A (en) | Mobile terminal and method for controlling the same | |
KR20150090740A (en) | Mobile terminal and control method thereof | |
KR20150109764A (en) | Mobile terminal and method for processing data the same | |
KR101893153B1 (en) | Mobile terminal and method for controlling the same | |
KR20160019187A (en) | Mobile terminal and method for controlling the same | |
KR20180056182A (en) | Mobile terminal and method for controlling the same | |
KR20180020517A (en) | Mobile terminal | |
KR101728358B1 (en) | Mobile terminal and control method thereof | |
KR20180021532A (en) | Mobile terminal | |
KR20160056582A (en) | Mobile terminal and controlling method thereof | |
KR101631754B1 (en) | Mobile terminal | |
KR101638922B1 (en) | Mobile terminal and method for controlling the same | |
KR20160005416A (en) | Mobile terminal | |
KR20170056846A (en) | Mobile terminal and method for controlling the same | |
KR101728758B1 (en) | Mobile terminal and method for controlling the same | |
KR101637663B1 (en) | Mobile terminal | |
KR20180073959A (en) | Mobile terminal and method for controlling the same | |
KR20170004587A (en) | Watch-type mobile terminal and method for controlling the same | |
KR20150088085A (en) | Mobile terminal and control method for the mobile terminal | |
KR101638921B1 (en) | Mobile terminal and method for controlling the same | |
KR20160004132A (en) | Mobile terminal and control method for the mobile terminal |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
E601 | Decision to refuse application |