KR20170027136A - Mobile terminal and the control method thereof - Google Patents
Mobile terminal and the control method thereof Download PDFInfo
- Publication number
- KR20170027136A KR20170027136A KR1020150123684A KR20150123684A KR20170027136A KR 20170027136 A KR20170027136 A KR 20170027136A KR 1020150123684 A KR1020150123684 A KR 1020150123684A KR 20150123684 A KR20150123684 A KR 20150123684A KR 20170027136 A KR20170027136 A KR 20170027136A
- Authority
- KR
- South Korea
- Prior art keywords
- thumbnail
- image
- menu
- touch
- mobile terminal
- Prior art date
Links
Images
Classifications
-
- H04M1/72522—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2201/00—Electronic components, circuits, software, systems or apparatus used in telephone systems
- H04M2201/42—Graphical user interfaces
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2250/00—Details of telephonic subscriber devices
- H04M2250/22—Details of telephonic subscriber devices including a touch pad, a touch sensor or a touch detector
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2250/00—Details of telephonic subscriber devices
- H04M2250/52—Details of telephonic subscriber devices including functional features of a camera
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Telephone Function (AREA)
- Controls And Circuits For Display Device (AREA)
Abstract
Description
The present invention relates to a mobile terminal and a control method thereof, and more particularly, to a mobile terminal and a control method thereof for easily editing an image by a thumbnail operation.
A terminal can be divided into a mobile terminal (mobile / portable terminal) and a stationary terminal according to whether the terminal can be moved. The mobile terminal can be divided into a handheld terminal and a vehicle mounted terminal according to whether the user can directly carry the mobile terminal.
The functions of mobile terminals are diversified. For example, there are data and voice communication, photographing and video shooting through a camera, voice recording, music file playback through a speaker system, and outputting an image or video on a display unit. Some terminals are equipped with an electronic game play function or a multimedia player function. In particular, modern mobile terminals can receive multicast signals that provide visual content such as broadcast and video or television programs.
Such a terminal has various functions, for example, in the form of a multimedia device having multiple functions such as photographing and photographing of a moving picture, reproduction of a music or video file, reception of a game and broadcasting, etc. .
In order to support and enhance the functionality of such terminals, it may be considered to improve the structural and / or software parts of the terminal.
SUMMARY OF THE INVENTION The present invention has been made in view of the above-mentioned needs, and an object of the present invention is to provide a mobile terminal and a control method thereof which can easily edit an image by dragging or dropping a thumbnail.
In order to achieve the above object, a mobile terminal according to an embodiment of the present invention includes a camera, a touch screen, a storage unit for storing an image photographed through a camera, and a pre- When a thumbnail formed in one area of the preview screen is touched for a predetermined time or longer, at least one menu for editing a video corresponding to the thumbnail is displayed on the preview screen, and the touched thumbnail is dragged And dragging the selected menu to one of the at least one menu to automatically perform a function corresponding to the dropped menu. The thumbnail may correspond to the image stored most recently in the storage unit.
Further, the control unit may display a different menu according to the type of the image corresponding to the touched thumbnail.
The control unit may display at least one of a menu for selecting an exposure value to be applied to the image and a menu for selecting a gain on one side of the touch screen.
If the image corresponding to the thumbnail is a still image, the control unit may display a plurality of still images having different visual effects on the still image on one side of the touch screen.
In addition, if the image corresponding to the thumbnail is a moving image, the control unit may display a menu for deleting a partial frame of the moving image on one side of the touch screen.
Also, when the thumbnail is touched for a predetermined time or longer, the control unit can switch to a screen corresponding to an image corresponding to a thumbnail touched on the preview screen.
In addition, when the thumbnail is touched for a predetermined time or longer, the controller displays a floating thumbnail floating on the preview screen on the preview screen, and when the floating thumbnail is dragged and dropped on one of the at least one menus, the floating thumbnail Can disappear.
According to at least one of the embodiments of the present invention, it is possible to easily perform a desired image editing by a one-touch method without entering a folder where an image is stored.
Further scope of applicability of the present invention will become apparent from the following detailed description. It should be understood, however, that the detailed description and specific examples, such as the preferred embodiments of the invention, are given by way of illustration only, since various changes and modifications within the spirit and scope of the invention will become apparent to those skilled in the art.
1 is a block diagram of a mobile terminal according to an embodiment of the present invention,
2 to 3 are views showing a conceptual diagram of a mobile terminal according to an embodiment of the present invention,
4 is a flowchart illustrating a method of controlling a mobile terminal according to an exemplary embodiment of the present invention,
5 to 31 illustrate various examples of a method of editing a still image according to an embodiment of the present invention,
32 to 51 are various examples of a method of editing moving images according to an embodiment of the present invention.
Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings, wherein like reference numerals are used to designate identical or similar elements, and redundant description thereof will be omitted. The suffix "module" and " part "for the components used in the following description are given or mixed in consideration of ease of specification, and do not have their own meaning or role. In the following description of the embodiments of the present invention, a detailed description of related arts will be omitted when it is determined that the gist of the embodiments disclosed herein may be blurred. It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are intended to provide further explanation of the invention as claimed. , ≪ / RTI > equivalents, and alternatives.
Terms including ordinals, such as first, second, etc., may be used to describe various elements, but the elements are not limited to these terms. The terms are used only for the purpose of distinguishing one component from another.
It is to be understood that when an element is referred to as being "connected" or "connected" to another element, it may be directly connected or connected to the other element, . On the other hand, when an element is referred to as being "directly connected" or "directly connected" to another element, it should be understood that there are no other elements in between.
The singular expressions include plural expressions unless the context clearly dictates otherwise.
The use of the terms "comprising" or "having" in this application is intended to specify the presence of stated features, integers, steps, operations, elements, parts, or combinations thereof, But do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, parts, or combinations thereof.
The mobile terminal described in this specification includes a mobile phone, a smart phone, a laptop computer, a digital broadcasting terminal, a personal digital assistant (PDA), a portable multimedia player (PMP), a navigation device, a slate PC A tablet PC, an ultrabook, a wearable device such as a smartwatch, a smart glass, and a head mounted display (HMD). have.
However, it will be readily apparent to those skilled in the art that the configuration according to the embodiments described herein may be applied to fixed terminals such as a digital TV, a desktop computer, a digital signage, and the like, .
1 is an exemplary block diagram of a
More specifically, the
The
The
The
The
The
In addition, the
In addition to the operations associated with the application program, the
In addition, the
The
At least some of the components may operate in cooperation with each other to implement a method of operation, control, or control of the mobile terminal according to various embodiments described below. The method of operation, control, or control of the mobile terminal may also be implemented on the mobile terminal by driving at least one application program stored in the
Hereinafter, the various components of the
First, referring to the
The
The wireless signal may include various types of data depending on a voice call signal, a video call signal, or a text / multimedia message transmission / reception.
The
Wireless Internet technologies include, for example, WLAN (Wireless LAN), Wi-Fi (Wireless Fidelity), Wi-Fi (Wireless Fidelity) Direct, DLNA (Digital Living Network Alliance), WiBro for example, a
The
The short-
Here, the other
The
Next, the
The
The
Meanwhile, the
First, the
Examples of the
For the sake of convenience of explanation, the act of recognizing that an object is located on the touch screen while the object is not in contact with the touch screen is referred to as "proximity touch" Quot; contact touch ". The position at which an object is touched on the touch screen means a position at which the object corresponds vertically to the touch screen when the object is touched. The
The touch sensor uses a touch (or touch input) applied to the touch screen (or the display unit 151) by using at least one of various touch methods such as a resistance film type, a capacitive type, an infrared type, an ultrasonic type, Detection.
For example, the touch sensor may be configured to convert a change in a pressure applied to a specific portion of the touch screen or a capacitance generated in a specific portion into an electrical input signal. The touch sensor may be configured to detect a position, an area, a pressure at the time of touch, a capacitance at the time of touch, and the like where a touch object touching the touch screen is touched on the touch sensor. Here, the touch object may be a finger, a touch pen, a stylus pen, a pointer, or the like as an object to which a touch is applied to the touch sensor.
If there is a touch input to the touch sensor, the corresponding signal (s) is sent to the touch controller. The touch controller processes the signal (s) and transmits the corresponding data to the
On the other hand, the
On the other hand, the touch sensors and proximity sensors described above can be used independently or in combination to provide short (touch), long (touch), multi touch, drag touch Touches such as flick touch, pinch-in touch, pinch-out touch, swipe touch, hovering touch, Sensing can be performed.
The ultrasonic sensor can recognize the position information of the object to be sensed by using ultrasonic waves. Meanwhile, the
The
The
The
Also, the
In the stereoscopic display unit, a three-dimensional display system such as a stereoscopic system (glasses system), an autostereoscopic system (no-glasses system), a projection system (holographic system) can be applied.
The
The
In addition to the vibration, the
The
The
The signal output from the
The
Meanwhile, the identification module is a chip for storing various kinds of information for authenticating the usage right of the
The
The
The
Meanwhile, as described above, the
The
The
Also, the
As another example, the
In the following, the various embodiments may be embodied in a recording medium readable by a computer or similar device using, for example, software, hardware, or a combination thereof.
FIGS. 2 to 3 are examples of a conceptual diagram of the
Here, the terminal body can be understood as a concept of referring to the
The
A
Electronic components may be mounted on the
As shown, when the
These
Unlike the above example in which the
Meanwhile, the
The
2 and 3, a
However, these configurations are not limited to this arrangement. These configurations may be excluded or replaced as needed or placed on another side. For example, the
The
The
In addition, the
The
On the other hand, the touch sensor is formed as a film having a touch pattern, and is disposed between the
In this way, the
The first
The
The
The
The first and
In this figure, the
The contents input by the first and
On the other hand, a back input unit (not shown) may be provided on the rear surface of the terminal body as another example of the
The rear input unit may be disposed so as to overlap with the
When a rear input unit is provided on the rear surface of the terminal body, a new type of user interface using the rear input unit can be realized. Also, when the
Meanwhile, the
The
The
And a
The
The
And a second
The terminal body may be provided with at least one antenna for wireless communication. The antenna may be embedded in the terminal body or formed in the case. For example, an antenna constituting a part of the broadcast receiving module 111 (see FIG. 1) may be configured to be able to be drawn out from the terminal body. Alternatively, the antenna may be formed in a film type and attached to the inner surface of the
The terminal body is provided with a power supply unit 190 (see FIG. 1) for supplying power to the
The
In the figure, the
The
Hereinafter, embodiments related to a control method that can be implemented in the
4 is a flowchart illustrating a method of controlling a mobile terminal according to an embodiment of the present invention. Hereinafter, a description of widely known technical contents will be omitted.
First, the
Specifically, the
Accordingly, when the storage of the photographed image is completed, the preview screen can be continuously displayed on the
Thereafter, a touch on the thumbnail may be performed (S420). Specifically, in a state in which the preview screen is being displayed, the user can touch the thumbnail formed on one side for a preset time. Hereinafter, this is referred to as a long-touch.
In this case, the
Specifically, when the user touches the thumbnail long, the
Also, the image to be edited may be a video corresponding to a long-touched thumbnail. Accordingly, at least one menu displayed on the preview screen may be for editing a video corresponding to a thumbnail of a long touch.
In addition, the
On the other hand, the
On the other hand, the preview screen is displayed before and after the long touch for the thumbnail, but the present invention is not limited thereto. Accordingly, when a long touch is performed on the thumbnail in a state that the preview screen is displayed, an image corresponding to the thumbnail of the long touch may be displayed on the full screen. That is, a screen corresponding to the thumbnail on the preview screen.
Then, the
Specifically, in a state where at least one menu is displayed on the preview screen, the user can drag the long-touched thumbnail to the position of one displayed menu. In this case, the floating thumbnail displayed by the user's long touch can be moved in accordance with the drag direction of the user.
Accordingly, when a user drags and drops a floating thumbnail to a single displayed menu, the function corresponding to the menu in which the floating thumbnail is dropped can be automatically performed. In this case, the floating thumbnail may disappear. Here, the drop may mean to release the touch for the thumbnail or the floating thumbnail.
According to the above description, there is no need for the user to enter a folder (gallery, album, or the like) in which an image is stored separately, so that the process for editing the image is simplified. Further, since the user can edit the thumbnail by dragging and dropping while the preview screen is being displayed or the image is being taken, various operations can be performed at the same time. In addition, since the user can edit the thumbnail by dragging and dropping, the image editing is simplified.
Hereinafter, various embodiments according to the present invention will be described according to image types. Therefore, the still image will be described in Figs. 5 to 31, and the moving image will be described in Fig. 32 to Fig. Hereinafter, the description of the portions already described and duplicated in each portion will be omitted.
5 to 31 are various examples of a method of editing a still image according to an embodiment of the present invention. Hereinafter, various methods of editing still images will be described. In addition, a description of widely known technical contents will be omitted.
5 to 10 show a first embodiment relating to still image editing.
As shown in FIG. 5, a preview screen captured by the camera may be displayed on the
In addition, a
When the
Also, when a long touch is performed on the thumbnail in a state that the preview screen is displayed, an image corresponding to the thumbnail long touched can be displayed on the full screen. That is, a screen corresponding to the thumbnail on the preview screen.
Also, a state in which a preview screen is displayed as a preview mode as shown in FIG. 5, and a state in which an image corresponding to a thumbnail is displayed as an image editing mode as shown in FIG.
In addition, various menus for editing a video corresponding to the
Specifically, the
The
The
The over-view menu 135-1 may be for displaying another menu not displayed on the full screen. Thus, when the floating
The other menu 135-2 may be a menu displayed by selection of the oversize menu 135-1, as shown in FIG. Therefore, various menus not displayed on the screen of FIG. 6 can be displayed. Referring to FIG. 6, since the more menu 135-1 is displayed in the shared
The
The
11 to 13 show a second embodiment relating to editing of a still image. In the state of FIG. 6, when the user drags the floating
Referring to FIG. 11, an
When the user places the floating
Referring to FIG. 12, the overflow menu 137-1 may be displayed on one side of the
Figs. 14 to 25 show a third embodiment relating to still image editing. Fig. In the state as shown in FIG. 5, when the thumbnail is long-touched, an
As shown in FIG. 14, the user can drag the floating
The crop box B is displayed in the form of a rectangle on the displayed image, and the size of the area is changed by the user's touch and drag, so that the user can visually confirm the area to be set or selected. Accordingly, when the crop box B is displayed as shown in FIG. 15, the user can set the size to be selected by touching and dragging the two points in the diagonal direction among the four corners of the crop box B. In this case, if the touch for two points is released, the original image can be changed to the image set according to the crop box B at the time when the touch is released and stored.
Alternatively, when the user drags the floating
Alternatively, the
Specifically, when the floating
On the other hand, as shown in FIG. 20, the user may drag each touch so that the crop box B is rotated while the first touch and the second touch are maintained. In this case, when the distance between the first point P1 and the second point P2 is maintained, only the rotation of the crop box B is performed, and the distance between the first point P1 and the second point P2 is changed The size and rotation of the crop box B can be simultaneously performed.
On the other hand, FIG. 22 shows a case where only one of the first touch and the second touch is released while the first touch and the second touch are maintained as in the case of FIG. In this case, as shown in FIG. 23, the
On the other hand, in the image editing mode, the image corresponding to the selected thumbnail is displayed on the full screen as described above. Therefore, as shown in FIG. 24, the user can resize an image by touching and dragging two opposing edges of the entire screen. That is, as shown in FIG. 25, when a touch on the upper right portion is dragged in the lower direction, a new image in which the image is entirely reduced can be displayed on the crop box (B). In this case, when one touch is released, the
24 to 25 illustrate only the case where the image is reduced, but the present invention is not limited thereto. Thus, the image may be enlarged, and the aspect ratio may be resized to be the same or may be differently resized.
Fig. 26 shows a third embodiment relating to still image editing.
The
If a thumbnail corresponding to an image captured and stored by the
In this case, since the image corresponding to the thumbnail includes the face of the person, the
27 to 29 show a fourth embodiment relating to still image editing.
One image may be formed by including a plurality of partial images. Here, the partial image can be used as a part of a whole image. For example, as shown in FIG. 27, an image corresponding to the long-touched
Accordingly, when the
Accordingly, when the user touches the
Similarly, when the user drags the floating thumbnail and drops the floating thumbnail onto the first storage button 453 formed on the first
30 to 31 show a fifth embodiment relating to still image editing.
As shown in FIG. 30, when the user touches the
For example, when the user touches the
In addition, a menu for adjusting various items such as the gain of the camera sensor as well as the exposure value can be displayed.
In the above description, various methods for editing a still image corresponding to a selected thumbnail are described. Hereinafter, various methods of editing a video corresponding to a selected thumbnail in the case of a moving image will be described.
32 to 51 are various examples of a method of editing moving images according to an embodiment of the present invention. Hereinafter, various methods of editing video will be described. Further, the description of the parts overlapping with the above description will be omitted.
On the other hand, when a thumbnail is touched in a state where a preview screen is displayed, it is preferable that the preview screen is switched to a screen corresponding to a thumbnail.
32 to 37 show a first embodiment relating to editing of moving images.
If the image corresponding to the long touch thumbnail is a moving image, the
Accordingly, when the user drags the floating
Thereafter, the user can drag the displayed
Then, as shown in Fig. 34, when the first frame 637-1 is located in the
On the other hand, if it is desired to further extend the range of the
Figs. 38 to 51 show a second embodiment relating to editing of moving images. Hereinafter, a method for capturing a first moving image and editing a second moving image separately will be described.
Referring to FIG. 38, the current moving picture is being recorded, and the moving picture being shot is displayed in full screen. In this case, when the user touches the thumbnail with a long touch or a long touch and then drags in the lower direction of the screen, a plurality of continuous frames 737-1 through 737-5 of the moving picture corresponding to the selected thumbnail can be displayed on one side of the screen. In this case, if the user drags in the left and right direction on the consecutive frames 737-1 to 737-5, the frame can be expanded in the same manner as in Fig.
For example, referring to FIG. 39 showing a frame over time, the user touch may be located between the fourth frame 737-4 and the fifth frame 737-5. Then, when the user drags in the left and right direction, a plurality of frames included between the fourth frame 737-4 and the fifth frame 737-5 can be displayed.
On the other hand, two touches may be dragged in opposite directions as shown in FIG. In this case, as shown in FIG. 41, the currently displayed continuous frames of 10 second intervals are further subdivided and can be displayed as continuous frames at intervals of 5 seconds.
For example, referring to FIG. 41 showing a frame over time, the first touch may be located between the first frame 737-1 to the fifth frame 737-5. Thereafter, when the second touch is also positioned between the first frame 737-1 to the fifth frame 737-5 and dragging in the left and right direction, a frame between the frames can be displayed.
On the other hand, as shown in Fig. 42, it is also possible to drag in the vertical direction. In this case, as shown in FIG. 43, the frame in which the touch is located may gradually fade out as the vertical drag is performed. Therefore, if the up and down dragging is performed by a predetermined number of times, the frame can be deleted.
For example, when the user repeatedly performs the dragging in the vertical direction on the third frame 737-3 of the five consecutive frames 737-1 to 737-5 displayed as in the case of Fig. 42, The third frame 737-3 may gradually be faded out as shown in FIG. In this case, a message indicating that the frame has been deleted may be displayed for a predetermined time.
On the other hand, as shown in FIG. 44, a circle or an elliptical shape can be drawn and dragged. In this case, as shown in FIG. 45, gradually from the frame corresponding to the dragged position to the adjacent frame, it can be fade-in or fade-out.
For example, in the case of the third frame 737-3 and the fourth frame 737-4 of the five consecutive frames 737-1 to 737-5 displayed as in the case of Fig. 44, If the number of times of dragging is repeated several times, the degree of blurring of the third frame 737-3 and the fourth frame 737-4 may be increased in proportion to the number of times of dragging as shown in FIG. In this case, the degree of blurring toward the left frame of the third frame 737-3 or toward the right frame of the fourth frame 737-4 may be reduced.
In the case of FIG. 44, if the frame is dragged in the clockwise direction, the frame fades-in, and if the frame is dragged in the counterclockwise direction, the frame can be fade-out, or vice versa.
On the other hand, as shown in FIG. 46, a moving picture is being recorded, and a
Also, a
The
The
When the touch is moved from the
The
The
The
The present invention described above can be embodied as computer readable codes on a medium on which a program is recorded. A computer readable medium includes any type of recording device in which data that can be read by a computer system is stored. Examples of the computer readable medium include a hard disk drive (HDD), a solid state disk (SSD), a silicon disk drive (SDD), a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, And may also be implemented in the form of a carrier wave (e.g., transmission over the Internet). The computer may also include a
100: mobile terminal 110: wireless communication unit
120: AV input unit 130: user input unit
140: sensing unit 150: output unit
160: interface unit 170: memory
180: control unit 190: power supply unit
Claims (7)
touch screen;
A storage unit for storing an image photographed through the camera; And
When a thumbnail formed on one area of the preview screen is touched for a predetermined time or longer in a state that a pre-view screen by the camera is displayed on the touch screen, an image corresponding to the thumbnail is edited At least one menu for displaying a menu on the preview screen,
And a controller for automatically performing a function corresponding to the dropped menu when the touched thumbnail is dragged and dropped to one of the at least one menu,
Wherein the thumbnail comprises:
And the image corresponding to the image most recently stored in the storage unit.
Wherein,
And displays a different menu according to a type of the image corresponding to the touched thumbnail.
Wherein,
Wherein at least one of a menu for selecting an exposure value to be applied to the image and a menu for selecting a gain is displayed on one side of the touch screen.
Wherein,
And displays a plurality of still images having different visual effects on the still image on one side of the touch screen if the image corresponding to the thumbnail is a still image.
Wherein,
And displays a menu for deleting a part of frames of the moving image on one side of the touch screen if the image corresponding to the thumbnail is a moving image.
Wherein,
And switches to a screen corresponding to an image corresponding to the touched thumbnail on the preview screen when the thumbnail is touched for a preset time or more.
Wherein,
And displaying a floating thumbnail floating on the preview screen on the preview screen when the thumbnail is touched for a preset time or longer,
And when the floating thumbnail is dragged and dropped to one of the at least one menu, the floating thumbnail disappears.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150123684A KR20170027136A (en) | 2015-09-01 | 2015-09-01 | Mobile terminal and the control method thereof |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150123684A KR20170027136A (en) | 2015-09-01 | 2015-09-01 | Mobile terminal and the control method thereof |
Publications (1)
Publication Number | Publication Date |
---|---|
KR20170027136A true KR20170027136A (en) | 2017-03-09 |
Family
ID=58402390
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020150123684A KR20170027136A (en) | 2015-09-01 | 2015-09-01 | Mobile terminal and the control method thereof |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR20170027136A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2021252216A1 (en) * | 2020-06-10 | 2021-12-16 | Snap Inc. | Interface carousel for use with image processing sdk |
CN115212582A (en) * | 2022-06-20 | 2022-10-21 | 广州小鹏汽车科技有限公司 | Display method, vehicle, and computer-readable storage medium |
-
2015
- 2015-09-01 KR KR1020150123684A patent/KR20170027136A/en unknown
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2021252216A1 (en) * | 2020-06-10 | 2021-12-16 | Snap Inc. | Interface carousel for use with image processing sdk |
US11604562B2 (en) | 2020-06-10 | 2023-03-14 | Snap Inc. | Interface carousel for use with image processing software development kit |
US11972090B2 (en) | 2020-06-10 | 2024-04-30 | Snap Inc. | Interface carousel for use with image processing software development kit |
CN115212582A (en) * | 2022-06-20 | 2022-10-21 | 广州小鹏汽车科技有限公司 | Display method, vehicle, and computer-readable storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR101667736B1 (en) | Mobile terminal and method for controlling the same | |
US10104312B2 (en) | Mobile terminal and method for controlling the same | |
KR101832966B1 (en) | Mobile terminal and method of controlling the same | |
KR20180019392A (en) | Mobile terminal and method for controlling the same | |
KR20180048142A (en) | Mobile terminal and method for controlling the same | |
KR20170112491A (en) | Mobile terminal and method for controlling the same | |
KR20170131101A (en) | Mobile terminal and method for controlling the same | |
KR20160018001A (en) | Mobile terminal and method for controlling the same | |
KR20160131720A (en) | Mobile terminal and method for controlling the same | |
KR20160014226A (en) | Mobile terminal and method for controlling the same | |
KR20150109756A (en) | Mobile terminal and method for controlling the same | |
KR20170029329A (en) | Mobile terminal and method for controlling the same | |
KR20160001228A (en) | Mobile terminal and method for controlling the same | |
KR101598710B1 (en) | Mobile terminal and method for controlling the same | |
KR20150146296A (en) | Mobile terminal and method for controlling the same | |
KR20180131908A (en) | Mobile terminal and method for controlling the same | |
KR20170112493A (en) | Mobile terminal and method for controlling the same | |
KR20180000255A (en) | Mobile terminal and method for controlling the same | |
KR20170001329A (en) | Mobile terminal and method for controlling the same | |
KR20170115863A (en) | Mobile terminal and method for controlling the same | |
KR101510704B1 (en) | Mobile terminal and control method for the mobile terminal | |
KR20170027136A (en) | Mobile terminal and the control method thereof | |
KR20180037370A (en) | Mobile terminal and method for controlling the same | |
KR20170039994A (en) | Mobile terminal and method for controlling the same | |
KR20170024437A (en) | Mobile terminal and folder image sliding method thereof |