KR102014417B1 - Terminal and control method thereof - Google Patents
Terminal and control method thereof Download PDFInfo
- Publication number
- KR102014417B1 KR102014417B1 KR1020130061517A KR20130061517A KR102014417B1 KR 102014417 B1 KR102014417 B1 KR 102014417B1 KR 1020130061517 A KR1020130061517 A KR 1020130061517A KR 20130061517 A KR20130061517 A KR 20130061517A KR 102014417 B1 KR102014417 B1 KR 102014417B1
- Authority
- KR
- South Korea
- Prior art keywords
- terminal
- delete delete
- video
- image
- mobile terminal
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04806—Zoom, i.e. interaction techniques or interactors for controlling the zooming operation
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
- Telephone Function (AREA)
Abstract
The present invention relates to a terminal and a control method thereof, and more particularly, to a terminal and a method for controlling a multi-window environment.
Terminal control method according to an embodiment of the present invention comprises the steps of playing a video including at least one object; Selecting at least one object from the played video; And displaying the selected object to be highlighted with the non-selected object.
Description
The present invention relates to a terminal and a control method thereof, and more particularly, to a terminal and a method for controlling a multi-window environment.
Terminals may be divided into mobile / portable terminals and stationary terminals according to their mobility. The mobile terminal may be further classified into a handheld terminal and a vehicle mount terminal according to whether a user can directly carry it.
As the terminal functions are diversified, for example, such a terminal is a multimedia player having a complex function such as taking a picture or a video, playing a music or video file, playing a game, or receiving a broadcast. Is being implemented.
In the case of watching a video played on a terminal, a technology for providing a detailed information or extracting a person information according to an external server for a person or an object appearing in the video by user selection has been disclosed. However, the function does not work or various applications are provided only by user selection.
Accordingly, an object of the present invention is to solve the above-mentioned problems of the related art, and provides a terminal and a method of controlling the same so that a user can easily search and provide various applications to an object shown in the video when the terminal plays the video. do.
Terminal control method according to an embodiment of the present invention comprises the steps of playing a video including at least one object; Selecting at least one object from the played video; And displaying the selected object to be highlighted with the non-selected object.
In addition, the terminal according to an embodiment of the present invention comprises a memory for storing at least one object and a video including the object; A user interface to receive a user control signal for selecting an object in a video playback mode; A controller configured to determine whether the stored object is present in the video play mode and to display an image of the object when the object appears in the video; And an output unit for reproducing a video according to a video playing mode and highlighting and displaying the object under the control of the controller.
Various embodiments of the present disclosure allow a user to easily control a window even in an environment without an external input device such as a keyboard or a mouse.
In a terminal and a method of controlling the same according to an embodiment of the present invention, the terminal can selectively display or manage various objects required by the user when playing a video or outputting an image so that the user can easily watch and search the video. Has the effect of doing
In addition, the terminal and the method of controlling the same according to an embodiment of the present invention have an effect of allowing a variety of objects to be reproduced in response to a user's request in a video played in the terminal.
1 is a block diagram of a mobile terminal according to an embodiment of the present invention.
2 is a flowchart illustrating an object selection operation of a mobile terminal according to an exemplary embodiment of the present invention.
3 is an exemplary diagram for explaining an object selecting operation according to an exemplary embodiment of the present invention.
4 is a flowchart illustrating an object display operation in a mobile terminal according to an embodiment of the present invention.
5 to 6 are diagrams illustrating screens on which an object is displayed in a mobile terminal according to one embodiment of the present invention.
7 is a flowchart illustrating an operation of displaying an object in a mobile terminal according to another embodiment of the present invention.
8 to 9 are diagrams illustrating screens on which an object is displayed in a mobile terminal according to another embodiment of the present invention.
Hereinafter, a mobile terminal according to the present invention will be described in more detail with reference to the accompanying drawings. The suffixes "module" and "unit" for components used in the following description are given or used in consideration of ease of specification, and do not have distinct meanings or roles from each other.
The mobile terminal described herein may include a mobile phone, a smart phone, a laptop computer, a digital broadcasting terminal, a personal digital assistant (PDA), a portable multimedia player (PMP), navigation, and the like. However, it will be readily apparent to those skilled in the art that the configuration according to the embodiments described herein may also be applied to fixed terminals such as digital TVs, desktop computers, etc., except when applicable only to mobile terminals.
Next, a structure of a mobile terminal according to an embodiment of the present invention will be described with reference to FIG. 1.
1 is a block diagram of a mobile terminal according to an embodiment of the present invention.
The
Hereinafter, the components will be described in order.
The
The
The broadcast channel may include a satellite channel and a terrestrial channel. The broadcast management server may mean a server that generates and transmits a broadcast signal and / or broadcast related information or a server that receives a previously generated broadcast signal and / or broadcast related information and transmits the same to a terminal. The broadcast signal may include not only a TV broadcast signal, a radio broadcast signal, and a data broadcast signal, but also a broadcast signal having a data broadcast signal combined with a TV broadcast signal or a radio broadcast signal.
The broadcast related information may mean information related to a broadcast channel, a broadcast program, or a broadcast service provider. The broadcast related information may also be provided through a mobile communication network. In this case, it may be received by the
The broadcast related information may exist in various forms. For example, it may exist in the form of Electronic Program Guide (EPG) of Digital Multimedia Broadcasting (DMB) or Electronic Service Guide (ESG) of Digital Video Broadcast-Handheld (DVB-H).
The
The broadcast signal and / or broadcast related information received through the
The
The
The short
The
Referring to FIG. 1, the A /
The image frame processed by the
The
The
The
The
The
The
Some of these displays can be configured to be transparent or light transmissive so that they can be seen from the outside. This may be referred to as a transparent display. A representative example of the transparent display is TOLED (Transparant OLED). The rear structure of the
There may be two or
When the
The touch sensor may be configured to convert a change in pressure applied to a specific portion of the
If there is a touch input to the touch sensor, the corresponding signal (s) is sent to the touch controller. The touch controller processes the signal (s) and then transmits the corresponding data to the
Referring to FIG. 1, a proximity sensor 141 may be disposed in an inner region of a mobile terminal surrounded by the touch screen or near the touch screen. The proximity sensor 141 refers to a sensor that detects the presence or absence of an object approaching a predetermined detection surface or an object present in the vicinity without using a mechanical contact by using an electromagnetic force or infrared rays. The proximity sensor 141 has a longer life and higher utilization than a contact sensor.
Examples of the proximity sensor 141 include a transmission photoelectric sensor, a direct reflection photoelectric sensor, a mirror reflection photoelectric sensor, a high frequency oscillation proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor. When the touch screen is capacitive, the touch screen is configured to detect the proximity of the pointer by the change of the electric field according to the proximity of the pointer. In this case, the touch screen (touch sensor) may be classified as a proximity sensor.
Hereinafter, for convenience of explanation, the act of allowing the pointer to be recognized without being in contact with the touch screen so that the pointer is located on the touch screen is referred to as a "proximity touch", and the touch The act of actually touching the pointer on the screen is called "contact touch." The position where the proximity touch is performed by the pointer on the touch screen refers to a position where the pointer is perpendicular to the touch screen when the pointer is in proximity proximity.
The proximity sensor detects a proximity touch and a proximity touch pattern (for example, a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch position, and a proximity touch movement state). Information corresponding to the sensed proximity touch operation and proximity touch pattern may be output on the touch screen. The
The
The
The
In addition to the vibration, the
The
The
The
The
The identification module is a chip that stores various types of information for authenticating the use authority of the
The interface unit may be a passage through which power from the cradle is supplied to the
The
The
According to an embodiment of the present disclosure, the
The
Various embodiments described herein may be implemented in a recording medium readable by a computer or similar device using, for example, software, hardware or a combination thereof.
According to a hardware implementation, the embodiments described herein include application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), and the like. It may be implemented using at least one of processors, controllers, micro-controllers, microprocessors, and electrical units for performing other functions. These may be implemented by the
In a software implementation, embodiments such as procedures or functions may be implemented with separate software modules that allow at least one function or operation to be performed. The software code may be implemented by a software application written in a suitable programming language. The software code may be stored in the
With the configuration of the
2 is a flowchart illustrating an object selection operation of a mobile terminal according to an exemplary embodiment of the present invention, and FIG. 3 is a diagram illustrating an example of an object selection operation according to an exemplary embodiment of the present disclosure.
2 to 3, the
The
When the object selection request signal is input, the
The
4 is a flowchart illustrating an object display on a mobile terminal according to an embodiment of the present invention, and FIGS. 5 to 6 are diagrams illustrating screens on which an object is displayed on a mobile terminal according to an embodiment of the present invention.
4 to 6, the
The
The
In addition, as illustrated in FIG. 5B, the
If a predetermined object is displayed during video playback as shown in FIGS. 5A and 5B, the
If the selection signal for the displayed object is not input, the
On the other hand, when a user input signal for selecting the displayed object is detected, the
The
The controller 4114 may zoom out the
As described above, according to an embodiment of the present invention, the appearance of the corresponding object may be detected and displayed (icon, highlight, etc.) in the video playback mode with respect to the object preset by the user's selection. In addition, the object may be displayed by zooming in (zoom in) or zooming out (zoom out, original image) to match the size of the screen.
7 is a flowchart illustrating an operation of displaying an object in a mobile terminal according to another embodiment of the present invention. 8 to 9 are diagrams illustrating screens on which an object is displayed in a mobile terminal according to another embodiment of the present invention.
7 to 9, the
The
The
When the
On the other hand, when the search request signal for the displayed object is input, the
When the
According to an embodiment of the present invention, the above-described method may be implemented as code that can be read by a processor in a medium in which a program is recorded. Examples of processor-readable media include ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical data storage, and the like, and may be implemented in the form of a carrier wave (for example, transmission over the Internet). Include.
The above-described mobile terminal is not limited to the configuration and method of the above-described embodiments, but the embodiments may be configured by selectively combining all or some of the embodiments so that various modifications can be made. have.
Claims (20)
An output unit for outputting a video;
A user interface unit receiving a user control signal for object selection while the video is output;
Setting an object selection area on a screen on which the video is output according to the user control signal input through the user interface unit;
Storing an image of an object existing in the set selection area in the memory,
While the video is output, it is determined whether an object existing in the selection area is present using the stored image,
And a controller configured to control the output unit to highlight and display an image of the object when an object existing in the selection area appears.
terminal.
And a sensing unit configured to detect a position and a gaze of a user for setting the object selection area.
terminal.
The control unit
When an object existing in the selection area appears, controlling to display the object by zooming in for a predetermined time.
terminal.
The control unit
When an object existing in the selection area appears, the object is controlled to be highlighted.
terminal.
The control unit
When an object existing in the selection area appears, an icon indicating that the appeared object is an object existing in the selection area is displayed.
terminal.
The control unit
Extracting images including an object existing in the selection region from the moving image, grouping the extracted images to generate a new image, and storing the generated image in the memory
terminal.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020130061517A KR102014417B1 (en) | 2013-05-30 | 2013-05-30 | Terminal and control method thereof |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020130061517A KR102014417B1 (en) | 2013-05-30 | 2013-05-30 | Terminal and control method thereof |
Publications (2)
Publication Number | Publication Date |
---|---|
KR20140140752A KR20140140752A (en) | 2014-12-10 |
KR102014417B1 true KR102014417B1 (en) | 2019-08-26 |
Family
ID=52458436
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020130061517A KR102014417B1 (en) | 2013-05-30 | 2013-05-30 | Terminal and control method thereof |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR102014417B1 (en) |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101254037B1 (en) * | 2009-10-13 | 2013-04-12 | 에스케이플래닛 주식회사 | Method and mobile terminal for display processing using eyes and gesture recognition |
-
2013
- 2013-05-30 KR KR1020130061517A patent/KR102014417B1/en active IP Right Grant
Also Published As
Publication number | Publication date |
---|---|
KR20140140752A (en) | 2014-12-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9509959B2 (en) | Electronic device and control method thereof | |
US10001910B2 (en) | Mobile terminal and controlling method thereof for creating shortcut of executing application | |
US8452339B2 (en) | Mobile terminal and method of controlling the same | |
US10042596B2 (en) | Electronic device and method for controlling the same | |
US9081541B2 (en) | Mobile terminal and method for controlling operation thereof | |
US20110096024A1 (en) | Mobile terminal | |
US20140189518A1 (en) | Mobile terminal | |
KR20130044770A (en) | Searching method and mobile device using the method | |
US20140354536A1 (en) | Electronic device and control method thereof | |
KR20110131439A (en) | Mobile terminal and method for controlling thereof | |
KR20130005174A (en) | Mobile device and control method for the same | |
KR20110016337A (en) | Method for displaying data and mobile terminal thereof | |
KR20150056353A (en) | The mobile terminal and the control method thereof | |
KR20140045060A (en) | Mobile terminal and method for controlling thereof | |
KR20140033896A (en) | Mobile terminal and method for controlling of the same | |
US9411411B2 (en) | Wearable electronic device having touch recognition unit and detachable display, and method for controlling the electronic device | |
KR20100099828A (en) | Mobile terminal for displaying three-dimensional menu and control method using the same | |
KR20150127842A (en) | Mobile terminal and control method thereof | |
KR20120078396A (en) | Mobile terminal and method for searching location information using touch pattern recognition thereof | |
KR20110133713A (en) | Mobile terminal and method for controlling the same | |
KR20110037064A (en) | Mobile terminal and method for controlling the same | |
US9336242B2 (en) | Mobile terminal and displaying method thereof | |
KR20110064289A (en) | Method for transmitting and receiving data and mobile terminal thereof | |
KR20120062165A (en) | Mobile terminal and method for controlling the same | |
KR20110041864A (en) | Method for attaching data and mobile terminal thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
A201 | Request for examination | ||
E902 | Notification of reason for refusal | ||
E701 | Decision to grant or registration of patent right | ||
GRNT | Written decision to grant |