KR102014417B1 - Terminal and control method thereof - Google Patents

Terminal and control method thereof Download PDF

Info

Publication number
KR102014417B1
KR102014417B1 KR1020130061517A KR20130061517A KR102014417B1 KR 102014417 B1 KR102014417 B1 KR 102014417B1 KR 1020130061517 A KR1020130061517 A KR 1020130061517A KR 20130061517 A KR20130061517 A KR 20130061517A KR 102014417 B1 KR102014417 B1 KR 102014417B1
Authority
KR
South Korea
Prior art keywords
terminal
delete delete
video
image
mobile terminal
Prior art date
Application number
KR1020130061517A
Other languages
Korean (ko)
Other versions
KR20140140752A (en
Inventor
박문희
서문수
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020130061517A priority Critical patent/KR102014417B1/en
Publication of KR20140140752A publication Critical patent/KR20140140752A/en
Application granted granted Critical
Publication of KR102014417B1 publication Critical patent/KR102014417B1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04806Zoom, i.e. interaction techniques or interactors for controlling the zooming operation

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Telephone Function (AREA)

Abstract

The present invention relates to a terminal and a control method thereof, and more particularly, to a terminal and a method for controlling a multi-window environment.
Terminal control method according to an embodiment of the present invention comprises the steps of playing a video including at least one object; Selecting at least one object from the played video; And displaying the selected object to be highlighted with the non-selected object.

Description

Terminal and its control method {TERMINAL AND CONTROL METHOD THEREOF}

The present invention relates to a terminal and a control method thereof, and more particularly, to a terminal and a method for controlling a multi-window environment.

Terminals may be divided into mobile / portable terminals and stationary terminals according to their mobility. The mobile terminal may be further classified into a handheld terminal and a vehicle mount terminal according to whether a user can directly carry it.

As the terminal functions are diversified, for example, such a terminal is a multimedia player having a complex function such as taking a picture or a video, playing a music or video file, playing a game, or receiving a broadcast. Is being implemented.

In the case of watching a video played on a terminal, a technology for providing a detailed information or extracting a person information according to an external server for a person or an object appearing in the video by user selection has been disclosed. However, the function does not work or various applications are provided only by user selection.

Accordingly, an object of the present invention is to solve the above-mentioned problems of the related art, and provides a terminal and a method of controlling the same so that a user can easily search and provide various applications to an object shown in the video when the terminal plays the video. do.

Terminal control method according to an embodiment of the present invention comprises the steps of playing a video including at least one object; Selecting at least one object from the played video; And displaying the selected object to be highlighted with the non-selected object.

In addition, the terminal according to an embodiment of the present invention comprises a memory for storing at least one object and a video including the object; A user interface to receive a user control signal for selecting an object in a video playback mode; A controller configured to determine whether the stored object is present in the video play mode and to display an image of the object when the object appears in the video; And an output unit for reproducing a video according to a video playing mode and highlighting and displaying the object under the control of the controller.

Various embodiments of the present disclosure allow a user to easily control a window even in an environment without an external input device such as a keyboard or a mouse.

In a terminal and a method of controlling the same according to an embodiment of the present invention, the terminal can selectively display or manage various objects required by the user when playing a video or outputting an image so that the user can easily watch and search the video. Has the effect of doing

In addition, the terminal and the method of controlling the same according to an embodiment of the present invention have an effect of allowing a variety of objects to be reproduced in response to a user's request in a video played in the terminal.

1 is a block diagram of a mobile terminal according to an embodiment of the present invention.
2 is a flowchart illustrating an object selection operation of a mobile terminal according to an exemplary embodiment of the present invention.
3 is an exemplary diagram for explaining an object selecting operation according to an exemplary embodiment of the present invention.
4 is a flowchart illustrating an object display operation in a mobile terminal according to an embodiment of the present invention.
5 to 6 are diagrams illustrating screens on which an object is displayed in a mobile terminal according to one embodiment of the present invention.
7 is a flowchart illustrating an operation of displaying an object in a mobile terminal according to another embodiment of the present invention.
8 to 9 are diagrams illustrating screens on which an object is displayed in a mobile terminal according to another embodiment of the present invention.

Hereinafter, a mobile terminal according to the present invention will be described in more detail with reference to the accompanying drawings. The suffixes "module" and "unit" for components used in the following description are given or used in consideration of ease of specification, and do not have distinct meanings or roles from each other.

The mobile terminal described herein may include a mobile phone, a smart phone, a laptop computer, a digital broadcasting terminal, a personal digital assistant (PDA), a portable multimedia player (PMP), navigation, and the like. However, it will be readily apparent to those skilled in the art that the configuration according to the embodiments described herein may also be applied to fixed terminals such as digital TVs, desktop computers, etc., except when applicable only to mobile terminals.

Next, a structure of a mobile terminal according to an embodiment of the present invention will be described with reference to FIG. 1.

1 is a block diagram of a mobile terminal according to an embodiment of the present invention.

The mobile terminal 100 includes a wireless communication unit 110, an A / V input unit 120, a user input unit 130, a sensing unit 140, an output unit 150, a memory 160, and an interface. The unit 170, the controller 180, and the power supply unit 190 may be included. The components shown in FIG. 1 are not essential, so that a mobile terminal having more or fewer components may be implemented.

Hereinafter, the components will be described in order.

The wireless communication unit 110 may include one or more modules that enable wireless communication between the mobile terminal 100 and the wireless communication system or between the mobile terminal 100 and a network in which the mobile terminal 100 is located. For example, the wireless communication unit 110 may include a broadcast receiving module 111, a mobile communication module 112, a wireless internet module 113, a short range communication module 114, a location information module 115, and the like. .

The broadcast receiving module 111 receives a broadcast signal and / or broadcast related information from an external broadcast management server through a broadcast channel.

The broadcast channel may include a satellite channel and a terrestrial channel. The broadcast management server may mean a server that generates and transmits a broadcast signal and / or broadcast related information or a server that receives a previously generated broadcast signal and / or broadcast related information and transmits the same to a terminal. The broadcast signal may include not only a TV broadcast signal, a radio broadcast signal, and a data broadcast signal, but also a broadcast signal having a data broadcast signal combined with a TV broadcast signal or a radio broadcast signal.

The broadcast related information may mean information related to a broadcast channel, a broadcast program, or a broadcast service provider. The broadcast related information may also be provided through a mobile communication network. In this case, it may be received by the mobile communication module 112.

The broadcast related information may exist in various forms. For example, it may exist in the form of Electronic Program Guide (EPG) of Digital Multimedia Broadcasting (DMB) or Electronic Service Guide (ESG) of Digital Video Broadcast-Handheld (DVB-H).

The broadcast receiving module 111 may include, for example, Digital Multimedia Broadcasting-Terrestrial (DMB-T), Digital Multimedia Broadcasting-Satellite (DMB-S), Media Forward Link Only (MediaFLO), and Digital Video Broadcast (DVB-H). Digital broadcast signals can be received using digital broadcasting systems such as Handheld and Integrated Services Digital Broadcast-Terrestrial (ISDB-T). Of course, the broadcast receiving module 111 may be configured to be suitable for not only the above-described digital broadcasting system but also other broadcasting systems.

The broadcast signal and / or broadcast related information received through the broadcast receiving module 111 may be stored in the memory 160.

The mobile communication module 112 transmits and receives a wireless signal with at least one of a base station, an external terminal, and a server on a mobile communication network. The wireless signal may include various types of data according to transmission and reception of a voice call signal, a video call call signal, or a text / multimedia message.

The wireless internet module 113 refers to a module for wireless internet access and may be embedded or external to the mobile terminal 100. Wireless Internet technologies may include Wireless LAN (Wi-Fi), Wireless Broadband (Wibro), World Interoperability for Microwave Access (Wimax), High Speed Downlink Packet Access (HSDPA), and the like.

The short range communication module 114 refers to a module for short range communication. As a short range communication technology, Bluetooth, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB), ZigBee, and the like may be used.

The location information module 115 is a module for obtaining a location of a mobile terminal, and a representative example thereof is a GPS (Global Position System) module.

Referring to FIG. 1, the A / V input unit 120 is for inputting an audio signal or a video signal, and may include a camera 121 and a microphone 122. The camera 121 processes image frames such as still images or moving images obtained by the image sensor in the video call mode or the photographing mode. The processed image frame may be displayed on the display unit 151.

The image frame processed by the camera 121 may be stored in the memory 160 or transmitted to the outside through the wireless communication unit 110. Two or more cameras 121 may be provided according to the use environment.

The microphone 122 receives an external sound signal by a microphone in a call mode, a recording mode, a voice recognition mode, etc., and processes the external sound signal into electrical voice data. The processed voice data may be converted into a form transmittable to the mobile communication base station through the mobile communication module 112 and output in the call mode. The microphone 122 may implement various noise removing algorithms for removing noise generated in the process of receiving an external sound signal.

The user input unit 130 generates input data for the user to control the operation of the terminal. The user input unit 130 may include a key pad dome switch, a touch pad (static pressure / capacitance), a jog wheel, a jog switch, and the like.

The sensing unit 140 detects a current state of the mobile terminal 100 such as an open / closed state of the mobile terminal 100, a location of the mobile terminal 100, presence or absence of a user contact, orientation of the mobile terminal, acceleration / deceleration of the mobile terminal, and the like. To generate a sensing signal for controlling the operation of the mobile terminal 100. For example, when the mobile terminal 100 is in the form of a slide phone, it may sense whether the slide phone is opened or closed. In addition, whether the power supply unit 190 is supplied with power, whether the interface unit 170 is coupled to the external device may be sensed. The sensing unit 140 may include a proximity sensor 141.

The output unit 150 is used to generate an output related to sight, hearing, or tactile sense, and includes a display unit 151, an audio output module 152, an alarm unit 153, and a haptic module 154. Can be.

The display unit 151 displays (outputs) information processed by the mobile terminal 100. For example, when the mobile terminal is in a call mode, the mobile terminal displays a user interface (UI) or a graphic user interface (GUI) related to the call. When the mobile terminal 100 is in a video call mode or a photographing mode, the mobile terminal 100 displays photographed and / or received images, a UI, and a GUI.

The display unit 151 includes a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT LCD), an organic light-emitting diode (OLED), and a flexible display (flexible). and at least one of a 3D display.

Some of these displays can be configured to be transparent or light transmissive so that they can be seen from the outside. This may be referred to as a transparent display. A representative example of the transparent display is TOLED (Transparant OLED). The rear structure of the display unit 151 may also be configured as a light transmissive structure. With this structure, the user can see the object located behind the terminal body through the area occupied by the display unit 151 of the terminal body.

There may be two or more display units 151 according to the implementation form of the mobile terminal 100. For example, a plurality of display units may be spaced apart or integrally disposed on one surface of the mobile terminal 100, or may be disposed on different surfaces, respectively.

When the display unit 151 and a sensor for detecting a touch operation (hereinafter, referred to as a touch sensor) form a mutual layer structure (hereinafter referred to as a touch screen), the display unit 151 may be configured in addition to an output device. Can also be used as an input device. The touch sensor may have, for example, a form of a touch film, a touch sheet, a touch pad, or the like.

The touch sensor may be configured to convert a change in pressure applied to a specific portion of the display unit 151 or capacitance generated in a specific portion of the display unit 151 into an electrical input signal. The touch sensor may be configured to detect not only the position and area of the touch but also the pressure at the touch.

If there is a touch input to the touch sensor, the corresponding signal (s) is sent to the touch controller. The touch controller processes the signal (s) and then transmits the corresponding data to the controller 180. As a result, the controller 180 can know which area of the display unit 151 is touched.

Referring to FIG. 1, a proximity sensor 141 may be disposed in an inner region of a mobile terminal surrounded by the touch screen or near the touch screen. The proximity sensor 141 refers to a sensor that detects the presence or absence of an object approaching a predetermined detection surface or an object present in the vicinity without using a mechanical contact by using an electromagnetic force or infrared rays. The proximity sensor 141 has a longer life and higher utilization than a contact sensor.

Examples of the proximity sensor 141 include a transmission photoelectric sensor, a direct reflection photoelectric sensor, a mirror reflection photoelectric sensor, a high frequency oscillation proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor. When the touch screen is capacitive, the touch screen is configured to detect the proximity of the pointer by the change of the electric field according to the proximity of the pointer. In this case, the touch screen (touch sensor) may be classified as a proximity sensor.

Hereinafter, for convenience of explanation, the act of allowing the pointer to be recognized without being in contact with the touch screen so that the pointer is located on the touch screen is referred to as a "proximity touch", and the touch The act of actually touching the pointer on the screen is called "contact touch." The position where the proximity touch is performed by the pointer on the touch screen refers to a position where the pointer is perpendicular to the touch screen when the pointer is in proximity proximity.

The proximity sensor detects a proximity touch and a proximity touch pattern (for example, a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch position, and a proximity touch movement state). Information corresponding to the sensed proximity touch operation and proximity touch pattern may be output on the touch screen. The sensing unit 140 according to an embodiment of the present invention may include a sensor unit for recognizing a face or eyes of a user. That is, the sensing unit 140 may detect a line of sight or direction in which the user is looking at the screen and recognize the object displayed in the corresponding area.

The sound output module 152 may output audio data received from the wireless communication unit 110 or stored in the memory 160 in a call signal reception, a call mode or a recording mode, a voice recognition mode, a broadcast reception mode, and the like. The sound output module 152 may also output a sound signal related to a function (eg, a call signal reception sound, a message reception sound, etc.) performed in the mobile terminal 100. The sound output module 152 may include a receiver, a speaker, a buzzer, and the like.

The alarm unit 153 outputs a signal for notifying occurrence of an event of the mobile terminal 100. Examples of events occurring in the mobile terminal include call signal reception, message reception, key signal input, and touch input. The alarm unit 153 may output a signal for notifying occurrence of an event in a form other than a video signal or an audio signal, for example, vibration. The video signal or the audio signal may be output through the display unit 151 or the audio output module 152, so that they 151 and 152 may be classified as part of the alarm unit 153.

The haptic module 154 generates various haptic effects that a user can feel. Vibration is a representative example of the haptic effect generated by the haptic module 154. The intensity and pattern of vibration generated by the haptic module 154 can be controlled. For example, different vibrations may be synthesized and output or may be sequentially output.

In addition to the vibration, the haptic module 154 may be used for stimulation such as a pin array vertically moving with respect to the contact skin surface, a jetting force or suction force of air through an injection or inlet, grazing to the skin surface, contact of an electrode, and electrostatic force. Various tactile effects can be generated, such as effects by the endothermic and the reproduction of a sense of cold using the elements capable of endotherm or heat generation.

The haptic module 154 may not only deliver the haptic effect through direct contact, but also may implement the user to feel the haptic effect through a muscle sense such as a finger or an arm. Two or more haptic modules 154 may be provided according to a configuration aspect of the mobile terminal 100.

The memory 160 may store a program for the operation of the controller 180 and may temporarily store input / output data (for example, a phone book, a message, a still image, a video, etc.). The memory 160 may store data regarding vibration and sound of various patterns output when a touch input on the touch screen is performed.

The memory 160 may be a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (for example, SD or XD memory), RAM Random Access Memory (RAM), Static Random Access Memory (SRAM), Read-Only Memory (ROM), Electrically Erasable Programmable Read-Only Memory (EEPROM), Programmable Read-Only Memory (PROM), Magnetic Memory, Magnetic It may include a storage medium of at least one type of disk, optical disk. The mobile terminal 100 may operate in connection with a web storage that performs a storage function of the memory 160 on the Internet. Memory unit 160 according to an embodiment of the present invention may store information about the object selected by the user. In addition, at least one image (the edited image) including the selected and stored object may be stored.

The interface unit 170 serves as a path with all external devices connected to the mobile terminal 100. The interface unit 170 receives data from an external device, receives power, transfers the power to each component inside the mobile terminal 100, or transmits data inside the mobile terminal 100 to an external device. For example, wired / wireless headset ports, external charger ports, wired / wireless data ports, memory card ports, ports for connecting devices with identification modules, audio input / output (I / O) ports, The video input / output (I / O) port, the earphone port, and the like may be included in the interface unit 170.

The identification module is a chip that stores various types of information for authenticating the use authority of the mobile terminal 100. The identification module includes a user identification module (UIM), a subscriber identity module (SIM), and a universal user authentication module ( Universal Subscriber Identity Module (USIM), and the like. A device equipped with an identification module (hereinafter referred to as an 'identification device') may be manufactured in the form of a smart card. Therefore, the identification device may be connected to the terminal 100 through a port.

The interface unit may be a passage through which power from the cradle is supplied to the mobile terminal 100 when the mobile terminal 100 is connected to an external cradle, or various command signals input from the cradle by a user may be transferred. It may be a passage that is delivered to the terminal. Various command signals or power input from the cradle may be operated as signals for recognizing that the mobile terminal is correctly mounted on the cradle.

The controller 180 typically controls the overall operation of the mobile terminal. For example, perform related control and processing for voice calls, data communications, video calls, and the like. The controller 180 may include a multimedia module 181 for playing multimedia. The multimedia module 181 may be implemented in the controller 180 or may be implemented separately from the controller 180.

The controller 180 may perform a pattern recognition process for recognizing a writing input or a drawing input performed on the touch screen as text and an image, respectively.

According to an embodiment of the present disclosure, the controller 180 recognizes a predetermined area or at least one object selected according to a user input input from the user input unit 130 or the display unit 151 (in the case of a display unit having a touch panel). And save the object. In addition, the controller 180 may extract or display the same object included in a program, a video, or an image based on the stored object information. In addition, the controller 180 may enlarge and reduce the object to be displayed in accordance with the resolution of the display unit 151. In addition, only an image including the selected object may be extracted and the extracted images may be grouped to generate a new image. The generated images may be controlled to be generated and stored as a new list in the memory 160.

The power supply unit 190 receives an external power source and an internal power source under the control of the controller 180 to supply power for operation of each component.

Various embodiments described herein may be implemented in a recording medium readable by a computer or similar device using, for example, software, hardware or a combination thereof.

According to a hardware implementation, the embodiments described herein include application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), and the like. It may be implemented using at least one of processors, controllers, micro-controllers, microprocessors, and electrical units for performing other functions. These may be implemented by the controller 180.

In a software implementation, embodiments such as procedures or functions may be implemented with separate software modules that allow at least one function or operation to be performed. The software code may be implemented by a software application written in a suitable programming language. The software code may be stored in the memory 160 and executed by the controller 180.

With the configuration of the mobile terminal 100 described above, a terminal control operation according to an embodiment of the present invention will be described in detail with reference to FIGS. 2 to 9.

2 is a flowchart illustrating an object selection operation of a mobile terminal according to an exemplary embodiment of the present invention, and FIG. 3 is a diagram illustrating an example of an object selection operation according to an exemplary embodiment of the present disclosure.

2 to 3, the controller 180 of the mobile terminal 100 according to an embodiment of the present invention is stored in the memory 160 or received from the wireless communication unit 110 or the interface unit 170. The video playback mode output from the output unit 150 may be performed (S202).

The controller 180 determines whether a selection signal for selecting at least one object included in an image reproduced through the user input unit 130 or the sensing unit 140 is received in the video reproduction mode (S204).

When the object selection request signal is input, the controller 180 sets a predetermined region 320 for object selection on the video playback screen 310 from the user input unit 130 or the sensing unit 140 as shown in FIG. 3. In step S206, the location and direction of the face or line of sight of the user may be recognized and the predetermined area 320 may be selected accordingly (S206).

The controller 180 may extract an image 321 to be set as an object from the selected area 320 (S208), and set the extracted image 321 as an object and store it in the memory 160 (S210).

4 is a flowchart illustrating an object display on a mobile terminal according to an embodiment of the present invention, and FIGS. 5 to 6 are diagrams illustrating screens on which an object is displayed on a mobile terminal according to an embodiment of the present invention.

4 to 6, the controller 180 of the mobile terminal 100 according to an embodiment of the present invention is stored in the memory 160 or received from the wireless communication unit 110 or the interface unit 170. The video output mode output from the output unit 150 may be performed (S402).

The controller 180 may determine whether an appearance of a preset and stored object is detected in the played video (S404).

The controller 180 may display the notification of the appearance of the object when the appearance of the preset object is detected in the played video (S406). In other words, the controller 180 may display the screen of FIG. When the preset object 520 is detected on the video screen 510 while the video play mode is performed, a display (icon) 530 indicating that the corresponding object is previously stored and set is output.

In addition, as illustrated in FIG. 5B, the object 540 may be highlighted and displayed on the video playback screen 510, or may be displayed to be highlighted with an unselected image (object).

If a predetermined object is displayed during video playback as shown in FIGS. 5A and 5B, the controller 180 may determine whether to input a signal for selecting the object (S408).

If the selection signal for the displayed object is not input, the controller 180 may turn off the object notification display 530 or 504 after a predetermined time (S410).

On the other hand, when a user input signal for selecting the displayed object is detected, the controller 180 may zoom in and display the corresponding object. That is, when the appearance of the preset object 612 is detected while the video is played as shown in FIG. 6 (a), the controller 180 displays the image of the object 612 in the screen example of FIG. 6 (b). As described above, the object 620 may be displayed in an enlarged state to match the size of the display unit 151.

The controller 180 may display the enlarged object 620 and determine whether to input a predetermined time or a user control signal (S414).

The controller 4114 may zoom out the enlarged object 620 according to a predetermined time or a user control signal input and display the same as the previous image screen (FIG. 6A) (S416).

As described above, according to an embodiment of the present invention, the appearance of the corresponding object may be detected and displayed (icon, highlight, etc.) in the video playback mode with respect to the object preset by the user's selection. In addition, the object may be displayed by zooming in (zoom in) or zooming out (zoom out, original image) to match the size of the screen.

7 is a flowchart illustrating an operation of displaying an object in a mobile terminal according to another embodiment of the present invention. 8 to 9 are diagrams illustrating screens on which an object is displayed in a mobile terminal according to another embodiment of the present invention.

7 to 9, the controller 180 of the mobile terminal 100 according to an exemplary embodiment of the present invention displays a video stored in the memory 160 or received from the wireless communication unit 110 or the interface unit 170. The video playback mode output from the output unit 150 may be performed (S702).

The controller 180 may determine whether an appearance of a preset and stored object is detected in the played video (S704).

The controller 180 may display the notification of the appearance of the object when the appearance of the preset object is detected in the reproduced video (S706). That is, the controller 180 reproduces the video as illustrated in the screen example of FIG. When the preset object 520 is detected on the video screen 510 while the mode is being executed, a display (icon, highlight, etc.) indicating that the corresponding object is a preset object may be output. (S706)

When the controller 180 detects a selection signal input for selecting the displayed object or the object notification display, the controller 180 may request a user input signal for determining whether to search for the object in detail. The controller 180 may determine whether to input a detailed search request signal for the object (S708), and if the detailed search request signal is not input, turn off the object notification display (S710).

On the other hand, when the search request signal for the displayed object is input, the controller 180 may display the search menu 820 for the object 810 as shown in the screen example illustrated in FIG. 8 (S712).

When the controller 180 receives one of the displayed menus (S714), the controller 180 may execute an operation corresponding to the selected menu. (S716) That is, the controller 180 requests a search for the object 810. According to the signal, a menu 820 including "search for detailed information", "view only search image", "view only image except search image", and "cancel" of the corresponding object may be displayed. The detailed information search may read information about the object from the memory 160 or an external server (not shown) and provide the information to the user. In addition, the view only search image and only view except search image menus may display only a scene including the corresponding object 810 or only a scene not including the corresponding object 810. For example, if a menu of “only view search image” is selected by the user, the image including the corresponding object may be displayed 910 in the form of a thumbnail or an image as shown in the screen of FIG. 9. An image including a viewpoint or an image of a selected item among the displayed items may be played.

According to an embodiment of the present invention, the above-described method may be implemented as code that can be read by a processor in a medium in which a program is recorded. Examples of processor-readable media include ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical data storage, and the like, and may be implemented in the form of a carrier wave (for example, transmission over the Internet). Include.

The above-described mobile terminal is not limited to the configuration and method of the above-described embodiments, but the embodiments may be configured by selectively combining all or some of the embodiments so that various modifications can be made. have.

Claims (20)

delete delete delete delete delete delete delete delete delete delete delete delete delete A memory for storing information;
An output unit for outputting a video;
A user interface unit receiving a user control signal for object selection while the video is output;
Setting an object selection area on a screen on which the video is output according to the user control signal input through the user interface unit;
Storing an image of an object existing in the set selection area in the memory,
While the video is output, it is determined whether an object existing in the selection area is present using the stored image,
And a controller configured to control the output unit to highlight and display an image of the object when an object existing in the selection area appears.
terminal.
The method of claim 14,
And a sensing unit configured to detect a position and a gaze of a user for setting the object selection area.
terminal.
delete The method of claim 14,
The control unit
When an object existing in the selection area appears, controlling to display the object by zooming in for a predetermined time.
terminal.
The method of claim 14,
The control unit
When an object existing in the selection area appears, the object is controlled to be highlighted.
terminal.
The method of claim 14,
The control unit
When an object existing in the selection area appears, an icon indicating that the appeared object is an object existing in the selection area is displayed.
terminal.
The method of claim 14,
The control unit
Extracting images including an object existing in the selection region from the moving image, grouping the extracted images to generate a new image, and storing the generated image in the memory
terminal.
KR1020130061517A 2013-05-30 2013-05-30 Terminal and control method thereof KR102014417B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020130061517A KR102014417B1 (en) 2013-05-30 2013-05-30 Terminal and control method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020130061517A KR102014417B1 (en) 2013-05-30 2013-05-30 Terminal and control method thereof

Publications (2)

Publication Number Publication Date
KR20140140752A KR20140140752A (en) 2014-12-10
KR102014417B1 true KR102014417B1 (en) 2019-08-26

Family

ID=52458436

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020130061517A KR102014417B1 (en) 2013-05-30 2013-05-30 Terminal and control method thereof

Country Status (1)

Country Link
KR (1) KR102014417B1 (en)

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101254037B1 (en) * 2009-10-13 2013-04-12 에스케이플래닛 주식회사 Method and mobile terminal for display processing using eyes and gesture recognition

Also Published As

Publication number Publication date
KR20140140752A (en) 2014-12-10

Similar Documents

Publication Publication Date Title
US9509959B2 (en) Electronic device and control method thereof
US10001910B2 (en) Mobile terminal and controlling method thereof for creating shortcut of executing application
US8452339B2 (en) Mobile terminal and method of controlling the same
US10042596B2 (en) Electronic device and method for controlling the same
US9081541B2 (en) Mobile terminal and method for controlling operation thereof
US20110096024A1 (en) Mobile terminal
US20140189518A1 (en) Mobile terminal
KR20130044770A (en) Searching method and mobile device using the method
US20140354536A1 (en) Electronic device and control method thereof
KR20110131439A (en) Mobile terminal and method for controlling thereof
KR20130005174A (en) Mobile device and control method for the same
KR20110016337A (en) Method for displaying data and mobile terminal thereof
KR20150056353A (en) The mobile terminal and the control method thereof
KR20140045060A (en) Mobile terminal and method for controlling thereof
KR20140033896A (en) Mobile terminal and method for controlling of the same
US9411411B2 (en) Wearable electronic device having touch recognition unit and detachable display, and method for controlling the electronic device
KR20100099828A (en) Mobile terminal for displaying three-dimensional menu and control method using the same
KR20150127842A (en) Mobile terminal and control method thereof
KR20120078396A (en) Mobile terminal and method for searching location information using touch pattern recognition thereof
KR20110133713A (en) Mobile terminal and method for controlling the same
KR20110037064A (en) Mobile terminal and method for controlling the same
US9336242B2 (en) Mobile terminal and displaying method thereof
KR20110064289A (en) Method for transmitting and receiving data and mobile terminal thereof
KR20120062165A (en) Mobile terminal and method for controlling the same
KR20110041864A (en) Method for attaching data and mobile terminal thereof

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E701 Decision to grant or registration of patent right
GRNT Written decision to grant