KR20130082260A - Computing device for performing at least one of function and controlling the same - Google Patents

Computing device for performing at least one of function and controlling the same Download PDF

Info

Publication number
KR20130082260A
KR20130082260A KR1020120003365A KR20120003365A KR20130082260A KR 20130082260 A KR20130082260 A KR 20130082260A KR 1020120003365 A KR1020120003365 A KR 1020120003365A KR 20120003365 A KR20120003365 A KR 20120003365A KR 20130082260 A KR20130082260 A KR 20130082260A
Authority
KR
South Korea
Prior art keywords
function
image
option
module
application
Prior art date
Application number
KR1020120003365A
Other languages
Korean (ko)
Inventor
김운영
박준식
이고은
김정은
이강섭
이건식
이형남
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020120003365A priority Critical patent/KR20130082260A/en
Priority to EP12189336.6A priority patent/EP2615564A1/en
Priority to US13/679,360 priority patent/US9582605B2/en
Priority to CN201210592947.7A priority patent/CN103209349B/en
Publication of KR20130082260A publication Critical patent/KR20130082260A/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/147Digital output to display device ; Cooperation and interconnection of the display device with other functional units using display panels
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

PURPOSE: A computing device for performing at least one of functions and controlling the same are provided to more quickly provide a shortcut function to a user who uses a multimedia device, and to define a protocol to which a shortcut item can be added, irrespective of types of multiple functions provided from the computing device. CONSTITUTION: An interface module receives a command signal while performing a specific function and a capture module (451) captures a screen image displayed according to the specific function. An adjustment module adjusts a size and location of the captured screen image. If an image having the adjusted size and location is selected, a controller perform a function corresponding to one among metadata mapped to the image. A decision module (452) determines a type of the aforementioned specific function, and a calculation module changes the size and location of the captured image according to the decision module. [Reference numerals] (451) Capture module; (452) Decision module; (453) Location adjustment module; (454) Size adjustment module; (455) Function extracting module; (456) Mapping module; (457) Memory; (458) Update information monitoring module

Description

BACKGROUND OF THE INVENTION 1. Field of the Invention [0001] The present invention relates to a computing device and a control method for performing at least one function,

The present invention relates to computing device technology, and more particularly, to a computing device and a control method that perform at least one or more functions. The computing device may be applied to, for example, a network TV, a smart TV, an Internet TV, an IPTV, a web TV, a mobile device or a smart phone.

Due to recent technological advances, various devices capable of performing multi-functions are emerging. While multifunctional computing devices according to such a technical environment may provide a positive aspect to the user, on the contrary, there exist functions that the user does not use at all among hundreds or thousands of functions.

In addition, a method in which a user pre-registers a desired function and accesses the function at a later time is also discussed using a shortcut button or a favorite.

However, the conventional multi-function computing device has a problem in that a user has to click through various depths in order to set a shortcut button for a specific function desired by the user.

Furthermore, even the functions that have been set as favorites are required to go through an uncomfortable process in order to eventually enter the favorite item, resulting in a large loss in time.

In addition, the functions that can be added to the list of favorites or shortcut buttons are very limited, or they are divided into groups, so that the user must remember them again.

Finally, since the options included in the favorite items according to the prior art are provided in a text form, there is a need for a solution for visually displaying items desired by the user.

One embodiment of the present invention seeks to provide a whole new type of solution that solves the problems described above and provides quicker shortcuts to users using lower computing devices.

Still another embodiment of the present invention is to define a protocol that can add a shortcut item, regardless of the type of multi-function that the computing device provides.

According to another embodiment of the present invention, there is provided a technology for automatically recognizing a path of a shortcut service according to a function of a currently executed computing device.

A control method of a computing device for performing at least one function according to an embodiment of the present invention includes executing a specific function among the at least one function, Capturing an output screen according to the execution of the specific function, adjusting the size and position of the captured image, and adjusting the size and position of the image with the adjusted size and position, And if selected, executing a function corresponding to any one of the at least one metadata mapped to the image.

Also, a computing device that performs at least one function according to an embodiment of the present invention may include an interface module that receives a command signal while executing a specific function among the at least one function, A capture module for capturing an output screen according to the execution of the captured image, an adjustment module for adjusting the size and position of the captured image, and, when the image having the adjusted size and position is selected, And a controller for controlling the execution of a function corresponding to any one of the at least one metadata.

According to an embodiment of the present invention, a completely new type of solution is provided that provides a faster shortcut to a user using a computing device.

Further, according to another embodiment of the present invention, regardless of the type of the multi-function provided by the computing device, there is an effect of defining a protocol that can add a shortcut item.

According to another embodiment of the present invention, there is an advantage of providing a technology for automatically recognizing a path of a shortcut service according to a function of a currently executed computing device.

The effects of the more specific invention will be described later in detail in the following table.

1 is a diagram illustrating an overall system including a computing device according to an embodiment of the present invention.
2 is a block diagram illustrating a computing device in greater detail in accordance with one embodiment of the present invention.
FIG. 3 is an embodiment showing the controller shown in FIG. 2 in more detail.
Fig. 4 is another embodiment showing the controller shown in Fig. 2 in more detail.
FIG. 5 is a diagram illustrating an outer appearance of a remote controller for externally controlling a computing device according to an embodiment of the present invention.
6 is a block diagram showing the internal configuration of the remote controller shown in FIG. 5 in more detail.
7 is a diagram illustrating an example of a database (DataBase) necessary for executing a capture function according to an embodiment of the present invention.
8 is a diagram showing another example of a database (DataBase) necessary for executing a capture function according to an embodiment of the present invention.
FIGS. 9A to 9C are diagrams for explaining a process of executing a capture function according to an exemplary embodiment of the present invention while viewing an arbitrary channel.
FIGS. 10A to 10C are views showing a process of moving the captured image in more detail.
11A and 11B are diagrams illustrating a process of executing an application generated according to an embodiment of the present invention.
12 is a diagram illustrating an option list for selecting a specific function among a plurality of functions when executing a capture function according to an exemplary embodiment of the present invention.
13A and 13B are diagrams for explaining a process of mapping a current channel and an application in a first option selection of the option list shown in FIG.
FIGS. 14A and 14B are diagrams for explaining a process of mapping a current content and an application in a second option selection of the option list shown in FIG.
FIGS. 15A and 15B are diagrams for explaining a process of mapping an application and an after-view function when the third option is selected in the option list shown in FIG.
16A and 16B are diagrams illustrating a process of mapping a video call service and an application.
17A to 17C are diagrams for explaining the process of executing the application generated by FIG.
18A to 18C are diagrams for explaining a process of mapping and executing a menu general function and an application.
FIGS. 19A through 19C are views for explaining a process of arbitrarily cropping a captured image to change a representative image of an application according to an embodiment of the present invention.
20A to 20C are views for explaining a process of rotating an application in the left-right direction when an application according to an embodiment of the present invention is updated.
21A to 21C are views for explaining a process of rotating an application in a vertical direction when an application is updated according to an embodiment of the present invention.
22 is a flowchart illustrating a process for creating an application using a captured image according to an embodiment of the present invention.
FIG. 23 is a flowchart for explaining another embodiment for supplementing FIG. 22. FIG.
Fig. 24 is a flowchart for explaining another embodiment for supplementing Fig. 22. Fig.
Fig. 25 is a flowchart for explaining another embodiment for supplementing Fig. 22. Fig.
26 to 30 are diagrams for explaining a process of setting an application position corresponding to a captured image according to an embodiment of the present invention.

Hereinafter, the present invention will be described in more detail with reference to the drawings.

The suffix "module" and " part "for components used in the following description are given merely for ease of description, and the" module "and" part "

BRIEF DESCRIPTION OF THE DRAWINGS The above and other features and advantages of the present invention will be more apparent from the following detailed description taken in conjunction with the accompanying drawings, in which: FIG.

As used herein, terms used in the present invention are selected from general terms that are widely used in the present invention while taking into account the functions of the present invention, but these may vary depending on the intention or custom of a person skilled in the art or the emergence of new technologies. In addition, in certain cases, there may be a term arbitrarily selected by the applicant, in which case the meaning thereof will be described in the description of the corresponding invention. Therefore, it is intended that the terminology used herein should be interpreted based on the meaning of the term rather than on the name of the term, and on the entire contents of the specification.

1 is a diagram illustrating an overall system including a computing device according to an embodiment of the present invention. Hereinafter, an overall system including a computing device according to an embodiment of the present invention will be described with reference to FIG.

1, the overall system includes a computing device 100, a broadcast station 110, a server 120, an external device 130, an external TV 140, a mobile device 150, and the like. The computing device 100 according to an exemplary embodiment of the present invention may be connected to the broadcasting station 110 by, for example, terrestrial, cable, satellite, or a network such as the Internet.

Meanwhile, the computing device 100 may be connected to the server 120 via a wired / wireless network, and the external device 130 may be a USB memory, an HDD, or the like. Further, the external TV 140 and the mobile device 150 are designed to perform wireless communication at a remote location from the computing device 100 and to enable, for example, telephone communication.

In particular, when the computing device 100 according to an embodiment of the present invention finds a channel, a program, or a specific function that the user prefers, for example, while watching a TV or executing a general function of the TV, And creates a unique application for the user.

More specifically, for example, when a capture button is selected at the time of watching a TV, a unique application is automatically generated by capturing the screen. It is also a feature of the present invention that, in the case where there are a plurality of functions that can be mapped, the option to be selected by the user is provided.

Furthermore, if a capture button according to an embodiment of the present invention is selected during execution of the general TV function, an application that can directly execute the corresponding function is created. Thus, without a complicated depth navigation, simply selecting a newly created application will jump to the captured screen or the corresponding function.

2 is a block diagram illustrating a computing device in greater detail in accordance with one embodiment of the present invention. Hereinafter, a computing device according to an embodiment of the present invention will be described in detail with reference to FIG.

2, the computing device 200 according to an exemplary embodiment of the present invention includes a broadcast receiver 210, a demodulator 240, a network interface 220, an external device interface 230, a controller 250, a video output unit 260, an audio output unit 270, a power supply unit 280, and a user interface 290. Meanwhile, the computing device 200 is designed to perform data communication with the remote controller 300, and the remote controller 300 will be described later in detail with reference to FIGS. 5 and 6. FIG.

The broadcast receiver 210 can be designed, for example, as an RF tuner or as an interface for receiving broadcast data from an external device such as a STB. The broadcast receiver 210 may receive an RF broadcast signal of a single carrier according to an Advanced Television System Committee (ATSC) scheme or an RF broadcast signal of a plurality of carriers according to a DVB (Digital Video Broadcasting) scheme.

The demodulator 240 receives the digital IF signal DIF converted by the broadcast receiver 210 and performs a demodulation operation. For example, when the digital IF signal output from the broadcast receiver 210 is of the ATSC scheme, the demodulator 240 performs 8-VSB (8-Vestigal Side Band) demodulation, for example.

The external device interface 230 is an interface that enables data communication between the external device and the computing device 200. The external device interface 230 can be connected to an external device such as a DVD (Digital Versatile Disk), a Blu ray, a game device, a camera, a camcorder, a computer (notebook)

The external device interface 230 may include a USB terminal, a CVBS (Composite Video Banking Sync) terminal, a component terminal, an S-video terminal (analog), a DVI (Digital Visual Interface) terminal, Terminal, an RGB terminal, a D-SUB terminal, and the like.

The network interface 220 provides an interface for connecting the computing device 200 with a wired / wireless network including the Internet network. The network interface 220 may include an Ethernet terminal or the like for connection to a wired network and may be connected to a wireless network such as a WLAN (Wireless LAN) ), Wibro (Wireless broadband), Wimax (World Interoperability for Microwave Access), and HSDPA (High Speed Downlink Packet Access) communication standards.

The user interface 290 transmits a signal input by the user to the control unit 250 or transmits a signal from the control unit 250 to an external device (for example, the remote controller 300). For example, the user interface 290 may be configured to transmit power on / off, channel selection, screen setting, etc. from the remote controller 300 in accordance with various communication methods such as a radio frequency (RF) communication method and an infrared The control signal is received and processed, or the control signal from the control unit 250 is transmitted to the remote controller 210).

The control unit 250 will be described later in detail with reference to FIG. 3 and FIG. 3 and 4 may be designed as separate embodiments, or may be implemented as embodiments including FIGs. 3 and 4. FIG.

The video output unit 260 converts the video signal, the data signal, the OSD signal processed by the control unit 250 or the video signal and the data signal received from the external device interface 2045 into R, G, and B signals, respectively Thereby generating a driving signal. The audio output unit 270 receives a voice processed signal, for example, a stereo signal, a 3.1 channel signal, or a 5.1 channel signal from the control unit 250, and outputs the voice.

The power supply unit 280 supplies the corresponding power supply throughout the computing device 200. Particularly, a control unit 250 that can be implemented in the form of a system on chip (SOC), a video output unit 260 for video display, and an audio output unit 270 for audio output .

FIG. 3 is an embodiment showing the controller shown in FIG. 2 in more detail.

3, the control unit 350 of the computing device includes a demultiplexing unit 351, a video decoder 352, a scaler 352, an OSD generating unit 357, a mixer 354, A portion 355 and a formatter 356, and the like. It is also within the scope of the present invention to further design a voice processing unit (not shown) and a data processing unit (not shown).

The demultiplexer 351 demultiplexes the input stream. For example, when an MPEG-2 TS is input, it can be demultiplexed into video, audio, and data signals, respectively.

The video decoder 352 decodes the demultiplexed video signal, and the scaler 353 performs scaling so that the resolution of the decoded video signal can be output from the video output unit.

The OSD generation unit 357 generates an OSD signal according to a user input or by itself. Accordingly, the mixer 354 can mix the OSD signal generated by the OSD generation unit 357 and the decoded video signal processed by the image processing units 352 and 353.

A frame rate converter (FRC) 355 can convert a frame rate of an input image. For example, a frame rate of 60 Hz is converted to 120 Hz or 240 Hz.

The formatter 356 receives the output signal of the frame rate conversion unit 355, changes the format of the signal to be suitable for the video output unit, and outputs the signal. For example, the R, G, and B data signals may be output as low voltage differential signaling (LVDS) signals or mini-LVDS signals.

Fig. 4 is another embodiment showing the controller shown in Fig. 2 in more detail. 4, including the controller shown in FIG. 3, or may implement the controller of FIG. 4 separately from FIG. In addition, the scope of rights of the present invention should, in principle, be determined by the claims.

First, it is assumed that a computing device according to an embodiment of the present invention is executing any one of at least one of the functions. At this time, it is designed to receive the command signal through the user interface 290 shown in FIG. Here, the command signal corresponds to a command for starting the capture function according to the embodiment of the present invention.

At this time, the capture module 451 shown in FIG. 4 is designed to capture a screen output according to the execution of the specific function. For example, if the function of outputting channel 11 is being executed, the current screen image of channel 11 is captured.

Further, the position adjustment module 453 and the size adjustment module 454 shown in FIG. 4 adjust the size and position of the captured image. When an image having the adjusted size and position is selected using the remote controller 300 shown in FIG. 2, the control unit 250 shown in FIG. 2 selects at least one metadata And a function corresponding to any one of them.

Meanwhile, the determination module 452 shown in FIG. 4 is designed to determine the type of the specific function described above, and a calculation module (not shown) calculates the size and position of the captured image To another value. For example, if the currently executing function determined by the function extracting module 455 is a simple channel screen, the size is reduced by 10% and designed to be located in the first layer. If the function being executed is about the aspect ratio, Reduce by 50% and design to be located in the second layer. A more detailed description will be given later with reference to Fig. 7 and Fig.

Further, the mapping module 456 generates a new application by mapping the image having the adjusted size and position with the specific function determined by the function extraction module 455. Then, the generated application is stored in the memory 457. In addition, the update information monitoring module 458 is designed to determine whether application-related update information generated according to an embodiment of the present invention exists, and to transmit updater information in real time. In this regard, it will be described later in detail with reference to FIG. 20 and FIG.

For example, when the user interface 290 receives the command signal while receiving and outputting the first content from the first channel according to the specific function described above, the control unit 250 shown in FIG. 2 And controls the video output unit 260 to display at least one or more options. A more detailed description will be given later with reference to FIG.

For example, if the first option is selected from among the displayed at least one option, the controller 250 controls the channel to be switched to the first channel. A more detailed description will be given later with reference to Figs. 13A and 13B. If the second option is selected from the displayed at least one option, the controller 250 controls to access a content provider (CP) that provides an additional service related to the first content. A more detailed description will be given later with reference to Figs. 14A and 14B. If the third option is selected from the displayed at least one option, the controller 250 controls the video output unit 260 to display a part of the first content stored in the memory from the captured time. A more detailed description will be given later with reference to Figs. 15A and 15B.

The network interface 220 of the computing device according to an embodiment of the present invention is designed to provide a video telephone service with at least one user according to the specific function. Further, when the video telephone service is being performed through the network interface 220, the control unit 250 extracts identification information corresponding to the at least one user, and, using the extracted identification information, And is designed to transmit a calling signal. Hereinafter, this will be described in more detail with reference to FIG. 16 and FIG.

FIG. 5 is a diagram illustrating an outer appearance of a remote controller for externally controlling a computing device according to an embodiment of the present invention. Hereinafter, the appearance of a remote controller for externally controlling a computing device according to an embodiment of the present invention will be described with reference to FIG.

5A, on the screen of the computing device 500, a pointer 501 corresponding to the movement of the remote controller 510 is displayed. The user can move or rotate the remote controller 510 left and right (FIG. 5 (b)) or up and down (FIG. 5 (c)). Such a remote controller 510 may be called a spatial remote controller because the pointer 501 is moved and displayed according to the movement in the 3D space.

In addition, according to the embodiment of the present invention, the capturing should be performed quickly in an arbitrary environment, so that a shortcut key 511 for performing a capture function may be added to the remote controller 510 in hardware.

5 (b), when the user moves the remote controller 510 to the left, the pointer 501 displayed on the screen of the computing device 500 also moves to the left. On the other hand, the information about the movement of the remote controller 510 sensed through the sensor of the remote controller 510 is transmitted to the computing device 500. The computing device 500 can calculate the coordinates of the pointer 501 from the information on the motion of the remote controller 510. [ The computing device 500 may display the pointer 501 so as to correspond to the calculated coordinates.

5 (c), when the user moves the remote controller 510 downward, the pointer 501 displayed on the screen of the computing device 500 also moves downward. Therefore, it is possible to quickly select a specific area in the screen of the computing device 500) using the remote controller 510 according to an embodiment of the present invention.

6 is a block diagram showing the internal configuration of the remote controller shown in FIG. 5 in more detail.

6, the remote controller 700 includes a wireless communication unit 740, a user input unit 750, a sensor unit 760, an output unit 770, a power supply unit 710, a storage unit 720, A control unit 730, and the like.

The wireless communication unit 740 is designed to be capable of communicating with any external device.

Further, according to an embodiment of the present invention, the remote controller 700 transmits to the computing device 600 a signal containing information on the motion of the remote controller 700, etc., through the RF module 741.

Further, the remote controller 700 can receive the signal transmitted by the computing device 600 through the RF module 741. [ In addition, the remote controller 700 can transmit commands to the computing device 600 via the IR module 742, such as power on / off, channel change, volume change, and the like, as needed.

The user input unit 750 may include a keypad, a button, a touch pad, or a touch screen.

The sensor unit 760 may include a gyro sensor 761 or an acceleration sensor 762. The gyro sensor 761 can sense information about the motion of the remote controller 700. [ For example, the gyro sensor 761 can sense information about the operation of the remote controller 700 on the basis of the x, y, and z axes. The acceleration sensor 762 can sense information about the moving speed of the remote controller 700 and the like. On the other hand, it is possible to further include a distance measuring sensor, thereby sensing the distance to the computing device 600. [

The output unit 770 may output an image or voice signal corresponding to an operation of the user input unit 750 or corresponding to a signal transmitted from the computing device 600. [ The output unit 770 includes an LED module 771 that is turned on when the user input unit 750 is operated or a signal is transmitted to or received from the computing device 600 through the wireless communication unit 740, An audio output module 773 for outputting sound, or a display module 774 for outputting an image.

The power supply unit 710 supplies power to each component of the remote controller 700. The power supply unit 710 can reduce power waste by interrupting the power supply when the remote controller 700 has not moved for a predetermined time.

The storage unit 720 may store various kinds of programs, application data, and the like necessary for the control or operation of the remote controller 700. Then, the control unit 730 controls various matters related to the control of the remote controller 700. For example, the control unit 730 transmits a signal corresponding to a predetermined key operation of the user input unit 750 or a signal corresponding to the motion of the remote controller 700 sensed by the sensor unit 770 to the wireless communication unit 740, Lt; / RTI > to the computing device 600 via the Internet. The operation of the remote controller 700 described above will be described later in detail with reference to FIG. 26 to FIG.

7 is a diagram illustrating an example of a database (DataBase) necessary for executing a capture function according to an embodiment of the present invention. Hereinafter, an example of a database (DataBase) necessary for executing a capture function according to an embodiment of the present invention will be described with reference to FIG.

One feature of the present invention is that, when a capture button is pressed during execution of an arbitrary function, an application represented by a captured image is automatically generated. As described above, the newly created application is mapped with the function that was being executed at the time of the capture. According to another aspect of the present invention, a size and a position included in an application list are readjusted according to a type of a function that is being executed.

For example, as shown in FIG. 7, when the A-type function is executed during the executing function, the size of the application to be included in the list is reduced by 10%, and is designed to be located in the first layer of the list. In addition, when the B type function is executing the in-execution capture function, the size of the application to be included in the list is reduced by 5%, and is designed to be located in the second layer of the list. When the capturing function is executed during the execution of the C type function, the size of the application to be included in the list is reduced by 3%, and is designed to be located in the third layer of the list.

8 is a diagram showing another example of a database (DataBase) necessary for executing a capture function according to an embodiment of the present invention. Hereinafter, another example of a database (DataBase) necessary for executing the capture function according to an embodiment of the present invention will be described with reference to FIG.

For example, as shown in FIG. 8, when the D-type function executes the executing-in-capture function, the size of the application to be included in the list is resized to a size larger than that of the general application, Design. In addition, when the E-type function executes the executing-in-capture function, the size of the application to be included in the list is resized to the same size as that of the general application and designed to belong to the second group in the list. When the F-type function is executing the in-execution capture function, the size of the application to be included in the list is resized to a size smaller than that of the general application, and is designed to belong to the third group in the list.

Therefore, as shown in FIG. 7 or 8, it is possible to provide a more customized access function to the user by adjusting the size and position of the application according to the state of the computing device on which the capture function is executed. For example, if the capturing function for viewing a channel is executed, there is an advantage that a user can easily confirm that the captured image is resized to a larger size than a normal application to a user in the future. Furthermore, by displaying the one-click access group for the channel and the one-click access group for the general TV function separately, it is possible for the user to quickly check the desired application among a large number of new applications to be continuously generated.

FIGS. 9A to 9C are diagrams for explaining a process of executing a capture function according to an exemplary embodiment of the present invention while viewing an arbitrary channel.

First, as shown in FIG. 9A, it is assumed that a user using a computing device (for example, a network TV) according to an embodiment of the present invention is viewing channel 11. Therefore, the content 950 of channel 11 is being output on the screen.

At this time, if the user wishes to set up the shortcut function for the channel currently being watched, the remote controller described above is moved to place the indicator 920 in the capture function area 930. In addition, a message 940 for guiding the main function expected when the above-described capture function area 930 is clicked may be displayed in advance. On the other hand, a list 910 of general applications stored in the memory is displayed at the bottom of the screen.

When the above-described user places the indicator 920 in the capture function area 930 using the remote controller and transmits a conform signal, the screen shown in FIG. 9B is changed.

That is, the image 951 of the channel 11 captured at the time when the user transmits the conform signal is designed to be ready to move to a certain area 952 of the application list as shown in FIG. 9B. Then, as shown in FIG. 9C, the image 953 of the captured channel 11 is automatically arranged in a predetermined region 954 of the channel list.

FIGS. 10A to 10C are views showing a process of moving the captured image in more detail.

9A, when the capture function area or the capture button is confirmed, the current channel screen 1000 that the user is viewing is automatically captured (see FIG. 10A). At this time, the size of the initially captured image data has a length value with respect to the vertical axis and a b length value with respect to the horizontal axis.

Meanwhile, in the process of moving the captured image to the application list, the image 1010 is changed to a relatively reduced size image 1010 as shown in FIG. 10B. If it is finally included in the application list, the size is changed to a size 1020 similar to a general application existing in the application list as shown in FIG. 10C. However, if a user wants to use a short channel function after capturing an arbitrary channel broadcasting screen, it is necessary to design a relatively larger size than a general application. This is because, in contrast to an application created and distributed by a company, a user's own application of the present invention needs to be identified as a captured image. Of course, it is also within the scope of the present invention to design information indicating a newly created application and a mapped shortcut function (for example, a channel number, a content name, etc.) together.

11A and 11B are diagrams illustrating a process of executing an application generated according to an embodiment of the present invention. As described above with reference to FIGS. 9 and 10, it is assumed that a user captures a specific image and generates a user-specific application.

For example, as shown in FIG. 11A, when the user selects a broadcast program called " LOVE " 1120 on channel 7, the corresponding video data 1130 is displayed on the screen. At this time, it is assumed that the user watches the program " I am an artist " of the MBC channel previously created.

Unlike the prior art, it suffices for the user to simply select the indicator 1100 moving according to the movement of the remote controller as the user-specific application 1110 displayed as the captured image in the application list.

Accordingly, when the user selects a specific application displayed with the captured image, the program 1150 is switched to the " I am singer " 1140 of the channel 11 (MBC channel) mapped as shown in Fig. 11B, Is displayed on the screen. However, when the capture function according to an exemplary embodiment of the present invention is performed while watching an arbitrary channel, whether the user's intention is a quick function setting for the channel or a quick function setting for the content, Since it is not possible to accurately determine whether the quick function setting is made, either of these can be defined as a default value. Of course, a service that provides a selection option to the user is also possible, which will be described with reference to FIG. 12 and the following figures.

12 is a diagram illustrating an option list for selecting a specific function among a plurality of functions when executing a capture function according to an exemplary embodiment of the present invention.

It is assumed that a user using a computing device according to an embodiment of the present invention is viewing channel 7 " LOVE " program 1230. [ At this time, when the user places the indicator 1200 in the capture button area 1210 using the remote controller, the user provides all the options 1220 that can be generated by the application as shown in FIG. Each of the options will be described in detail below with reference to FIGS. 13 to 15. FIG.

13A and 13B are diagrams for explaining a process of mapping a current channel and an application in a first option selection of the option list shown in FIG.

In the option list 1220 shown in FIG. 12, when the current channel is selected to be mapped to a new application, the channel information at the time of capture with the captured image 1310, as shown in FIG. 13A, (1320) is included in the application list.

Furthermore, a user who has switched to another channel 11 (1330) watches the current broadcast program 1340 of channel 11. In this case, when the application represented by the captured image 1310 is executed using the indicator 1300, it is a feature of the present invention that the screen is switched to the screen shown in FIG. 13B.

That is, since the application displayed with the captured image is mapped to the channel switching function to the channel 7, the current broadcast program 1360 of channel number 7 (1350) is output as shown in FIG. 13B. This is because when a user creates an application with a captured image, the user gives priority to the channel at the time of capturing the content itself.

FIGS. 14A and 14B are diagrams for explaining a process of mapping a current content and an application in a second option selection of the option list shown in FIG.

If the option list 1220 shown in FIG. 12 is selected to map the content currently being watched to a new application, as shown in FIG. 14A, Content information 1420 (e.g., a broadcast program title) is included in the application list.

Further, the user who has switched to the 11th channel 1430 of the other channel watches the current broadcast program 1440 of the channel 11. In this case, when the application displayed with the captured image 1410 is executed using the indicator 1400, it is a feature of the present invention that the screen is switched to the screen shown in FIG. 14B.

That is, since the application displayed with the captured image is mapped to the access function to the CP providing the " LOVE " broadcast program, the service provides the "LOVE" broadcast program viewing service 1450 as shown in FIG. 14B. For example, each episode is output in the form of a list 1460. Alternatively, it may be directly connected to a web site providing the corresponding broadcast program (LOVE). When creating an application with a captured image, the user gives priority to the content itself at a point in time when it is captured from the channel.

FIGS. 15A and 15B are diagrams for explaining a process of mapping an application and an after-view function when the third option is selected in the option list shown in FIG.

If the option list 1220 shown in FIG. 12 is selected for mapping the follow-up view function to a new application, a text (1510) for guiding the view along with the captured image 1510 as shown in FIG. 1520) are included in the application list.

Further, the user who has switched to another channel 11 (1530) watches the current broadcast program 1540 of channel 11. In this case, when the application represented by the captured image 1510 is executed using the indicator 1500, it is a feature of the present invention that the screen is switched to the screen shown in FIG. 15B. On the other hand, when the application is created by the capture function, if the "view next" function is mapped, the content of the corresponding channel is automatically stored in the memory from the point of capture.

That is, since the application displayed with the captured image is mapped with the " next view " function, the LO broadcast program of the recorded channel number 7 is continuously output from the time of capture as shown in FIG. 15B. Therefore, when the user executes only the capture function according to the embodiment of the present invention, there is no problem in re-viewing even when surfing to another channel, and the captured image is displayed on the list as an application before channel switching is performed. There is a technical effect that can prevent the recording itself from being forgotten accidentally after recording.

16A and 16B are diagrams illustrating a process of mapping a video call service and an application.

Recent network TVs are capable of video call services with remote parties. However, as the network develops, the number of other parties that can provide video call service will increase exponentially, and it may be another matter to remember the phone number one by one or to search each other every time.

For example, as shown in FIG. 16A, it is assumed that a computing device according to an embodiment of the present invention is providing a video call service 1600. The face of the user of the computing device is generally displayed in the lower left corner 1640 and occupies the entire face 1630 of the face of the other party. At this time, the capture button 1620 according to an embodiment of the present invention is clicked using an indicator 1610 corresponding to the movement of the remote controller.

Accordingly, the computing device according to an embodiment of the present invention creates a new application 1650 as shown in FIG. 16B by mapping the identification information of the other party with which the video communication service is currently performed and the captured image of the other party. Two alternative solutions are available for capturing the other party's image. When the user selects the capture button, it is possible to capture the entire image of the entire image. Alternatively, since the image of the other party is located in the middle region, it is also possible to automatically crop and save only the middle region in the captured image Do.

In addition, the user who has confirmed the captured image of the application shown in FIG. 16 has an advantage that it can recognize the other party more clearly through the face of the other party.

17A to 17C are diagrams for explaining the process of executing the application generated by FIG.

First, as shown in FIG. 17A, it is assumed that an application including a captured image 1720 and a message 1730 for guiding the telephone service is included in the list. A user using a computing device according to an embodiment of the present invention watches the content 1750 of the channel 11 (1740), confirms the captured image of the application list, and calls the other party of the captured image I suppose. At this time, the user merely places the indicator 1710 in the captured image 1720.

Further, as shown in Fig. 17B, the image is switched to the telephone service mode 1760, and its own image 1780 photographed by the camera started for the video call service is displayed in the first area, A message is displayed on the entire screen 1770. At the same time, the call signal is transmitted to the partner identification number mapped to the application of the captured image.

When the other party responds to the calling signal, as shown in Fig. 17C, the current photographing screen 1790 of the other party is displayed on the entire screen. Therefore, the user can solve the problem of individually memorizing or searching the telephone of the other party of video telephone.

18A to 18C are diagrams for explaining a process of mapping and executing a menu general function and an application.

The present invention may apply not only during the use of the video telephone service but also during the execution of the general TV function during the viewing of any channel broadcast described above.

For example, as shown in FIG. 18A, a computing device according to an embodiment of the present invention executes a menu 1810 function and an associated list is displayed. For each of the functions displayed on the list, a capture button is also displayed together with the image. Thus, the user can move the indicator 1800 to perform the capture function. Assume that you click the Capture button for the "Aspect Ratio" function, as shown in Figure 18A.

At this time, in the application list located at the lower end of Fig. 18B, the text 1840 and the representative image 1830 for guiding the aspect ratio adjustment are newly added to the application. Of course, the captured image may be used as in the previous description, but conventional TV functions may employ existing image data stored in the memory.

18B, while the broadcast program 1860 of channel 11 (1850) is being output, the user who adjusts and adjusts the aspect ratio simply displays the indicator 1820 in the image of the newly created application 1830).

Thus, as shown in FIG. 18C, the aspect ratio of the broadcast program is automatically changed (1870). According to the related art, there has been a problem in that, in order to adjust the aspect ratio, a menu item is clicked and detailed lists are individually checked. However, according to the embodiment of the present invention described above, the user can perform a desired function only by one-click operation.

FIGS. 19A through 19C are views for explaining a process of arbitrarily cropping a captured image to change a representative image of an application according to an embodiment of the present invention.

In the description of the previous drawings, a case has been described in which the current broadcast screen is captured and used as it is, or an image pre-stored in the memory is used by borrowing. In FIG. 19, a tool for editing a captured image desired by the user will be described.

As shown in FIG. 19A, let us suppose that a program for Korean professional baseball is executed. When the capture button 1900 is clicked, an option 1910 for inquiring whether to resize the captured image is displayed unlike the description of the previous drawings.

As shown in FIG. 19B, when the user clicks two positions (1920, 1930) of cropping and reducing the position, only the rectangular portion is automatically cropped and used as an image for the application. That is, as shown in FIG. 19C, a text 1950 for guiding a program for Korean professional baseballs and an image 1940 for only a portion of the user's crop are included in the application list.

There are two main reasons why such a design is necessary. First, the entire captured image differs from the user 's intention or there is not enough immediate visualization information to use as a shortcut function in the future. Second, it is the case that a captured image is formed and recorded for a specific part by a user's personal taste. For example, the captured image of the logo of the cheering team among the many teams of Korean professional baseball is designed considering the needs of fans who think it represents Korean baseball.

20A to 20C are views for explaining a process of rotating an application in the left-right direction when an application according to an embodiment of the present invention is updated.

First, as described above, it is assumed that an application represented by the captured image 2000 is automatically generated. At this time, as shown in FIG. 20 (a), it is assumed that the face of the other party (for example, mother) is captured while executing the video call service. On the other hand, if the identification information of the mother mapped to the generated application is logged out and the user is logged in again in the future, a user interface is required to promptly notify the user.

Therefore, as shown in Fig. 20 (b), the new original image 2010 is gradually displayed while moving in the left direction, and a new image 2020 informing that the mapped identification information of the mapped mother is re-recognized is gradually displayed. Finally, as shown in FIG. 20 (c), the original original image completely disappears and only the image 2030 displaying the update information is designed to be displayed. Further, as shown in Figs. 20 (a) to 20 (c), by adopting a UI that gradually changes from a first image to a second image instead of a method of suddenly replacing the image, There is a technical effect that can be easily and promptly confirmed.

21A to 21C are views for explaining a process of rotating an application in a vertical direction when an application is updated according to an embodiment of the present invention.

Fig. 21 is another embodiment in two aspects in comparison with Fig. First, the application generated according to an embodiment of the present invention is mapped to a web address or the like provided by the CP, and is characterized in that the update information is adjusted up and down rather than horizontally.

First, a user who selects an arbitrary channel executes a capture function according to an embodiment of the present invention, and maps a function of accessing a CP providing content at the time of capturing to a new application. Therefore, as shown in Fig. 21 (a), the image 2100 at the time of capture is displayed as a representative image of the new application.

Further, when the content-related information at the time of capturing is updated, the original original image 2110 disappears while moving upward, and the image 2120 displaying the update information is gradually displayed as shown in FIG. 21 (b) Design.

Therefore, as shown in FIG. 21 (c), only the image 2120 displaying the update information is designed to be displayed. Although not shown in FIG. 21, it is also within the scope of the present invention to design the image of FIG. 21 (c) to return to the image of FIG. 21 (a) again depending on whether the user is selected or elapsed.

The reason for designing the rotation direction of FIG. 20 different from FIG. 21 is that the application generated in FIG. 21 has a longer horizontal length than the vertical length, Is more likely to improve the user's food toughness.

22 is a flowchart illustrating a process for creating an application using a captured image according to an embodiment of the present invention. 22 is a diagram for explaining an embodiment of the present invention. It is also within the scope of the present invention to delete some steps or add other steps according to the needs of those skilled in the art. Further, with reference to the description of FIG. 1 to FIG. 21 described above, FIG. 22 may be supplementary analyzed.

First, a control method of a computing device that performs at least one function according to an embodiment of the present invention executes a specific one of the at least one function (S2201). In addition, a command signal during execution of the specific function is received (S2202). The command signal may be, for example, a method of pressing a capture button or the like described previously.

And captures an output screen according to the execution of the specific function (S2203). The size and position of the captured image are adjusted (S2204). At this time, the executed specific function and the adjusted captured image are mapped and stored in the memory (S2205).

When the image having the adjusted size and position is selected, a function corresponding to one of the at least one metadata mapped to the image is executed. For example, when there is one function that can be mapped, mapping can be performed automatically. If there are a plurality of functions that can be mapped, as shown in FIG. 12, a menu Or to automatically select a function set to a default, which is also within the scope of the present invention.

According to another embodiment of the present invention, a step of determining a type of the specific function and a step of calculating a size and a position of the captured image as different values according to the determination result are further included. In this regard, a detailed description as described above with reference to FIGS. 7 and 8 will be omitted.

According to another embodiment of the present invention, there is further provided a method of generating an application having an image having the adjusted size and position and an application to which the specific function is mapped, and storing the created application in a memory Design.

FIG. 23 is a flowchart for explaining another embodiment for supplementing FIG. 22. FIG. 23 and 22 may be combined to set different scope of rights of the present invention.

Between steps S2203 and S2204 shown in Fig. 22, the two steps shown in Fig. 23 are added.

That is, when receiving and outputting the first content from the first channel according to the specific function, it is determined whether a plurality of functions that can be mapped exist (S2301). At this time, at least one or more options for selecting a specific function are displayed (S2302). In this regard, as fully described in the description of FIG. 12, those skilled in the art will be able to implement FIG. 23 with reference to those drawings.

For example, when the first option (item 1220 to 1 in FIG. 12) is selected from among the displayed at least one option, a function corresponding to one of the at least one metadata is executed. The controller controls the channel switching to the first channel.

On the other hand, when the second option (items 1220 to 1220 in FIG. 12) is selected from among the displayed at least one option, the step of executing a function corresponding to any one of the at least one metadata , And accesses to the CP providing additional services related to the first content.

Finally, when the third option (item 1220 to item 3 of FIG. 12) is selected from among the displayed at least one option, the step of executing a function corresponding to any one of the at least one metadata Displays a part of the first content stored in the memory from the time of the capture.

Fig. 24 is a flowchart for explaining another embodiment for supplementing Fig. 22. Fig. 24 and 22 can be combined to set another scope of rights of the present invention.

Between steps S2203 and S2204 shown in Fig. 22, the three steps shown in Fig. 24 are added. 23 and 24 may be implemented as one embodiment, or two flows may be implemented as two embodiments.

First, as shown in Fig. 24, an OSD inquiring whether or not to edit the size of the captured image is displayed (S2401). In addition, a signal for selecting two points of the area to be captured by the user is received (S2402). The device according to an embodiment of the present invention crops only a specific area based on two points (positions) selected by the user and uses the cropped image as a representative image of the application (S2403). A more detailed description can be understood by those skilled in the art with reference to FIG. 19 above.

Fig. 25 is a flowchart for explaining another embodiment for supplementing Fig. 22. Fig. 25 and FIG. 22 may be combined to set another right scope of the present invention. After step S2206 shown in Fig. 22, two steps shown in Fig. 25 are added.

First, it is determined whether the additional information related to the application stored in the memory is updated (S2501). If an update is made as a result of the determination, the first image having the adjusted size and position is readjusted to the second image (S2502). In step S2502, the second image is designed to be replaced, for example, by replacing the first image, and the second image is associated with the updated additional information. 20 and 21, a person skilled in the art is capable of repeated implementation.

26 to 30 are diagrams for explaining a process of setting an application position corresponding to a captured image according to an embodiment of the present invention. As described above, an application created with the captured image according to an embodiment of the present invention, as described above, is automatically moved to an application list (which may be referred to as a " launch bar "). 26 to 30, a solution for providing a more dynamic user interface in moving the user to a desired position will be described in detail.

26, when the user views a favorite scene and confirms a favorite scene 2600 and at the same time creates it as a new application, the pointer 2620 of the motion remote controller 2610 And places it in the capture option 2630. The motion remote controller 2610 has been described in detail with reference to FIGS. 5 and 6, and a duplicated description will be omitted.

Next, as shown in Fig. 27, the captured image 2710 is automatically converted into a size (width: a, length: b) smaller than the original image 2600, . Since the screen 2700 continuously displays the broadcast of the current channel in real time, the user is not disturbed to view the current channel even if the user performs the capture function according to the embodiment of the present invention. Furthermore, the user can naturally recognize that the captured image is automatically resized and moved to the center, thereby quickly moving the captured image to the list.

At this time, the user moves the pointer corresponding to the motion of the remote controller 2810 from the first position 2820 to the second position 2830, as shown in Fig. Thus, the captured image 2840 becomes smaller than in Fig. 27, the capture image has length values a and b, but in FIG. 28, the captured image has relatively long c and d length values, which are relatively small compared to a and b.

In addition, since the pointer moves in the list area direction (downward direction) means that the pointer is moving in the proper direction, there is an advantage that the user can naturally notify the user that the pointer is in the direction to enter the list. Also in this case, the broadcasting screen 2800 is not in a fixed state, but the broadcasting of the current channel is continuously displayed in real time.

Next, the user moves the pointer corresponding to the motion of the remote controller 2910 from the first position 2920 to the second position 2930 as shown in Fig. Therefore, the capture image 2940 is resized (width: e, length: f) so that the image of the existing application is included in the displayed list. At this time, however, since it is not yet completely included in the list, it is set to be slightly larger than the image size of the existing applications, thereby inducing the user to slightly move the pointer downward. As described repeatedly, the broadcast 2900 of the current channel is continuously displayed in real time.

Finally, as shown in FIG. 30, the user moves the pointer corresponding to the motion of the remote controller 3010 from the first position 3020 to the second position 3030. Thus, the size of the final captured image 3040 is designed to have the same or slightly different value (5%, 3%, etc.) than the representative image of the existing application. As described repeatedly, the broadcast 3000 of the current channel is continuously displayed in real time.

26 to 30, the user can drag the captured image in an arbitrary direction. In particular, as the user drags the image in a proper direction (direction toward the area where the application list exists), the captured image . Therefore, even if a special indicator is not provided, the user has an advantage of knowing in which direction the captured image should be moved.

1 to 30 are described for convenience of explanation, it is obvious that the present invention incorporates some of the features of the drawings to implement other embodiments.

In this specification, both the invention and the invention of the method are explained, and the description of the two inventions can be supplementarily applied as necessary.

The method inventions according to the present invention can all be implemented in the form of program instructions that can be executed through various computer means and recorded on a computer readable medium.

The computer-readable medium may include program instructions, data files, data structures, and the like, alone or in combination. The program instructions recorded on the medium may be those specially designed and constructed for the present invention or may be available to those skilled in the art of computer software. Examples of computer-readable media include magnetic media such as hard disks, floppy disks and magnetic tape; optical media such as CD-ROMs and DVDs; magnetic media such as floppy disks; Magneto-optical media, and hardware devices specifically configured to store and execute program instructions such as ROM, RAM, flash memory, and the like. Examples of program instructions include machine language code such as those produced by a compiler, as well as high-level language code that can be executed by a computer using an interpreter or the like. The hardware devices described above may be configured to operate as one or more software modules to perform the operations of the present invention, and vice versa.

While the invention has been shown and described with reference to certain preferred embodiments thereof, it will be understood by those of ordinary skill in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims. This is possible.

Therefore, the scope of the present invention should not be limited to the described embodiments, but should be determined by the equivalents of the claims, as well as the claims.

100: computing device
110: Station
120: Server
130: External device
140: External TV
150: Mobile device

Claims (19)

A control method of a computing device for performing at least one function,
Executing a specific function among the at least one function;
Receiving a command signal while executing the specific function;
Capturing an output screen according to execution of the specific function;
Adjusting a size and position of the captured image; And
Executing a function corresponding to any one of the at least one metadata mapped to the image if the image having the adjusted size and location is selected,
≪ / RTI >
The method according to claim 1,
Determining a type of the specific function; And
Calculating a size and a position of the captured image as different values according to the determination result
Further comprising the steps of:
The method according to claim 1,
Creating an application to which the image having the adjusted size and position and the specific function are mapped; And
Storing the generated application in a memory
Further comprising the steps of:
The method according to claim 1,
When receiving the first content from the first channel according to the specific function and outputting the first content,
Displaying at least one option upon receipt of the command signal, the option being selectable;
Further comprising the steps of:
5. The method of claim 4,
If the first option is selected from among the displayed at least one option,
The step of executing a function corresponding to any one of the at least one metadata,
And controlling the channel to be switched to the first channel.
5. The method of claim 4,
If the second option is selected from among the displayed at least one option,
The step of executing a function corresponding to any one of the at least one metadata,
And controlling access to a content provider (CP) that provides an additional service related to the first content.
5. The method of claim 4,
If the third option is selected from among the displayed at least one option,
The step of executing a function corresponding to any one of the at least one metadata,
And displaying a part of the first content stored in the memory from the captured time point.
The method according to claim 1,
When providing the video telephone service with at least one user according to the specific function,
The step of executing a function corresponding to any one of the at least one metadata,
Extracting identification information corresponding to the at least one user; And
Transmitting the calling signal using the extracted identification information,
≪ / RTI >
The method of claim 3,
Determining whether additional information associated with an application stored in the memory has been updated; And
Resuming the first image having the adjusted size and position to the second image if an update is made as a result of the determination
≪ / RTI >
10. The method of claim 9,
Wherein the step of re-
Wherein the second image is designed to replace the first image and the second image is associated with the updated additional information.
1. A computing device that performs at least one function,
An interface module for receiving a command signal during execution of a specific function among the at least one function;
A capture module for capturing an output screen according to execution of the specific function;
An adjustment module for adjusting the size and position of the captured image; And
A controller for controlling to execute a function corresponding to any one of the at least one metadata mapped to the image when the image having the adjusted size and position is selected,
≪ / RTI >
12. The method of claim 11,
A determination module for determining the type of the specific function; And
A calculation module for calculating the size and the position of the captured image as different values according to the determination result;
Lt; / RTI >
12. The method of claim 11,
A generator for generating an application to which the image having the adjusted size and position and the specific function are mapped; And
A memory for storing the generated application
Lt; / RTI >
12. The method of claim 11,
When the interface module receives the command signal while receiving and outputting the first content from the first channel according to the specific function,
Wherein the controller controls the display module to display at least one or more options.
15. The method of claim 14,
If the first option is selected from among the displayed at least one option,
Wherein the controller controls the channel switching to the first channel.
15. The method of claim 14,
If the second option is selected from among the displayed at least one option,
Wherein the controller controls access to a content provider (CP) that provides an additional service related to the first content.
15. The method of claim 14,
If the third option is selected from among the displayed at least one option,
Wherein the controller displays a portion of the first content stored in the memory from the time of the capture.
12. The method of claim 11,
And a network interface for providing a video telephone service with at least one user according to the specific function.
19. The method of claim 18,
When the video telephone service is being performed through the network interface,
The controller comprising:
Extracting identification information corresponding to the at least one user, and
And to transmit the calling signal using the extracted identification information.
KR1020120003365A 2012-01-11 2012-01-11 Computing device for performing at least one of function and controlling the same KR20130082260A (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
KR1020120003365A KR20130082260A (en) 2012-01-11 2012-01-11 Computing device for performing at least one of function and controlling the same
EP12189336.6A EP2615564A1 (en) 2012-01-11 2012-10-19 Computing device for performing at least one function and method for controlling the same
US13/679,360 US9582605B2 (en) 2012-01-11 2012-11-16 Generating user specific applications for performing functions in a user device and method for controlling the same
CN201210592947.7A CN103209349B (en) 2012-01-11 2012-12-31 Computing device and its control method for performing at least one function

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020120003365A KR20130082260A (en) 2012-01-11 2012-01-11 Computing device for performing at least one of function and controlling the same

Publications (1)

Publication Number Publication Date
KR20130082260A true KR20130082260A (en) 2013-07-19

Family

ID=48993609

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020120003365A KR20130082260A (en) 2012-01-11 2012-01-11 Computing device for performing at least one of function and controlling the same

Country Status (1)

Country Link
KR (1) KR20130082260A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20160039996A (en) * 2014-10-02 2016-04-12 엘지전자 주식회사 Mobile terminal and method for controlling the same

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20160039996A (en) * 2014-10-02 2016-04-12 엘지전자 주식회사 Mobile terminal and method for controlling the same

Similar Documents

Publication Publication Date Title
CN107770627B (en) Image display apparatus and method of operating the same
CN107567713B (en) Television and method for controlling television
US9137476B2 (en) User-defined home screen for ultra high definition (UHD) TV
CN107852531B (en) Display device and control method thereof
CN107637089B (en) Display device and control method thereof
KR102035134B1 (en) Image display apparatus and method for operating the same
KR101852818B1 (en) A digital receiver and a method of controlling thereof
US9088814B2 (en) Image display method and apparatus
KR101774316B1 (en) Image display device and method of managing conents using the same
US9582605B2 (en) Generating user specific applications for performing functions in a user device and method for controlling the same
US8397258B2 (en) Image display apparatus and method for operating an image display apparatus
US9332300B2 (en) Apparatus and method for controlling display of information on a television
KR20160060846A (en) A display apparatus and a display method
CN111726673B (en) Channel switching method and display device
CN111669634A (en) Video file preview method and display equipment
KR20120065689A (en) Image processing apparatus, user interface providing method thereof
CN109922364B (en) Display device
KR20170022333A (en) Digital device and method of processing data the same
KR20170017606A (en) Digital device and method of processing data the same
KR102311249B1 (en) Display device and controlling method thereof
CN113259733B (en) Display device
KR102243213B1 (en) Image display device and operation method of the image display device
KR20130082260A (en) Computing device for performing at least one of function and controlling the same
KR20160148875A (en) Display device and controlling method thereof
CN112243147A (en) Video image zooming method, video image zooming service device and display equipment

Legal Events

Date Code Title Description
E902 Notification of reason for refusal
E601 Decision to refuse application