CN107743710B - Display device and control method thereof - Google Patents

Display device and control method thereof Download PDF

Info

Publication number
CN107743710B
CN107743710B CN201680030475.2A CN201680030475A CN107743710B CN 107743710 B CN107743710 B CN 107743710B CN 201680030475 A CN201680030475 A CN 201680030475A CN 107743710 B CN107743710 B CN 107743710B
Authority
CN
China
Prior art keywords
image
display device
display
display apparatus
service
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201680030475.2A
Other languages
Chinese (zh)
Other versions
CN107743710A (en
Inventor
李桂林
罗楠·布鲁莱
厄万·布鲁莱
盖尔·雨果
崔松雅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Priority claimed from PCT/KR2016/005604 external-priority patent/WO2016190691A1/en
Publication of CN107743710A publication Critical patent/CN107743710A/en
Application granted granted Critical
Publication of CN107743710B publication Critical patent/CN107743710B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/426Internal components of the client ; Characteristics thereof
    • H04N21/42653Internal components of the client ; Characteristics thereof for processing graphics
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/4722End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for requesting additional data associated with the content
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/482End-user interface for program selection
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/485End-user interface for client configuration
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/8146Monomedia components thereof involving graphical data, e.g. 3D object, 2D graphics
    • H04N21/8153Monomedia components thereof involving graphical data, e.g. 3D object, 2D graphics comprising still images, e.g. texture, background image

Abstract

A display device, comprising: a display, a user input, configured to issue a command for executing the GUI; and a processor configured to process a first image including the object to be displayed on the display, and in response to receiving a command to execute the GUI from a user input while the first image is displayed, process the GUI corresponding to an outline of the object to be displayed on the display.

Description

Display device and control method thereof
Technical Field
Apparatuses and methods consistent with exemplary embodiments relate to displaying various types of contents as well as images of broadcast programs and images of various preset additional services and control methods thereof, and more particularly, to reducing visual fatigue of a user when video contents are switched to an additional service and control methods thereof.
Background
The image processing apparatus processes image signals and/or video data received from an external device according to various video processing procedures. The image processing apparatus may display an image based on the processed video data on its own display panel or output a processed image signal to another display apparatus provided with a panel so that an image can be displayed based on the processed image signal on the corresponding display apparatus. That is, the image processing apparatus may include a panel capable of displaying an image, or may not include a panel, as long as the image processing apparatus can process video data. For example, the former may include a display device such as a Television (TV), and the latter may include a set-top box.
As the technology develops and the demand increases, display devices implemented by Televisions (TVs), tablet computers, mobile phones, etc. have been proposed to display video contents and also to provide various additional services. For example, the display device provides various image-based services such as game play, Web page display, time or weather notification based on an installation location, day and date notification, still image display such as a photograph, text display, activation and use of hardware components installed in the display device, configuration settings of the display device, and the like. Further, such additional services may be provided as a network-based service when the display apparatus is connected to an external network through wired or wireless communication, or may be provided as a client-based service regardless of connection to the external network.
When video content is displayed on the display device, the user may cause a particular trigger event to be generated, instructing the display device to initiate additional services from the video content. When the display device initiates an additional service in response to a triggering event, an abrupt change from the image of the video content to the additional service occurs. Such sudden changes may make the user feel visual fatigue, and the visual fatigue may become severe when the image of the video content includes a large-scale animation or when the image is displayed full screen.
Disclosure of Invention
Technical scheme
Exemplary embodiments may address at least the above problems and/or disadvantages and other disadvantages not described above. Further, the exemplary embodiments need not overcome the above disadvantages and may not overcome any of the above problems.
According to an aspect of an example embodiment, there is provided a display apparatus including: a display, a user input, configured to issue a command for executing a Graphical User Interface (GUI); and at least one processor configured to: processing a first image including at least one object for display on the display, and if a command to execute the GUI is received from a user input device while the first image is displayed, processing the GUI corresponding to an outline of the at least one object for display on the display. The at least one processor may generate a second image corresponding to the outline of the at least one object by processing image information acquired from the first image, and may process the second image to be displayed as a background image of the GUI. Accordingly, when the first image of the video content is not displayed due to the display of the GUI, visual feedback of the first image is continuously provided to the user, thereby preventing discontinuity of the viewing experience of the user for the first image as much as possible. Further, an abrupt change is prevented when the first image is switched to the GUI, thereby reducing visual fatigue of the user.
The at least one processor may acquire, in real time, image information changed by reproducing the first image, and may process the second image to be changed according to the image information acquired in real time. Therefore, by changing the second image even when the first image is not displayed, it is possible to prevent the video feedback from being interrupted due to the reproduction of the first image.
The at least one processor may adjust the initially generated second image based on the gray scale, and may gradually change red, green, and blue (RGB) values of the second image such that the second image may change with respect to the preset color over time. Therefore, when the first image is switched to the second image based on the specific color, the user can be protected from visual fatigue due to an abrupt change in color.
The image information may include RGB values of each pixel included in a specific image frame of the first image. The second image may be generated by: dividing an image frame into a plurality of portions; applying, to each divided portion, a luminance and a transparency derived corresponding to the RGB values of the pixels included in each divided portion; adjusting a lateral width of each of the divided parts corresponding to the brightness; and applying fuzzy filtering to each of the partitions. The at least one processor may adjust the lateral width of the divided portion having higher luminance to be narrower and adjust the lateral width of the divided portion having lower luminance to be wider. The at least one processor may apply blur filtering to left and right edges of upper, lower, left, and right edges (i.e., four edges) of the partition. Accordingly, when the first image is switched to the second image, a soft wave effect may be provided to reduce visual fatigue of the user.
One or more GUIs may be provided with respect to each of a plurality of preset services, and the one or more GUIs provided corresponding to each service may be preset as: switching between GUIs is performed in response to commands issued by user input for movement in the up, down, left, and right directions (i.e., four directions). The at least one processor may switch between GUIs of different services in response to a command issued through a user input to move in upward and downward directions, and may switch between GUIs of one service in response to a command issued through a user input to move in leftward and rightward directions. Accordingly, the user can use the additional service more easily because switching the service and displaying the image are achieved only by intuitive commands for movement in the up, down, left, and right directions (i.e., four directions).
According to an aspect of another example embodiment, there is provided a method of controlling a display apparatus, the method including: displaying a first image including at least one object; receiving a command for executing the GUI from a user while the first image is displayed; and displaying a GUI corresponding to the outline of the at least one object. Displaying the GUI may include: generating a second image corresponding to the contour of the at least one object by processing preset image information acquired from the first image; the second image is processed to be displayed as a background image of the GUI. Accordingly, when the first image of the video content is not displayed due to the display of the GUI, visual feedback of the first image is continuously provided to the user, thereby preventing discontinuity of the viewing experience of the user for the first image as much as possible. Further, an abrupt change is prevented when the first image is switched to the GUI, thereby reducing visual fatigue of the user.
Displaying the GUI may include acquiring, in real time, image information changed by reproducing the first image, and processing the second image to change according to the image information acquired in real time. Therefore, by changing the second image even when the first image is not displayed, it is possible to prevent the video feedback from being interrupted due to the reproduction of the first image.
Displaying the GUI may include initially generating the second image based on the gray scale adjustment, and gradually changing red, green, and blue (RGB) values of the second image so that the second image may be changed with respect to a preset color over time. Therefore, when the first image is switched to the second image based on the specific color, the user can be protected from visual fatigue due to an abrupt change in color.
The image information may include RGB values of each pixel included in a specific image frame of the first image. Generating the second image may include: dividing an image frame into a plurality of portions; applying, to each divided portion, a luminance and a transparency derived corresponding to the RGB values of the pixels included in each divided portion; adjusting a lateral width of each of the divided parts corresponding to the brightness; and applying fuzzy filtering to each of the partitions. Adjusting the lateral width of each of the divided parts corresponding to the luminance may include: adjusting the lateral width of the divided portion having a higher luminance value to be narrower; and the lateral width of the divided parts having lower luminance values is adjusted to be wider. Applying the blur filter to each partition may include: blur filtering is applied to left and right edges among upper, lower, left and right edges (i.e., four edges) of the divided parts. Accordingly, when the first image is switched to the second image, a soft wave effect may be provided to reduce visual fatigue of the user.
One or more GUIs may be provided with respect to each of a plurality of preset services, and the one or more GUIs provided corresponding to each service may be preset as: switching between GUIs is performed in response to commands issued by user input for movement in the up, down, left, and right directions (i.e., four directions). The method may further comprise: switching between GUIs of different services in response to a command issued by a user input to move in an upward and downward direction; and to switch between GUIs of one service in response to commands issued by user input to move in left and right directions. Accordingly, the user can use the additional service more easily because the service is switched and the image is displayed only by intuitive commands for movement in the up, down, left, and right directions (i.e., four directions).
Drawings
Fig. 1 illustrates an example of a display device according to an exemplary embodiment;
fig. 2 illustrates an example of switching from a content image to a service image when a service switching event occurs in the display apparatus of fig. 1;
FIG. 3 is a block diagram of a display device according to an exemplary embodiment;
FIG. 4 is a block diagram of a signal processor in the display device of FIG. 3;
fig. 5, 6, 7, 8, 9, 10, 11, 12, and 13 illustrate examples of generating a background image of an additional service in the display apparatus of fig. 3;
fig. 14 is a flowchart for displaying a service image in the display apparatus of fig. 3;
fig. 15 is a flowchart of generating a background image of a service image in the display apparatus of fig. 3;
FIG. 16 shows an example of user input according to an example embodiment;
FIG. 17 shows an example of user input according to an example embodiment;
FIG. 18 illustrates a content depiction of an additional service image in accordance with an illustrative embodiment;
fig. 19 illustrates switching of a title image corresponding to a specific category of an additional service to a content image in a display device according to an exemplary embodiment;
fig. 20 illustrates switching a service image between two categories of additional services in a display device according to an exemplary embodiment;
fig. 21 is a flowchart of displaying a service image of an additional service in a display device according to an exemplary embodiment;
fig. 22 illustrates an example of overlaying a content image with a UI menu in a display device according to an exemplary embodiment;
fig. 23 is a flowchart of displaying a UI menu in the display device of fig. 22;
fig. 24 illustrates an example of displaying a service image on a content image in a picture-in-picture (PIP) form in a display apparatus according to an exemplary embodiment;
fig. 25 is a flowchart for displaying a service image in the display device of fig. 24;
fig. 26 illustrates an example of an image displayed in response to a user input in a display device according to an exemplary embodiment;
fig. 27 illustrates an example of a User Interface (UI) displayed in response to execution of a TV icon in a display apparatus according to an exemplary embodiment;
fig. 28 illustrates an example of a UI displayed in response to execution of an application icon in a display device according to an exemplary embodiment;
fig. 29 illustrates an example of a UI displayed in response to execution of a speaker icon in a display device according to an exemplary embodiment;
fig. 30 illustrates an example of a UI showing output levels of speakers in a display apparatus according to an exemplary embodiment;
fig. 31 illustrates an example of a UI displayed in response to execution of a photo icon in a display device according to an exemplary embodiment;
fig. 32 shows an example of displaying an image in a display device according to an exemplary embodiment;
fig. 33 illustrates an example of a UI displayed in response to execution of a clock icon in a display device according to an exemplary embodiment;
fig. 34 illustrates an example of a UI displayed in response to execution of a background image setting icon in a display device according to an exemplary embodiment; and
fig. 35 illustrates an example of a UI for function adjustment provided in a display apparatus according to an exemplary embodiment.
Detailed Description
Certain exemplary embodiments will be described in more detail below with reference to the accompanying drawings.
In the following description, the same reference numerals are used for the same elements even in different drawings. The matters defined in the description such as a detailed construction and elements are provided to assist in a comprehensive understanding of exemplary embodiments. It should be apparent, however, that the exemplary embodiments can be practiced without these specifically defined matters. In other instances, well-known functions or constructions are not described in detail since they would obscure the description in unnecessary detail.
In the description of the exemplary embodiments, ordinal numbers used in terms such as first element, second element, etc., are used to describe various elements, and these terms are used to distinguish one element from another. Accordingly, the meaning of the elements is not limited by the terms, and the terms are also used only to illustrate the corresponding embodiments and are not limiting.
Furthermore, the exemplary embodiments will only describe elements directly related to the exemplary embodiments. However, it should be recognized that the description of elements omitted is not necessary to implement an apparatus or system in accordance with the example embodiments. In the following description, terms such as "including" or "having" refer to the presence of features, numbers, steps, operations, elements, and combinations thereof, and do not preclude the presence or addition of one or more other features, numbers, steps, operations, elements, and combinations thereof.
Fig. 1 illustrates an example of a display device 100 according to an exemplary embodiment.
As shown in fig. 1, the display apparatus 100 according to the exemplary embodiment includes a TV, but is not limited thereto. As another example, the display device may include a tablet computer, a mobile phone, a multimedia player, an electronic photo frame, a digital billboard, and similar electronic devices capable of displaying images. In addition, the display device 100 may include a fixed device installed at one place and a mobile device freely carried and used by a user. Further, the exemplary embodiments can also be applied to an image processing apparatus provided with a separate monitor for displaying an image, and an apparatus capable of displaying an image by itself like the display apparatus 100.
The display apparatus 100 processes data of video contents received from an external device and displays an image. The video content may be transmitted as a radio frequency broadcast signal from a transmitter 10 of a broadcasting station, transmitted as data packets from a server 20 through a network 8, or reproduced and transmitted from a multimedia player 30 locally connected to a display device. In addition, the video content may be stored as digital data in the display device 100.
Display apparatus 100 includes a user input device 130 that allows a user to select video content from one of video sources 10, 20, and 30. For example, a remote controller physically separated from the main body of the display apparatus 100 may be used as the user input device 130. If the user selects one of the video sources 10, 20, and 30 through the user input device 130, the display apparatus 100 processes video content received from the selected one of the video sources 10, 20, and 30 and displays an image based on the processed video content. For example, if a user selects a specific broadcast channel through the user input device 130, the display apparatus 100 is tuned to the selected broadcast channel and receives a broadcast signal transmitted from the transmitter 10 of a broadcasting station corresponding to the tuned broadcast channel, thereby displaying a broadcast image.
The display apparatus 100 has a function of displaying images based on video contents received from the video sources 10, 20, and 30 and providing various additional services to a user. Here, the additional service refers to various services provided by the display apparatus 100 in addition to a service of processing video content to be displayed as an image, and is not limited to a specific service. Further, additional services may be provided based on the network or the client.
For example, the additional service includes various types of services, for example, a service that sets the configuration of the display apparatus 100 or the user environment; a service of notifying date, time, weather, and the like based on an installation location or a use location of the display device 100; a service that displays photos, pictures, etc. that are accessible over a network or stored locally; a service that displays a particular Web page through a Web browser, and so on.
When displaying images of video content, a user may issue a command through user input device 130 to view images of a particular additional service. In response to the user command, the display apparatus 100 operates to display an image of the additional service instead of the image of the video content.
Fig. 2 illustrates an example of switching from a content image to a service image when a service switching event occurs in the display apparatus of fig. 1.
As shown in fig. 2, the display apparatus 100 processes data of video contents received from an external device and displays a content image 210. While the content image 210 is displayed, the display apparatus 100 may detect an event for switching to an additional service. As an example of this event, the user may request an additional service displaying time and date information of an area where the display apparatus 100 is currently located.
In response to the event, the display apparatus 100 switches the content image 210 to the service image 220. The service image 220 shows time and date information corresponding to the current location. In an exemplary embodiment, the content image 210 and the service image 220 are displayed as a full screen on the entire area of the displayable image of the display device 100. Accordingly, the user can obtain a desired service through the service image 220 displayed on the display device 100.
When the content image 210 is switched to the service image 220, the display apparatus 100 directly switches between the content image 210 and the service image 220 without a separate visual effect.
The user may feel visual fatigue due to the abrupt switching from the content image 210 to the service image 220. In particular, when the display apparatus 100 has a large screen size, when the content image 210 and the service image 220 are displayed as a full screen, and when the animation in the content image 210 is a large-scale animation, the visual fatigue becomes severe.
Further, since the content image 210 is not displayed while the service image 220 is displayed, the viewing experience of the user for the content image 210 is interrupted. That is, the user may be reluctant to use the additional service because the visual representation of the content image 210 is interrupted.
Fig. 3 is a block diagram of the display apparatus 100 according to an exemplary embodiment.
As shown in fig. 3, the display device 100 includes: a signal receiver 110 for receiving a video signal from an external device; a display 120 for displaying an image based on the video signal received in the signal receiver 110; a user input device 130 for receiving user input; a storage device 140 for storing data and/or information; and a signal processor 150 for processing a video signal to be displayed as an image on the display 120 and controlling the operation of the display apparatus 100.
The signal receiver 110 receives video signals from various video sources 10, 20, and 30 (see fig. 1). The signal receiver 110 receives a video signal from an external device and transmits a signal to the external device, thereby performing interactive communication. The signal receiver 110 may include components of communication ports or communication modules corresponding to communication standards, respectively, and supportable protocols and communication targets of the components are not limited to one kind or one kind. For example, the signal receiver 110 may include a radio frequency integrated circuit (RFIC, not shown) for receiving RF signals, a wireless fidelity (Wi-Fi) communication module (not shown) for wireless network communications, an ethernet module (not shown) for wired network communications, a Universal Serial Bus (USB) port for locally connecting USB memory, and the like.
The display 120 displays an image based on the video signal processed by the signal processor 150. For example, the display 120 displays a broadcast image based on the tuned broadcast signal output from the signal processor 150. The display 120 may include at least one of a liquid crystal, a plasma, a light emitting diode, an organic light emitting diode, a surface conduction electron emitter, a carbon nanotube, a nanocrystal, and the like, but is not limited thereto.
In addition, the display 120 may include additional elements consistent with the type of panel as well as a display panel. For example, if the display 120 includes liquid crystal, the display 120 includes a Liquid Crystal Display (LCD) panel (not shown), a backlight unit (not shown) for emitting light to the LCD panel, and a panel driver (not shown) for driving the LCD panel.
The user input device 130 transmits various preset control commands or information to the signal processor 150 according to the control or input of the user. The user input device 130 is caused to transmit various events to the signal processor 150 through user control according to the user's intention. The user input device 130 may be variously implemented according to an information input method. For example, the user input device 130 may include: keys and/or buttons provided on the outside of the display apparatus 100, a remote controller separated from the main body of the display apparatus 100, a touch screen integrally formed with the display 120, and an input device provided to communicate with the display apparatus 100.
Storage device 140 stores various pieces of data that are processed and controlled by processor 150. The storage device 140 is accessed by the processor 150 and performs reading, writing, editing, deleting, updating, etc. on the data. The storage device 140 includes a non-volatile memory such as a flash memory, a hard disk drive, or the like to hold data regardless of the supply of system power in the display apparatus 100.
The signal processor 150 performs processing on data and/or signals received in the signal receiver 110. When the video signal is received in the signal receiver 110, the signal processor 150 applies a process of video processing to the video signal and outputs the processed video signal to the display 120, thereby displaying an image on the display 120.
The kind of processing of the image processing performed by the signal processor 150 is not limited, and the processing of the video processing may include, for example: demultiplexing for separating the stream into sub-streams such as a video signal, an audio signal and additional data; decoding, corresponding to a video format of the video stream; de-interlacing for converting a video stream from interlaced to progressive; scaling, for adjusting the video stream to have a preset resolution; noise reduction for improving image quality; detail enhancement; frame refresh rate conversion, etc.
Since the signal processor 150 may perform various processes according to the kind and characteristics of signals or data, the processes that may be performed by the signal processor 150 are not limited to those of video processing. Furthermore, the data that can be processed by the signal processor 150 is not limited to only the data received in the signal receiver 110. For example, if a user voice is input to the display apparatus 100, the signal processor 150 may process the voice according to a process of preset audio processing. The signal processor 150 includes a System On Chip (SOC) in which many functions are integrated, or an image processing board (not shown) in which respective chipsets for independently performing processing are mounted to a printed circuit board.
The display apparatus 100 may have explicitly different hardware components according to the type of the display apparatus 100 and the functions supported by the display apparatus 100. For example, if the display apparatus 100 is a TV, a hardware component to be tuned to a specific frequency to receive a broadcast signal may be included, but if the display apparatus 100 is a tablet computer, such a hardware component may not be included.
The signal processor 150 of the display apparatus 100 including the TV is described in detail below.
Fig. 4 is a block diagram showing details of the signal processor 150 included in the display device 100, which may include more or less elements than those described below.
As shown in fig. 4, the signal receiver 110 includes a tuner 111 to be tuned to a specific frequency to receive a broadcast signal. Further, the signal processor 150 includes: a Demultiplexer (DEMUX)151 for dividing a broadcast signal received from the tuner 110 into a plurality of sub-signals; a decoder 152 for decoding the plurality of sub-signals output from the DEMUX 151; a scaler 153 for scaling the video signal of the decoded sub-signal and outputting the scaled video signal to the display 120; a Central Processing Unit (CPU)154 for performing calculation and control on the operation of the signal processor 150; and a buffer 155 for temporarily storing signals or data while the signal processor 150 processes the signals or data.
When a broadcast signal is received in an RF antenna (not shown), the tuner 111 is tuned to a frequency of a designated channel to receive the broadcast signal and convert the broadcast signal into a transport stream. The tuner 111 converts a high frequency of a carrier wave received via an antenna (not shown) into an intermediate frequency band and converts the carrier wave into a digital signal, thereby generating a transport stream. To this end, the tuner 111 has an analog/digital (a/D) converter (not shown). As another example, the a/D converter may be included in a demodulator (not shown).
The DEMUX 151 performs an inverse operation of a multiplexer (not shown). That is, the DEMUX 151 connects one input terminal with a plurality of output terminals, and distributes a stream input to the input terminal to each output terminal according to a selection signal. For example, if there are four output terminals with respect to one input terminal, the DEMUX 151 may select each of the four output terminals by a combination of selection signals having two levels of 0 and 1.
In the display apparatus 100, the DEMUX 151 divides a transport stream received from the tuner 311 into sub-signals of a video stream, an audio stream, and an additional data stream, and outputs the sub-signals to respective output terminals.
The DEMUX 151 may divide a transport stream into sub-signals using various methods. For example, the DEMUX 151 divides a transport stream into sub-signals according to Packet Identifiers (PIDs) provided to packets in the transport stream. Sub-signals in a transport stream are independently compressed and packetized according to channels, and packets corresponding to one channel are provided with the same PID in order to distinguish packets corresponding to another channel. The DEMUX 151 classifies packets of a transport stream according to PIDs and extracts sub-signals having the same PIDs.
The decoder 152 decodes each sub-signal output from the DEMUX 151. Fig. 4 shows a single decoder 152, but is not limited thereto. As another example, a plurality of decoders may be provided to decode the sub-signals separately. That is, the decoder 152 may include a video decoder for decoding a video signal, an audio decoder for decoding an audio signal, and an additional data decoder for decoding additional data.
Since the sub-signal transmitted to the decoder 152 is encoded by a specific format, the decoder 152 performs an inverse operation with respect to the encoding process, thereby restoring the sub-signal to the original signal before encoding. If the sub-signal output from the DEMUX 151 is not encoded (e.g., not compressed), the decoder 152 transmits the sub-signal to the scaler 153 without processing the sub-signal, or the sub-signal bypasses the decoder 152 and is thus transmitted to the scaler 153.
The scaler 153 scales the video signal decoded by the decoder 152 to have a resolution suitable for the display 120 or a different designated resolution. Thereby, the scaled video signal is displayed as an image on the display 120.
The CPU 154 is an element for performing central calculation to operate elements in the signal processor 150, and plays a central role in analyzing and calculating data. The CPU 154 internally includes: a processor register (not shown) in which a command to be processed is stored; an Arithmetic Logic Unit (ALU) (not shown) responsible for comparison, determination, and calculation; a control unit (not shown) for internally controlling the CPU 154 to analyze and execute commands; an internal bus (not shown), a cache (not shown), etc.
The CPU 154 performs calculations required to operate elements of the signal processor 150, such as the DEMUX 151, the decoder 152, and the scaler 153. As another example, some elements of the signal processor 150 may be designed to operate without data computation by the CPU 154 or by a separate microcontroller (not shown).
When the signal processor 150 processes the broadcast signal, the buffer 155 temporarily stores data to be processed by the respective elements of the signal processor 150. In contrast to the storage device 140 (see fig. 4) that needs to retain data even when the system is powered off, the buffer 155 includes a volatile memory because data is temporarily loaded into the volatile memory during the processing of signal processing. In fig. 4, an input/output of the buffer 155 is connected to the CPU 154, but is not limited thereto. As another example, elements of the signal processor 150 may be directly connected to the buffer 155 without passing through the CPU 154.
The display apparatus 100 processes the received video content and displays an image based on the processed video content. Further, when a command for executing the additional service is received from the user input device 130 while the image based on the video content is displayed on the display 120, the display apparatus 100 displays the image based on the additional service instead of the image based on the video content.
The display apparatus 100 generating a background image (e.g., a second image) for a content image (e.g., a first image) based on an additional service according to an exemplary embodiment is described in detail below.
Fig. 5 to 13 illustrate examples of generating a background image of an additional service in the display apparatus 100 according to an exemplary embodiment. In the following description, an image based on a video signal corresponding to video content will be referred to as a content image or a first image, and an image corresponding to an additional service will be referred to as a service image.
When displaying a service image including a background image generated according to an exemplary embodiment, the display apparatus 100 does not display a content image but continues to process video content.
As shown in fig. 5, when an event indicating that an additional service is performed occurs while a content image is displayed, the display apparatus 100 captures a specific image frame 310 of the content image. These processes are operated and executed by the CPU 154 (see fig. 4), but separate elements for realizing these processes may be provided in the display device 100.
Further, data generated in the following processes and the captured image frame 310 are stored in the buffer 155 (see fig. 4).
As shown in fig. 6, the display apparatus 100 divides an image frame 310 captured from a content image into a grid 320 in a matrix form of M × N. Here, M and N are preset numerical values, and may vary according to the resolution of the image frame 310.
The image frame 310 is divided into a plurality of quadrilaterals 330 by a grid 320. Further, each quadrangle 330 includes a plurality of pixels.
As shown in fig. 7, the display apparatus 100 extracts and stores RGB values of each pixel included in each quadrangle 350. Further, the display apparatus 100 converts all the image frames 340 into frames expressed in gray.
As shown in fig. 8, the display apparatus 100 calculates brightness and transparency based on RGB values extracted and stored according to the respective quadrangles 350 (see fig. 7). The display apparatus 100 reflects the calculated brightness and transparency to the corresponding quadrangle 360. Fig. 8 illustrates a partial region of the display device 100 for easy understanding.
Here, the luminance Br and the transparency α of the quadrangle 360 are calculated based on RGB values of pixels of each quadrangle 360, as described below.
[ equation 1]
Br=sqrt(r2*0.241+g2*0.691+b2*0.068);(0<Br<255)
α=1-Br/255;(0<α<1)
Where the variables R, G, and B are the average of the R, G, and B values of the pixels in each quad 360, and sqrt is a function that returns the square root of the provided values.
As shown in fig. 9, the display device 100 adjusts each width of the quadrangles 361 and 362 based on the luminance calculated from the quadrangles 361 and 362 of the portion 358 of the screen. The display device 100 does not adjust the heights of the quadrangles 361 and 362.
The display apparatus 100 determines each width w of the quadrangles 361 and 362 by the following equation.
[ equation 2]
w=1-Br/255
If the maximum value of w is 1, then w ranges from 0< w < 1. Referring to equation 2, the widths of the quadrangles 361 and 362 become smaller as the luminance increases and become larger as the luminance decreases.
For comparison, the first quadrangle 361 has a relatively low brightness, and the second quadrangle 362 has a relatively high brightness. Before the width adjustment, the width w1 of the first quadrangle 361 is equal to the width w2 of the second quadrangle 362.
Based on the above equation, the width of the first quadrangle 361 changes from w1 to w1 ', and the width of the second quadrangle 362 changes from w2 to w 2', where the width w1 'is greater than the width w 2'. That is, the larger the luminance, the smaller the width becomes.
As shown in fig. 10, if the above-described adjustment is applied to all quadrangles in the image frame 370, the quadrangles of the image frame 370 have the same height but have different widths according to brightness. Specifically, a quadrangle having relatively low luminance has a relatively wide width, but a quadrangle having relatively high luminance has a relatively narrow width.
As shown in fig. 11, the display apparatus 100 applies the blurring process only to the vertical lines of each quadrangle in the image frame 380. The blurring applied only to the vertical lines of the quadrangle means that the blurring is not applied to the top and bottom edges of the quadrangle, but is applied to the left and right edges of the quadrangle.
Blurring processing (i.e., blurring) refers to a technique for blurring or softening an image by removing detailed portions of the image. Here, the detailed portion of the image corresponds to a portion where the image abruptly changes in units of pixels (i.e., an edge of an object in the image). If the image is represented in the frequency domain, the edge portion of the object corresponds to the high frequency component. In blurring, such high frequency components are filtered out, and thus edge portions of an object in an image are blurred. For example, the blurring may include low pass filtering.
In an exemplary embodiment, various techniques may be used to perform the obfuscation. For example, gaussian blur may be applied to image frame 380. Gaussian blur is an image blur filter that uses a gaussian function. The one-dimensional (1D) gaussian function is as follows.
[ equation 3]
G(x)=[1/√(2πσ2)]*e^(-x2/2σ2)
The two-dimensional (2D) gaussian function is as follows.
[ equation 4]
G(x,y)=[1/√(2πσ2)]*e^(-(x2+y2)/2σ2)
Where x is the distance from the origin in the direction of the horizontal axis, y is the distance from the origin in the direction of the vertical axis, and σ is the standard deviation of the Gaussian distribution.
The display apparatus 100 repeats the above-described process while displaying the service image. These processes are performed on the content image reproduced in the background in real time at preset times per second. That is, the display apparatus 100 captures an image frame from a content image reproduced but not displayed in real time and applies the above-described process to the captured image frame, thereby displaying a background image of a service image.
The number of times the display apparatus 100 performs processing per second can be variously determined in the design stage. For example, the processing may be performed at a frequency of 15 frames per second (fps) (i.e., 15 times per second).
As shown in fig. 12, the display device 100 sequentially changes the RGB values of each quadrangle in units of processes while repeating the processes, so that the background image 390 may have a preset color.
This changing method will be described by an example of changing one black quadrangle into an orange quadrangle. The RGB values of the black quadrilateral are (0,0,0) and the RGB values of the orange quadrilateral are (152,83, 44). Therefore, if there is a quadrangle whose RGB value is (0,0,0) in the first process, the display apparatus 100 increases the RGB value by 1 in each process until the RGB value of the corresponding quadrangle reaches the RGB value of e (152,83, 44).
For example, each time the process is repeated, the RGB values of the corresponding quadrangle change as follows: (0,0,0) - > (1,1,1) - > (2,2,2) - > (44,44,44) - > (83,83,44) - > - > (152,83, 44). If a specific value among the RGB values first reaches the target value, the display apparatus 100 fixes the reached value, but continues to change other values in units of processing.
On the other hand, if the RGB value of the quadrangle is (255 ), the display apparatus 100 subtracts 1 from the RGB value in each process until the RGB value of the corresponding quadrangle reaches the RGB value of e (152,83, 44).
Further, the display apparatus 100 may increase the blur value from 1 to a preset value (for example, by adding 1 to 20) in each process, and the blur value is reflected to the corresponding image frame each time the process is repeated. The higher the blur value, the better the blurring effect. The display device increases the weight of the blur each time the process is repeated.
As shown in fig. 13, when the processing is repeated in units of image frames, the boundary between quadrangles longitudinally adjacent to each other in the image frames disappears and looks like a line. Accordingly, the content image gradually becomes abstract, and thus serves as the background image 400 based on the specific color of the service image.
While the service image is displayed, the background image 400 changes in real time according to the reproduction of the content image, and thus the lines in the background image 400 continuously change like waves. The real-time changes in the background image 400 correspond to changes in the image frames of the content image. Accordingly, when the service image is displayed and the content image is not displayed, the user may continuously receive visual feedback corresponding to the content image. Further, when the content image is switched to the service image or when the service image is switched to the content image again, visual fatigue due to a sudden change of the image can be reduced.
The display apparatus 100 displaying a service image according to an exemplary embodiment is described in detail below.
Fig. 14 is a flowchart of displaying a service image in the display device 100.
As shown in fig. 14, the display apparatus 100 receives a video content signal at operation S110, and the display apparatus 100 processes the video content signal to reproduce and display a content image at operation S120.
At operation S130, the display apparatus 100 determines whether a command to execute an additional service is issued.
If it is determined that a command to perform an additional service is issued, the display apparatus 100 acquires an image frame from the content image at the current time at operation S140. At operation S150, the display apparatus 100 acquires preset image information from an image frame. At operation S160, the display apparatus 100 changes image information according to a preset algorithm, thereby generating a background image. At operation S170, the display device displays a service image including a background image.
At operation S180, the display apparatus 100 determines whether a preset time has elapsed after displaying the service image.
If it is determined that the preset time has elapsed after the service image is displayed, the display apparatus 100 returns to operation S140 and repeats the process of generating the background image.
Therefore, as time passes, the background image of the service image changes according to the image information of the reproduced content image.
The display apparatus 100 generating a background image of a service image according to an exemplary embodiment is described in detail below.
Fig. 15 is a flowchart of generating a background image of a service image in the display device 100.
As shown in fig. 15, the display apparatus 100 captures an image frame from a content image at operation S210.
The display apparatus 100 divides an image frame into a mesh including a preset number of quadrangles at operation S220.
At operation S230, the display device 100 calculates luminance and transparency based on RGB values of pixels included in each quadrangle.
The display apparatus 100 applies the calculated brightness and transparency to the corresponding quadrangle at operation S240.
At operation S250, the display apparatus 100 adjusts a lateral width of each quadrangle based on the brightness.
At operation S260, the display apparatus 100 performs blurring with respect to a vertical line of each quadrangle.
The display apparatus 100 performs the above-described processing in real time while displaying the service image. When the process is repeated, the display apparatus 100 changes the RGB value of each quadrangle to approximate the RGB value of the preset color, and also increases the blur value to the preset value. Accordingly, the display device 100 can display the background image of the service image, which varies according to the image information of the reproduced content image.
For example, if a content image (e.g., a first image) includes one or more objects, a background image (e.g., a second image) of a service image is displayed as an image corresponding to a portion of image information of the objects (e.g., corresponding to outlines of the objects).
Additional services using the background image generated according to the exemplary embodiment are described in detail below.
The user may control the user input device 130 (see fig. 3) to move a cursor displayed on the display apparatus 100 or to move a GUI displayed on the display apparatus 100 in a specific direction. In this case, the user can intuitively and easily recognize a specific direction with respect to the upward, downward, leftward and rightward directions (i.e., four directions). If the cursor is moved in a particular direction from the origin, that direction can be represented in two dimensions (i.e., on the horizontal and vertical axes). Accordingly, the user can easily recognize the two-dimensional movement direction with respect to the left and right directions corresponding to the lateral direction and the up and down directions corresponding to the longitudinal direction.
In this regard, the user input device 130 (see FIG. 4) provides an input environment for the user to issue commands regarding movement in four directions.
FIG. 16 shows an example of a user input device 410 according to an example embodiment.
As shown in fig. 16, the user input device 410 includes a remote controller that is easily carried by the user. The user input device 410 includes an up arrow key 411, a down arrow key 412, a left arrow key 413, and a right arrow key 414, which correspond to four directions, respectively. The respective arrow keys 411, 412, 413, and 414 are physically or mechanically separated from each other. If the user presses one of the arrow keys, a command is issued in the user input device 410 regarding movement in a direction corresponding to the pressed arrow key.
The commands issued by the user input device 410 are sent to the display apparatus.
FIG. 17 shows an example of a user input device 420 according to an exemplary embodiment.
As shown in fig. 17, the user input device 420 includes a touch panel 421 for touch input using a user's finger, a stylus (not shown), or the like. Although the user can make an input by a drag operation in various directions via the 2D touch pad 421, the most accurate directions that the user can input are the most intuitive up, down, left, and right directions (i.e., four directions).
As another example, the user input device 420 may have a built-in motion sensor (not shown) for sensing its own motion, such that the user may issue commands corresponding to the direction in which the user input device 420 is shaken or moved.
Since the user can most intuitively recognize the up, down, left, and right directions (i.e., four directions) and make corresponding inputs, drawings based on categories of additional services displayed on the display device are provided corresponding to the up, down, left, and right directions (i.e., four directions) to facilitate the user.
Fig. 18 illustrates content mapping of an additional service image according to an exemplary embodiment.
As shown in fig. 18, the additional services 500 include the following categories: for example, "clock" for displaying the current time, "weather" for displaying the weather in the current area, "calendar" for displaying the date of the day, "photo" for displaying an image such as a photo, a picture, "speaker" for activating a speaker installed in the display device, and the like. These categories are provided as examples and do not limit the exemplary embodiments.
The frame shown in fig. 18 is a service image to be displayed on the display device. The horizontal rows refer to the categories of additional services and the vertical columns refer to the functions of the service image.
The service images 510 and 515 on the first horizontal row correspond to the initial category for initiating additional services for the first time. The service images 520, 521 and 525 on the second horizontal row correspond to the "clock" category. The service images 530, 531, 532, and 535 on the third horizontal row correspond to weather categories. The service images 540, 541, 542, and 545 on the fourth horizontal row correspond to a "calendar" category. The service images 550 and 551 on the fifth horizontal row correspond to the "photos" category. The service images 560 and 561 on the sixth horizontal row correspond to the "speaker" category.
The "setting" corresponds to the setting images 515, 525, 535, and 545 for setting the display environments of the respective categories. The "title page" corresponds to the title images 510, 520, 530, 540, 550, and 560 for respectively initiating the services of these categories. The "contents" correspond to the images 521, 531, 532, 541, 542, 551, and 561 containing actual information for providing the respective categories of services.
According to an exemplary embodiment, the service images have background images. Of course, according to an exemplary embodiment, all, some, or only one of the service images may have a background image (e.g., a second image). As another example, according to an exemplary embodiment, a specific image such as the title images 510, 520, 530, 540, 550, and 560 in the category may have a background image. However, the background images of these categories are displayed in different colors in order to distinguish the categories.
Further, the arrows around the service image indicate: if a user issues a command for moving in one of up, down, left, and right directions (i.e., four directions) while the corresponding service image is displayed on the display device, the corresponding service image will be switched to what service image. "move to the head" means to switch to the first service image on the corresponding line, to "move to the end" means to switch to the last service image on the corresponding line, to "move to setting" means to switch to setting for the corresponding horizontal line.
For example, in response to an operation of initiating the additional service for the first time, the content image is switched to the initial title image 510. In this state, if the user issues a command for moving in the left direction or the right direction, a setting image 515 for adjusting settings applied to, for example, all the additional services is displayed. On the other hand, if the user issues a command for moving in the downward direction, the title image 520 corresponding to the "clock" category is displayed.
In a state where the title image 520 corresponding to the "clock" category is displayed, if the user issues a command for moving in the right direction, the content image 521 corresponding to the "clock" category is displayed, showing the current time. On the other hand, if the user issues a command for moving in the downward direction, the title image 530 corresponding to the "weather" category is displayed.
If a command for executing an additional service is issued while a content image is displayed on a display device, the display device first displays an initial image 510 for initiating the additional service and switches a service image in response to a user command regarding a direction.
Further, if an execution command is issued with respect to a specific category in a state where the execution command is previously provided to display respective categories corresponding to additional services, the display device may first display one of the title images 520, 530, 540, 550, or 560 of the corresponding category.
Fig. 19 illustrates switching of a title image 530 corresponding to a specific category of an additional service to a content image 531a in a display device according to an exemplary embodiment.
As shown in fig. 19, if a title image 530 corresponding to the "weather" category is launched in response to a user command, the display device displays a weather information image 531a according to one of two methods, but the described category (i.e., "weather") and method are not restrictive.
One of the two methods is: even if there is no user input in a state where the display device displays the title image 530, the weather information image 531a is automatically displayed after a preset time elapses from the time when the title image 530 is displayed. The preset time may have various values. For example, after 1 second has elapsed from the time when the header image 530 is displayed, the display device displays the weather information image 531 a.
Another method is to display the weather information image 531a in response to a user command in a right direction, but this is not limitative. As described in the above drawing shown in fig. 18, the weather information image 531a is drawn to the right direction of the title image 530, and thus the display device displays the switched service image based on the corresponding drawing.
Fig. 20 illustrates switching a service image between two categories of additional services in a display device according to an exemplary embodiment.
As shown in fig. 20, the display device switches or moves between categories of additional services in response to a user command for moving in an upward direction and a downward direction. Further, in response to a user command for moving in the left and right directions, the display apparatus switches or moves between the title image 520 and the information images 521a and 521b within one category, the information images 521a and 521b including information provided by the corresponding category service.
Switching between categories may be implemented between the title images 520 and 530 of the corresponding categories.
Further, if the information screens 521a, 521b, 531a and 531b are related to each other between categories, the vertical switching between the categories may be realized between the information images 521a and 521b and between the information images 531a and 531 b.
For example, the first information image 521a of the "clock" category may include time information on the area of "city 1", and the first information image 531a of the "weather" category may include weather information on the area of "city 1". The first information image 521a and the first information image 531a are different in the category of the service information, but they are both related to the same area. Since the images 521a and 531a are both related, the display device may display the first information image 531a switched from the first information image 521a if there is a user command for moving in the downward direction, or vice versa, and the display device may display the first information image 521a switched from the first information image 531a if there is a user command for moving in the upward direction.
Similarly, the second information image 521b of the "clock" category may include time information on the area of "city 2", and the second information image 531b of the "weather" category may include weather information on the area of "city 2". Since both the second information image 521b and the second information image 531b are related to the same area of "city 2", the display device may display the second information image 531b switched from the second information image 521b if there is a user command for moving in the downward direction, or vice versa, and the display device may display the second information image 521b switched from the second information image 531b if there is a user command for moving in the upward direction.
According to an exemplary embodiment, the display device displays a service image of an additional service, and switches in response to a user command for moving in up, down, left, and right (i.e., four directions), so that the user can intuitively use the additional service.
A display apparatus displaying a service image of an additional service according to an exemplary embodiment is described in detail below.
Fig. 21 is a flowchart of displaying a service image of an additional service in a display device according to an exemplary embodiment.
As shown in fig. 21, the display apparatus receives a command issued by a user to provide an additional service at operation S310.
At operation S320, according to a non-limiting example of drawing, the display apparatus invokes drawing of service images in which the service is drawn according to the moving direction, i.e., drawing refers to data specifying matching between service images in response to upward, downward, leftward, and backward directions (i.e., four directions), as shown in fig. 18.
At operation S330, the display apparatus generates a background image based on image information of the content image. The method of generating the background image is the same as the method described above with reference to the exemplary embodiment.
At operation S340, the display device displays a service image including a background image. The displayed service image may include an initial image for initiating an additional service or a title image of a specific service category.
At operation S350, the display apparatus determines whether a user issues a command for moving in one of upward, downward, leftward and rightward directions (i.e., four directions).
If it is determined that the user has issued a command for moving in one of the up, down, left, and right directions (i.e., four directions), the display apparatus selects a service image corresponding to the direction according to the drawing of the service image at operation S360.
At operation S370, the display device displays the selected service image.
According to an exemplary embodiment, the display apparatus provides an intuitive environment so that a user can easily use additional services.
In an exemplary embodiment, both the content image and the service image are displayed as a full screen on the display device, and thus the content image is not displayed while the service image is displayed. Further, in the exemplary embodiment, an image generated based on the image information of the content image is used as the background image of the service image. However, this is not restrictive, and the exemplary embodiments may be variously implemented by modification.
Fig. 22 illustrates an example of overlaying a content image 600 with a UI menu 670 (e.g., a GUI menu screen) in the display device 600 according to an exemplary embodiment.
As shown in fig. 22, the display apparatus 600 according to an exemplary embodiment displays a content image 660 on the display 620. If the user issues a command to display the UI menu 670 through the user input device 630 while the display 620 displays the content image 660, the display apparatus 600 displays the content image 660 overlaid by the UI menu 670.
Here, if the content image 660 is displayed as a full screen on the display 620, the UI menu 670 is displayed to cover a partial area of the content image 660.
When the UI menu 670 is displayed, the display apparatus 600 acquires image information of the content image 660 and changes the image information based on a preset algorithm, thereby generating a background of the UI menu 670 as described in detail above. The algorithm may be derived from the method of the above exemplary embodiment, and a detailed description thereof will be omitted.
Here, the image information of the content image 660 is image information corresponding to a pixel area to be covered by the UI menu 670 within the entire pixel area of the content image 660.
According to the present exemplary embodiment, the UI menu 670 is displayed together with the content image 660 while the content image 660 continues to be displayed. That is, the background of the UI menu 670 on the portion 672 of the display 620 changes in real time according to the display of the content image 660. Accordingly, the user can recognize a motion in the content image 660 covered by the UI menu 670 through a real-time change of the background of the UI menu 670.
In an exemplary embodiment, an image generated based on image information of the content image 660 is used as a background (e.g., a second image) of the UI menu 670, but is not limited thereto. As another example, an image generated based on image information of the content image 660 may be used as an icon, a selection item, or the like of the UI menu.
Fig. 23 is a flowchart of displaying the UI menu 670 in the display device 600 according to an exemplary embodiment.
As shown in fig. 23, the display apparatus 600 displays a content image at operation S410.
At operation S420, the display apparatus 600 determines whether there is a command for displaying a UI menu.
If it is determined that there is a command to display the UI menu 670, the display apparatus 600 designates a specific area of the content image to be covered by the UI menu at operation S430.
At operation S440, the display apparatus 600 generates a background image by considering the image information of the specific region.
At operation S450, the display apparatus 600 displays the UI menu together with the generated background image so that the content image can be overlaid by the UI menu.
Fig. 24 illustrates an example of displaying a service image 770 on a content image 760 in a picture-in-picture (PIP) form in a display apparatus 700 according to an exemplary embodiment.
As shown in fig. 24, when the display 720 displays the content image 760, the display apparatus 700 displays a service image 770 in response to a command for executing an additional service issued through the user input device 730.
In some exemplary embodiments, the content image and the service image are displayed as a full screen, whereby the display apparatus displays the service image by switching the content image to the service image.
On the other hand, the display apparatus 700 according to the present exemplary embodiment displays the service image 770 within the content image 760 by the PIP mode. That is, the display apparatus 700 displays the content image 760 as a main image of the PIP mode and displays the service image 770 as a sub image of the PIP mode.
The display apparatus 700 determines an area corresponding to a position for displaying the service image 770 within the entire area of the content image 760, and acquires image information of the determined area. The display device 700 generates a background image of the service image 770 based on the acquired image information.
Fig. 25 is a flowchart of displaying a service image in the display device 700.
As shown in fig. 25, the display device 700 displays a content image at operation S510. In the present exemplary embodiment, the content image is displayed as a full screen.
At operation S520, the display apparatus 700 determines whether there is a command for executing the additional service.
If it is determined that there is a command to perform the additional service, the display apparatus 700 acquires image information of a content image corresponding to a sub image of the PIP mode at operation S530.
At operation S540, the display apparatus 700 generates a background image based on the acquired image information. The method of generating the background image may be derived according to the method of the above-described exemplary embodiment, and thus a detailed description thereof will be omitted.
The display device 700 displays a service image including a background image as a sub image of a PIP mode and displays a content image as a main image of the PIP mode at operation S550.
A UI including a background image according to an exemplary embodiment is described in more detail below.
Fig. 26 illustrates an example of an image 800 displayed in response to a user input in a display device according to an exemplary embodiment.
As shown in fig. 26, when a user presses a specific button on a remote controller to generate a preset input, or when the display device is turned on, the display device displays a main image 800 including at least one UI.
The entire main image 800 is changed in real time according to the reproduction of the content image and includes a background image 810, and an inner line of the background image 810 may be continuously changed like a wave according to the contour of at least one object of the content image which may be moving. The background image 810 according to the present exemplary embodiment may be generated based on the same method as the above-described method. The background image 810 according to the present exemplary embodiment may also be applied to various UIs and the main image 800, and may also be a still image.
If it is determined that the user input is not received within the preset time while the main image 800 is displayed, the display apparatus may display only the background image 810 by switching the main image 800 to the background image 810. If a user input is received while only displaying the background image 810, the display apparatus returns to displaying the main image 800, for example, the GUI and the background image 810 as a combined image.
In order to perform the preset service, the main image 800 includes a plurality of icons 820, 830, 840, 850, 860, and 870 corresponding to the services, respectively. The respective icons 820, 830, 840, 850, 860, and 870 are positioned at the lower side of the main image 800, but there is no limitation on the positions of the icons 820, 830, 840, 850, 860, and 870.
Icons 820, 830, 840, 850, 860, and 870 may be labeled with titles that simply describe the corresponding services, respectively. For example, the icons 820, 830, 840, 850, 860, and 870 according to the exemplary embodiment include a TV icon 820, an application icon 830, a speaker icon 840, a photo icon 850, a clock icon 860, and a background image setting icon 870. The TV icon 820, the application icon 830, the speaker icon 840, the photo icon 850, and the clock icon 860 are arranged in a row along a lower side edge on the right side of the main image 800. The background image setting icons 870 characteristically different from these services are arranged along the lower side edge on the left side of the main image 800.
The exemplary embodiment shows icons 820, 830, 840, 850, 860, and 870 and an arrangement of these icons 820, 830, 840, 850, 860, and 870, respectively, corresponding to a particular service. However, the exemplary embodiment is only an example among various types of GUIs that can be displayed as the main image 800, and the example is not limiting.
The TV icon 820, the application icon 830, the speaker icon 840, the photo icon 850, and the clock icon 860 are respectively arranged in a plurality of areas 821, 831, 841, 851, and 861 formed by a plurality of columns extending longitudinally upward from the lower side of the main image 800.
Each column may have various visual effects. For example, each column may have a predetermined color, which is assimilated to the background image 810 as it is faded upward from the lower side of the main image 800.
Each of the areas 821, 831, 841, 851 and 861 may be highlighted to indicate the selection state of the icons 820, 830, 840, 850 and 860 in the corresponding areas 821, 831, 841, 851 and 861. That is, if the user presses an arrow key of the remote controller, the highlight indicating the currently selected icon may move. For example, if the TV icon 820 is selected, the TV icon 820 is displayed larger than the other icons, and the area 821 of the TV icon 820 is highlighted. Accordingly, the user can easily recognize that the TV icon 820 is currently selected.
In this state, if the user moves the highlight to the application icon 830, the TV icon 820 returns to its original size and the highlight on the region 821 including the TV icon 820 is released. Further, the application icon 830 is enlarged, and the region 831 including the application icon 830 is highlighted.
In a state where one of the icons 820, 830, 840, 850, 860, and 870 is selected, the user may issue a command to execute the selected icon 820, 830, 840, 850, 860, or 870 by pressing a key-in key of the remote controller or the like. In response to a command for executing the selected icon 820, 808, 840, 850, 860, or 870, the display device executes a service corresponding to the selected icon 820, 830, 840, 850, 860, or 870.
The following describes services of the respective icons 820, 830, 840, 850, 860, and 870 according to an exemplary embodiment.
The display apparatus processes a currently received video signal and displays an image based on the video signal in response to execution of the TV icon 820. The display device may receive video signals from various video sources. For example, the display apparatus may display a broadcast image based on a broadcast signal tuned to a specific channel and received from a transmitter of a broadcasting station, or display an image based on a video signal received from an external device such as an optical media player.
The display device displays items of many applications installed therein so that the applications can be executed in response to execution of the application icon 830.
In response to execution of the speaker icon 840, the display device displays a setting image of a speaker for outputting sound, an image showing a sound output state of the speaker, and the like.
The display device displays the accessible photo images in response to execution of the photo icon 850.
The display device displays a clock showing the current time in response to execution of the clock icon 860.
The display device displays a UI for changing the setting of the currently displayed background image 810 in response to execution of the background image setting icon 870.
When each of the icons 820, 830, 840, 850, 860, and 870 is selected and executed, the display device displays UIs displayed corresponding to the respective icons 820, 830, 840, 850, 860, and 870. If one of the TV icon 820, the application icon 830, the speaker icon 840, the photo icon 850, and the clock icon 860 biased to the right of the main image 800 is selected, a service is provided corresponding to the executed icon 820, 830, 840, 850, or 860. Such services may be displayed in various forms. For example, the main image 800 may be switched to an image providing a service corresponding to the icon 820, 830, 840, 850, or 860. As another example, one of regions 821, 831, 841, 851, and 861 including icons 820, 830, 840, 850, and 860, respectively, may extend toward the left side (i.e., in the X direction) of the main image 800, and a UI corresponding to the icon 820, 830, 840, 850, or 860 may be displayed in the extended region 821, 831, 841, 851, or 861.
Fig. 27 illustrates an example of a UI displayed in response to execution of a TV icon in a display apparatus according to an exemplary embodiment.
As shown in fig. 27, when the user selects and executes a TV icon 820 (see fig. 26), the display apparatus processes a currently received content video signal and displays a content image 910. The content image 910 may vary depending on which of the video signals received in the display device is selected as currently being processed. While displaying the content image 910, the display apparatus determines the setting state of the video signal.
For example, the display device directly acquires the setting state information of the video signal before displaying the main image 800 (refer to fig. 26). The setting state information includes processing information for displaying the content image 910. For example, the setting state information may include identification information of a video signal to be processed by the display apparatus among many video signals receivable in the display apparatus, identification information of a channel when the video signal relates to a plurality of channels (such as a broadcast signal), reproduction information such as resolution of an image or volume of sound, or the like. The display apparatus switches the main image 800 (see fig. 26) to the content image 910 based on the setting state information.
Fig. 28 illustrates an example of a UI displayed in response to execution of an application icon in a display device according to an exemplary embodiment.
As shown in fig. 28, in response to execution of the application icon 830, the display device expands a region 831 including the application icon 830 within the main image 800. For example, the display device may expand region 831 in such a way: the TV icon 820 and the application icon 830 biased to the right of the main image 800 move to the left within the main image 800 and keep the positions of the speaker icon 840, the photo icon 850, and the clock icon 860 unchanged.
The display device displays execution icons 832 of one or more application programs currently stored or installed therein, the execution icons 832 being arranged within the extension area 831. Each execution icon 832 includes a thumbnail image of a similar identification so that a user can visually identify a corresponding application, and may additionally include title information or brief description information of the corresponding application. When the user selects and executes one of the execution icons 832, the display device executes an application corresponding to the execution icon 832.
Here, the display device may highlight the selected execution icon 832 or enlarge the selected execution icon 832, so that the user may easily distinguish the currently selected execution icon 832 among the plurality of execution icons 832.
The display device may also include an execution icon 832 and an application additional icon 833 within the extension region 831. When the user executes the application attachment icon 833, the display device may display a UI for adding an application to be executed in the display device and additionally display an execution icon 832 corresponding to the added application in the extension area 831.
Fig. 29 illustrates an example of a UI displayed in response to execution of a speaker icon in a display device according to an exemplary embodiment.
As shown in fig. 29, in response to execution of the speaker icon 840, the display device expands a region 841 that includes the speaker icon 840. Region 841 may be expanded as described above with reference to application icon 830.
The display apparatus determines whether an external device is connected to the display apparatus. Here, the external device is a device that stores audio data to be output through a speaker provided in the display apparatus. If it is determined that the external device is not connected to the display apparatus, the display apparatus displays a message 842 requesting a connection with the external device to the user on the expanded area 841.
While displaying the message 842, the display device detects a connection with an external device. The display apparatus receives audio data from the external device and processes the audio data to be output through the speaker if it is determined that the external device is connected. The display device displays a UI showing an output state of audio data while outputting the audio data through a speaker.
Fig. 30 illustrates an example of a UI showing an output level of a speaker in a display apparatus according to an exemplary embodiment.
As shown in fig. 30, the display apparatus displays a UI 920 showing output state information of audio data while outputting the audio data through a speaker. For example, the output state information of the UI 920 may include an equalizer visualization 921 showing a level of audio data corresponding to a frequency, title information 922 of the audio data, information 923 about an author, an actor, or an artist, a time bar 924 in which an indicator indicating a current reproduction point moves in real time, and the like. In addition, the UI 920 shows various pieces of information related to audio data.
When reproducing audio data, the indicator of the equalizer visualization 921 and the time bar 924 in the UI 920 changes in real time according to the reproduction state. If the audio data is paused, the equalizer visualization 921 and the indicator of the time bar 924 in the UI 920 are also stopped. Further, when the audio data is paused, the display device may display a message for notifying the paused state on the UI 920.
Fig. 31 illustrates an example of a UI displayed in response to execution of a photo icon in a display device according to an exemplary embodiment.
As shown in fig. 31, in response to execution of the photo icon 850, the display apparatus expands an area 851 including the photo icon 850. Here, region 851 can be expanded as described above with reference to application icon 830.
The display apparatus determines whether an external device or an external memory is connected to the display apparatus. Here, the external device or the external memory is a device in which an image to be displayed by the display apparatus is stored. If it is determined that the external memory storing the image is not connected to the display apparatus, the display apparatus displays a message 852 guiding the user to connect the external memory storing the graphic to the display apparatus on the extension area 851.
If the display apparatus detects a connection with the external memory while displaying the message 852, the display apparatus receives an image from the external display and displays a UI showing the received image.
If the external memory connected to the display apparatus does not include an image, the display apparatus may display a message notifying that there is no image in the external memory on the extension area 851. Here, the presence of the image may be determined based on whether there is an image in a format that the display device can process.
Fig. 32 illustrates an example of displaying an image in a display device according to an exemplary embodiment.
As shown in fig. 32, it is the device that buffers the images 931, 932, and 933 received from the external memory. Here, the number of data amounts of the images 931, 932, and 933 to be buffered may vary according to the configuration of the display device or settings customized by the user.
The display device may display the buffered images 931, 932, and 933 in various modes. For example, the display device may display the buffered images 931, 932, and 933 in a slide show mode. The display device displays a slide show UI930 and moves the images 931, 932, and 933 at a preset speed within the UI 930. In the slide show mode, the plurality of images 931, 932, and 933 are automatically scrolled without user input for switching the images 931, 932, and 933. Accordingly, the display device can sequentially display a plurality of images 931, 932, and 933.
When the display device displays a plurality of images 931, 932, and 933 in the slide show mode, the images 931, 932, and 933 can move at a uniform speed without stopping. As another example, if the specific image 932 is located at the center, the display device may stop moving the images 931, 932, and 933 for a preset time, and then move the images 931, 932, and 933 again after the preset time elapses. In this case, the user can easily view the images 931, 932, and 933 because the images 931, 932, and 933 are stopped for the preset time while being respectively located at the centers.
Fig. 33 illustrates an example of a UI displayed in response to execution of a clock icon in a display device according to an exemplary embodiment.
As shown in fig. 33, the display apparatus displays a UI 940 showing the current time in response to execution of a clock icon 860 (see fig. 26). The UI 940 may show various types of clocks such as an analog clock, a digital clock, and the like as long as the clocks can show information about the current time.
The display device may blink at least a portion of the UI 940 so that the user may recognize that the UI 940 normally shows the current time. For example, the display device may flash between a number corresponding to an hour and a number corresponding to a minute in seconds.
Fig. 34 illustrates an example of a UI displayed in response to execution of a background image setting icon in a display device according to an exemplary embodiment;
as shown in fig. 34, the display device displays a UI950 that provides an option for changing the setting of the background image in response to execution of the background image setting icon 870 (see fig. 26). The UI950 may be provided as three options to set "color", "grid", and "auto start". However, the options set through the UI950 according to the exemplary embodiment are only examples, and this is not limiting.
The "color" item is to select a main color of the background image as a curtain effect showing an outline of the content image. In addition to the default value, the "color" item allows selection of one of red, green, blue, and gray. The display means displays a background image of the UI based on the selected color in the "color" item.
The "mesh" item is used to specify the size of a mesh for dividing the image when the background image is generated.
The "auto start" item is used to determine whether or not the main image 800 (see fig. 26) is designated as an initial image displayed when the display device is turned on. If the "auto start" item is set to "on", the display device sets the main image as the initial image. On the other hand, if the "auto start" item is set to "off", the display device does not set the main image as the initial image, and may display the content image, for example, when the system is powered on.
As described above, the display device may display various UIs with background images having a curtain effect.
A display apparatus providing a UI for adjusting functions of the display apparatus according to an exemplary embodiment is described in detail below.
Fig. 35 illustrates an example of a UI for function adjustment provided in a display apparatus according to an exemplary embodiment.
As shown in fig. 35, the display apparatus displays a UI 960 for adjusting various functions of the display apparatus on a content image 970 in response to a preset user input (e.g., a button input of a remote controller).
When the display device displays the UI 960 on the content image 970, there is no limitation on the position of the UI 960. As shown in fig. 35, the display device may display a UI 960 slid from the right side of the content image 970, or may display a UI 960 slid from the left, upper, or lower side of the content image 970.
The UI 960 may have a transparency of a predetermined value. Thus, the content image 970 behind the UI 960 is visible through the UI 960. Here, the UI 960 is divided into a first area 961 and a second area 962, and the first area 961 and the second area 962 may be different in transparency, thereby being easily distinguished from each other.
In the first region 961, adjustable function items of the display device are arranged to be selectable. In the second region 962, a setting value regarding a specific item selected among a plurality of items displayed on the first region 961 is arranged.
For example, the display device displays a plurality of items such as "picture mode" for selecting picture mode, "sound mode" for selecting sound mode, "sleep timer" for selecting automatic expiration time, and the like on the first region 961. When the user presses the arrow button on the remote control, the cursor may move between items on the first region 961.
Further, the display device displays a plurality of setting values provided with respect to the specific item currently selected in the first region 961 on the second region 962. As the user moves between items in the first region 961, the items alternate between being selected and not selected, and thus the setting values on the second region 962 also change.
For example, if "picture mode" is selected in the items in the first region 961, the display device displays "standard", "dynamic", "movie", and "natural" as setting values of "picture mode" on the second region 962. If the user selects one of the setting values, the display apparatus adjusts the function of the display apparatus based on the selected setting value.
The method according to the above-described exemplary embodiments may be implemented in the form of program commands that can be implemented in various computers and recorded in computer-readable media. Such computer-readable media may include program instructions, data files, data structures, etc., or a combination thereof. For example, the computer-readable medium may be stored in a volatile or non-volatile memory such as a Read Only Memory (ROM) (e.g., RAM, memory chips, memory-like devices, or Integrated Circuits (ICs)) whether or not it is removable or rewritable, or in an optical or magnetic recordable or machine-readable (e.g., computer-readable) storage medium (e.g., Compact Discs (CDs), Digital Versatile Discs (DVDs), magnetic disks, magnetic tapes, etc.). It should be understood that the memory that may be included in the mobile terminal is an example of a machine-readable storage medium suitable for storing a program having instructions for implementing the exemplary embodiments. The program commands recorded in the storage medium may be specifically designed and configured according to the exemplary embodiments, or may be well known and available to those skilled in the computer software art.
While certain exemplary embodiments have been shown and described, those skilled in the art will recognize that: modifications to the exemplary embodiments may be made without departing from the spirit and scope of the present invention, which is defined by the appended claims and their equivalents.

Claims (15)

1. A display device, comprising:
a display device is arranged on the base plate,
a user input device configured to receive a user command; and
at least one processor configured to:
controlling the display to display a first image comprising a plurality of image frames;
obtaining an image frame from the plurality of image frames in response to receiving a user command from the user input device to display a Graphical User Interface (GUI) while displaying the first image;
dividing the obtained image frame into a plurality of parts;
adjusting a lateral width of each of the plurality of portions based on red, green, and blue RGB values of pixels in each of the plurality of portions;
applying blur filtering to the plurality of portions having the adjusted lateral width; and
control the display to display the GUI as overlaid on a background image including the plurality of portions to which the blur filter is applied,
wherein the background image is a full screen image.
2. The display device of claim 1, wherein the at least one processor is configured to: generating a second image corresponding to an outline of at least one object in the first image by processing image information acquired from the first image, and processing the second image to be displayed as the background image of the GUI.
3. The display device of claim 2, wherein the at least one processor is configured to: image information changed by continuously reproducing the first image is acquired in real time, and the second image is processed to be changed according to the image information acquired in real time.
4. The display device of claim 3, wherein the at least one processor is configured to: the red, green and blue RGB values of the second image are gradually changed so that the second image changes with respect to color over a period of time.
5. The display device of claim 2, wherein the image information comprises red, green, and blue RGB values of pixels comprised in the image frame of the first image.
6. The display device of claim 5, wherein the at least one processor is configured to generate the second image by: determining a luminance value and a transparency value based on average RGB values of pixels respectively included in the plurality of portions; adjusting lateral widths of the plurality of portions based on the determined luminance values, respectively; and applying blur filtering to the plurality of portions.
7. The display device of claim 6, wherein the at least one processor is configured to: the lateral width of a portion having a higher luminance value among the plurality of portions is adjusted to be narrower, and the lateral width of a portion having a lower luminance value among the plurality of portions is adjusted to be wider.
8. The display device according to claim 6, wherein the plurality of portions are surrounded by four edges including an upper edge, a lower edge, a left edge, and a right edge, respectively, and
the at least one processor is configured to apply blur filtering only to left and right edges of the plurality of portions.
9. The display device according to claim 1, wherein the GUIs include graphical user interfaces GUI respectively corresponding to services provided by the display device, and
the GUI is configured to: switching between GUIs in response to a user command issued by the user input device, the user command indicating a direction of movement in one of an up, down, left and right direction relative to a screen of the display.
10. The display device of claim 9, wherein the at least one processor is configured to: switching between GUIs of services different from each other in response to a user command issued by the user input device for movement in an upward direction or a downward direction, and switching between GUIs of the same service in response to a user command issued by the user input device for movement in a leftward direction or a rightward direction.
11. A method of controlling a display device, the method comprising:
displaying a first image including a plurality of image frames;
obtaining an image frame from the plurality of image frames in response to receiving a user command from a user input device to display a Graphical User Interface (GUI) while displaying the first image;
dividing the obtained image frame into a plurality of parts;
adjusting a lateral width of each of the plurality of portions based on red, green, and blue RGB values of pixels in each of the plurality of portions;
applying blur filtering to the plurality of portions having the adjusted lateral width; and
control the display to display the GUI as overlaid on a background image including the plurality of portions to which the blur filter is applied,
wherein the background image is a full screen image.
12. The method of claim 11, wherein displaying the GUI comprises:
generating a second image corresponding to a contour of at least one object in a first image by processing image information acquired from the first image; and
processing a second image to be displayed as the background image of the GUI.
13. The method of claim 12, wherein displaying the GUI further comprises:
acquiring image information changed by continuously reproducing the first image in real time; and
the second image is processed to vary according to the image information acquired in real time.
14. The method of claim 13, wherein displaying the GUI further comprises:
the red, green and blue RGB values of the second image are gradually changed so that the second image changes with respect to color over a period of time.
15. The method of claim 12, wherein the image information comprises red, green, and blue RGB values of pixels comprised in the image frame of the first image.
CN201680030475.2A 2015-05-28 2016-05-27 Display device and control method thereof Expired - Fee Related CN107743710B (en)

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
KR10-2015-0074965 2015-05-28
KR20150074965 2015-05-28
KR10-2016-0054921 2016-05-03
KR1020160054921A KR101799381B1 (en) 2015-05-28 2016-05-03 Display apparatus and control method thereof
PCT/KR2016/005604 WO2016190691A1 (en) 2015-05-28 2016-05-27 Display apparatus and control method thereof

Publications (2)

Publication Number Publication Date
CN107743710A CN107743710A (en) 2018-02-27
CN107743710B true CN107743710B (en) 2020-09-25

Family

ID=57573662

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201680030475.2A Expired - Fee Related CN107743710B (en) 2015-05-28 2016-05-27 Display device and control method thereof

Country Status (2)

Country Link
KR (1) KR101799381B1 (en)
CN (1) CN107743710B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102523672B1 (en) * 2017-11-14 2023-04-20 삼성전자주식회사 Display apparatus, control method thereof and recording media
CN109164954A (en) * 2018-07-28 2019-01-08 北京旺马科技有限公司 Selective method for application, system, car-mounted terminal, handheld device and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102186110A (en) * 2011-05-11 2011-09-14 深圳市茁壮网络股份有限公司 Method for processing pictures and STB (set top box)
CN102215428A (en) * 2011-06-23 2011-10-12 深圳市茁壮网络股份有限公司 Picture processing method and STB (Set Top Box)
CN102640509A (en) * 2009-11-30 2012-08-15 Lg电子株式会社 A network television and a method of controlling the same
CN102714763A (en) * 2009-11-17 2012-10-03 Lg电子株式会社 A method of providing menu for network television

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020109734A1 (en) * 1997-10-10 2002-08-15 Satoshi Umezu GUI processing system for performing an operation of an application which controls testing equipment
JP2012215852A (en) * 2011-03-25 2012-11-08 Semiconductor Energy Lab Co Ltd Image processing method and display device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102714763A (en) * 2009-11-17 2012-10-03 Lg电子株式会社 A method of providing menu for network television
CN102640509A (en) * 2009-11-30 2012-08-15 Lg电子株式会社 A network television and a method of controlling the same
CN102186110A (en) * 2011-05-11 2011-09-14 深圳市茁壮网络股份有限公司 Method for processing pictures and STB (set top box)
CN102215428A (en) * 2011-06-23 2011-10-12 深圳市茁壮网络股份有限公司 Picture processing method and STB (Set Top Box)

Also Published As

Publication number Publication date
CN107743710A (en) 2018-02-27
KR20160140375A (en) 2016-12-07
KR101799381B1 (en) 2017-11-22

Similar Documents

Publication Publication Date Title
US11726645B2 (en) Display apparatus for classifying and searching content, and method thereof
EP3099081B1 (en) Display apparatus and control method thereof
KR100736076B1 (en) Portable device and method providing scene capture
US8397262B2 (en) Systems and methods for graphical control of user interface features in a television receiver
JP5783245B2 (en) How to display a video stream according to a customized format
JP4999649B2 (en) Display device
US20070200953A1 (en) Method and Device for Displaying the Content of a Region of Interest within a Video Image
CN107743710B (en) Display device and control method thereof
US20120147025A1 (en) Image processing apparatus and user interface providing method thereof
EP2897362A1 (en) Apparatus for displaying image, driving method thereof, and method for displaying image
CN111954043B (en) Information bar display method and display equipment
CN112783380A (en) Display apparatus and method
JP2008122507A (en) Screen display processor, video display device, and osd display method
CN112788387A (en) Display apparatus, method and storage medium
KR20090059433A (en) Image processing apparatus and control method thereof
JP2006252276A (en) Video display device
CN114173187A (en) Method for determining dynamic contrast and display device
CN114979773A (en) Display device, video processing method, and storage medium
KR100697467B1 (en) Television receiver and method for processing mosaic
JP2009200937A (en) Digital broadcast receiver
KR100731357B1 (en) Method of controlling picturequality and display processing apparatus thereof
JP2012220941A (en) Image processing device and image processing method
KR20130082260A (en) Computing device for performing at least one of function and controlling the same
JP2010193530A (en) Video processor, and video display device
KR20090074623A (en) Method of extending an image

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20200925