WO2007114664A1 - Apparatus and method for multiple screen - Google Patents

Apparatus and method for multiple screen Download PDF

Info

Publication number
WO2007114664A1
WO2007114664A1 PCT/KR2007/001668 KR2007001668W WO2007114664A1 WO 2007114664 A1 WO2007114664 A1 WO 2007114664A1 KR 2007001668 W KR2007001668 W KR 2007001668W WO 2007114664 A1 WO2007114664 A1 WO 2007114664A1
Authority
WO
WIPO (PCT)
Prior art keywords
screen
audio content
audio
output
service
Prior art date
Application number
PCT/KR2007/001668
Other languages
French (fr)
Inventor
Jong-Ho Lee
Kwang-Kee Lee
Sung-Wook Byun
Glenn A. Adams
Un-Gyo Jung
Original Assignee
Samsung Electronics Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020070033459A external-priority patent/KR20070100136A/en
Application filed by Samsung Electronics Co., Ltd. filed Critical Samsung Electronics Co., Ltd.
Priority to MX2008012862A priority Critical patent/MX2008012862A/en
Priority to CA002648467A priority patent/CA2648467A1/en
Publication of WO2007114664A1 publication Critical patent/WO2007114664A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/443OS processes, e.g. booting an STB, implementing a Java virtual machine in an STB or power management in an STB
    • H04N21/4438Window management, e.g. event handling following interaction with the user interface
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/4104Peripherals receiving signals from specially adapted client devices
    • H04N21/4122Peripherals receiving signals from specially adapted client devices additional display device, e.g. video projector
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/443OS processes, e.g. booting an STB, implementing a Java virtual machine in an STB or power management in an STB
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/44Receiver circuitry for the reception of television signals according to analogue transmission standards
    • H04N5/445Receiver circuitry for the reception of television signals according to analogue transmission standards for displaying additional information
    • H04N5/45Picture in picture, e.g. displaying simultaneously another television channel in a region of the screen
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/44Receiver circuitry for the reception of television signals according to analogue transmission standards
    • H04N5/60Receiver circuitry for the reception of television signals according to analogue transmission standards for the sound signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/64Circuits for processing colour signals
    • H04N9/641Multi-purpose receivers, e.g. for auxiliary information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • H04N21/4316Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations for displaying supplemental content in a region of the screen, e.g. an advertisement in a separate window
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications

Definitions

  • the present invention relates to the configuration of multiple screens, and more particularly, to a method of dynamically configuring multiple screens which provide multiple contents on a single physical display device and an apparatus for providing the multiple screens.
  • Conventional broadcast receivers such as digital TVs or digital set-top boxes provide only one content element on a single physical display device or simultaneously display a main screen and a sub-screen on a single physical display device.
  • broadcast receivers can simultaneously display both the main screen and the sub-screen on the same display screen, they can only arrange the main screen and the sub-screen in a limited number of manners.
  • all elements of the content i.e., video data, audio data, and other data
  • content displayed within the sub-screen only some of the elements of the content are displayed.
  • Content sources include a broadcast service such as a satellite broadcaster, a terrestrial broadcaster, or a cable broadcaster, a storage medium such as DVDs, or an external device connected to an input terminal.
  • a broadcast service such as a satellite broadcaster, a terrestrial broadcaster, or a cable broadcaster
  • storage medium such as DVDs
  • MHP Mobile Multimedia Subsystem
  • ACAP Advanced Common Application
  • OCAP Open Cable Application Platform
  • HAVi Home Audio/Video Interoperability
  • UI user interface
  • An object of the present invention is to provide the dynamic configuration of multiple screens which provide a plurality of contents on a physical display device.
  • Another object of the present invention is to provide a method of independently selecting and outputting an audio content provided in individual services.
  • an apparatus for providing multiple screens including a service processing module providing a plurality of services to which a plurality of first audio contents are respectively allocated, a user/application interface module receiving a command to designate one of the first audio contents as a second audio content (where the second audio content is an audio content to be focused), and an output module outputting the first audio content that is designated as the second audio content and is focused in response to the received command.
  • an apparatus for providing multiple screens including a service processing module providing a plurality of services to which a plurality of first audio contents are respectively allocated, and an output module outputting one of the first audio contents designated as a second audio content with reference to output attributes of the first audio contents (where the second audio content is an audio content to be output).
  • a method of providing multiple screens including providing a plurality of services to which a plurality of first audio contents are respectively allocated, receiving a command to designate one of the first audio contents as a second audio content (where the second audio content is an audio content to be focused), and outputting the first audio content that is designated as the second audio content and is focused in response to the received command.
  • a method of providing multiple screens including providing a plurality of services to which a plurality of first audio contents are respectively allocated, and outputting one of the first audio contents designated as a second audio content with reference to output attributes of the first audio contents (where the second audio content is an audio content to be output).
  • FIGS. 1 to 8 are diagrams illustrating configurations of PiP screens according to an exemplary embodiment of the present invention.
  • FIG. 9 is a diagram illustrating the relationship between a logical screen and a display screen according to an exemplary embodiment of the present invention
  • FIGS. 10 to 14 are diagrams illustrating a configuration of a screen including a mapper according to an exemplary embodiment of the present invention
  • FIG. 15 is a block diagram illustrating service sources according to an exemplary embodiment of the present invention
  • FIGS. 16 to 17 are diagrams illustrating a non-abstract service and an abstract service according to an exemplary embodiment of the present invention
  • FIG. 18 is a diagram illustrating examples of the types of attribute information and interfaces of a logical screen and a display screen
  • FIG. 19 is a diagram illustrating an attribute 'z-order' of a logical screen according to an exemplary embodiment of the present invention.
  • FIGS. 20 and 21 are diagrams each illustrating an attribute 'Display Area' of a logical screen according to exemplary embodiments of the present invention.
  • FIG. 22 is a diagram illustrating a method of mapping two services to a display screen according to an exemplary embodiment of the present invention
  • FIG. 23 is a block diagram illustrating a configuration of an apparatus for providing multiple screens according to an exemplary embodiment of the present invention
  • FIG. 24 is a flowchart illustrating a method of dynamically configuring multiple screens according to an exemplary embodiment of the present invention
  • FIGS. 25 and 26 are diagrams illustrating PiP service providing modes according to an exemplary embodiment of the present invention
  • FIG. 27 is a flowchart illustrating a method of independently selecting an audio content and outputting the selected audio content according to an exemplary embodiment of the present invention
  • FIG. 28 is a diagram illustrating an example of a software architecture for providing multiple screens according to an exemplary embodiment of the present invention
  • FIG. 29 is a diagram illustrating the relationships among modules constituting an application program interface (API) layer according to an exemplary embodiment of the present invention
  • FIG. 30 is a flowchart illustrating a method of displaying a plurality of services that are displayed on respective corresponding logical screens on a display screen by the modules illustrated in FIG. 29;
  • API application program interface
  • FIG. 31 is a flowchart illustrating the output of audio content according to an embodiment of the present invention.
  • FIG. 32 is a flowchart illustrating the output of audio content according to another embodiment of the present invention.
  • These computer program instructions may also be stored in a computer usable or computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer usable or computer-readable memory produce an article of manufacture including instruction means that implement the function specified in the flowchart block or blocks.
  • the computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions that execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart block or blocks.
  • each block of the flowchart illustrations may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that in some alternative implementations, the functions noted in the blocks may occur out of the order. For example, two blocks illustrated in succession may in fact be executed substantially concurrently or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. For a better understanding of the present invention, the terms used in this disclosure will now be defined.
  • Service components are elements of a service and include a video component, an audio component, and a data component.
  • a data component is an application program in a service.
  • Applications can be classified into unbound applications and service bound applications.
  • the unbound application is not related to a specific channel, and thus the execution of the application is not effected by channel switch. Further, the unbound application usually has a high priority, and therefore resources for executing the application are smoothly assigned to the application.
  • a monitor application corresponds to one of specific unbound applications capable of having the highest priority.
  • the service bound application relates to a transport stream and usually has a low priority as compared to the unbound application.
  • the service bound application does not perform any critical system function. For this reason, when competition for resources occurs, a possibility for the service bound application to abandon resource assignment is higher as compared to the unbounded application.
  • the service bound applications can be classified into a linked type of operating in cooperation with a stream being currently transmitted and an unlinked type of operating regardless of a stream being currently transmitted.
  • the term 'service context' indicates an object which can control the executing of a service and includes various resources, devices, and execution state information needed for providing a service.
  • the term 'physical display device' indicates a physical space which actually displays the content of a service, an external output port outputting the content of a service, or a storage device storing a service.
  • the term 'display screen' indicates a screen actually displayed on a physical display device.
  • An arbitrary service may be directly set in the display screen, and the display screen may be displayed on a physical display device.
  • at least one logical screen which is mapped to a certain area of the display screen may be displayed on the physical display device.
  • the term 'logical screen' indicates a space in which an arbitrary service is displayed.
  • a logical screen is a virtual screen before being mapped to a display screen and thus is not displayed on a physical display device.
  • the logical screen and the display screen may be a combination of a background still image, a video raster, and a graphic raster.
  • the graphic raster may be a combination of text, lines, colors, and images or a mixture of video frames.
  • 'main service' indicates a service that is selected as a main service through a menu displayed on the physical display device or a remote controller by a user or through an API by an application, and the screen on which the main service is displayed is referred to as a 'main screen'.
  • 'Picture-in-Picture service indicates a service that is selected as a sub-service in the main service through a menu displayed on a physical display device or a remote controller by a user via an API by an application, and the PiP service may be displayed on a picture-in-picture screen (PiP screen) or a main screen.
  • the PiP screen includes a screen that occupies a part of another screen as illustrated in FIGS. 1 to 4 and a screen that is simultaneously displayed with another screen without overlapping the other screen as illustrated in FIGS. 5 to 6.
  • the PiP screen may include a screen that is displayed on an arbitrary location or area in the physical display device or overlaps another screen, as illustrated in FIGS. 7 and 8.
  • FIG. 9 is a diagram illustrating the relationship between a logical screen and a display screen according to an exemplary embodiment of the present invention.
  • a service may be displayed using logical screens 210, 212, and 214.
  • the logical screens 210, 212, and 214 are mapped to display screens 220, 222, and 224 through a mapping block 230.
  • the logical screens 210 and 212 are mapped to the display screen 220
  • the logical screens 212 and 214 are mapped to the display screen 222
  • the logical screens 210, 212, and 214 are mapped to the display screen 224.
  • at least one logical screen which displays a service is mapped to an arbitrary area of a display screen.
  • the mapping block 230 is a group of various pieces of information needed for mapping a logical screen to a display screen.
  • the various pieces of information include coordinate information of a predetermined area on a display screen to which each of a plurality of logical screens is mapped, identification information of the logical screens and the display screen, and information specifying in what order the logical screens are displayed on the display screen.
  • the mapping block 230 can change the size of the logical screen so to be allocated in an arbitrary area of the display screen. That is, the mapping block 230 can perform scaling of the logical screen and allocating of the position thereof, and FIGS. 10 to 14 are diagrams illustrating a configuration of the screen including a mapper as the mapping block.
  • the main screen including a combination of a background still image B, a video raster V, and a graphic raster G is mapped to the entire display screen by a mapper with a normal size.
  • the PiP screen including only video components is mapped to the entire display screen by the mapper with a reduced size.
  • the mapped PiP screen is displayed on the main screen, which is determined depending on a Z value.
  • the reference character Z refers to z-order value which will be described later.
  • An overlay screen may be combined with the display screen.
  • the overlay screen is a specific screen disposed at the outmost side, and may be used when providing a caption function.
  • the PiP screen may have only a video component as illustrated in FIG.
  • the main screen including the combination of the background still image B, the video raster V, and the graphic rater G is mapped to the entire display screen by the mapper with a normal size.
  • Two PiP screens #1 and #2 having only video component is mapped to an arbitrary area of the display screen by the mapper with a reduced size.
  • the mapped PiP screen is disposed on the main screen and the Z value can be constantly maintained.
  • the overlay screen may be combined with the display screen.
  • the configuration of the screen may have a plurality of PiP screens including only video components as illustrated in FIG.
  • FIG. 14 a plurality of PiP screens including a combination of the background still image B, the video raster V, and the graphic rater G.
  • POP screens are illustrated in FIG. 14. It can be understood that a known PiP screen is displayed inside the main screen and the POP screen is displayed outside the main screen.
  • the plurality of PiP screens #1 and #2 including a combination of the background still image B, the video raster V, and the graphic rater G are mapped to arbitrary areas of the display screen by the mapper with a reduced size. In this case, the Z value of the mapped POP screens #1 and #2 may be constantly maintained. Further, the overlay screen may be combined with the display screen.
  • the mapping block 230 may be realized by interfaces or functions prepared by various computer program languages to be executed and create or change the relationship between the logical screen and the display screen by using the above information as parameters.
  • mapping block 230 may be realized by hardware to perform a mapping function between the logical screen and the display screen.
  • services provided by various service sources may be displayed on a display screen, and the display screen may be displayed on a physical display device, as illustrated in FIG. 15.
  • service sources which provide broadcast services such as a terrestrial broadcaster 320 and a cable broadcaster 330, service sources which provide services stored in a storage medium such as a personal video recorder (PVR) 340, and service sources (not illustrated in FIG. 15) which provide services via a wired network or a wireless network.
  • PVR personal video recorder
  • a broadcast receiver 310 receives services from the service sources and produces logical screens displaying each of the received services.
  • an arbitrary service is directly set on the display screen to be displayed on a physical display device using a predefined method or a method set by a user or an application. Otherwise, at least one logical screen that is mapped to an arbitrary area on the display screen is displayed on a physical display device 350.
  • services provided by the terrestrial broadcaster 320, the cable broadcaster 330, and the PVR are displayed on the physical display device 350.
  • the terrestrial broadcaster 320, the cable broadcaster 330, and the PVR 340 are illustrated in FIG. 15 as being service sources, but the present invention is not limited to it. Any type of multimedia content source which provides multimedia contents that can be displayed together can be a service source according to an exemplary embodiment of the present invention.
  • Services according to an exemplary embodiment of the present invention can be classified into abstract services and non-abstract services, as illustrated in FIGS. 16 and 17.
  • the abstract services are not services provided by broadcast signals transmitted in real time but services independent of broadcast channels.
  • the abstract services include only data components, i.e., application programs, without video components and audio components.
  • Examples of the abstract services include services having unbound applications based on the open cable application platform (OCAP) standard.
  • the non-abstract services are understood as services other than abstract services.
  • both abstract services and non-abstract services have independency.
  • abstract services may be directly set on the physical display device not through logical screens and non-abstract services may be displayed on the logical screens.
  • the logical screens may be mapped to the display screen in which the abstract services are set. Thereafter, the display screen may be output through the physical display device.
  • the abstract services can be displayed on the display screen independently of the non-abstract services.
  • the abstract services and non-abstract services may be mapped to different logical screens. Thereafter, the logical screens may be mapped to a single display screen. In other words, the abstract services can be displayed on the display screen independently of non-abstract services.
  • the logical screen and the display screen may be categorized as being different objects.
  • a screen may serve as a logical screen or a display screen according to attribute information of one screen object.
  • Attribute information of the screen object includes a plurality of attributes 'Type',
  • FIG. 18 illustrates the attribute information on the screen object and the types of interface for processing the attribute information.
  • An attribute 'Type' 510 represents whether the screen is a logical screen or a display screen.
  • An attribute 'z-Order' 520 is for determining in what order a plurality of logical screens are arranged along the z-axis.
  • FIG. 19 illustrates different configurations of logical screens on a physical display device for different combinations of the values of attributes 'z-Order' of the logical screens.
  • first and second logical screens 620 and 630 are respectively mapped to predetermined areas of a display screen 610.
  • the first logical screen 620 is displayed on the display screen 610
  • the second logical screen 630 is displayed on the display screen partially overlapping the first logical screen 620.
  • the display screen 610, the first logical screen 620, and the second logical screen 630 are sequentially arranged in the direction of the z-axis.
  • an attribute 'z-Order' of the first logical screen 620 may be set to a value of 1
  • an attribute 'z-Order' of the second logical screen 630 may be set to a value of 2.
  • the attributes 'z-Order' of the first and second logical screens 620 and 630 may be set to any numbers or characters as long as they can represent a certain order in which the first and second logical screens 620 and 630 are to be arranged along the z-axis.
  • An attribute 'Display_Area' 530 is information regarding a display screen area of a logical screen, as to be illustrated in FIGS. 20 and 21.
  • FIG. 20 illustrates that a logical screen 710 is mapped to an entire area of the display screen 720
  • FIG. 21 illustrates that a logical screen 730 is mapped to a partial area of the display screen 740.
  • the attribute 'Display_Area' may include information specifying the 2-dimensional coordinates of a predetermined portion of a display screen to which the logical screen is to be mapped or may include information specifying a predetermined location on the display screen and an offset value indicating how much the logical screen deviates from the predetermined location on the display screen.
  • An attribute 'Visibility' 540 determines whether a logical screen is to be visibly or invisibly displayed on a display screen. It is possible to make a logical screen appear on or disappear from a display screen by altering the value of the attribute 'Visibility' 530.
  • An attribute 'Associated_Display_Screen' 550 is information regarding display screens associated with a logical screen.
  • a logical screen which is not associated with any display screens may not be displayed on a physical display device nor be transmitted to external output devices.
  • An attribute ' Associated Service Contexts' 560 is information regarding service contexts connected to a logical screen or a display screen. Services set in such service contexts may be displayed on a logical screen or a display screen.
  • An attribute 'OutputPort' 570 is information regarding devices by which a display screen is to be output, and such devices include display screens, wired/wireless communication media, and various storage media.
  • Interfaces for identifying or altering the values of the attributes illustrated in FIG. 18 may be provided.
  • the interfaces may include an interface 'SET' for setting attribute values or connecting a logical screen to a display screen, an interface 'ADD' for adding attribute values or connecting a logical screen to a service, an interface 'GET' for identifying attribute values, and an interface 'REMOVE' for deleting attribute values.
  • These interfaces may include processes, functions, procedures, or methods that perform their functions, respectively.
  • a method 'getDisplayScreen(void)' returns a display screen associated with the current screen.
  • the method 'getDisplayScreen(void)' returns the associated display screen.
  • the method 'getDisplayScreen(void)' returns reference information regarding the current screen.
  • the method 'getDisplayScreen(void)' returns a value of 'NULL'.
  • a method 'public void setDisplayArea(HScreenRectangle rect) throws SecurityException
  • IllegalStateException' provides a function for mapping the current logical screen to a predetermined area of the associated display screen.
  • An instance that is provided as a parameter is of a class 'HScreenRectangle' of a package 'org.havi.ui', and has 2-dimensional position information.
  • the execution of the methods 'SecurityException' and 'IllegalStateException' may be conducted as an exceptional operation for the method 'setDisplayScreen(HScreen screen)'.
  • the method 'IllegalStateException' may be executed when the current screen is a logical screen or when a portion of a display screen associated with a current logical screen cannot change due to the characteristics of a host platform.
  • a method 'getOutputArea(void)' returns regional information of a current screen as HScreenRectangle information. If the current screen corresponds to a display screen, the method 'getOutputArea(void)' returns HScreenRectangle information having the same value as HScreenRectangle (0,0, 1,1). If the current screen is a logical screen, the method 'getOutputArea(void)' returns information regarding an area on a display screen occupied by the current screen. If the current screen is a logical screen but is not associated with any display screen, the method 'getOutputArea(void)' returns a value 'NULL'. Certain terms are used throughout the following description to refer to particular interfaces. However, one skilled in the art will appreciate that a particular function is named just to indicate its functionality. This document does not intend to distinguish between functions that differ in name but not function.
  • FIG. 22 is a diagram illustrating a process that two services are set on two logical screens to be mapped to a single display screen.
  • a first service includes all the three service components, i.e., video, audio, and data components
  • a second service includes only video and audio components.
  • the present invention does not impose any restrictions on service components, and the first and second services illustrated in FIG. 8 are exemplary.
  • the first and second services are displayed on a physical display device in almost the same manner as in the related art. According to the current embodiment of the present invention, it is possible to display a plurality of services on a physical display device independently of one another without imposing any restrictions on the number of services that can be displayed on a single display screen.
  • FIG. 23 is a block diagram of an apparatus for providing multiple screens according to an exemplary embodiment of the present invention.
  • an apparatus 900 for providing multiple screens includes a digital signal processing module 940, a service processing module 950, an output module 960, and a user/application interface module 965. Also, the apparatus 900 includes a broadcast signal reception module 910, a storage medium 920, and an external input module 930 as service sources, and includes a display screen 970, a storage medium 980, and an external output module 990 as service output media.
  • 'module' means, but is not limited to, a software or hardware component, such as a Field Programmable Gate Array (FPGA) or Application Specific Integrated Circuit (ASIC), which performs certain tasks.
  • a module may advantageously be configured to reside on the addressable storage medium and configured to execute on one or more processors.
  • a module may include, by way of example, components, such as software components, object-oriented software components, class components and task components, processes, functions, attributes, procedures, subroutines, segments of program code, drivers, firmware, microcode, circuitry, data, databases, data structures, tables, arrays, and variables.
  • components such as software components, object-oriented software components, class components and task components, processes, functions, attributes, procedures, subroutines, segments of program code, drivers, firmware, microcode, circuitry, data, databases, data structures, tables, arrays, and variables.
  • the functionality provided for in the components and modules may be combined into fewer components and modules or further separated into additional components and modules.
  • the digital signal processing module 940 receives various information of a service such as a multimedia content, e.g., video information, audio information, or data information, from the broadcast signal reception module 910, the storage medium 920, or the external input module 930.
  • the broadcast signal reception module 910 receives a satellite, terrestrial, or cable broadcast signal and transmits the received broadcast signal
  • the storage medium 920 stores video information, audio information, or data information of a service
  • the external input module 930 receives video information, audio information, or data information of a service from an external device such as a network interface module connected to a network.
  • the digital signal processing module 940 restores a plurality of services using received service components.
  • the restored services include abstract or non-abstract services.
  • the 'a plurality of services' refers to two or more services transmitted by the broadcast signal reception module 910 or two or more services respectively transmitted by the broadcast signal reception module 910 and the storage medium 920.
  • the digital signal processing module 940 may restore services according to selection by a user or an application with the aid of the user/application interface module 965.
  • the user or the application may select the connection between an arbitrary service and a screen.
  • the service processing module 950 produces a logical screen and a display screen to display a service restored by the digital signal processing module 940.
  • the output module 960 maps a plurality of logical screens produced by the service processing module 950 to the display screen.
  • the mapping of the logical screens to the display screen may be conducted using a predefined method or a method set by the user with the aid of the user/application interface module 965.
  • a service restored by the digital signal processing module 940 may not be processed by the service module 950. Instead, a service restored by the digital signal processing module 940 may be directly mapped to a certain portion of a display screen produced by the output module 960.
  • a display screen provided by the output module 960 may be displayed on the physical display device 970 or may be stored in the storage medium 980. Examples of the storage medium 980 include computer readable floppy discs, hard discs, CD-ROM. DVD, DVD-ROM, BD(Blu-ray Disc), and semiconductor memories.
  • a display screen provided by the output module 960 may be transmitted to an external device connected to a network via the external output module 990.
  • the output module 960 may include a plurality of output ports via which a display screen can be provided.
  • a display screen can be provided via an output port set in advance as a default or an output port chosen by the user with the aid of the user/application interface module 965.
  • the user or the application can choose one of a plurality of services or restore desired services using the user/application interface module 965. Also, the user can choose one of a plurality of display screens using the user/application interface module 965. Since the modules illustrated in FIG. 23 are divided according to their functions, it is possible to be connected to the other modules.
  • FIG. 24 is a flowchart illustrating a method of dynamically configuring multiple screens according to an exemplary embodiment of the present invention.
  • video information, audio information, and data information constituting a multimedia content are transmitted in a predetermined format, for example, an MPEG stream format.
  • an apparatus for providing a service such as a multimedia content service receives video information, audio information, and data information and restores a service based on the video information, the audio information, and the data information.
  • the service restored in operation SlOlO may be selected or previously determined by a user or an application.
  • the user may use a menu displayed on the display device or a remote controller to select the connections between an arbitrary screen and a screen.
  • the application may select the connections using an API.
  • data information includes application information regarding application program for a service, and these application information includes signal information indicating whether the application program can be executed on a PiP screen.
  • Examples of the application information include application information table (AIT) based on MHP standard and extended application information table (XAIT) based on OCAP standard.
  • the signal information may be added to the application information.
  • the restored service is set such that it can be displayed on a logical screen.
  • the logical screen is mapped to a display screen.
  • the display screen is provided to the user using a display screen, a storage medium, or a network.
  • the restored service is illustrated in FIG. 24 as being displayed on a physical display device via a logical screen. However, the restored service may be directly displayed on a physical display device without passing through the logical screen.
  • FIG. 24 illustrates a method of mapping only one service to a display screen for simplicity. However, a plurality of services may be mapped to a display screen with or without passing through a plurality of logical screens. When a display screen is provided to the user in this manner, the user can perform a plurality of services.
  • the apparatus 900 for providing multiple screens provides the PiP service into two modes.
  • FIG. 25 illustrates a first mode of the two modes and
  • FIG. 26 illustrates a second mode of the two modes.
  • the first mode only video component for PiP service selected on the main screen is provided without creating a separate logical screen for PiP service, that is, a PiP screen.
  • any application related to the PiP service is not executed or the operation of an application related to the PiP service becomes inactive.
  • a separate logical screen for PiP service is created to provide the PiP service selected on the created PiP screen.
  • the PiP screen provided in the second mode may include a background video serving as a background screen, or a video component.
  • an application related to the PiP service can be executed. Whether an application related to the PiP service can be executed or not may be determined on the basis of the signal information described above. It is preferable that the first mode and the second mode cannot be executed at the same time.
  • the PiP service providing mode may be selected by input of the user or the application through the user/application interface module 965.
  • the digital signal processing module 940 restores only the video component of the selected PiP service.
  • the restored video component is mapped to the main screen produced by the service processing module 950 and is then displayed on a display screen produced by the output module 960.
  • the digital signal processing module 940 restores the selected PiP service.
  • the restored service is mapped to the PiP screen created by the service processing module 950 and is then displayed on the display screen produced by the output module 960.
  • the user or the application can select an audio content of a specific one of a plurality of services provided on the logical screen through the user/application interface module 965 and enables the output of the selected audio content to be independently output. This process is illustrated in FIG. 27.
  • the user or the application selects a specific service through the user/application module 965 (S 1310).
  • the digital signal processing module 940 extracts an audio content from the selected service, and the extracted audio content is mapped to a logical screen or a display screen produced by the service processing module 950 and is independently output on the corresponding screen by the output module 960 (S 1320).
  • 'independent output' can be understood as, for example, a concept including that an audio content of a specific service is selected and transmitted through an output port mapped to the display screen displaying the specific service simultaneously or exclusively with audio contents of the other services provided on the same screen. It can also be understood as the 'independent output' that audio contents of a plurality of services are output, provided, or stored through different media, respectively. In other words, an audio content need not be provided with other components of the same service and may be independently provided by the user or the application.
  • the output module 960 is selected by the user or the application and outputs the audio content through a predetermined external output module 990.
  • the individual audio contents may be independently output through the separate external output modules 990 at the same time.
  • the user or the application can select a desired audio content through the user/application module 965 independently of the other components in the same service.
  • FIG. 28 is a diagram illustrating a software architecture for providing multiple screens according to an exemplary embodiment of the present invention.
  • a software architecture 1400 includes a device driver layer
  • API application program interface
  • the device driver layer 1410 receives service components from various multimedia content sources and decodes the received service components. Examples of the received service components include video information, audio information, and data information.
  • the API layer 1420 generates a logical screen and a display screen and maps a service, the logical screen, and the display screen to one another.
  • the application layer 1430 provides a user interface so that a user can dynamically configure a logical screen which displays a service or transmits a user command to the API layer 1420 so that the API layer 1420 can execute the user command.
  • the user enables the device driver layer 1410 with the aid of the application layer 1430 to provide a display screen via a physical display device or to store the display screen in a storage medium.
  • the user can enable the device driver layer 1410 to transmit a display screen to an external device via a network.
  • the device driver layer 1410 may include a plurality of output ports which can provide a display screen.
  • API layer 1420 may include the plurality of output ports.
  • the API layer 1420 may include a plurality of software modules, e.g., a multi-screen manager module 'MultiscreenManager' , a multi-screen context module 'MultiscreenContext', a multiscreen context listener module 'MultiscreenContextListener', and a multi-screen context event module 'MultiscreenContextEvent', as illustrated in FIG. 29.
  • a multi-screen manager module 'MultiscreenManager' e.g., a multi-screen context module 'MultiscreenContext', a multiscreen context listener module 'MultiscreenContextListener', and a multi-screen context event module 'MultiscreenContextEvent', as illustrated in FIG. 29.
  • the multi-screen manager module 1510 manages the multi-screen context module 1230, searches for a desired screen, displays information specifying what devices are shared by screens, registers the multi-screen context listener module 1550, or cancels the registration of the screen context listener module 1550.
  • the multi-screen context module 1530 is an interface object associated with a screen object 1520 and determines whether the screen object 1520 is to become a logical screen or a display screen according to an interface operation performed by the multi-screen context module 1530.
  • Various attributes illustrated in FIG. 18 may be set in the multi-screen context module 1530.
  • the multi-screen context module 1530 can provide the functions 'SET', 'ADD', 'GET', and 'REMOVE' described above with reference to FIG. 18.
  • the multi-screen context event module 1540 serves as an event class announcing that the attribute information of the screen object 1220 has been changed, and the multi-screen context listener module 1550 serves as a listener interface object which can be realized in a predetermined application class which attempts to receive an event prompted by the multi-screen context event module 1540.
  • An application 1560 is a module which is driven on the application layer 1430.
  • the application 1560 allows the user to choose a desired service and to freely arrange a plurality of logical screens on a display screen.
  • the application 1560 transmits various commands which allow the user to dynamically configure and manage logical screens to the multi-screen manager module 1510, and the multi-screen manager module 1510 controls operations corresponding to the various commands to be executed through the multi-screen context module 1530.
  • the multi-screen context module 1530 is associated with the screen object 1520 and manages the attribute information of the screen object 1520 illustrated in FIG. 18. In order to manage the attribute information of the screen object 1520, the multi-screen context module 1530 may include a variety of functions or methods.
  • the multi-screen manager module 1510 can receive service components provided by various service sources from the device driver layer 1410 and can operate to display the received service components on a logical screen or a display screen. Such a function may be performed by a separate module not illustrated.
  • FIG. 30 is a flowchart illustrating a method of displaying a plurality of services displayed on respective corresponding logical screens by the modules illustrated in FIG. 29 on a display screen according to an exemplary embodiment of the present invention.
  • the multi-screen manager module 1510 in operation S 1610, produces a display screen and a number of logical screens corresponding to the number of services to be performed.
  • the multi-screen manager module 1510 connects the logical screens to respective corresponding services received from the device driver layer 1410.
  • the multi-screen manager module 1510 may call a method 'addServiceContext' for each of the logical screens by setting service context objects of the received services as parameters for the logical screens services.
  • the method 'addServiceContext' connects a logical screen to a service and may be provided by the multi-screen context module 1230.
  • the multi-screen manager module 1510 connects the logical screens to the display screen.
  • the multi-screen manager module 1510 may call a method 'setDisplayScreen' for each of the logical screens by setting a display screen object to which the logical screens are connected as a parameter.
  • the method 'setDisplayScreen' connects a logical screen to a display screen and may be provided by the multi-screen context module 1530.
  • a method 'setDisplayScreensetDisplayScreen' may be set to 'public void setDisplayScreensetDisplayScreen(HScreen screen) throws SecurityException, Illegal StateException', and this method allows an instance 'HScreen' that is provided as a parameter to be associated with the current logical screen.
  • the instance 'HScreen' is preferably a display screen.
  • a parameter of the method 'setDisplayScreen(HScreen screen)' may include a value of 'NULL'.
  • the method 'setDisplayScreen(HScreen screen)' is executed without exception handling, the current logical screen is no longer associated with the display screen.
  • the execution of the methods 'SecurityException' and 'IllegalStateException' may be conducted as an exceptional operation for the method 'setOutputScreen(HScreen screen)'.
  • the method 'IllegalStateException' may be executed when a current screen is a logical screen or when a portion of a display screen associated with a current logical screen cannot change due to the characteristics of a host platform.
  • FIG. 31 is a flowchart illustrating the output of an audio content according to an embodiment of the present invention.
  • the service processing module 950 may represent a plurality of services, and an audio content allocated to each of the services may be focused by a user or an application.
  • a user or an application may input a command to designate predetermined as a focus target. Then, in operation S 1710, the user/application interface module 965 receives the command input by the user or the application.
  • a method 'assignAudioFocus' may be called.
  • the method 'assignAudioFocus' is a method for enabling one of a plurality of audio contents to be focused, and may be provided by the multi-screen context module 1530.
  • the output module 960 outputs the audio content that is designated as the focus target and is focused (S 1730). More specifically, the output module 960 may examine whichever of a plurality of audio contents respectively allocated to a plurality of services is designated as the focus target (S 1720) and then output the audio content designated as the focus target according to the result of the examination (S1730).
  • FIG. 32 is a flowchart illustrating the output of an audio content according to another embodiment of the present invention.
  • the output module 960 may output a plurality of audio contents at the same time. More specifically, referring to FIG. 32, the output module 960 examines the attributes of each of a plurality of audio contents respectively allocated to a plurality of services (Sl 810) and outputs whichever of the audio contents is designated as an output target (S 1820).
  • the output module 960 may output the audio contents designated as the output target along with an audio content currently being focused. For example, assume that audio contents A, B, C, and D are allocated to respective corresponding services, and that the audio content A is currently being focused. If the audio content B is set not to be able to be simultaneously output along with other audio contents and the audio contents C and D are set to be able to be simultaneously output along with other audio contents, the output module 960 may simultaneously output the audio content A along with the audio contents C and D.
  • the audio content B may be output when being focused by a user or an application.
  • the output module 960 may output an audio content designated as an output target by referencing the output attributes of a plurality of audio contents respectively allocated to a plurality of services.
  • the output attributes of the audio contents may be set by a user or an application using the user/application interface module 965.
  • a method 'addAudioSources' may be called.
  • the method 'addAudioSources' may set a number of audio contents to be able to be simultaneously output.
  • Parameters of the method 'addAudioSources' may include a matrix 'device' having the same type as a class ⁇ ScreenDevice[]' of the package 'org.havi.ui' and a
  • Boolean flag 'mixWithAudioFocus' It may be determined whether a plurality of audio contents can be simultaneously output based on the value of the Boolean flag 'mixWithAudioFocus' .
  • a user or an application may select at least one device to which an audio content is connected and set the audio content to be able to be simultaneously output to the selected device along with another audio content by using the method 'addAudioSources'. For example, if the Boolean flag 'mixWithAudioFocus' of the method 'addAudioSources' is set to 1, the audio content of the selected device may be simultaneously output along with an audio content currently being focused. On the other hand, if the Boolean flag 'mixWithAudioFocus' of the method 'addAudioSources' is set to 0, the audio content of the selected device may not be simultaneously output along with the audio content currently being focused. In this case, the audio content of the selected device may be output only when being focused.
  • a user or an application may cancel the simultaneous output setting of an audio content that is set to be able to be simultaneously output along with another audio content.
  • a method 'removeAudioSources' may be called.
  • the method 'removeAudioSources' may include, as a parameter, a matrix 'device' which has the same type as the class ⁇ ScreenDevice[]' of the package 'org.havi.ui'.
  • the simultaneous output setting of an audio content of at least one device designated by the matrix 'device' of the method 'removeAudioSources' is canceled.
  • the method 'removeAudioSources' may not include any parameter. In this case, the simultaneous output setting of audio contents of all devices may be cancelled.
  • the present invention it is possible to perform a plurality of services provided by various sources such as cable broadcasts, terrestrial broadcasts, various storage media, and external inputs, in various manners using a single physical display screen.

Abstract

The present invention relates to an apparatus and method for providing multiple screens. The apparatus includes a service processing module providing a plurality of services to which a plurality of first audio contents are respectively allocated, a user/application interface module receiving a command to designate one of the first audio contents as a second audio content (where the second audio content is an audio content to be focused), an output module outputting the first audio content that is designated as the second audio content and is focused in response to the received command.

Description

APPARATUS AND METHOD FOR MULTIPLE SCREEN
Technical Field The present invention relates to the configuration of multiple screens, and more particularly, to a method of dynamically configuring multiple screens which provide multiple contents on a single physical display device and an apparatus for providing the multiple screens.
Background Art
Conventional broadcast receivers such as digital TVs or digital set-top boxes provide only one content element on a single physical display device or simultaneously display a main screen and a sub-screen on a single physical display device.
Even though conventional broadcast receivers can simultaneously display both the main screen and the sub-screen on the same display screen, they can only arrange the main screen and the sub-screen in a limited number of manners. In the case of a content displayed within the main screen, all elements of the content, i.e., video data, audio data, and other data, are displayed. On the other hand, in the case of a content displayed within the sub-screen, only some of the elements of the content are displayed. Content sources include a broadcast service such as a satellite broadcaster, a terrestrial broadcaster, or a cable broadcaster, a storage medium such as DVDs, or an external device connected to an input terminal. However, it is quite difficult to display contents provided by such various content sources on a display screen using the existing broadcast receivers. In an interactive TV application program environment such as Multimedia Home
Platform (MHP), Advanced Common Application (ACAP), Open Cable Application Platform (OCAP), it is assumed that only one screen is output on a physical display device.
In the interactive TV application program environment, for example, a Home Audio/Video Interoperability (HAVi)-based user interface (UI) is adopted. According to the HAVi UI standard, even though no restriction is imposed on the number of screens displayed on a physical display device, only one screen is generally displayed on a physical display device.
DISCLOSURE OF THE INVENTION
Technical Problem
In such an environment, it is difficult to perform operations, such as decoding, digital signal processing, user interaction processing, etc. with respect to one among multimedia contents displayed on a screen while displaying the multimedia contents on independent screens. In addition, it is also difficult to dynamically control the life cycles of application programs and the use of resources in the units of the screens.
Accordingly, there exists a need for a method of displaying a variety of contents on a dynamically configured screen.
Technical Solution
An object of the present invention is to provide the dynamic configuration of multiple screens which provide a plurality of contents on a physical display device.
Another object of the present invention is to provide a method of independently selecting and outputting an audio content provided in individual services.
The above and other objects of the present invention will be described in or be apparent from the following description of the preferred embodiments.
According to an aspect of the present invention, there is provided an apparatus for providing multiple screens, the apparatus including a service processing module providing a plurality of services to which a plurality of first audio contents are respectively allocated, a user/application interface module receiving a command to designate one of the first audio contents as a second audio content (where the second audio content is an audio content to be focused), and an output module outputting the first audio content that is designated as the second audio content and is focused in response to the received command.
According to another aspect of the present invention, there is provided an apparatus for providing multiple screens, the apparatus including a service processing module providing a plurality of services to which a plurality of first audio contents are respectively allocated, and an output module outputting one of the first audio contents designated as a second audio content with reference to output attributes of the first audio contents ( where the second audio content is an audio content to be output).
According to another aspect of the present invention, there is provided a method of providing multiple screens, the method including providing a plurality of services to which a plurality of first audio contents are respectively allocated, receiving a command to designate one of the first audio contents as a second audio content (where the second audio content is an audio content to be focused), and outputting the first audio content that is designated as the second audio content and is focused in response to the received command.
According to another aspect of the present invention, there is provided a method of providing multiple screens, the method including providing a plurality of services to which a plurality of first audio contents are respectively allocated, and outputting one of the first audio contents designated as a second audio content with reference to output attributes of the first audio contents (where the second audio content is an audio content to be output).
Description of Drawings
The above and other features and advantages of the present invention will become more apparent by describing in detail preferred embodiments thereof with reference to the attached drawings in which:
FIGS. 1 to 8 are diagrams illustrating configurations of PiP screens according to an exemplary embodiment of the present invention;
FIG. 9 is a diagram illustrating the relationship between a logical screen and a display screen according to an exemplary embodiment of the present invention; FIGS. 10 to 14 are diagrams illustrating a configuration of a screen including a mapper according to an exemplary embodiment of the present invention;
FIG. 15 is a block diagram illustrating service sources according to an exemplary embodiment of the present invention; FIGS. 16 to 17 are diagrams illustrating a non-abstract service and an abstract service according to an exemplary embodiment of the present invention;
FIG. 18 is a diagram illustrating examples of the types of attribute information and interfaces of a logical screen and a display screen;
FIG. 19 is a diagram illustrating an attribute 'z-order' of a logical screen according to an exemplary embodiment of the present invention;
FIGS. 20 and 21 are diagrams each illustrating an attribute 'Display Area' of a logical screen according to exemplary embodiments of the present invention;
FIG. 22 is a diagram illustrating a method of mapping two services to a display screen according to an exemplary embodiment of the present invention; FIG. 23 is a block diagram illustrating a configuration of an apparatus for providing multiple screens according to an exemplary embodiment of the present invention;
FIG. 24 is a flowchart illustrating a method of dynamically configuring multiple screens according to an exemplary embodiment of the present invention; FIGS. 25 and 26 are diagrams illustrating PiP service providing modes according to an exemplary embodiment of the present invention;
FIG. 27 is a flowchart illustrating a method of independently selecting an audio content and outputting the selected audio content according to an exemplary embodiment of the present invention; FIG. 28 is a diagram illustrating an example of a software architecture for providing multiple screens according to an exemplary embodiment of the present invention;
FIG. 29 is a diagram illustrating the relationships among modules constituting an application program interface (API) layer according to an exemplary embodiment of the present invention; FIG. 30 is a flowchart illustrating a method of displaying a plurality of services that are displayed on respective corresponding logical screens on a display screen by the modules illustrated in FIG. 29;
FIG. 31 is a flowchart illustrating the output of audio content according to an embodiment of the present invention; and
FIG. 32 is a flowchart illustrating the output of audio content according to another embodiment of the present invention.
<Reference Names of Major Components Shown in the Drawings> 900 : Apparatus for providing multiple screens 910 : Broadcast Signal Reception Module
920 : Storage Medium 930 : External Input Module 940 : Digital Signal Processing Module 950 : Service Processing Module 960 : Output Module
965 : User/ Application Interface Module 970 : Display Screen 980 : Storage Medium 990 : External Output Module
Mode for Invention
Advantages and features of the present invention and methods of accomplishing the same may be understood more readily by reference to the following detailed description of preferred embodiments and the accompanying drawings. The present invention may, however, be embodied in many different forms and should not be construed as being limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete and will fully convey the concept of the invention to those skilled in the art, and the present invention will only be defined by the appended claims. Like reference numerals refer to like elements throughout the specification. The present invention is described hereinafter with reference to flowchart illustrations of user interfaces, methods, and computer program products according to embodiments of the invention. It will be understood that each block of the flowchart illustrations, and combinations of blocks in the flowchart illustrations, can be implemented by computer program instructions. These computer program instructions can be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart block or blocks.
These computer program instructions may also be stored in a computer usable or computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer usable or computer-readable memory produce an article of manufacture including instruction means that implement the function specified in the flowchart block or blocks.
The computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions that execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart block or blocks.
And each block of the flowchart illustrations may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that in some alternative implementations, the functions noted in the blocks may occur out of the order. For example, two blocks illustrated in succession may in fact be executed substantially concurrently or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. For a better understanding of the present invention, the terms used in this disclosure will now be defined.
The term 'service' indicates a group of multimedia contents displayed together, i.e., a group of service components. Service components are elements of a service and include a video component, an audio component, and a data component. A data component is an application program in a service.
Applications can be classified into unbound applications and service bound applications. The unbound application is not related to a specific channel, and thus the execution of the application is not effected by channel switch. Further, the unbound application usually has a high priority, and therefore resources for executing the application are smoothly assigned to the application. A monitor application corresponds to one of specific unbound applications capable of having the highest priority.
Meanwhile, the service bound application relates to a transport stream and usually has a low priority as compared to the unbound application. The service bound application does not perform any critical system function. For this reason, when competition for resources occurs, a possibility for the service bound application to abandon resource assignment is higher as compared to the unbounded application. The service bound applications can be classified into a linked type of operating in cooperation with a stream being currently transmitted and an unlinked type of operating regardless of a stream being currently transmitted.
The term 'service context' indicates an object which can control the executing of a service and includes various resources, devices, and execution state information needed for providing a service. The term 'physical display device' indicates a physical space which actually displays the content of a service, an external output port outputting the content of a service, or a storage device storing a service.
The term 'display screen' indicates a screen actually displayed on a physical display device. An arbitrary service may be directly set in the display screen, and the display screen may be displayed on a physical display device. Alternatively, at least one logical screen which is mapped to a certain area of the display screen may be displayed on the physical display device.
The term 'logical screen' indicates a space in which an arbitrary service is displayed. A logical screen is a virtual screen before being mapped to a display screen and thus is not displayed on a physical display device.
The logical screen and the display screen may be a combination of a background still image, a video raster, and a graphic raster. The graphic raster may be a combination of text, lines, colors, and images or a mixture of video frames.
The term 'main service' indicates a service that is selected as a main service through a menu displayed on the physical display device or a remote controller by a user or through an API by an application, and the screen on which the main service is displayed is referred to as a 'main screen'.
The term 'Picture-in-Picture service (PiP service)' indicates a service that is selected as a sub-service in the main service through a menu displayed on a physical display device or a remote controller by a user via an API by an application, and the PiP service may be displayed on a picture-in-picture screen (PiP screen) or a main screen.
The PiP screen includes a screen that occupies a part of another screen as illustrated in FIGS. 1 to 4 and a screen that is simultaneously displayed with another screen without overlapping the other screen as illustrated in FIGS. 5 to 6. In this case, it is understood that the PiP screen may include a screen that is displayed on an arbitrary location or area in the physical display device or overlaps another screen, as illustrated in FIGS. 7 and 8.
FIG. 9 is a diagram illustrating the relationship between a logical screen and a display screen according to an exemplary embodiment of the present invention. Referring to FIG. 9, a service may be displayed using logical screens 210, 212, and 214. The logical screens 210, 212, and 214 are mapped to display screens 220, 222, and 224 through a mapping block 230.
In detail, the logical screens 210 and 212 are mapped to the display screen 220, the logical screens 212 and 214 are mapped to the display screen 222, and the logical screens 210, 212, and 214 are mapped to the display screen 224. In short, at least one logical screen which displays a service is mapped to an arbitrary area of a display screen.
The mapping block 230 is a group of various pieces of information needed for mapping a logical screen to a display screen. Examples of the various pieces of information include coordinate information of a predetermined area on a display screen to which each of a plurality of logical screens is mapped, identification information of the logical screens and the display screen, and information specifying in what order the logical screens are displayed on the display screen.
The mapping block 230 can change the size of the logical screen so to be allocated in an arbitrary area of the display screen. That is, the mapping block 230 can perform scaling of the logical screen and allocating of the position thereof, and FIGS. 10 to 14 are diagrams illustrating a configuration of the screen including a mapper as the mapping block.
Referring to FIG. 10, the main screen including a combination of a background still image B, a video raster V, and a graphic raster G is mapped to the entire display screen by a mapper with a normal size. The PiP screen including only video components is mapped to the entire display screen by the mapper with a reduced size. In this case, the mapped PiP screen is displayed on the main screen, which is determined depending on a Z value. The reference character Z refers to z-order value which will be described later. An overlay screen may be combined with the display screen. The overlay screen is a specific screen disposed at the outmost side, and may be used when providing a caption function. The PiP screen may have only a video component as illustrated in FIG. 10, or may have a combination of the background still image B, the video raster V, and the graphic rater G as illustrated in FIG. 11. Referring to FIG. 12, the main screen including the combination of the background still image B, the video raster V, and the graphic rater G is mapped to the entire display screen by the mapper with a normal size. Two PiP screens #1 and #2 having only video component is mapped to an arbitrary area of the display screen by the mapper with a reduced size. In this case, the mapped PiP screen is disposed on the main screen and the Z value can be constantly maintained. Further, the overlay screen may be combined with the display screen. The configuration of the screen may have a plurality of PiP screens including only video components as illustrated in FIG. 12 or a plurality of PiP screens including a combination of the background still image B, the video raster V, and the graphic rater G. POP screens are illustrated in FIG. 14. It can be understood that a known PiP screen is displayed inside the main screen and the POP screen is displayed outside the main screen. Referring to FIG. 14, the plurality of PiP screens #1 and #2 including a combination of the background still image B, the video raster V, and the graphic rater G are mapped to arbitrary areas of the display screen by the mapper with a reduced size. In this case, the Z value of the mapped POP screens #1 and #2 may be constantly maintained. Further, the overlay screen may be combined with the display screen.
The mapping block 230 may be realized by interfaces or functions prepared by various computer program languages to be executed and create or change the relationship between the logical screen and the display screen by using the above information as parameters.
Alternatively, the mapping block 230 may be realized by hardware to perform a mapping function between the logical screen and the display screen.
Further, services provided by various service sources may be displayed on a display screen, and the display screen may be displayed on a physical display device, as illustrated in FIG. 15.
There are service sources which provide broadcast services such as a terrestrial broadcaster 320 and a cable broadcaster 330, service sources which provide services stored in a storage medium such as a personal video recorder (PVR) 340, and service sources (not illustrated in FIG. 15) which provide services via a wired network or a wireless network.
A broadcast receiver 310 receives services from the service sources and produces logical screens displaying each of the received services.
Then, an arbitrary service is directly set on the display screen to be displayed on a physical display device using a predefined method or a method set by a user or an application. Otherwise, at least one logical screen that is mapped to an arbitrary area on the display screen is displayed on a physical display device 350. In short, services provided by the terrestrial broadcaster 320, the cable broadcaster 330, and the PVR are displayed on the physical display device 350.
The terrestrial broadcaster 320, the cable broadcaster 330, and the PVR 340 are illustrated in FIG. 15 as being service sources, but the present invention is not limited to it. Any type of multimedia content source which provides multimedia contents that can be displayed together can be a service source according to an exemplary embodiment of the present invention.
Services according to an exemplary embodiment of the present invention can be classified into abstract services and non-abstract services, as illustrated in FIGS. 16 and 17.
The abstract services are not services provided by broadcast signals transmitted in real time but services independent of broadcast channels. The abstract services include only data components, i.e., application programs, without video components and audio components. Examples of the abstract services include services having unbound applications based on the open cable application platform (OCAP) standard.
The non-abstract services are understood as services other than abstract services.
According to the current embodiment of the present invention, both abstract services and non-abstract services have independency. For example, abstract services may be directly set on the physical display device not through logical screens and non-abstract services may be displayed on the logical screens. Then, the logical screens may be mapped to the display screen in which the abstract services are set. Thereafter, the display screen may be output through the physical display device. By doing so, the abstract services can be displayed on the display screen independently of the non-abstract services. In addition, the abstract services and non-abstract services may be mapped to different logical screens. Thereafter, the logical screens may be mapped to a single display screen. In other words, the abstract services can be displayed on the display screen independently of non-abstract services.
According to the current embodiment of the present invention, the logical screen and the display screen may be categorized as being different objects. Alternatively, a screen may serve as a logical screen or a display screen according to attribute information of one screen object.
In detail, whether a screen is a logical screen or a display screen can be known through type information of attribute information on the screen object. Attribute information of the screen object includes a plurality of attributes 'Type',
'z-Order', 'Display Area', 'Visibility', 'Associated Display Screen', 'Associated Service Contexts', 'Associated_Logical_Screens', and 'OutputPort'.
FIG. 18 illustrates the attribute information on the screen object and the types of interface for processing the attribute information. An attribute 'Type' 510 represents whether the screen is a logical screen or a display screen.
An attribute 'z-Order' 520 is for determining in what order a plurality of logical screens are arranged along the z-axis. FIG. 19 illustrates different configurations of logical screens on a physical display device for different combinations of the values of attributes 'z-Order' of the logical screens.
Referring to FIG. 19, first and second logical screens 620 and 630 are respectively mapped to predetermined areas of a display screen 610. In detail, the first logical screen 620 is displayed on the display screen 610, and the second logical screen 630 is displayed on the display screen partially overlapping the first logical screen 620. In other words, the display screen 610, the first logical screen 620, and the second logical screen 630 are sequentially arranged in the direction of the z-axis. In this case, an attribute 'z-Order' of the first logical screen 620 may be set to a value of 1, and an attribute 'z-Order' of the second logical screen 630 may be set to a value of 2. The attributes 'z-Order' of the first and second logical screens 620 and 630 may be set to any numbers or characters as long as they can represent a certain order in which the first and second logical screens 620 and 630 are to be arranged along the z-axis.
An attribute 'Display_Area' 530 is information regarding a display screen area of a logical screen, as to be illustrated in FIGS. 20 and 21. FIG. 20 illustrates that a logical screen 710 is mapped to an entire area of the display screen 720, and FIG. 21 illustrates that a logical screen 730 is mapped to a partial area of the display screen 740.
The attribute 'Display_Area' may include information specifying the 2-dimensional coordinates of a predetermined portion of a display screen to which the logical screen is to be mapped or may include information specifying a predetermined location on the display screen and an offset value indicating how much the logical screen deviates from the predetermined location on the display screen.
An attribute 'Visibility' 540 determines whether a logical screen is to be visibly or invisibly displayed on a display screen. It is possible to make a logical screen appear on or disappear from a display screen by altering the value of the attribute 'Visibility' 530.
An attribute 'Associated_Display_Screen' 550 is information regarding display screens associated with a logical screen. A logical screen which is not associated with any display screens may not be displayed on a physical display device nor be transmitted to external output devices.
An attribute ' Associated Service Contexts' 560 is information regarding service contexts connected to a logical screen or a display screen. Services set in such service contexts may be displayed on a logical screen or a display screen.
An attribute 'OutputPort' 570 is information regarding devices by which a display screen is to be output, and such devices include display screens, wired/wireless communication media, and various storage media.
Interfaces for identifying or altering the values of the attributes illustrated in FIG. 18 may be provided. Referring to FIG. 18, the interfaces may include an interface 'SET' for setting attribute values or connecting a logical screen to a display screen, an interface 'ADD' for adding attribute values or connecting a logical screen to a service, an interface 'GET' for identifying attribute values, and an interface 'REMOVE' for deleting attribute values. These interfaces may include processes, functions, procedures, or methods that perform their functions, respectively.
For example, a method 'getDisplayScreen(void)' returns a display screen associated with the current screen. In detail, if the current screen is a logical screen, the method 'getDisplayScreen(void)' returns the associated display screen. If the current screen is display screen, the method 'getDisplayScreen(void)' returns reference information regarding the current screen. Further, if the current screen is a logical screen, but there is no associated screen, the method 'getDisplayScreen(void)' returns a value of 'NULL'.
According to another example, a method 'public void setDisplayArea(HScreenRectangle rect) throws SecurityException, IllegalStateException' provides a function for mapping the current logical screen to a predetermined area of the associated display screen. An instance that is provided as a parameter is of a class 'HScreenRectangle' of a package 'org.havi.ui', and has 2-dimensional position information. The execution of the methods 'SecurityException' and 'IllegalStateException' may be conducted as an exceptional operation for the method 'setDisplayScreen(HScreen screen)'. The method 'IllegalStateException' may be executed when the current screen is a logical screen or when a portion of a display screen associated with a current logical screen cannot change due to the characteristics of a host platform.
According to still another example, a method 'getOutputArea(void)' returns regional information of a current screen as HScreenRectangle information. If the current screen corresponds to a display screen, the method 'getOutputArea(void)' returns HScreenRectangle information having the same value as HScreenRectangle (0,0, 1,1). If the current screen is a logical screen, the method 'getOutputArea(void)' returns information regarding an area on a display screen occupied by the current screen. If the current screen is a logical screen but is not associated with any display screen, the method 'getOutputArea(void)' returns a value 'NULL'. Certain terms are used throughout the following description to refer to particular interfaces. However, one skilled in the art will appreciate that a particular function is named just to indicate its functionality. This document does not intend to distinguish between functions that differ in name but not function.
FIG. 22 is a diagram illustrating a process that two services are set on two logical screens to be mapped to a single display screen. Referring to FIG. 22, a first service includes all the three service components, i.e., video, audio, and data components, and a second service includes only video and audio components. However, the present invention does not impose any restrictions on service components, and the first and second services illustrated in FIG. 8 are exemplary. As illustrated in FIG. 22, the first and second services are displayed on a physical display device in almost the same manner as in the related art. According to the current embodiment of the present invention, it is possible to display a plurality of services on a physical display device independently of one another without imposing any restrictions on the number of services that can be displayed on a single display screen. FIG. 23 is a block diagram of an apparatus for providing multiple screens according to an exemplary embodiment of the present invention.
Referring to FIG. 23, an apparatus 900 for providing multiple screens includes a digital signal processing module 940, a service processing module 950, an output module 960, and a user/application interface module 965. Also, the apparatus 900 includes a broadcast signal reception module 910, a storage medium 920, and an external input module 930 as service sources, and includes a display screen 970, a storage medium 980, and an external output module 990 as service output media.
The term 'module', as used herein, means, but is not limited to, a software or hardware component, such as a Field Programmable Gate Array (FPGA) or Application Specific Integrated Circuit (ASIC), which performs certain tasks. A module may advantageously be configured to reside on the addressable storage medium and configured to execute on one or more processors. Thus, a module may include, by way of example, components, such as software components, object-oriented software components, class components and task components, processes, functions, attributes, procedures, subroutines, segments of program code, drivers, firmware, microcode, circuitry, data, databases, data structures, tables, arrays, and variables. The functionality provided for in the components and modules may be combined into fewer components and modules or further separated into additional components and modules. The digital signal processing module 940 receives various information of a service such as a multimedia content, e.g., video information, audio information, or data information, from the broadcast signal reception module 910, the storage medium 920, or the external input module 930. The broadcast signal reception module 910 receives a satellite, terrestrial, or cable broadcast signal and transmits the received broadcast signal, the storage medium 920 stores video information, audio information, or data information of a service, and the external input module 930 receives video information, audio information, or data information of a service from an external device such as a network interface module connected to a network.
The digital signal processing module 940 restores a plurality of services using received service components. The restored services include abstract or non-abstract services.
Here, the 'a plurality of services' refers to two or more services transmitted by the broadcast signal reception module 910 or two or more services respectively transmitted by the broadcast signal reception module 910 and the storage medium 920.
The digital signal processing module 940 may restore services according to selection by a user or an application with the aid of the user/application interface module 965. In this case, the user or the application may select the connection between an arbitrary service and a screen.
The service processing module 950 produces a logical screen and a display screen to display a service restored by the digital signal processing module 940.
The output module 960 maps a plurality of logical screens produced by the service processing module 950 to the display screen. The mapping of the logical screens to the display screen may be conducted using a predefined method or a method set by the user with the aid of the user/application interface module 965.
A service restored by the digital signal processing module 940 may not be processed by the service module 950. Instead, a service restored by the digital signal processing module 940 may be directly mapped to a certain portion of a display screen produced by the output module 960. A display screen provided by the output module 960 may be displayed on the physical display device 970 or may be stored in the storage medium 980. Examples of the storage medium 980 include computer readable floppy discs, hard discs, CD-ROM. DVD, DVD-ROM, BD(Blu-ray Disc), and semiconductor memories. Also, a display screen provided by the output module 960 may be transmitted to an external device connected to a network via the external output module 990.
For this, the output module 960 may include a plurality of output ports via which a display screen can be provided. In this case, a display screen can be provided via an output port set in advance as a default or an output port chosen by the user with the aid of the user/application interface module 965.
The user or the application can choose one of a plurality of services or restore desired services using the user/application interface module 965. Also, the user can choose one of a plurality of display screens using the user/application interface module 965. Since the modules illustrated in FIG. 23 are divided according to their functions, it is possible to be connected to the other modules.
FIG. 24 is a flowchart illustrating a method of dynamically configuring multiple screens according to an exemplary embodiment of the present invention.
In general, video information, audio information, and data information constituting a multimedia content are transmitted in a predetermined format, for example, an MPEG stream format. In operation S 1010, an apparatus for providing a service such as a multimedia content service receives video information, audio information, and data information and restores a service based on the video information, the audio information, and the data information. Here, the service restored in operation SlOlO may be selected or previously determined by a user or an application. The user may use a menu displayed on the display device or a remote controller to select the connections between an arbitrary screen and a screen. The application may select the connections using an API.
Further, data information includes application information regarding application program for a service, and these application information includes signal information indicating whether the application program can be executed on a PiP screen. Examples of the application information include application information table (AIT) based on MHP standard and extended application information table (XAIT) based on OCAP standard. The signal information may be added to the application information. Thereafter, in operation S 1020, the restored service is set such that it can be displayed on a logical screen. In operation S 1030, the logical screen is mapped to a display screen. In operation S 1040, the display screen is provided to the user using a display screen, a storage medium, or a network.
The restored service is illustrated in FIG. 24 as being displayed on a physical display device via a logical screen. However, the restored service may be directly displayed on a physical display device without passing through the logical screen.
FIG. 24 illustrates a method of mapping only one service to a display screen for simplicity. However, a plurality of services may be mapped to a display screen with or without passing through a plurality of logical screens. When a display screen is provided to the user in this manner, the user can perform a plurality of services.
When the user or application selects a PiP service, the apparatus 900 for providing multiple screens provides the PiP service into two modes. FIG. 25 illustrates a first mode of the two modes and FIG. 26 illustrates a second mode of the two modes.
Referring to FIG. 25, in the first mode, only video component for PiP service selected on the main screen is provided without creating a separate logical screen for PiP service, that is, a PiP screen. In the first mode, any application related to the PiP service is not executed or the operation of an application related to the PiP service becomes inactive.
Referring to FIG. 26, in the second mode, a separate logical screen for PiP service is created to provide the PiP service selected on the created PiP screen. The PiP screen provided in the second mode may include a background video serving as a background screen, or a video component. Further, in the second mode, an application related to the PiP service can be executed. Whether an application related to the PiP service can be executed or not may be determined on the basis of the signal information described above. It is preferable that the first mode and the second mode cannot be executed at the same time.
The PiP service providing mode may be selected by input of the user or the application through the user/application interface module 965. When the user or the application selects the first mode, the digital signal processing module 940 restores only the video component of the selected PiP service. The restored video component is mapped to the main screen produced by the service processing module 950 and is then displayed on a display screen produced by the output module 960. When the user or the application selects the second mode, the digital signal processing module 940 restores the selected PiP service. The restored service is mapped to the PiP screen created by the service processing module 950 and is then displayed on the display screen produced by the output module 960.
The user or the application can select an audio content of a specific one of a plurality of services provided on the logical screen through the user/application interface module 965 and enables the output of the selected audio content to be independently output. This process is illustrated in FIG. 27.
First, the user or the application selects a specific service through the user/application module 965 (S 1310).
Then, the digital signal processing module 940 extracts an audio content from the selected service, and the extracted audio content is mapped to a logical screen or a display screen produced by the service processing module 950 and is independently output on the corresponding screen by the output module 960 (S 1320). Here, 'independent output' can be understood as, for example, a concept including that an audio content of a specific service is selected and transmitted through an output port mapped to the display screen displaying the specific service simultaneously or exclusively with audio contents of the other services provided on the same screen. It can also be understood as the 'independent output' that audio contents of a plurality of services are output, provided, or stored through different media, respectively. In other words, an audio content need not be provided with other components of the same service and may be independently provided by the user or the application. The output module 960 is selected by the user or the application and outputs the audio content through a predetermined external output module 990.
When a plurality of audio contents are selected, the individual audio contents may be independently output through the separate external output modules 990 at the same time.
The user or the application can select a desired audio content through the user/application module 965 independently of the other components in the same service.
FIG. 28 is a diagram illustrating a software architecture for providing multiple screens according to an exemplary embodiment of the present invention. Referring to FIG. 28, a software architecture 1400 includes a device driver layer
1410, an application program interface (API) layer 1420, and an application layer 1430.
The device driver layer 1410 receives service components from various multimedia content sources and decodes the received service components. Examples of the received service components include video information, audio information, and data information.
The API layer 1420 generates a logical screen and a display screen and maps a service, the logical screen, and the display screen to one another.
The application layer 1430 provides a user interface so that a user can dynamically configure a logical screen which displays a service or transmits a user command to the API layer 1420 so that the API layer 1420 can execute the user command.
The user enables the device driver layer 1410 with the aid of the application layer 1430 to provide a display screen via a physical display device or to store the display screen in a storage medium. In addition, the user can enable the device driver layer 1410 to transmit a display screen to an external device via a network. For this, the device driver layer 1410 may include a plurality of output ports which can provide a display screen. Otherwise, API layer 1420 may include the plurality of output ports.
In order to dynamically configure a plurality of logical screens on a display screen, the API layer 1420 may include a plurality of software modules, e.g., a multi-screen manager module 'MultiscreenManager' , a multi-screen context module 'MultiscreenContext', a multiscreen context listener module 'MultiscreenContextListener', and a multi-screen context event module 'MultiscreenContextEvent', as illustrated in FIG. 29.
The multi-screen manager module 1510 manages the multi-screen context module 1230, searches for a desired screen, displays information specifying what devices are shared by screens, registers the multi-screen context listener module 1550, or cancels the registration of the screen context listener module 1550.
The multi-screen context module 1530 is an interface object associated with a screen object 1520 and determines whether the screen object 1520 is to become a logical screen or a display screen according to an interface operation performed by the multi-screen context module 1530. Various attributes illustrated in FIG. 18 may be set in the multi-screen context module 1530. The multi-screen context module 1530 can provide the functions 'SET', 'ADD', 'GET', and 'REMOVE' described above with reference to FIG. 18. When attribute information of the screen object 1220 is altered by the multi-screen context module 1230, the multi-screen context event module 1540 serves as an event class announcing that the attribute information of the screen object 1220 has been changed, and the multi-screen context listener module 1550 serves as a listener interface object which can be realized in a predetermined application class which attempts to receive an event prompted by the multi-screen context event module 1540.
An application 1560 is a module which is driven on the application layer 1430. The application 1560 allows the user to choose a desired service and to freely arrange a plurality of logical screens on a display screen.
In detail, the application 1560 transmits various commands which allow the user to dynamically configure and manage logical screens to the multi-screen manager module 1510, and the multi-screen manager module 1510 controls operations corresponding to the various commands to be executed through the multi-screen context module 1530.
The multi-screen context module 1530 is associated with the screen object 1520 and manages the attribute information of the screen object 1520 illustrated in FIG. 18. In order to manage the attribute information of the screen object 1520, the multi-screen context module 1530 may include a variety of functions or methods.
The multi-screen manager module 1510 can receive service components provided by various service sources from the device driver layer 1410 and can operate to display the received service components on a logical screen or a display screen. Such a function may be performed by a separate module not illustrated.
FIG. 30 is a flowchart illustrating a method of displaying a plurality of services displayed on respective corresponding logical screens by the modules illustrated in FIG. 29 on a display screen according to an exemplary embodiment of the present invention. Referring to FIG. 30, in operation S 1610, the multi-screen manager module 1510 produces a display screen and a number of logical screens corresponding to the number of services to be performed.
In operation S 1620, the multi-screen manager module 1510 connects the logical screens to respective corresponding services received from the device driver layer 1410. The multi-screen manager module 1510 may call a method 'addServiceContext' for each of the logical screens by setting service context objects of the received services as parameters for the logical screens services. The method 'addServiceContext' connects a logical screen to a service and may be provided by the multi-screen context module 1230. In operation S 1630, once the logical screens are connected to the respective services, the multi-screen manager module 1510 connects the logical screens to the display screen. At this time, the multi-screen manager module 1510 may call a method 'setDisplayScreen' for each of the logical screens by setting a display screen object to which the logical screens are connected as a parameter. The method 'setDisplayScreen' connects a logical screen to a display screen and may be provided by the multi-screen context module 1530.
A method 'setDisplayScreensetDisplayScreen' may be set to 'public void setDisplayScreensetDisplayScreen(HScreen screen) throws SecurityException, Illegal StateException', and this method allows an instance 'HScreen' that is provided as a parameter to be associated with the current logical screen. In this case, the instance 'HScreen' is preferably a display screen. A parameter of the method 'setDisplayScreen(HScreen screen)' may include a value of 'NULL'. In this case, when the method 'setDisplayScreen(HScreen screen)' is executed without exception handling, the current logical screen is no longer associated with the display screen. The execution of the methods 'SecurityException' and 'IllegalStateException' may be conducted as an exceptional operation for the method 'setOutputScreen(HScreen screen)'.
The method 'IllegalStateException' may be executed when a current screen is a logical screen or when a portion of a display screen associated with a current logical screen cannot change due to the characteristics of a host platform.
In operation S 1640, areas on the display screen to which the logical screens are to be respectively mapped are determined. At this time, a predetermined method provided by the multi-screen context module 1530 can be called to determine an area on the display screen where the logical screens are to be displayed. FIG. 31 is a flowchart illustrating the output of an audio content according to an embodiment of the present invention. The service processing module 950 may represent a plurality of services, and an audio content allocated to each of the services may be focused by a user or an application.
That is, a user or an application may input a command to designate predetermined as a focus target. Then, in operation S 1710, the user/application interface module 965 receives the command input by the user or the application.
In order to designate a predetermined audio content as a focus target, a method 'assignAudioFocus' may be called. The method 'assignAudioFocus' is a method for enabling one of a plurality of audio contents to be focused, and may be provided by the multi-screen context module 1530.
Once the designation of an audio content as a focus target is completed, the output module 960 outputs the audio content that is designated as the focus target and is focused (S 1730). More specifically, the output module 960 may examine whichever of a plurality of audio contents respectively allocated to a plurality of services is designated as the focus target (S 1720) and then output the audio content designated as the focus target according to the result of the examination (S1730).
In order to examine an audio content designated as a focus target, a method 'getAudioFocus' may be called. The method 'getAudioFocus' is a method for examining an audio content currently being focused and returns an object having the same type as a class 'HScreen' of the package 'org.havi.ui' as the result of the examination. That is, the method 'getAudioFocus' returns a screen that represents a service including the audio content currently being focused, and the output module 960 outputs the audio content represented by the screen. FIG. 32 is a flowchart illustrating the output of an audio content according to another embodiment of the present invention. The output module 960 may output a plurality of audio contents at the same time. More specifically, referring to FIG. 32, the output module 960 examines the attributes of each of a plurality of audio contents respectively allocated to a plurality of services (Sl 810) and outputs whichever of the audio contents is designated as an output target (S 1820).
The output module 960 may output the audio contents designated as the output target along with an audio content currently being focused. For example, assume that audio contents A, B, C, and D are allocated to respective corresponding services, and that the audio content A is currently being focused. If the audio content B is set not to be able to be simultaneously output along with other audio contents and the audio contents C and D are set to be able to be simultaneously output along with other audio contents, the output module 960 may simultaneously output the audio content A along with the audio contents C and D.
The audio content B may be output when being focused by a user or an application.
That is, the output module 960 may output an audio content designated as an output target by referencing the output attributes of a plurality of audio contents respectively allocated to a plurality of services. Here, the output attributes of the audio contents may be set by a user or an application using the user/application interface module 965. In order to set the attributes of a plurality of audio contents, a method 'addAudioSources' may be called. The method 'addAudioSources' may set a number of audio contents to be able to be simultaneously output.
Parameters of the method 'addAudioSources' may include a matrix 'device' having the same type as a class ΗScreenDevice[]' of the package 'org.havi.ui' and a
Boolean flag 'mixWithAudioFocus'. It may be determined whether a plurality of audio contents can be simultaneously output based on the value of the Boolean flag 'mixWithAudioFocus' .
That is, a user or an application may select at least one device to which an audio content is connected and set the audio content to be able to be simultaneously output to the selected device along with another audio content by using the method 'addAudioSources'. For example, if the Boolean flag 'mixWithAudioFocus' of the method 'addAudioSources' is set to 1, the audio content of the selected device may be simultaneously output along with an audio content currently being focused. On the other hand, if the Boolean flag 'mixWithAudioFocus' of the method 'addAudioSources' is set to 0, the audio content of the selected device may not be simultaneously output along with the audio content currently being focused. In this case, the audio content of the selected device may be output only when being focused.
A user or an application may cancel the simultaneous output setting of an audio content that is set to be able to be simultaneously output along with another audio content. For this, a method 'removeAudioSources' may be called. The method 'removeAudioSources' may include, as a parameter, a matrix 'device' which has the same type as the class ΗScreenDevice[]' of the package 'org.havi.ui'. When the method 'removeAudioSources' is called, the simultaneous output setting of an audio content of at least one device designated by the matrix 'device' of the method 'removeAudioSources' is canceled. The method 'removeAudioSources' may not include any parameter. In this case, the simultaneous output setting of audio contents of all devices may be cancelled.
While the present invention has been particularly illustrated and described with reference to exemplary embodiments thereof, it will be understood by those of ordinary skill in the art that various changes in form and details may be made therein without departing from the spirit and scope of the present invention as defined by the following claims. Therefore, it is to be understood that the above-described embodiments have been provided only in a descriptive sense and will not be construed as placing any limitation on the scope of the invention.
Industrial Applicability
According to the present invention, it is possible to perform a plurality of services provided by various sources such as cable broadcasts, terrestrial broadcasts, various storage media, and external inputs, in various manners using a single physical display screen.

Claims

1. An apparatus for providing multiple screens, the apparatus comprising: a service processing module providing a plurality of services to which a plurality of first audio contents are respectively allocated; a user/application interface module receiving a command to designate one of the first audio contents as a second audio content, the second audio content being an audio content to be focused; and an output module outputting the first audio content that is designated as the second audio content and is focused in response to the received command.
2. The apparatus of claim 1, wherein the output module examines the first audio content designated as the second audio content and outputs the first audio content designated as the second audio content according to the result of the examination.
3. The apparatus of claim 1, wherein the output module simultaneously outputs a predetermined first audio content, other than the first content designated as the second audio content, along with the first content designated as the second audio content with reference to attributes of the first contents.
4. The apparatus of claim 1, wherein one of the first audio contents is designated as the second audio content by a user or an application.
5. An apparatus for providing multiple screens, the apparatus comprising: a service processing module providing a plurality of services to which a plurality of first audio contents are respectively allocated; and an output module outputting one of the first audio contents designated as a second audio content with reference to output attributes of the first audio contents, the second audio content being an audio content to be output.
6. The apparatus of claim 5, wherein the output module outputs one of the first audio contents currently being focused.
7. The apparatus of claim 6, wherein the first audio content designated as the second audio content is simultaneously output along with an audio content currently being focused.
8. The apparatus of claim 5, further comprising a user/application interface module setting the output attributes of the first audio contents.
9. A method of providing multiple screens, the method comprising: providing a plurality of services to which a plurality of first audio contents are respectively allocated; and receiving a command to designate one of the first audio contents as a second audio content, the second audio content being an audio content to be focused; and outputting the first audio content that is designated as the second audio content and is focused in response to the received command.
10. The method of claim 9, further comprising examining the first audio content designated as the second audio content.
11. The method of claim 9, wherein the output comprises simultaneously outputting a predetermined first audio content, other than the first content designated as the second audio content, along with the first content designated as the second audio content with reference to attributes of the first contents.
12. The method of claim 9, wherein one of the first audio contents is designated as the second audio content by a user or an application.
13. A method of providing multiple screens, the method comprising: providing a plurality of services to which a plurality of first audio contents are respectively allocated; and outputting one of the first audio contents designated as a second audio content with reference to output attributes of the first audio contents, the second audio content being an audio content to be output.
14. The method of claim 13, wherein the output comprises outputting one of the first audio contents currently being focused.
15. The method of claim 14, wherein the output further comprises simultaneously outputting the first audio content designated as the second audio content along with an audio content currently being focused.
16. The method of claim 13, further comprising setting the output attributes of the first audio contents.
PCT/KR2007/001668 2006-04-06 2007-04-05 Apparatus and method for multiple screen WO2007114664A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
MX2008012862A MX2008012862A (en) 2006-04-06 2007-04-05 Apparatus and method for multiple screen.
CA002648467A CA2648467A1 (en) 2006-04-06 2007-04-05 Apparatus and method for multiple screen

Applications Claiming Priority (10)

Application Number Priority Date Filing Date Title
US78957706P 2006-04-06 2006-04-06
US60/789,577 2006-04-06
US81209006P 2006-06-09 2006-06-09
US60/812,090 2006-06-09
US87047106P 2006-12-18 2006-12-18
US60/870,471 2006-12-18
US91889407P 2007-03-20 2007-03-20
US60/918,894 2007-03-20
KR10-2007-0033459 2007-04-04
KR1020070033459A KR20070100136A (en) 2006-04-06 2007-04-04 Apparatus and method for multiple screen

Publications (1)

Publication Number Publication Date
WO2007114664A1 true WO2007114664A1 (en) 2007-10-11

Family

ID=38563886

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2007/001668 WO2007114664A1 (en) 2006-04-06 2007-04-05 Apparatus and method for multiple screen

Country Status (1)

Country Link
WO (1) WO2007114664A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU2012211393B2 (en) * 2011-10-07 2014-05-15 Fusion Holdings Limited Gaming systems, apparatus and method with dual game play

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5347624A (en) * 1987-03-05 1994-09-13 Hitachi, Ltd. Method and apparatus for display control
US6917362B2 (en) * 2002-01-25 2005-07-12 Hewlett-Packard Development Company, L.P. System and method for managing context data in a single logical screen graphics environment
US20060026318A1 (en) * 2004-07-30 2006-02-02 Samsung Electronics Co., Ltd. Apparatus, medium, and method controlling audio/video output

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5347624A (en) * 1987-03-05 1994-09-13 Hitachi, Ltd. Method and apparatus for display control
US6917362B2 (en) * 2002-01-25 2005-07-12 Hewlett-Packard Development Company, L.P. System and method for managing context data in a single logical screen graphics environment
US20060026318A1 (en) * 2004-07-30 2006-02-02 Samsung Electronics Co., Ltd. Apparatus, medium, and method controlling audio/video output

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU2012211393B2 (en) * 2011-10-07 2014-05-15 Fusion Holdings Limited Gaming systems, apparatus and method with dual game play

Similar Documents

Publication Publication Date Title
US8949894B2 (en) Apparatus for providing multiple screens and method of dynamically configuring multiple screens
US20080094510A1 (en) Apparatus for providing multiple screens and method of dynamically configuring multiple screens
US20080106487A1 (en) Apparatus for providing multiple screens and method of dynamically configuring multiple screens
US20080094511A1 (en) Apparatus for providing multiple screens and method of dynamically configuring multiple screens
EP1911270A1 (en) Apparatus for providing multiple screens and method of dynamically configuring multiple screens
WO2007018381A1 (en) Apparatus for providing multiple screens and method of dynamically configuring multiple screens
CA2648467A1 (en) Apparatus and method for multiple screen
US20080094415A1 (en) Method and apparatus for identifying application in multiscreen environment
US20080094512A1 (en) Apparatus for providing multiple screens and method of dynamically configuring multiple screens
WO2007114664A1 (en) Apparatus and method for multiple screen
US20080094508A1 (en) Apparatus for providing mutliple screens and method of dynamically configuring
EP1911275B1 (en) Apparatus for providing multiple screens and method of dynamically configuring multiple screens
WO2007018380A1 (en) Apparatus for providing multiple screens and method of dynamically configuring multiple screens
WO2007114669A1 (en) Apparatus and method for identifying an application in the multiple screens environment
EP1913769A1 (en) Apparatus for providing multiple screens and method of dynamically configuring multiple screens
WO2007018370A1 (en) Apparatus for providing multiple screens and method of dynamically configuring multiple screens
WO2007018385A1 (en) Apparatus for providing multiple screens and method of dynamically configuring multiple screens
WO2007114659A1 (en) Apparatus for providing multiple screens and method of dynamically configuring multiple screens
WO2007114666A1 (en) Apparatus for providing multiple screens and method of dynamically configuring multiple screens

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 07745831

Country of ref document: EP

Kind code of ref document: A1

DPE1 Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101)
WWE Wipo information: entry into national phase

Ref document number: 2648467

Country of ref document: CA

WWE Wipo information: entry into national phase

Ref document number: MX/a/2008/012862

Country of ref document: MX

Ref document number: 200780012192.6

Country of ref document: CN

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 07745831

Country of ref document: EP

Kind code of ref document: A1