CN110659010A - Picture-in-picture display method and display equipment - Google Patents

Picture-in-picture display method and display equipment Download PDF

Info

Publication number
CN110659010A
CN110659010A CN201910893043.XA CN201910893043A CN110659010A CN 110659010 A CN110659010 A CN 110659010A CN 201910893043 A CN201910893043 A CN 201910893043A CN 110659010 A CN110659010 A CN 110659010A
Authority
CN
China
Prior art keywords
application
window
picture
display
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910893043.XA
Other languages
Chinese (zh)
Inventor
孙哲
刘月卿
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense Mobile Communications Technology Co Ltd
Original Assignee
Hisense Mobile Communications Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Mobile Communications Technology Co Ltd filed Critical Hisense Mobile Communications Technology Co Ltd
Priority to CN201910893043.XA priority Critical patent/CN110659010A/en
Publication of CN110659010A publication Critical patent/CN110659010A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Abstract

The application discloses a picture-in-picture display method and display equipment. The application provides a display device, includes: a touch screen for receiving a touch input of a user; a display screen for displaying a user interface; a processor, coupled with the display screen, configured to: in response to a user input to initiate a picture-in-picture mode in a full screen window displaying a first application; starting a second application selected by a user; displaying the second application in a window of the created virtual display screen; wherein the window displaying the second application is smaller than the window displaying the first application.

Description

Picture-in-picture display method and display equipment
Technical Field
The present disclosure relates to display technologies, and in particular, to a picture-in-picture display method and a display device.
Background
Currently, since a display device can provide a user with a play picture such as audio, video, picture, and the like, it is receiving a wide attention of the user. As applications grow and diversify, users increasingly demand functionality of display devices for better user experience.
Wherein the user desires that the display device is capable of supporting a multi-window mode. One typical type of multi-window mode is a picture-in-picture mode. The picture-in-picture mode refers to displaying one or more sub-windows in a main window displayed in full screen, and the main window and the sub-windows may display different contents so that a user can view the contents in the main window and the contents in the sub-windows simultaneously.
Disclosure of Invention
The embodiment of the application provides a picture-in-picture display method and display equipment.
In a first aspect, there is provided a display device comprising: a touch screen for receiving a touch input of a user;
a display screen for displaying a user interface; a controller coupled with the display screen and configured to:
in response to a user input to initiate a picture-in-picture mode in a full screen window displaying a first application; starting a second application selected by a user; displaying the second application in a window of the created virtual display screen. Wherein the window displaying the second application is smaller than the window displaying the first application.
Optionally, the controller is further configured to: judging whether the window except the full-screen window is supported to be displayed or not according to the parameter value of the display interface defined by the second application; and if not, modifying the parameter value to support the window displayed except the full-screen window.
Optionally, the controller is further configured to: creating the virtual display screen; and creating a display window for the virtual display screen. Optionally, the size of the display window is the same as the size of the virtual display screen.
Optionally, the user input is a voice input.
Optionally, the first application is a video-type application or a game-type application.
Optionally, the second application is a chat-type application or a shopping-type application.
In a second aspect, a method for displaying a picture-in-picture is provided, including: at a display device having a touchscreen, a display screen, and a controller: in response to a user input to initiate a picture-in-picture mode in a full screen window displaying a first application; starting a second application selected by a user; displaying the second application in a window of the created virtual display screen; wherein the window displaying the second application is smaller than the window displaying the first application.
Optionally, the method further comprises: judging whether the window except the full-screen window is supported to be displayed according to the parameter value of the display interface defined by the second application; and if not, modifying the parameter value to support the window displayed except the full-screen window.
Optionally, the method further comprises: creating the virtual display screen; and creating a display window for the virtual display screen.
In a third aspect, there is provided a non-transitory storage medium readable by a computer, having stored thereon computer instructions which, when executed by a controller, implement the method according to any one of the second aspects above.
In the above embodiment of the present application, after receiving a user input for starting a picture-in-picture mode in a full-screen window displaying a first application, a display device starts a second application selected by a user in response, and displays the second application in a window of a created virtual display screen, on one hand, since the window displaying the second application is smaller than the window displaying the first application, the display in the picture-in-picture mode is realized; on the other hand, by creating the virtual display as a carrier for the window of the second application, so that the window of the second application can be relatively independent of the window of the first application, the influence and dependence of the window of the second application on the window of the first application are reduced.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to these drawings without inventive exercise.
Fig. 1 is a schematic diagram illustrating an operation scenario between a smart tv and a controller according to an embodiment;
fig. 2 is a block diagram schematically showing a hardware configuration of the smart tv 200 according to the embodiment;
fig. 3 schematically illustrates an operating system architecture of the smart tv 200 according to an embodiment;
fig. 4 is a diagram schematically illustrating a functional configuration of a smart tv according to an exemplary embodiment;
fig. 5 is a block diagram exemplarily showing a hardware configuration of a mobile terminal according to an embodiment;
fig. 6a schematically shows a picture-in-picture mode display of a smart tv according to an embodiment;
fig. 6b schematically shows a picture-in-picture mode display diagram of a mobile terminal according to an embodiment;
fig. 7 schematically illustrates a display flow diagram of a picture-in-picture according to an embodiment of the present application;
fig. 8a schematically illustrates a virtual display screen creation process under an Android system according to an embodiment;
fig. 8b is a schematic flowchart illustrating a picture-in-picture display implementation according to an open application under an Android system in an embodiment;
fig. 9a, 9b schematically show a window of a different application and a virtual display in a picture-in-picture mode according to an embodiment, respectively.
Detailed Description
To make the objects, technical solutions and advantages of the exemplary embodiments of the present application clearer, the technical solutions in the exemplary embodiments of the present application will be clearly and completely described below with reference to the drawings in the exemplary embodiments of the present application, and it is obvious that the described exemplary embodiments are only a part of the embodiments of the present application, but not all the embodiments.
The concept to which the present application relates will be first explained below with reference to the drawings. It should be noted that the following descriptions of the concepts are only for the purpose of facilitating understanding of the contents of the present application, and do not represent limitations on the scope of the present application.
The term "module," as used in various embodiments of the present application, may refer to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware and/or software code that is capable of performing the functionality associated with that element.
The term "remote control" as used in the embodiments of the present application refers to a component of an electronic device (such as the display device disclosed in the present application) that is capable of wirelessly controlling the electronic device, typically over a short distance. The component may be connected to the electronic device by using at least one of infrared, Radio Frequency (RF) signal, bluetooth, and other communication methods, and may also include functional modules such as WiFi, wireless Universal Serial Bus (USB), bluetooth, and motion sensor. For example: the hand-held touch remote controller replaces most of the physical built-in hard keys in the common remote control device with the user interface in the touch screen.
The term "gesture" used in the embodiments of the present application refers to a user behavior used to express an intended idea, action, purpose or result through a change in hand shape or an action such as hand movement.
The term "hardware system" used in the embodiments of the present application may refer to a physical component having computing, controlling, storing, inputting and outputting functions, which is formed by a mechanical, optical, electrical and magnetic device such as an Integrated Circuit (IC), a Printed Circuit Board (PCB) and the like. In various embodiments of the present application, a hardware system may also be referred to as a motherboard (or chip).
The embodiment of the application provides a display device, which can display a plurality of windows in a picture-in-picture mode.
The picture-in-picture mode is a content presentation mode, such as displaying one or more sub-windows in a main window displayed in full screen, so that the user can view the content displayed in the main window and also view the content displayed in the sub-windows. The content displayed in the main window and the sub-window may be from different sources, such as the content displayed in the main window and the sub-window originating from different applications, or the content displayed in the main window and the sub-window originating from different video sources or channels.
Another form of picture-in-picture, also known as an out-of-picture, specifically refers to positioning the sub-window outside the main window. In the industry, whether the sub-window is inside or outside the main window is referred to as picture-in-picture mode.
The display device in the embodiment of the present application refers to a device having a display function, and the display device may further carry an operating system, such as an Android system (Android), so as to implement richer functions. For example, the display device may specifically include a smart television, a mobile terminal, and the like. The mobile terminal can be a smart phone, a tablet computer or wearable equipment.
Taking an example that the display device in the embodiment of the present application is an intelligent television, fig. 1 exemplarily shows a schematic diagram of an operation scenario between an intelligent television and a remote controller according to the embodiment. As shown in fig. 1, a user may operate the smart tv 200 through the controller 100.
The controller 100 may communicate with the smart tv 200 through an infrared protocol communication, a bluetooth protocol communication, a ZigBee (ZigBee) protocol communication, or another short-distance communication method, and is configured to control the smart tv 200 in a wireless or other wired manner. The user may input a user command through a button on the remote controller, a voice input, a control panel input, etc., to control the smart tv 200. Such as: the user can input a corresponding control instruction through a volume up-down key, a channel control key, an up/down/left/right moving key, a voice input key, a menu key, a power on/off key and the like on the remote controller, so as to realize the function of controlling the smart television 200.
Alternatively, the controller may be replaced by an intelligent device, such as a mobile terminal, a tablet computer, a notebook computer, etc., which may communicate with the smart tv 200 through a Local Area Network (LAN), a Wide Area Network (WAN), a Wireless Local Area Network (WLAN), or other networks, and implement control of the smart tv 200 through an application corresponding to the smart tv 200. For example, the smart tv 200 is controlled using an application running on the smart device. The application may provide various controls to the user through an intuitive User Interface (UI) on a screen associated with the smart device.
As shown in fig. 1, the smart tv 200 may also perform data communication with the server 300 through a plurality of communication methods. In various embodiments of the present application, the smart tv 200 may be allowed to be communicatively connected to the server 300 through a local area network, a wireless local area network, or other network. The server 300 may provide various contents and interactions to the smart tv 200.
Illustratively, the smart tv 200 receives software program updates, or accesses a remotely stored digital media library by sending and receiving information, and Electronic Program Guide (EPG) interactions. The servers 300 may be a group or groups, and may be one or more types of servers. Other web service contents such as a video on demand and an advertisement service are provided through the server 300.
The smart tv 200 may be a liquid crystal display, an Organic Light Emitting Diode (OLED) display, a projection display device, or a smart tv. The specific display device type, size, resolution, etc. are not limited, and those skilled in the art will appreciate that the smart tv 200 may have some changes in performance and configuration as desired. In some embodiments, the display device may not have a broadcast receiving television function.
The smart tv 200 may additionally provide an intelligent network tv function for a computer support function. Examples include: network televisions, smart televisions, Internet Protocol Televisions (IPTV), and the like.
Fig. 2 schematically shows a hardware configuration block diagram of a hardware system in the smart tv 200 according to an exemplary embodiment, which is described by taking a single hardware system architecture as an example.
It should be noted that fig. 2 is only an exemplary illustration of the architecture of the hardware system of the application, and does not represent a limitation of the application. In actual implementation, a single-piece system may contain more or less hardware or interfaces as desired.
As shown in fig. 2, the hardware system of the smart tv 200 may include N chips, and modules connected to the N chips through various interfaces.
The N-chip may include a tuner demodulator 220, a communicator 230, an external device interface 250, a controller 210, a memory 290, a user input interface, a video processor 260-1, an audio processor 260-2, a display 280, an audio output interface 272, and a power supply. The N-chip may also include more or fewer modules in other embodiments.
Optionally, a touch screen (not shown in the figures) may also be included. Touch screens (also called touch screens and touch panels) are inductive liquid crystal display devices capable of receiving input signals such as contacts, and are classified into four types, namely, resistive type, capacitive type, infrared type and surface acoustic wave type, according to the operating principle of the touch screens and media for transmitting information.
The tuning demodulator 220 is configured to receive broadcast television signals in a wired or wireless manner, and perform modulation and demodulation processing such as amplification, mixing, resonance, and the like, so as to demodulate, from a plurality of wireless or wired broadcast television signals, an audio/video signal carried in a frequency of a television channel selected by a user, and additional information (e.g., an EPG data signal). Depending on the broadcast system of the television signal, the signal source of the tuner 220 may be various, such as: terrestrial broadcasting, cable broadcasting, satellite broadcasting, internet broadcasting, or the like; according to different modulation types, the modulation mode of the signal can be a digital modulation mode or an analog modulation mode; and depending on the type of television signal being received, tuner demodulator 220 may demodulate analog and/or digital signals.
The tuner demodulator 220 is also operative to respond to the user-selected television channel frequency and the television signals carried thereby, as selected by the user and as controlled by the controller 210.
In other exemplary embodiments, the tuner/demodulator 220 may be in an external device, such as an external set-top box. In this way, the set-top box outputs television audio and video signals after modulation and demodulation, and the television audio and video signals are input into the smart television 200 through the external device interface 250.
The communicator 230 is a component for communicating with an external device or an external server according to various communication protocol types. For example: the communicator 230 may include a WiFi module 231, a bluetooth communication protocol module 232, a wired ethernet communication protocol module 233, and other network communication protocol modules such as an infrared communication protocol module or a near field communication protocol module.
The smart tv 200 may establish a connection of a control signal and a data signal with an external control device or a content providing device through the communicator 230. For example, the communicator 230 may receive a control signal of the remote controller 100 according to the control of the controller.
The external device interface 250 is a component for providing data transmission between the N-chip controller 210 and the a-chip and other external devices. The external device interface may be connected with an external apparatus such as a set-top box, a game device, a notebook computer, etc. in a wired and/or wireless manner, and may receive data such as a video signal (e.g., moving image), an audio signal (e.g., music), additional information (e.g., EPG), etc. of the external apparatus.
The external device interface 250 may include: a High Definition Multimedia Interface (HDMI) terminal 251, a Composite Video Blanking Sync (CVBS) terminal 252, an analog or digital component terminal 253, a Universal Serial Bus (USB) terminal 254, a red, green, blue (RGB) terminal (not shown), and the like. The number and type of external device interfaces are not limited by this application.
The controller 210 controls the operation of the smart tv 200 and responds to the user's operation by running various software control programs (such as an operating system and/or various application programs) stored on the memory 290.
As shown in FIG. 2, the controller 210 includes a read only memory ROM213, a random access memory RAM214, a graphics processor 216, a CPU processor 212, communication interfaces (218-1, 218-2, … …, 218-n), and a communication bus. The RAM213 and the ROM214, the graphic processor 216, the CPU processor 212, and the communication interface 218 are connected via a bus.
A ROM213 for storing instructions for various system boots. If the power of the smart tv 200 starts to be started when the power-on signal is received, the CPU processor 212 executes the system boot instruction in the ROM and copies the operating system stored in the memory 290 to the RAM214 to start running the boot operating system. After the start of the operating system is completed, the CPU processor 212 copies the various application programs in the memory 290 to the RAM214 and then starts running and starting the various application programs.
A graphics processor 216 for generating various graphics objects, such as: icons, operation menus, user input instruction display graphics, and the like. The display device comprises an arithmetic unit which carries out operation by receiving various interactive instructions input by a user and displays various objects according to display attributes. And a renderer for generating various objects based on the operator and displaying the rendered result on the display 280.
A CPU processor 212 for executing operating system and application program instructions stored in memory 290. And executing various application programs, data and contents according to various interactive instructions received from the outside so as to finally display and play various audio and video contents.
In some exemplary embodiments, the CPU processor 212 may include a plurality of processors. The plurality of processors may include a main processor and a plurality of or a sub-processor. And a main processor for performing some operations of the smart tv 200 in a pre-power-up mode and/or displaying a picture in a normal mode. A plurality of or one sub-processor for performing an operation in a standby mode or the like.
The communication interfaces may include a first interface 218-1 through an nth interface 218-n. These interfaces may be network interfaces that are connected to external devices via a network.
The controller 210 may control the overall operation of the smart tv 200. For example: in response to receiving a user command for selecting a UI object to be displayed on the display 280, the controller 210 may perform an operation related to the object selected by the user command.
Wherein the object may be any one of selectable objects, such as a hyperlink or an icon. Operations related to the selected object, such as: displaying an operation connected to a hyperlink page, document, image, or the like, or performing an operation of a program corresponding to an icon. The user command for selecting the UI object may be a command input through various input devices (e.g., a mouse, a keyboard, a touch pad, etc.) connected to the smart tv 200 or a voice command corresponding to a voice spoken by the user.
The memory 290 includes various software modules for driving and controlling the smart tv 200. Such as: various software modules stored in memory 290, including: the system comprises a basic module, a detection module, a communication module, a display control module, a browser module, various service modules and the like.
The basic module is a bottom-layer software module used for signal communication among hardware in the smart television 200 and sending processing and control signals to an upper-layer module. The detection module is a management module used for collecting various information from various sensors or user input interfaces, and performing digital-to-analog conversion and analysis management.
For example: the voice recognition module comprises a voice analysis module and a voice instruction database module. The display control module is a module for controlling the display 280 to display image content, and may be used to play information such as multimedia image content and UI interface. The communication module is used for carrying out control and data communication with external equipment. And the browser module is used for executing data communication between the browsing servers. The service module is a module for providing various services and various application programs.
Meanwhile, the memory 290 is also used to store visual effect maps and the like for receiving external data and user data, images of respective items in various user interfaces, and a focus object.
A user input interface for transmitting an input signal of a user to the controller 210 or transmitting a signal output from the controller to the user. For example, the control device (e.g., a mobile terminal or a remote controller) may send an input signal, such as a power switch signal, a channel selection signal, a volume adjustment signal, etc., input by a user to the user input interface, and then the input signal is forwarded to the controller by the user input interface; alternatively, the control device may receive an output signal such as audio, video, or data output from the user input interface via the controller, and display the received output signal or output the received output signal in audio or vibration form.
In some embodiments, a user may enter a user command on a Graphical User Interface (GUI) displayed on the display 280, and the user input interface receives the user input command through the Graphical User Interface (GUI). Alternatively, the user may input the user command by inputting a specific sound or gesture, and the user input interface receives the user input command by recognizing the sound or gesture through the sensor.
The video processor 260-1 is configured to receive a video signal, and perform video data processing such as decompression, decoding, scaling, noise reduction, frame rate conversion, resolution conversion, and image synthesis according to a standard codec protocol of the input signal, so as to obtain a video signal that is directly displayed or played on the display 280.
Illustratively, the video processor 260-1 includes a demultiplexing module, a video decoding module, an image synthesizing module, a frame rate conversion module, a display formatting module, and the like.
The demultiplexing module is used for demultiplexing the input audio and video data stream, and if an MPEG-2 format signal is input, the demultiplexing module demultiplexes the input audio and video data stream into a video signal and an audio signal.
And the video decoding module is used for processing the video signal after demultiplexing, including decoding, scaling and the like.
And the image synthesis module is used for carrying out superposition mixing processing on the GUI signal input by the user or generated by the user and the video image after the zooming processing by the graphic generator so as to generate an image signal for display.
The frame rate conversion module is configured to convert a frame rate of an input video, such as a 24Hz, 25Hz, 30Hz, or 60Hz video, into a 60Hz, 120Hz, or 240Hz frame rate, where the input frame rate may be related to a source video stream, and the output frame rate may be related to an update rate of a display. The input is realized in a common format by using a frame insertion mode.
And a display formatting module for converting the signal output by the frame rate conversion module into a signal conforming to a display format of a display, such as converting the format of the signal output by the frame rate conversion module to output an RGB data signal.
And a display 280 for receiving the image signal input from the video processor 260-1 and displaying the video content and image and the menu manipulation interface. The display 280 includes a display component for presenting a picture and a driving component for driving the display of an image. The video content may be displayed from the video in the broadcast signal received by the tuner/demodulator 220, or from the video content input from the communicator or the external device interface. And the display 280 simultaneously displays a user manipulation interface UI generated in the smart tv 200 and used for controlling the smart tv 200.
And, a driving component for driving the display according to the type of the display 280. Alternatively, in case the display 280 is a projection display, it may also comprise a projection device and a projection screen.
The audio processor 260-2 is configured to receive an audio signal, and perform decompression and decoding according to a standard codec protocol of the input signal, and perform audio data processing such as noise reduction, digital-to-analog conversion, and amplification processing to obtain an audio signal that can be played in the speaker 272.
An audio output interface 270 for receiving the audio signal output by the audio processor 260-2 under the control of the controller 210, wherein the audio output interface may include a speaker 272 or an external sound output terminal 274 for outputting to a generating device of an external device, such as: external sound terminal or earphone output terminal.
In other exemplary embodiments, video processor 260-1 may comprise one or more chip components. The audio processor 260-2 may also include one or more chips.
And, in other exemplary embodiments, the video processor 260-1 and the audio processor 260-2 may be separate chips or may be integrated in one or more chips with the controller 210.
And the power supply is used for providing power supply support for the smart television 200 by the power input by the external power supply under the control of the controller 210. The power supply may include a built-in power circuit installed inside the smart tv 200, or may be a power supply installed outside the smart tv 200, such as a power interface for providing an external power supply in the smart tv 200.
It should be noted that the hardware system architecture of the smart television 200 shown in fig. 3 is described by taking a single hardware system architecture as an example, and the embodiment of the present application may also be applied to a display device having a plurality of hardware systems.
An operating system architecture diagram of a display device, such as a smart television or a mobile terminal, according to an embodiment is illustrated in fig. 3.
As shown in fig. 3, taking the Android system as an example, the operating system architecture of the display device is mainly divided into three levels, from the bottom layer to the upper layer, a platform layer, a service support layer, and an application layer. The platform layer is mainly a Linux kernel and various hardware drivers; the service support layer mainly comprises middleware, which is used for supporting and maintaining upper-layer services based on a Linux kernel and a hardware drive, providing a browser environment for User Interface (UI) operation, starting a browser process after a display device is started, and rendering a UI page through a graphic engine and the like; the UI is positioned in an application layer and is responsible for displaying business functions, drawing graphs and realizing functions through Web technologies and the like supported by the browser.
Fig. 4 is a diagram schematically illustrating a functional configuration of the smart tv according to an exemplary embodiment.
As shown in fig. 4, the memory 290 is used to store an operating system, an application program, contents, user data, and the like, and performs various operations of driving the system operation of the smart tv 200 and responding to the user under the control of the controller 210. The memory 290 may include volatile and/or nonvolatile memory.
The memory 290 is specifically configured to store an operating program for driving the controller 210 in the smart tv 200, and store various applications built in the smart tv 200, various applications downloaded by a user from an external device, various graphical user interfaces related to the applications, various objects related to the graphical user interfaces, user data information, and internal data of various supported applications. The memory 290 is used to store system software such as an Operating System (OS) kernel, middleware, and applications, and to store input video data and audio data, and other user data.
The memory 290 is specifically used for storing drivers and related data such as the video processor 260-1 and the audio processor 260-2, the display 280, the communication interface 230, the tuner demodulator 220, the input/output interface, and the like.
In some embodiments, memory 290 may store software and/or programs, software programs for representing an Operating System (OS) including, for example: a kernel, middleware, an Application Programming Interface (API), and/or an application program. For example, the kernel may control or manage system resources, or functions implemented by other programs (e.g., the middleware, APIs, or applications), and the kernel may provide interfaces to allow the middleware and APIs, or applications, to access the controller to implement controlling or managing system resources.
The memory 290, for example, includes a broadcast receiving module 2901, a channel control module 2902, a volume control module 2903, an image control module 2904, a display control module 2905, an audio control module 2906, an external command recognition module 2907 (including voice recognition modules 2907-2, key command recognition modules 2907-3), a communication control module 2908, a light receiving module, a power control module 2910, an operating system 2911, and other application programs 2912, a browser module, and the like. The controller 210 performs functions such as: the system comprises a broadcast television signal receiving and demodulating function, a television channel selection control function, a volume selection control function, an image control function, a display control function, an audio control function, an external instruction identification function, a communication control function, an optical signal receiving function, an electric power control function, a software control platform supporting various functions, a browser function and other various functions.
External instruction identification module 2907 may identify different instructions. The external command recognition module 2907 may include a voice recognition module 2907-2, in which a voice database is stored in the pattern recognition module 2907-2, and when receiving an external voice command or the like, the voice receiving device and the like make a corresponding relationship with the command in the voice database to perform command control on the display device. Similarly, the control device 100 such as a remote controller is connected to a chip in the smart tv 200, and a key command identification module in the smart tv 200 performs command interaction with the control device 100.
Taking the example that the display device in the embodiment of the present application is a mobile terminal as an example, fig. 5 exemplarily shows a configuration block diagram of a mobile terminal 400 according to an exemplary embodiment. As shown in fig. 5, the mobile terminal 400 includes a controller 410, a communicator 430, a user input/output interface 440, a memory 490, and a power supply 480.
The controller 410 includes a processor 412, a RAM413 and a ROM414, a communication interface, and a communication bus. The controller 410 is used for operation and manipulation of the mobile terminal 400, as well as for communication and coordination among the various internal components and external and internal data processing functions.
A user input/output interface 440, wherein the input interface includes at least one of a microphone 441, a touch pad 442, a sensor 443, keys 444, and the like. Such as: the user can realize the user instruction input function through actions such as voice, touch, gestures, pressing and the like, and the input interface converts the received analog signals into digital signals and converts the digital signals into corresponding instruction signals.
In some embodiments, the mobile terminal 400 includes at least one of a communicator 430 and an output interface. The communicator 430 is configured in the mobile terminal 400, such as: WiFi, Bluetooth, NFC and other modules.
A memory 490 for storing various operating programs, data, and applications that drive and control the mobile terminal 400 under the control of the controller 410. The memory 490 may store various control signal commands input by a user.
A power supply 480 for providing operational power support to the various elements of the mobile terminal 400 under the control of the controller 410. A battery and associated control circuitry.
Optionally, a touch screen (not shown in the figures) may also be included.
The operating system architecture of the mobile terminal 400 is similar to that shown in fig. 3, and the functional configuration of the mobile terminal 400 is similar to that shown in fig. 4, and will not be repeated here.
Fig. 6a schematically shows a display of a picture-in-picture mode in a smart tv 200 according to an exemplary embodiment. As shown in fig. 6a, a plurality of windows, illustratively a first window 601a and a second window 602a, are displayed on the display screen. The first window 601a is a full screen display window, and the size of the second window 602a is smaller than that of the first window 601 a. The contents displayed in the first window 601a and the second window 602a are different, for example, a video frame is displayed in the first window 601a, and a user interface of a shopping application is displayed in the second window 602 a.
A schematic display of the picture-in-picture mode in the mobile terminal 400 according to an exemplary embodiment is exemplarily shown in fig. 6 b. As shown in fig. 6b, a plurality of windows, illustratively a first window 601b and a second window 602b, are displayed on the display screen. The first window 601b is a full screen display window, and the size of the second window 602b is smaller than that of the first window 601 b. The content displayed in the first window 601b and the second window 602b are different, for example, a game screen is displayed in the first window 601b, and a user interface of a social application, such as a video chat interface as shown in the figure, is displayed in the second window 602 b.
Fig. 7 schematically illustrates a picture-in-picture display flow chart provided by an embodiment of the present application, and as shown in the figure, the flow chart may include:
s701: a user input is received requesting that a picture-in-picture mode be turned on in a full screen window of a displayed first application.
The user input can be realized in various human-computer interaction modes based on the human-computer interaction mode provided by the display equipment. For example, the user input may include an instruction submitted by a user operating a remote controller; or the user input is voice input, and the display equipment can obtain a corresponding instruction through voice recognition; still alternatively, the voice input may also be a gesture input, such as a user inputting a gesture through a touch screen.
Optionally, based on the received user input, the display device may obtain a second application that the user requested to be displayed in a picture-in-picture. For example, the display device may recognize a name of the second application and an instruction keyword for opening a picture-in-picture mode from a voice input by the user.
In further embodiments, the display device may also further prompt the user to select a second application that needs to be displayed in a picture-in-picture, based on the received user input. For example, the display device recognizes a keyword instructing to start the picture-in-picture mode from the voice input by the user, and then prompts the user to select a second application to be displayed in the picture-in-picture mode (e.g., prompt information may be displayed through voice prompt or on the screen of the display device).
S702: in response to the user input, a picture-in-picture mode is turned on.
Specifically, S702 may include:
s7021: starting a second application selected by a user;
s7022: displaying the second application in a window of the created virtual display screen. Wherein the window displaying the second application is smaller than the window displaying the first application.
In the above embodiment of the present application, after receiving a user input for starting a picture-in-picture mode in a full-screen window displaying a first application, a display device starts a second application selected by a user in response, and displays the second application in a window of a created virtual display screen, on one hand, since the window displaying the second application is smaller than the window displaying the first application, the picture-in-picture mode is realized; on the other hand, by creating the virtual display screen as a carrier of the window of the second application, the window of the second application can be relatively independent from the window of the first application, the influence and dependence of the window of the second application on the window of the first application are reduced, and technical support is provided for subsequent operations (such as moving, changing the size and the like) on the window of the second application.
Optionally, S702 of the above process may further include: and creating the virtual display screen and creating a display window for the virtual display screen, so that the second application is displayed in the window of the virtual display screen.
The virtual display is invisible to the user and is a carrier of the window, for which a window of the second application is created. In the operating system level, the virtual display screen may be a display object, taking the Android system as an example, where the virtual display screen is a display object in the Android system, and a window may be created for the object.
The size of the virtual display screen may be set. In the embodiment of the application, the size of the virtual display screen can be appointed in advance, and the size of the virtual display screen can be determined in real time when the virtual display screen is created according to an application scene. The size of the virtual display screen may be the full screen size or smaller than the full screen size.
The size of the window created on the virtual display screen can also be set. In the embodiment of the present application, the size of the window may be predetermined, or may be determined in real time when the window is created according to an application scenario.
The size relationship between the virtual display screen and the window created thereon may include the following:
case 1: when the virtual display screen is smaller than the full screen, the size of the window created for the virtual display screen may be equal to the size of the virtual display screen, or may be smaller than the size of the virtual display screen.
Case 2: in the case where the virtual display screen is full screen, the size of the window created for the virtual display screen is smaller than the virtual display screen.
Optionally, the first application in this embodiment may be a video-type application, and a video picture is displayed in a window of the application. The first application may also be a game-like application, in which a game screen is displayed in a window. The types of the first application listed above are merely examples, and the embodiments of the present application do not limit this.
Optionally, the second application in the embodiment of the present application may be a social application (such as a chat application), and a window of the application displays a user interface of the social application. The second application may also be a shopping-type application having a window in which a user interface of the shopping application is displayed. The type of the second application listed above is merely an example, and the embodiment of the present application is not limited thereto.
Optionally, in this embodiment of the application, the first application and the second application may be the same type of application or different types of applications.
Depending on the above type of first application and the type of second application, several combinations are given below:
combination 1: the first application is a video application and the second application is a social application. According to the combination, the user can chat with friends through the social application while watching the video. Because the window of the video application is full screen and the window of the social application is small, the video picture watched by the user is not influenced too much.
And (3) combination 2: the first application is a video-type application and the second application is a shopping-type application. According to the combination, the user can shop through the shopping application while watching the video. Because the window of the video application is full screen and the window of the shopping application is small, the video picture watched by the user is not influenced too much.
And (3) combination: the first application is a game-like application and the second application is a social-like application. According to the combination, the user can chat with friends through the social application while playing the game. Because the window of the game application is full screen and the window of the social application is small, the game playing of the user is not influenced too much.
And (4) combination: the first application is a game-like application and the second application is a shopping-like application. According to this combination, the user can shop through the shopping-like application while playing the game. Because the window of the game application is full screen and the window of the shopping application is small, the game playing of the user is not influenced too much.
The embodiment of the application can enable the application to support the picture-in-picture mode by modifying the interface parameters of the application aiming at the application which is declared not to support the picture-in-picture mode, so that the display of the picture-in-picture mode can be realized by the embodiment of the application for the application which supports the picture-in-picture mode and the application which is declared not to support the picture-in-picture mode.
Specifically, the flow shown in fig. 7 may further include: according to the parameter value of a display interface defined by a second application, judging whether the second application supports displaying in a window except a full-screen window, namely whether the second application supports displaying in a small window in a picture-in-picture mode; if the parameter value is judged not to be supported, the parameter value is modified to support the window displayed except the full-screen window.
Taking an Android system as an example, when the application is applied to user interface definition, it can be stated that multiple windows are not supported by setting the value of resizable parameter as false, that is, multiple window modes such as split screen mode, picture-in-picture mode and the like are not supported, and if the value of the parameter is true, it indicates that multiple windows are supported. In the conventional process, if the value of resizable parameter in the user interface parameters of an application is false, the application cannot be displayed in a picture-in-picture widget (sub-window), and only can be displayed in full screen. In the embodiment of the application, if the value of the resizable parameter in the user interface parameters of the second application is false, the value of the parameter may be modified to true, so that the second application supports the picture-in-picture mode, that is, the user interface of the second application may be displayed in the sub-window of the picture-in-picture.
In order to more clearly illustrate the above embodiments of the present application, the following describes embodiments of the present application with an Android system as an example.
Fig. 8a illustrates a creation process of a display object (i.e. a virtual display screen), which may be triggered by a user input to initiate a picture-in-picture mode in a full screen window currently displaying a first application. In this process, as a container for a window, a container Activity is first created (801); an ActiviyView component is then created in the container Activity (802), which is a UI control, visible to the user.
In the creation process of the ActiviyView component, a SurfaceView component is first created (8021). In an Android system, the SurfaceView is a special view component and has an independent drawing surface, and a UI of the SurfaceView can be drawn in an independent thread without occupying main thread resources. A display object (i.e., a virtual display screen) is then created in 8022. The display object, i.e. the virtual display screen, may be a carrier for a window of the second application, i.e. as a carrier for a sub-window in picture-in-picture mode.
ActiviyView creation is called back after success (803).
In the embodiment of the application, an interface, such as getTargetDisplayId (), is added to an ActivityView component to obtain an identifier of a display object. When the container Activity receives the ActiviyView creation success callback, the interface is called to obtain the identification (displayID) of the display object, and the displayID is recorded (804).
The displayID is obtained and may be recorded, while the recording may also include the package name of the application (805). The application is an application that needs to be presented in a sub-window in picture-in-picture mode. Alternatively, the recording may be performed in an attribute manner, that is, the identification displayID of the display object and the application package name are recorded as the system attribute.
The creating process of the SurfaceView component and the creating process of the display object (namely the virtual display screen) are not controlled by the Activity of the container, and whether the started application is the application needing to be displayed in the window created in the display object can be judged according to the identification display ID and the application package name of the display object in the application starting stage.
Fig. 8b exemplarily shows a start-up procedure of the second application in the picture-in-picture mode.
As shown in fig. 8b, in the starting phase of the second application, a resizable parameter value in the user interface parameter of the second application is first obtained (810), and if the parameter value is true, a window of the second application is created for the display object corresponding to the recorded display id, so that the second application is displayed in the window (812). Wherein the window of the second application is a sub-window of picture-in-picture having a smaller size than the window of the first application which is a main window of picture-in-picture.
If the parameter value is false, the parameter value is modified to true (811), and a window of a second application is created for the display object corresponding to the recorded display id, thereby displaying the second application in the window (812).
Fig. 9a illustrates a relationship between a window of a first application, a window of a second application, and a display object. As shown, a window of the second application is created for a display object (indicated in the figure by a dashed box to indicate that the object is not visible), the size of the display object being smaller than the size of the window of the first application, the size of the window of the second application being substantially the same as the size of the display object. In this way, the window of the second application, although created in the window of the first application, is based on a display object that is independent of the window of the first application and therefore relatively independent of the window of the first application.
Fig. 9b illustrates a relationship between a window of the first application, a window of the second application, and a display object. As shown, a window for the second application is created for a display object (shown in the figure as a dashed box to indicate that the object is not visible), the size of the display object being substantially the same as the size of the window for the first application, the size of the window for the second application being smaller than the size of the display object. In this way, the window of the second application, although created in the window of the first application, is based on a display object that is independent of the window of the first application and therefore relatively independent of the window of the first application.
Further embodiments of the present application also provide a non-transitory computer-readable storage medium having stored thereon computer instructions which, when executed by a processor, implement a method as described above in one or more of the embodiments in combination.
Finally, it should be noted that: the above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present application.

Claims (10)

1. A display device, comprising:
a touch screen for receiving a touch input of a user;
a display screen for displaying a user interface;
a controller coupled with the display screen and configured to:
in response to a user input to initiate a picture-in-picture mode in a full screen window displaying a first application;
starting a second application selected by a user;
displaying the second application in a window of the created virtual display screen;
wherein the window displaying the second application is smaller than the window displaying the first application.
2. The display device of claim 1, wherein the controller is further configured to:
judging whether the window except the full-screen window is supported to be displayed or not according to the parameter value of the display interface defined by the second application;
and if not, modifying the parameter value to support the window displayed except the full-screen window.
3. The display device of claim 1, wherein the controller is further configured to:
creating the virtual display screen;
and creating a display window for the virtual display screen.
4. The display device of claim 1, wherein the user input is a voice input.
5. The display device of claim 1, wherein the first application is a video-type application or a game-type application.
6. The display device of claim 1, wherein the second application is a chat-type application or a shopping-type application.
7. A method for displaying a picture-in-picture, comprising:
at a display device having a touchscreen, a display screen, and a controller:
in response to a user input to initiate a picture-in-picture mode in a full screen window displaying a first application;
starting a second application selected by a user;
displaying the second application in a window of the created virtual display screen;
wherein the window displaying the second application is smaller than the window displaying the first application.
8. The method of claim 7, wherein the method further comprises:
judging whether the window except the full-screen window is supported to be displayed according to the parameter value of the display interface defined by the second application;
and if not, modifying the parameter value to support the window displayed except the full-screen window.
9. The method of claim 8, wherein the method further comprises:
creating a virtual display screen;
creating the virtual display screen;
and creating a display window for the virtual display screen.
10. A computer-readable non-transitory storage medium having stored thereon computer instructions which, when executed by a controller, implement the method of any one of claims 7-9.
CN201910893043.XA 2019-09-20 2019-09-20 Picture-in-picture display method and display equipment Pending CN110659010A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910893043.XA CN110659010A (en) 2019-09-20 2019-09-20 Picture-in-picture display method and display equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910893043.XA CN110659010A (en) 2019-09-20 2019-09-20 Picture-in-picture display method and display equipment

Publications (1)

Publication Number Publication Date
CN110659010A true CN110659010A (en) 2020-01-07

Family

ID=69038249

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910893043.XA Pending CN110659010A (en) 2019-09-20 2019-09-20 Picture-in-picture display method and display equipment

Country Status (1)

Country Link
CN (1) CN110659010A (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111258699A (en) * 2020-01-21 2020-06-09 青岛海信移动通信技术股份有限公司 Page display method and communication terminal
CN111367456A (en) * 2020-02-28 2020-07-03 青岛海信移动通信技术股份有限公司 Communication terminal and display method in multi-window mode
CN111913621A (en) * 2020-07-29 2020-11-10 海信视像科技股份有限公司 Screen interface interactive display method and display equipment
CN111913622A (en) * 2020-07-29 2020-11-10 海信视像科技股份有限公司 Screen interface interactive display method and display equipment
CN112764619A (en) * 2021-01-22 2021-05-07 联想(北京)有限公司 Window control method and electronic equipment
CN113296662A (en) * 2021-04-14 2021-08-24 惠州市德赛西威汽车电子股份有限公司 Method for realizing multi-screen virtual display by single physical screen and storage medium
CN113938633A (en) * 2020-06-29 2022-01-14 聚好看科技股份有限公司 Video call processing method and display device
CN114721752A (en) * 2020-12-18 2022-07-08 青岛海信移动通信技术股份有限公司 Mobile terminal and display method of application interface thereof

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103581754A (en) * 2012-07-20 2014-02-12 腾讯科技(深圳)有限公司 Data display method and intelligent equipment
CN103986961A (en) * 2014-05-26 2014-08-13 惠州华阳通用电子有限公司 Method and device for achieving picture in picture based on QNX vehicle-mounted system
CN108319491A (en) * 2010-10-19 2018-07-24 苹果公司 Working space in managing user interface
CN108664300A (en) * 2018-04-03 2018-10-16 青岛海信移动通信技术股份有限公司 A kind of application interface display methods under picture-in-picture mode and device
CN109358941A (en) * 2018-11-01 2019-02-19 联想(北京)有限公司 A kind of control method and electronic equipment

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108319491A (en) * 2010-10-19 2018-07-24 苹果公司 Working space in managing user interface
CN103581754A (en) * 2012-07-20 2014-02-12 腾讯科技(深圳)有限公司 Data display method and intelligent equipment
CN103986961A (en) * 2014-05-26 2014-08-13 惠州华阳通用电子有限公司 Method and device for achieving picture in picture based on QNX vehicle-mounted system
CN108664300A (en) * 2018-04-03 2018-10-16 青岛海信移动通信技术股份有限公司 A kind of application interface display methods under picture-in-picture mode and device
CN109358941A (en) * 2018-11-01 2019-02-19 联想(北京)有限公司 A kind of control method and electronic equipment

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111258699A (en) * 2020-01-21 2020-06-09 青岛海信移动通信技术股份有限公司 Page display method and communication terminal
CN111367456A (en) * 2020-02-28 2020-07-03 青岛海信移动通信技术股份有限公司 Communication terminal and display method in multi-window mode
CN113938633A (en) * 2020-06-29 2022-01-14 聚好看科技股份有限公司 Video call processing method and display device
CN113938633B (en) * 2020-06-29 2023-09-08 聚好看科技股份有限公司 Video call processing method and display device
CN111913621A (en) * 2020-07-29 2020-11-10 海信视像科技股份有限公司 Screen interface interactive display method and display equipment
CN111913622A (en) * 2020-07-29 2020-11-10 海信视像科技股份有限公司 Screen interface interactive display method and display equipment
CN111913622B (en) * 2020-07-29 2022-04-19 海信视像科技股份有限公司 Screen interface interactive display method and display equipment
CN111913621B (en) * 2020-07-29 2022-04-19 海信视像科技股份有限公司 Screen interface interactive display method and display equipment
CN114721752A (en) * 2020-12-18 2022-07-08 青岛海信移动通信技术股份有限公司 Mobile terminal and display method of application interface thereof
CN112764619A (en) * 2021-01-22 2021-05-07 联想(北京)有限公司 Window control method and electronic equipment
CN113296662A (en) * 2021-04-14 2021-08-24 惠州市德赛西威汽车电子股份有限公司 Method for realizing multi-screen virtual display by single physical screen and storage medium

Similar Documents

Publication Publication Date Title
CN109618206B (en) Method and display device for presenting user interface
CN110659010A (en) Picture-in-picture display method and display equipment
CN111277884A (en) Video playing method and device
CN111836109A (en) Display device, server and method for automatically updating column frame
WO2021031623A1 (en) Display apparatus, file sharing method, and server
US11425466B2 (en) Data transmission method and device
CN111770370A (en) Display device, server and media asset recommendation method
CN111479145A (en) Display device and television program pushing method
CN112165641A (en) Display device
CN112380420A (en) Searching method and display device
CN111954059A (en) Screen saver display method and display device
CN111757024A (en) Method for controlling intelligent image mode switching and display equipment
CN111176603A (en) Image display method for display equipment and display equipment
CN112073787B (en) Display device and home page display method
CN112087671B (en) Display method and display equipment for control prompt information of input method control
WO2021184575A1 (en) Display device and display method
CN112203154A (en) Display device
CN112017415A (en) Recommendation method of virtual remote controller, display device and mobile terminal
CN110572519A (en) Method for intercepting caller identification interface and display equipment
CN111988646B (en) User interface display method and display device of application program
CN111259639B (en) Self-adaptive adjustment method of table and display equipment
CN111479146B (en) Display apparatus and display method
CN114501158B (en) Display device, external sound equipment and audio output method of external sound equipment
CN112261463A (en) Display device and program recommendation method
CN115185392A (en) Display device, image processing method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200107