CN111935530B - Display equipment - Google Patents
Display equipment Download PDFInfo
- Publication number
- CN111935530B CN111935530B CN202010758465.9A CN202010758465A CN111935530B CN 111935530 B CN111935530 B CN 111935530B CN 202010758465 A CN202010758465 A CN 202010758465A CN 111935530 B CN111935530 B CN 111935530B
- Authority
- CN
- China
- Prior art keywords
- floating layer
- display
- instruction
- layer window
- application
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 230000003993 interaction Effects 0.000 claims abstract description 27
- 230000004044 response Effects 0.000 claims description 23
- 230000006870 function Effects 0.000 claims description 21
- 230000009467 reduction Effects 0.000 claims description 6
- 238000012790 confirmation Methods 0.000 claims description 5
- 230000003321 amplification Effects 0.000 claims description 4
- 238000003199 nucleic acid amplification method Methods 0.000 claims description 4
- 238000004891 communication Methods 0.000 description 39
- 238000000034 method Methods 0.000 description 34
- 238000010586 diagram Methods 0.000 description 32
- 230000008569 process Effects 0.000 description 28
- 230000033001 locomotion Effects 0.000 description 17
- 230000008859 change Effects 0.000 description 13
- 238000012545 processing Methods 0.000 description 11
- 238000006243 chemical reaction Methods 0.000 description 8
- 230000000694 effects Effects 0.000 description 6
- 230000005236 sound signal Effects 0.000 description 6
- 230000007613 environmental effect Effects 0.000 description 4
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 238000003825 pressing Methods 0.000 description 3
- 230000009471 action Effects 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 2
- 238000013473 artificial intelligence Methods 0.000 description 2
- 239000002131 composite material Substances 0.000 description 2
- 238000012217 deletion Methods 0.000 description 2
- 230000037430 deletion Effects 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 230000002452 interceptive effect Effects 0.000 description 2
- 238000002156 mixing Methods 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 230000006399 behavior Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- 125000004122 cyclic group Chemical group 0.000 description 1
- 230000006837 decompression Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000012905 input function Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 230000008054 signal transmission Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 238000003786 synthesis reaction Methods 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/431—Generation of visual interfaces for content selection or interaction; Content or additional data rendering
- H04N21/4312—Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0486—Drag-and-drop
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G08—SIGNALLING
- G08C—TRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
- G08C17/00—Arrangements for transmitting signals characterised by the use of a wireless electrical link
- G08C17/02—Arrangements for transmitting signals characterised by the use of a wireless electrical link using a radio link
-
- G—PHYSICS
- G08—SIGNALLING
- G08C—TRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
- G08C23/00—Non-electrical signal transmission systems, e.g. optical systems
- G08C23/04—Non-electrical signal transmission systems, e.g. optical systems using light waves, e.g. infrared
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
- H04N21/42204—User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
- H04N21/42206—User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor characterized by hardware details
- H04N21/42221—Transmission circuitry, e.g. infrared [IR] or radio frequency [RF]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/431—Generation of visual interfaces for content selection or interaction; Content or additional data rendering
- H04N21/4312—Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
- H04N21/4316—Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations for displaying supplemental content in a region of the screen, e.g. an advertisement in a separate window
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/443—OS processes, e.g. booting an STB, implementing a Java virtual machine in an STB or power management in an STB
- H04N21/4438—Window management, e.g. event handling following interaction with the user interface
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- Computer Networks & Wireless Communication (AREA)
- Software Systems (AREA)
- Business, Economics & Management (AREA)
- Marketing (AREA)
- Controls And Circuits For Display Device (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The display device provided by the embodiment of the application comprises: display, user interface, controller and remote control. The user interface is used for receiving the instruction input by the user through the remote controller and transmitting the instruction to the control; the controller is used for responding to the received starting instruction, creating a first floating layer window corresponding to the current pull-up application, wherein the floating layer window is used for loading an application page of the previous pull-up application; and controlling the first floating layer window based on the received interaction instruction. The scheme controller that this application shows can be based on the interaction instruction control first superficial layer window that receives to make first superficial layer window realize corresponding function, can the display device that this application provided can realize the interaction between remote controller and the first superficial layer window.
Description
Technical Field
The application relates to the technical field of social televisions, in particular to display equipment.
Background
Currently, a display device is receiving a great deal of attention from users because it can provide a user with a play screen such as audio, video, pictures, etc. With the development of big data and artificial intelligence, the functional demands of users on display devices are increasing. For example, while the user wants to play the display screen, the user presents multiple paths of video chat screens; alternatively, when the user is in a game scene, the participant is displayed in real time as a real screen or the like. Under the application scene, the display equipment is required to create a plurality of windows, and different pictures are displayed by using different windows.
The existing display equipment adopts an Android system, and Android is built in a touch interaction mode supported by a smart phone. However, the television set screen cannot be touched, and the conventional display device only supports remote control operation, but the remote control of the conventional display device cannot interact with a window shown by a display.
Disclosure of Invention
The application provides a display device to solve the problems existing in the existing display devices.
An embodiment of the present application provides a display device, including: the display is used for displaying the application page; the user interface is used for receiving instructions input by a user, and the instructions comprise a starting instruction and an interaction instruction; a controller configured to: in response to receiving a starting instruction, creating a first floating layer window corresponding to a current pull-up application, wherein the floating layer window is used for loading an application page of the previous pull-up application; and controlling the first floating layer window based on the received interaction instruction.
The display device provided by the embodiment of the application comprises: display, user interface, controller and remote control. The user interface is used for receiving the instruction input by the user through the remote controller and transmitting the instruction to the control; the controller is used for responding to the received starting instruction, creating a first floating layer window corresponding to the current pull-up application, wherein the floating layer window is used for loading an application page of the previous pull-up application; and controlling the first floating layer window based on the received interaction instruction. The scheme controller that this application shows can be based on the interaction instruction control first superficial layer window that receives to make first superficial layer window realize corresponding function, can the display device that this application provided can realize the interaction between remote controller and the first superficial layer window.
Drawings
In order to more clearly illustrate the embodiments of the present application or the implementation in the related art, a brief description will be given below of the drawings required for the embodiments or the related art descriptions, and it is apparent that the drawings in the following description are some embodiments of the present application, and other drawings may be obtained according to these drawings for those of ordinary skill in the art.
A schematic diagram of an operational scenario between a display device and a control apparatus according to some embodiments is exemplarily provided in fig. 1;
a hardware configuration block diagram of a display device 200 according to some embodiments is exemplarily provided in fig. 2;
a hardware configuration block diagram of the control device 100 according to some embodiments is exemplarily provided in fig. 3;
a schematic diagram of the software configuration in the display device 200 according to some embodiments is exemplarily provided in fig. 4;
an icon control page display schematic of an application in a display device 200 according to some embodiments is provided in fig. 5 by way of example;
a flowchart of interactions between components of a display device according to some embodiments is provided schematically in fig. 6;
FIG. 7 is a schematic diagram of a display showing pages in a multi-screen display mode according to one possible embodiment;
FIG. 8 is a schematic diagram of a display showing pages in a multi-screen display mode according to one possible embodiment;
FIG. 9 is a schematic diagram of a remote control provided in accordance with a possible embodiment;
FIG. 10 is a schematic diagram of a display page of a display provided in accordance with one possible embodiment;
FIG. 11 is a diagram showing a change in a page displayed on a display before and after receiving a start instruction according to one possible embodiment;
FIG. 12 is a schematic diagram showing a change of a display page in a process of controlling a first floating layer window to be enlarged according to one possible embodiment;
FIG. 13 is a schematic diagram illustrating a change of a display page in a process of controlling a first floating layer window to be scaled down according to one possible embodiment;
FIG. 14 is a schematic diagram of a page presented by a display when a controller enters a drag mode according to one possible embodiment;
FIG. 15 is a diagram showing the change of a page displayed by the display in the process of controlling the movement of the first floating layer window according to one possible embodiment;
FIG. 16 is a diagram showing the change of a page displayed by a display in controlling the movement of a first floating layer window according to one possible embodiment;
FIG. 17 is a schematic diagram providing a page for displaying a presentation before and after creating a second floating layer window according to one possible embodiment;
FIG. 18 is a schematic diagram of a display presentation page provided in accordance with a possible embodiment;
FIG. 19 is a schematic diagram of a variation of a display showing page provided in accordance with a possible embodiment;
FIG. 20 is a schematic diagram of a variation of a display showing page provided in accordance with one possible embodiment.
Detailed Description
For purposes of clarity, embodiments and advantages of the present application, the following description will make clear and complete the exemplary embodiments of the present application, with reference to the accompanying drawings in the exemplary embodiments of the present application, it being apparent that the exemplary embodiments described are only some, but not all, of the examples of the present application.
Based on the exemplary embodiments described herein, all other embodiments that may be obtained by one of ordinary skill in the art without making any inventive effort are within the scope of the claims appended hereto. Furthermore, while the disclosure is presented in the context of an exemplary embodiment or embodiments, it should be appreciated that the various aspects of the disclosure may, separately, comprise a complete embodiment.
It should be noted that the brief description of the terms in the present application is only for convenience in understanding the embodiments described below, and is not intended to limit the embodiments of the present application. Unless otherwise indicated, these terms should be construed in their ordinary and customary meaning.
The terms first, second, third and the like in the description and in the claims and in the above-described figures are used for distinguishing between similar or similar objects or entities and not necessarily for describing a particular sequential or chronological order, unless otherwise indicated (Unless otherwise indicated). It is to be understood that the terms so used are interchangeable under appropriate circumstances such that the embodiments of the application are, for example, capable of operation in sequences other than those illustrated or otherwise described herein.
Furthermore, the terms "comprise" and "have," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a product or apparatus that comprises a list of elements is not necessarily limited to those elements expressly listed, but may include other elements not expressly listed or inherent to such product or apparatus.
The term "module" as used in this application refers to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware and/or software code that is capable of performing the function associated with that element.
The term "remote control" as used in this application refers to a component of an electronic device (such as a display device as disclosed in this application) that can typically be controlled wirelessly over a relatively short distance. Typically, the electronic device is connected to the electronic device using infrared and/or Radio Frequency (RF) signals and/or bluetooth, and may also include functional modules such as WiFi, wireless USB, bluetooth, motion sensors, etc. For example: the hand-held touch remote controller replaces most of the physical built-in hard keys in a general remote control device with a touch screen user interface.
The term "gesture" as used herein refers to a user behavior by which a user expresses an intended idea, action, purpose, and/or result through a change in hand shape or movement of a hand, etc.
A schematic diagram of an operation scenario between the display device and the control apparatus according to an embodiment is exemplarily provided in fig. 1. As provided in fig. 1, a user may operate the display apparatus 200 through the mobile terminal 300 and the control device 100.
In some embodiments, the control apparatus 100 may be a remote controller, and the communication between the remote controller and the display device includes infrared protocol communication or bluetooth protocol communication, and other short-range communication modes, etc., and the display device 200 is controlled by a wireless or other wired mode. The user may control the display device 200 by inputting user instructions through keys on a remote control, voice input, control panel input, etc. Such as: the user can input corresponding control instructions through volume up-down keys, channel control keys, up/down/left/right movement keys, voice input keys, menu keys, on-off keys, etc. on the remote controller to realize the functions of the control display device 200.
In some embodiments, mobile terminals, tablet computers, notebook computers, and other smart devices may also be used to control the display device 200. For example, the display device 200 is controlled using an application running on a smart device. The application, by configuration, may provide various controls to the user in an intuitive user page (UI) on a screen associated with the smart device.
In some embodiments, the mobile terminal 300 may install a software application with the display device 200, implement connection communication through a network communication protocol, and achieve the purpose of one-to-one control operation and data communication. Such as: it is possible to implement a control command protocol established between the mobile terminal 300 and the display device 200, synchronize a remote control keyboard to the mobile terminal 300, and implement a function of controlling the display device 200 by controlling a user interface on the mobile terminal 300. The audio/video content displayed on the mobile terminal 300 can also be transmitted to the display device 200, so as to realize the synchronous display function.
As also provided in fig. 1, the display device 200 is also in data communication with the server 400 via a variety of communication means. The display device 200 may be permitted to make communication connections via a Local Area Network (LAN), a Wireless Local Area Network (WLAN), and other networks. The server 400 may provide various contents and interactions to the display device 200. By way of example, display device 200 receives software program updates, or accesses a remotely stored digital media library by sending and receiving information, as well as Electronic Program Guide (EPG) interactions. The server 400 may be a cluster, or may be multiple clusters, and may include one or more types of servers. Other web service content such as video on demand and advertising services are provided through the server 400.
The display device 200 may be a liquid crystal display, an OLED display, a projection display device. The particular display device type, size, resolution, etc. are not limited, and those skilled in the art will appreciate that the display device 200 may be modified in performance and configuration as desired.
The display apparatus 200 may additionally provide a smart network television function of a computer support function, including, but not limited to, a network television, a smart television, an Internet Protocol Television (IPTV), etc., in addition to the broadcast receiving television function.
A hardware configuration block diagram of the display device 200 in accordance with an exemplary embodiment is exemplarily provided in fig. 2.
In some embodiments, at least one of the controller 250, the modem 210, the communicator 220, the detector 230, the input/output interface 255, the display 275, the audio output interface 285, the memory 260, the power supply 290, the user interface 265, and the external device interface 240 is included in the display apparatus 200.
In some embodiments, the display 275 is configured to receive image signals from the first processor output, and to display video content and images and components of the menu manipulation interface.
In some embodiments, display 275 includes a display screen assembly for presenting pictures, and a drive assembly for driving the display of images.
In some embodiments, the video content is displayed from broadcast television content, or alternatively, from various broadcast signals that may be received via a wired or wireless communication protocol. Alternatively, various image contents received from the network server side transmitted from the network communication protocol may be displayed.
In some embodiments, the display 275 is used to present a user-manipulated UI interface generated in the display device 200 and used to control the display device 200.
In some embodiments, depending on the type of display 275, a drive assembly for driving the display is also included.
In some embodiments, display 275 is a projection display and may further include a projection device and a projection screen.
In some embodiments, communicator 220 is a component for communicating with external devices or external servers according to various communication protocol types. For example: the communicator may include at least one of a Wifi chip, a bluetooth communication protocol chip, a wired ethernet communication protocol chip, or other network communication protocol chip or a near field communication protocol chip, and an infrared receiver.
In some embodiments, the display device 200 may establish control signal and data signal transmission and reception between the communicator 220 and the external control device 100 or the content providing device.
In some embodiments, the user interface 265 may be used to receive infrared control signals from the control device 100 (e.g., an infrared remote control, etc.).
In some embodiments, the detector 230 is a signal that the display device 200 uses to capture or interact with the external environment.
In some embodiments, the detector 230 includes an optical receiver, a sensor for capturing the intensity of ambient light, a parameter change may be adaptively displayed by capturing ambient light, etc.
In some embodiments, the detector 230 may further include an image collector, such as a camera, a video camera, etc., which may be used to collect external environmental scenes, collect attributes of a user or interact with a user, adaptively change display parameters, and recognize a user gesture to realize an interaction function with the user.
In some embodiments, the detector 230 may also include a temperature sensor or the like, such as by sensing ambient temperature.
In some embodiments, the display device 200 may adaptively adjust the display color temperature of the image. The display device 200 may be adjusted to display a colder color temperature shade of the image, such as when the temperature is higher, or the display device 200 may be adjusted to display a warmer color shade of the image when the temperature is lower.
In some embodiments, the detector 230 may also be a sound collector or the like, such as a microphone, that may be used to receive the user's sound. Illustratively, a voice signal including a control instruction for a user to control the display apparatus 200, or an acquisition environmental sound is used to recognize an environmental scene type so that the display apparatus 200 can adapt to environmental noise.
In some embodiments, as shown in fig. 2, the input/output interface 255 is configured to enable data transfer between the controller 250 and external other devices or other controllers 250. Such as receiving video signal data and audio signal data of an external device, command instruction data, or the like.
In some embodiments, external device interface 240 may include, but is not limited to, the following: any one or more interfaces of a high definition multimedia interface HDMI interface, an analog or data high definition component input interface, a composite video input interface, a USB input interface, an RGB port, and the like can be used. The plurality of interfaces may form a composite input/output interface.
In some embodiments, as shown in fig. 2, the modem 210 is configured to receive the broadcast television signal by a wired or wireless receiving manner, and may perform modulation and demodulation processes such as amplification, mixing, and resonance, and demodulate the audio/video signal from a plurality of wireless or wired broadcast television signals, where the audio/video signal may include a television audio/video signal carried in a television channel frequency selected by a user, and an EPG data signal.
In some embodiments, the frequency point demodulated by the modem 210 is controlled by the controller 250, and the controller 250 may send a control signal according to the user selection, so that the modem responds to the television signal frequency selected by the user and modulates and demodulates the television signal carried by the frequency.
In some embodiments, the broadcast television signal may be classified into a terrestrial broadcast signal, a cable broadcast signal, a satellite broadcast signal, an internet broadcast signal, or the like according to a broadcasting system of the television signal. Or may be differentiated into digital modulation signals, analog modulation signals, etc., depending on the type of modulation. Or it may be classified into digital signals, analog signals, etc. according to the kind of signals.
In some embodiments, the controller 250 and the modem 210 may be located in separate devices, i.e., the modem 210 may also be located in an external device to the main device in which the controller 250 is located, such as an external set-top box or the like. In this way, the set-top box outputs the television audio and video signals modulated and demodulated by the received broadcast television signals to the main body equipment, and the main body equipment receives the audio and video signals through the first input/output interface.
In some embodiments, the controller 250 controls the operation of the display device and responds to user operations through various software control programs stored on the memory. The controller 250 may control the overall operation of the display apparatus 200. For example: in response to receiving a user command to select to display a UI object on the display 275, the controller 250 may perform an operation related to the object selected by the user command.
In some embodiments, the object may be any one of selectable objects, such as a hyperlink or an icon. Operations related to the selected object, such as: operations to connect to a hyperlink page, document, image, etc., or operations to execute a program corresponding to an icon are displayed. The user command for selecting the UI object may be an input command through various input means (e.g., mouse, keyboard, touch pad, etc.) connected to the display device 200 or a voice command corresponding to a voice uttered by the user.
As shown in fig. 2, the controller 250 includes at least one of a random access Memory 251 (Random Access Memory, RAM), a Read-Only Memory 252 (ROM), a video processor 270, an audio processor 280, other processors 253 (e.g., a graphics processor (Graphics Processing Unit, GPU), a central processing unit 254 (Central Processing Unit, CPU), a communication interface (Communication Interface), and a communication Bus 256 (Bus), which connects the respective components.
In some embodiments, RAM 251 is used to store temporary data for the operating system or other on-the-fly programs, and in some embodiments ROM 252 is used to store various system boot instructions.
In some embodiments, ROM 252 is used to store a basic input output system, referred to as a basic input output system (Basic Input Output System, BIOS). The system comprises a drive program and a boot operating system, wherein the drive program is used for completing power-on self-checking of the system, initialization of each functional module in the system and basic input/output of the system.
In some embodiments, upon receipt of the power-on signal, the display device 200 power starts up, the CPU runs system boot instructions in the ROM 252, copies temporary data of the operating system stored in memory into the RAM 251, in order to start up or run the operating system. When the operating system is started, the CPU copies temporary data of various applications in the memory to the RAM 251, and then, facilitates starting or running of the various applications.
In some embodiments, CPU processor 254 is used to execute operating system and application program instructions stored in memory. And executing various application programs, data and contents according to various interactive instructions received from the outside, so as to finally display and play various audio and video contents.
In some exemplary embodiments, the CPU processor 254 may comprise a plurality of processors. The plurality of processors may include one main processor and one or more sub-processors. A main processor for performing some operations of the display apparatus 200 in the pre-power-up mode and/or displaying a picture in the normal mode. One or more sub-processors for one operation in a standby mode or the like.
In some embodiments, the graphics processor 253 is configured to generate various graphical objects, such as: icons, operation menus, user input instruction display graphics, and the like. The device comprises an arithmetic unit, wherein the arithmetic unit is used for receiving various interaction instructions input by a user to carry out operation and displaying various objects according to display attributes. And a renderer for rendering the various objects obtained by the arithmetic unit, wherein the rendered objects are used for being displayed on a display.
In some embodiments, video processor 270 is configured to receive external video signals, perform video processing such as decompression, decoding, scaling, noise reduction, frame rate conversion, resolution conversion, image composition, etc., according to standard codec protocols for input signals, and may result in signals that are displayed or played on directly displayable device 200.
In some embodiments, video processor 270 includes a demultiplexing module, a video decoding module, an image compositing module, a frame rate conversion module, a display formatting module, and the like.
The demultiplexing module is used for demultiplexing the input audio/video data stream, such as the input MPEG-2, and demultiplexes the input audio/video data stream into video signals, audio signals and the like.
And the video decoding module is used for processing the demultiplexed video signals, including decoding, scaling and the like.
And an image synthesis module, such as an image synthesizer, for performing superposition mixing processing on the graphic generator and the video image after the scaling processing according to the GUI signal input by the user or generated by the graphic generator, so as to generate an image signal for display.
The frame rate conversion module is configured to convert the input video frame rate, for example, converting the 60Hz frame rate into the 120Hz frame rate or the 240Hz frame rate, and the common format is implemented in an inserting frame manner.
The display format module is used for converting the received frame rate into a video output signal, and changing the video output signal to a signal conforming to the display format, such as outputting an RGB data signal.
In some embodiments, the graphics processor 253 may be integrated with the video processor, or may be separately configured, where the integrated configuration may perform processing of graphics signals output to the display, and the separate configuration may perform different functions, such as gpu+ FRC (Frame Rate Conversion)) architecture, respectively.
In some embodiments, the audio processor 280 is configured to receive an external audio signal, decompress and decode the audio signal according to a standard codec protocol of an input signal, and perform noise reduction, digital-to-analog conversion, and amplification processing, so as to obtain a sound signal that can be played in a speaker.
In some embodiments, video processor 270 may include one or more chips. The audio processor may also comprise one or more chips.
In some embodiments, video processor 270 and audio processor 280 may be separate chips or may be integrated together with the controller in one or more chips.
In some embodiments, the audio output, under the control of the controller 250, receives sound signals output by the audio processor 280, such as: the speaker 286, and an external sound output terminal that can be output to a generating device of an external device, other than the speaker carried by the display device 200 itself, such as: external sound interface or earphone interface, etc. can also include the close range communication module in the communication interface, for example: and the Bluetooth module is used for outputting sound of the Bluetooth loudspeaker.
The power supply 290 supplies power input from an external power source to the display device 200 under the control of the controller 250. The power supply 290 may include a built-in power circuit installed inside the display device 200, or may be an external power source installed in the display device 200, and a power interface for providing an external power source in the display device 200.
The user interface 265 is used to receive an input signal from a user and then transmit the received user input signal to the controller 250. The user input signal may be a remote control signal received through an infrared receiver, and various user control signals may be received through a network communication module.
In some embodiments, a user inputs a user command through the control apparatus 100 or the mobile terminal 300, the user input interface is then responsive to the user input through the controller 250, and the display device 200 is then responsive to the user input.
In some embodiments, a user may input a user command through a Graphical User Interface (GUI) displayed on the display 275, and the user input interface receives the user input command through the Graphical User Interface (GUI). Alternatively, the user may input the user command by inputting a specific sound or gesture, and the user input interface recognizes the sound or gesture through the sensor to receive the user input command.
In some embodiments, a "user interface" is a media interface for interaction and exchange of information between an application or operating system and a user that enables conversion between an internal form of information and a form acceptable to the user. A commonly used presentation form of the user interface is a graphical user interface (Graphic User Interface, GUI), which refers to a user interface related to computer operations that is displayed in a graphical manner. It may be an interface element such as an icon, a window, a control, etc. displayed in a display screen of the electronic device, where the control may include a visual interface element such as an icon, a button, a menu, a tab, a text box, a dialog box, a status bar, a navigation bar, a Widget, etc.
The memory 260 includes memory storing various software modules for driving the display device 200. Such as: various software modules stored in the first memory, including: at least one of a base module, a detection module, a communication module, a display control module, a browser module, various service modules, and the like.
The base module is a bottom software module for signal communication between the various hardware in the display device 200 and for sending processing and control signals to the upper modules. The detection module is used for collecting various information from various sensors or user input interfaces and carrying out digital-to-analog conversion and analysis management.
For example, the voice recognition module includes a voice analysis module and a voice instruction database module. The display control module is used for controlling the display to display the image content, and can be used for playing the multimedia image content, the UI interface and other information. And the communication module is used for carrying out control and data communication with external equipment. And the browser module is used for executing data communication between the browsing servers. And the service module is used for providing various services and various application programs. Meanwhile, the memory 260 also stores received external data and user data, images of various items in various user interfaces, visual effect maps of focus objects, and the like.
Fig. 3 exemplarily provides a block diagram of a configuration of the control apparatus 100 in accordance with an exemplary embodiment. As shown in fig. 3, the control device 100 includes a controller 110, a communication interface 130, a user input/output interface, a memory, and a power supply.
The control device 100 is configured to control the display device 200 and to receive input operation instructions from a user, and to convert the operation instructions into instructions recognizable and corresponding to the display device 200, enabling interaction between the user and the display device 200. Such as: the user responds to the channel addition and subtraction operation by operating the channel addition and subtraction key on the control apparatus 100, and the display apparatus 200.
In some embodiments, the control device 100 may be a smart device. Such as: the control apparatus 100 may install various applications for controlling the display apparatus 200 according to user's needs.
In some embodiments, as shown in fig. 1, a mobile terminal 300 or other intelligent electronic device may function similarly to the control device 100 after installing an application that manipulates the display device 200. Such as: the user may implement the functions of controlling the physical keys of the device 100 by installing various function keys or virtual buttons of a graphical user interface available on the mobile terminal 300 or other intelligent electronic device.
The controller 110 includes a processor 112 and RAM 113 and ROM 114, a communication interface 130, and a communication bus. The controller is used to control the operation and operation of the control device 100, as well as the communication collaboration among the internal components and the external and internal data processing functions.
The communication interface 130 enables communication of control signals and data signals with the display device 200 under the control of the controller 110. Such as: the received user input signal is transmitted to the display device 200. The communication interface 130 may include at least one of a WiFi chip 131, a bluetooth module 132, an NFC module 133, and other near field communication modules.
A user input/output interface 140, wherein the input interface includes at least one of a microphone 141, a touchpad 142, a sensor 143, keys 144, and other input interfaces. Such as: the user can implement a user instruction input function through actions such as voice, touch, gesture, press, and the like, and the input interface converts a received analog signal into a digital signal and converts the digital signal into a corresponding instruction signal, and sends the corresponding instruction signal to the display device 200.
The output interface includes an interface that transmits the received user instruction to the display device 200. In some embodiments, an infrared interface may be used, as well as a radio frequency interface. Such as: when the infrared signal interface is used, the user input instruction needs to be converted into an infrared control signal according to an infrared control protocol, and the infrared control signal is sent to the display device 200 through the infrared sending module. And the following steps: when the radio frequency signal interface is used, the user input instruction is converted into a digital signal, and then the digital signal is modulated according to a radio frequency control signal modulation protocol and then transmitted to the display device 200 through the radio frequency transmission terminal.
In some embodiments, the control device 100 includes at least one of a communication interface 130 and an input-output interface 140. The control device 100 is provided with a communication interface 130 such as: the WiFi, bluetooth, NFC, etc. modules may send the user input instruction to the display device 200 through a WiFi protocol, or a bluetooth protocol, or an NFC protocol code.
A memory 190 for storing various operation programs, data and applications for driving and controlling the control device 200 under the control of the controller. The memory 190 may store various control signal instructions input by a user.
A power supply 180 for providing operating power support for the various elements of the control device 100 under the control of the controller. May be a battery and associated control circuitry.
In some embodiments, the system may include a Kernel (Kernel), a command parser (shell), a file system, and an application. The kernel, shell, and file system together form the basic operating system architecture that allows users to manage files, run programs, and use the system. After power-up, the kernel is started, the kernel space is activated, hardware is abstracted, hardware parameters are initialized, virtual memory, a scheduler, signal and inter-process communication (IPC) are operated and maintained. After the kernel is started, shell and user application programs are loaded again. The application program is compiled into machine code after being started to form a process.
Referring to FIG. 4, in some embodiments, the system is divided into four layers, from top to bottom, an application layer (referred to as an "application layer"), an application framework layer (Application Framework layer) (referred to as a "framework layer"), a An Zhuoyun row (Android run) and a system library layer (referred to as a "system runtime layer"), and a kernel layer, respectively.
In some embodiments, at least one application program is running in the application program layer, and these application programs may be a Window (Window) program of an operating system, a system setting program, a clock program, a camera application, and the like; and may be an application program developed by a third party developer, such as a hi-see program, a K-song program, a magic mirror program, etc. In particular implementations, the application packages in the application layer are not limited to the above examples, and may actually include other application packages, which are not limited in this embodiment of the present application.
The framework layer provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The application framework layer includes a number of predefined functions. The application framework layer corresponds to a processing center that decides to let the applications in the application layer act. The application program can access the resources in the system and obtain the services of the system in the execution through the API interface, as shown in fig. 4, and in this embodiment of the present application, the application framework layer includes a manager (manager), a Content Provider (Content Provider), and the like, where the manager includes at least one of the following modules: an Activity Manager (Activity Manager) is used to interact with all activities that are running in the system; a Location Manager (Location Manager) is used to provide system services or applications with access to system Location services; a Package Manager (Package Manager) for retrieving various information about an application Package currently installed on the device; a notification manager (Notification Manager) for controlling the display and clearing of notification messages; a Window Manager (Window Manager) is used to manage bracketing icons, windows, toolbars, wallpaper, and desktop components on the user interface.
In some embodiments, the activity manager is to: the lifecycle of each application program is managed, as well as the usual navigation rollback functions, such as controlling the exit of the application program (including switching the currently displayed user interface in the display window to the system desktop), opening, backing (including switching the currently displayed user interface in the display window to the previous user interface of the currently displayed user interface), etc.
In some embodiments, the window manager is configured to manage all window procedures, such as obtaining a display screen size, determining whether there is a status bar, locking the screen, intercepting the screen, controlling display window changes (e.g., scaling the display window down, dithering, distorting, etc.), and so on.
In some embodiments, the system runtime layer provides support for the upper layer, the framework layer, and when the framework layer is in use, the android operating system runs the C/C++ libraries contained in the system runtime layer to implement the functions to be implemented by the framework layer.
In some embodiments, the kernel layer is a layer between hardware and software. As shown in fig. 4, the kernel layer contains at least one of the following drivers: audio drive, display drive, bluetooth drive, camera drive, WIFI drive, USB drive, HDMI drive, sensor drive (e.g., fingerprint sensor, temperature sensor, touch sensor, pressure sensor, etc.), and the like.
In some embodiments, the kernel layer further includes a power driver module for power management.
In some embodiments, the software programs and/or modules corresponding to the software architecture in fig. 4 are stored in the first memory or the second memory shown in fig. 2 or fig. 3.
In some embodiments, taking a magic mirror application (photographing application) as an example, when the remote control receiving device receives an input operation of the remote control, a corresponding hardware interrupt is sent to the kernel layer. The kernel layer processes the input operation into the original input event (including the value of the input operation, the timestamp of the input operation, etc.). The original input event is stored at the kernel layer. The application program framework layer acquires an original input event from the kernel layer, identifies a control corresponding to the input event according to the current position of the focus and takes the input operation as a confirmation operation, wherein the control corresponding to the confirmation operation is a control of a magic mirror application icon, the magic mirror application calls an interface of the application framework layer, the magic mirror application is started, and further, a camera driver is started by calling the kernel layer, so that a still image or video is captured through a camera.
In some embodiments, for a display device with a touch function, taking a split screen operation as an example, the display device receives an input operation (such as a split screen operation) acted on a display screen by a user, and the kernel layer may generate a corresponding input event according to the input operation and report the event to the application framework layer. The window mode (e.g., multi-window mode) and window position and size corresponding to the input operation are set by the activity manager of the application framework layer. And window management of the application framework layer draws a window according to the setting of the activity manager, then the drawn window data is sent to a display driver of the kernel layer, and the display driver displays application interfaces corresponding to the window data in different display areas of the display screen.
In some embodiments, as shown in fig. 5, the application layer contains at least one icon control that the application can display in the display, such as: a live television application icon control, a video on demand application icon control, a media center application icon control, an application center icon control, a game application icon control, and the like.
In some embodiments, the live television application may provide live television via different signal sources. For example, a live television application may provide television signals using inputs from cable television, radio broadcast, satellite services, or other types of live television services. And, the live television application may display video of the live television signal on the display device 200.
In some embodiments, the video on demand application may provide video from different storage sources. Unlike live television applications, video-on-demand provides video displays from some storage sources. For example, video-on-demand may come from the server side of cloud storage, from a local hard disk storage containing stored video programs.
In some embodiments, the media center application may provide various multimedia content playing applications. For example, a media center may be a different service than live television or video on demand, and a user may access various images or audio through a media center application.
In some embodiments, an application center may be provided to store various applications. The application may be a game, an application, or some other application associated with a computer system or other device but which may be run in a smart television. The application center may obtain these applications from different sources, store them in local storage, and then be run on the display device 200.
The existing display equipment adopts an Android system, and Android is built in a touch interaction mode supported by a smart phone. However, the television set screen cannot be touched, and the conventional display device only supports remote control operation, so that interaction logic between large and small windows cannot be solved. Accordingly, there is an urgent need for a display device that can implement interactive logic between large and small windows.
In order to solve the above technical problems, the embodiments of the present application provide a display device, and the functional connection relationships of the components of the display device may refer to the above embodiments, which are not described herein.
Wherein the interaction flow chart between the components of the display device may refer to fig. 6, wherein the user interface is configured to execute step S101 to receive an instruction input by a user;
The instructions involved in this embodiment include a start instruction and an interaction instruction; the starting instruction is used for controlling the display device to enter a multi-screen (window) display mode. In this embodiment, the display may present multiple pages when the display device is in the multi-screen presentation mode. For example, fig. 7 is a schematic diagram of a display showing a page in a multi-screen showing mode according to a possible embodiment. The application scenario provided in fig. 7 is an application scenario of video call, and it can be seen from fig. 7 that two call pages are displayed in the display, where one call page is a video page of the home terminal and the other call page is a video page of the opposite terminal. Fig. 8 is a schematic diagram of a display showing a page in a multi-screen display mode according to a possible embodiment. The application scene provided in fig. 8 is chat while watching, and it can be seen from fig. 8 that the display displays a video page, and a floating layer window is created in the upper right corner of the video page and is used for displaying the video page of the opposite terminal. The embodiment of the application is only used for displaying two multi-screen display pages in an exemplary manner, and the display form of the pages can be set according to the requirements in the practical application process, so that the applicant does not limit the display form too much.
The controller is configured to execute step S102, in response to receiving the start instruction, to create a first floating layer window corresponding to the current pull-up application, the floating layer window being used to load the current application page.
The application page provided by the embodiment of the application page can be, but is not limited to, a video picture, a search page, a main page, a call page and the like, and any page which can be displayed on a display in the actual application process can be called an application page.
The corresponding relation between the starting instruction and the remote controller key is bound in advance in the actual application process. For example, a multi-screen display page start button is added on the basis of the existing remote controller, and specifically, referring to fig. 9, fig. 9 is a schematic diagram of a remote controller provided according to a feasible embodiment, and the remote controller provided in fig. 9 is added with a multi-screen display page start button 101, and each time a user touches the button, the remote controller sends a corresponding start command to the controller. For another example, the correspondence between the start command and the plurality of remote controller keys may be bound in advance, and when the user touches the plurality of keys bound to the start command, the remote controller issues the start command. In a feasible embodiment, the keys to which the start command is bound are keys (right, down, right, down), and the remote controller sends the start command to the controller only if the user continuously touches the keys (right, down, right, down) within a preset time. By adopting the binding method, the starting instruction can be prevented from being sent out due to misoperation of a user. The embodiment of the application is only to provide a binding relation between several starting instructions and keys by way of example, and the binding relation between the starting instructions and the keys can be set according to the habit of a user in the actual application process, so that the applicant does not make excessive restrictions.
In this embodiment, the controller creates a first floating layer window corresponding to the currently pulled application in response to receiving the start instruction. In this embodiment, the number of the first floating layer windows is determined according to the current pulling application, for example, the current pulling application of the display device is a video call application, the controller receives the start command, creates two first floating layer windows, the page displayed by the corresponding display can continue to refer to fig. 7, it can be seen from fig. 7 that the controller creates two first floating layer windows, one floating layer window displays the video page of the home terminal, and the other floating layer window displays the video page of the opposite terminal. For another example, when the application currently pulled up by the display device is a video application and the controller receives the start command, the controller creates a first floating layer window, and referring to fig. 10, it can be seen from fig. 10 that the controller creates a first floating layer window for displaying a video frame.
In this embodiment, the position of the first floating layer window may be determined according to the requirement. FIG. 11 is a diagram of a variation of a page presented by a display before and after receiving a start instruction according to one possible embodiment. As can be seen from fig. 11, the display presents the video frame in a full screen manner, and the page presented by the display can be referred to as page 11. When the controller receives a starting instruction, the controller creates a first floating layer window, and a video picture of the currently played video is loaded in the first floating layer window. In this embodiment, the position of the first floating layer window is not limited, for example, the center of the display that can be located in the first floating layer window may be specifically referred to as the page 12A in fig. 11. For another example, the upper left corner of the display may be located in the first floating layer window, and more particularly, reference may be made to page 12A in fig. 11.
S103, controlling the first floating layer window based on the received interaction instruction so that the first floating layer window achieves corresponding functions.
In this embodiment, the first floating layer window may implement a corresponding function based on control of the interaction instruction.
In the actual process, each key of the remote control is bound with a corresponding control command in advance, for example, when the key 104 in fig. 9 is touched, the remote control sends a volume increasing control command to the controller, and the corresponding control increases the gain parameter of the speaker when receiving the volume increasing control command. The embodiment of the application provides the following specific implementation modes for increasing the interaction modes between the remote controllers on the basis of not changing the original structure of the remote controllers:
the scaling control process for the first floating layer window may be:
in response to receiving the zoom instruction, entering a zoom mode, and responding to the received zoom-out instruction, zoom-in instruction and return instruction by the controller.
In a practical application process, the corresponding relation between the scaling instruction and the remote controller key is bound in advance, and in a feasible embodiment, the key bound by the scaling instruction may be the key 102 in fig. 9, and a specific triggering process may be long-press key 102. For example, when the user presses the key 102 for a long time, the remote controller sends a zoom instruction to the controller, and the controller enters the zoom mode after receiving the zoom instruction. The time of the long press is not limited in this application, and may be, for example, 3s or more.
After the controller enters the zoom mode, the controller only responds correspondingly to the interaction instruction related to zooming. For example, in a feasible embodiment, after the controller enters the zoom mode, the user touches the key 3, where the control command corresponding to the key 3 is a control command for the controller to display the television program of the channel 3, and since the controller is in the zoom mode at this time, the controller does not respond to the control command corresponding to the key 3.
When the controller enters a zoom mode, a first prompt message is displayed on the corresponding display, wherein the first prompt message is used for prompting a user to zoom in a key corresponding to the instruction and zoom out the key corresponding to the instruction. Specifically, referring to fig. 12, fig. 12 is a schematic diagram illustrating a change of a display page in a process of controlling the enlargement of a first floating layer window according to one possible embodiment. As can be seen from the figure, when the controller enters the zoom mode, the controller controls to display and display the first prompt message, specifically, referring to the page 22 in fig. 12, the prompt message in the page includes two parts of prompt messages, wherein one part of the prompt messages enlarges the key corresponding to the instruction, and the other part of the prompt messages reduces the key corresponding to the instruction.
The user can touch the corresponding key according to the prompt of the first prompt information.
And when the user touches the key corresponding to the amplifying instruction once, the controller sends an amplifying instruction to the controller, and correspondingly, the controller responds to the received amplifying instruction to control the first floating layer window to be expanded.
In this application, the position of the enlarged first floating layer window is not limited, for example, in a feasible embodiment, a top and bottom of the first floating layer window may be used as a fixed point, the fixed point may refer to the vertex a of the page 22 in fig. 12, the controller uses the vertex a as the fixed point in each process of controlling the enlargement of the first floating layer window, and then controls the enlargement of the first floating layer window, and the enlarged first floating layer window may refer to the page 23A in fig. 12. For another example, in a feasible embodiment, the enlarged first floating layer window may be always controlled to be located at the center of the display, and the enlarged first floating layer window may refer to the page 23B in fig. 12. The enlarged first floating layer window aspect ratio is equal to the first floating layer window aspect ratio before enlargement.
Fig. 13 is a schematic diagram illustrating a change of a display page in a process of controlling a first floating layer window to shrink according to a possible embodiment, and as can be seen from the figure, when the controller enters a zoom mode, the controller controls to display and display a first prompt message. Specifically, referring to the page 32 in fig. 13, the prompt information in the page includes two parts of prompt information, wherein one part of the prompt information enlarges the key corresponding to the instruction, and the other part of the prompt information reduces the key corresponding to the instruction. And the user touches the key corresponding to the shrinking instruction according to the first prompt information. And when the user touches the key corresponding to the shrinking instruction once, the controller sends a shrinking instruction to the controller, and correspondingly, the controller responds to the received shrinking instruction to control the first floating layer window to shrink. Under the condition that the size of the first floating layer window is reduced to a preset size, the controller responds to receiving a reducing instruction, and controls the first floating layer window to display second prompting information, wherein the second prompting information is used for prompting a user to only enlarge the first floating layer window in the current environment. Specifically, referring to the page 33 in fig. 13, the second prompt message 13 only shows the key corresponding to the zoom-in instruction.
In response to receiving the return instruction, the zoom mode is exited.
The moving control process for the first floating layer window can be as follows:
and responding to the received drag command, entering a drag mode, and responding to the received movement command and the received return command by the controller.
In a feasible embodiment, the key to which the drag instruction is bound may be the key 103, and the specific triggering process may be long-pressing the key 103. For example, when the user presses the key 103 for a long time, the remote controller transmits a drag instruction to the controller, and the controller enters the movement mode after receiving the drag instruction. The time of the long press is not limited in this application, and may be, for example, 3s or more.
After the controller enters the drag mode, the controller only responds to the interaction instruction related to movement. For example, in a feasible embodiment, after the controller enters the drag mode, the user touches the key 3, where the control instruction corresponding to the key 3 is a control instruction corresponding to the key 3 for playing the television program of the channel 3 on the controller display, and since the controller is in the drag mode at this time, the controller does not respond to the control instruction corresponding to the key 3.
When the controller enters a drag mode, a third prompt message is displayed on the corresponding display, and the third prompt message is used for prompting a user to drag a key corresponding to the instruction. Specifically, referring to fig. 14, fig. 14 is a schematic view of a page displayed by the display when the controller enters the drag mode according to one possible embodiment. As can be seen from the figure, when the controller enters the drag mode, the controller controls to display and display the third prompt message 2. In the embodiment provided in fig. 14, the four keys corresponding to the movement command are an up key, a down key, a left key, and a right key, respectively.
When the controller enters a drag mode, the controller controls the first floating layer window to move a preset distance in the display in the direction corresponding to the movement instruction after the user touches the key corresponding to the movement instruction once. Fig. 15 may be referred to in the moving process of the first floating layer window, and fig. 15 is a schematic diagram illustrating a change of a page displayed by the display in the moving process of the first floating layer window according to a feasible embodiment. In a specific control process, a user touches a drag button, and the remote controller sends a drag instruction to the controller, and the controller responds to the drag instruction, and controls the display to display third prompt information, and at this time, a page displayed by the display can refer to a page 41 in fig. 15. And the user touches the right shift key according to the indication of the third prompt information, and the corresponding remote controller sends a right shift instruction to the controller. After receiving the right shift instruction, the controller controls the first floating layer window to move rightwards, and at this time, the page displayed by the display can refer to the page 42 in fig. 15; the user touches the downward movement key, the corresponding remote controller sends a downward movement instruction to the controller, and the controller controls the first floating layer window to move downwards after receiving the downward movement instruction, so that a page displayed by the display can refer to a page 43 in FIG. 15; the user touches the left shift key, the corresponding remote controller sends a left shift instruction to the controller, and the controller controls the first floating layer window to move leftwards after receiving the left shift instruction, so that a page displayed by the display can refer to a page 43 in FIG. 15; the user touches the up-shift key, the corresponding remote controller sends up-shift instruction to the controller, and the controller controls the first floating layer window to move upwards after receiving the up-shift instruction, at this time, the page displayed by the display can refer to the page 44 in fig. 15.
And in response to receiving the return instruction, exiting the drag mode.
For convenience and smoothness of operation, a default movement interval is set, for example, 5px, 10px, etc. This movement interval size may support the user to set in settings, but we should define a reasonable setting range. This movement interval is defined herein as step_size.
In some application scenarios, a user needs to display all the pulled applications on the display, and based on the application scenarios, an interaction flow chart of each component of the display device provided in the embodiments of the present application may refer to fig. 16.
The user interface is configured to perform step S201 to receive an instruction input by a user;
the controller is configured to execute step S202, in response to receiving the start instruction, to create a first floating layer window corresponding to the current pull-up application, the floating layer window being used to load the current application page;
the process of creating the first floating layer window may be omitted herein with reference to the above embodiments.
The controller S203 controls the display of the first floating layer window;
s204, responding to the received preview instruction, creating a plurality of second floating layer windows;
the user needs to display all the pulled applications on the display, needs to touch keys bound by the preview instruction, and then the remote controller sends corresponding preview instructions to the controller. The controller creates a plurality of second floating layer windows in response to receiving the preview instruction.
S205 controls the display of the second floating layer window.
In this application, the key to which the preview instruction is bound may be the key 104, and the specific triggering process may be long-pressing the key 104. For example, when the user presses the key 103 for a long time, the remote controller sends a preview instruction to the controller, and after receiving the preview instruction, the controller enters a preview mode to create a plurality of second floating layer windows.
Wherein the number of second floating layer windows depends on the number of display pull-up applications. For example, FIG. 17 is a schematic diagram providing a page that is displayed before and after creating a second floating layer window according to one possible embodiment. The display equipment sequentially pulls up the application 1, the application 2, the application 3 and the application 4; when the display displays an application page corresponding to the application 4, the user wants to control the display device to enter a screen display mode, the user touches a key corresponding to a starting instruction on the remote controller, and the remote controller sends the starting instruction to the controller. The controller responds to the starting instruction and enters a multi-screen display mode, and the controller creates a first floating layer window, wherein the first floating layer window is used for displaying an application page corresponding to the application 4, and at the moment, the page displayed by the display can refer to a page 51 in fig. 17. When the user wants to preview all the applications pulled up before, the user touches a key corresponding to the preview instruction on the remote control, and the corresponding remote controller sends the preview instruction to the controller. The controller responds to the received preview instruction to create a second preview window, wherein the application pages related to 3 historical pull-ups in the application are respectively: application page of application 1, application page of application 2, and application page of application 3. The position of the second floating layer window is not limited in this application. For example, in the embodiment related to fig. 17, when entering the preview mode, the display displays 4 floating layer windows, which are a first floating layer window and 3 second floating layer windows, respectively, and the arrangement manner of the 4 floating layer windows can be referred to as a page 52A in fig. 17, where the size of each floating layer window is fixed and occupies about 1/9 of the size of the display, and then the floating layer windows corresponding to each application are sequentially arranged according to the sequence of the creation time of the floating layer windows (application 4, application 1, application 2 and application 3). For another example, in the embodiment shown in fig. 17, when the display enters the preview mode, the display displays 4 floating layer windows, which are a first floating layer window and 3 second floating layer windows, respectively, and the arrangement of the 4 floating layer windows can be referred to as page 52B in fig. 17, where the size of each floating layer window is not fixed, and the size of the floating layer window is determined by the number of floating layer windows, and in this application, the size of each floating layer window occupies about 1/4 of the size of the display. And then sequentially arranging the floating layer windows corresponding to each application according to the sequence of the creation time of the floating layer windows (application 4, application 1, application 2 and application 3).
Further, the controller may be configured to: receiving an instruction to move a focus, the focus configured to be controlled to move; when the focus moves to the target floating layer window, the display is controlled to display the target floating layer window in a first display mode, and display other floating layer windows in a second display mode, wherein the first display mode is different from the second display mode, and the floating layer window comprises a first floating layer window and a second floating layer window.
Specifically, referring to fig. 18, fig. 18 is a schematic diagram of a display page according to one possible embodiment. As can be seen from fig. 18, when the focus moves to the target floating layer window, the controller controls the display to display the target floating layer window in a first display manner and display the other floating layer windows in a second display manner, wherein the first display manner is different from the second display manner. The application does not excessively limit the first display mode and the second display mode, and the display modes capable of distinguishing the two floating layer windows can be applied to the scheme provided by the embodiment of the application. For example, page 6A of fig. 18, in which the frame of the target floating layer window is shown in bolded form and the other floating layer window frames are shown in non-bolded form. As further illustrated in page 6B of fig. 18, in which the target floating layer window is displayed in a highlighted form, the other floating layer window borders are displayed in gray scale.
On the premise that the controller is controlled to be in a preview mode, the controller receives a full-screen instruction, the controller controls the size of the target floating layer window to be enlarged to the size of the display, and the display priority of the target floating layer window is larger than that of the other floating layer windows, so that only an application page displayed by the target floating layer window can be seen from the angle of a user, and when the controller receives a return instruction, the target floating layer window is reduced to the size before enlargement. Referring specifically to fig. 19, fig. 19 is a schematic diagram showing a change of a display showing a page according to a feasible embodiment, it can be seen from fig. 19 that when a focus moves to a target floating layer window, a controller controls the display to show the target floating layer window in a first showing manner, and at this time, the page shown by the display can be referred to as a page 71 in fig. 19. The controller receives the full screen instruction, and controls the size of the target floating layer window to be enlarged to the size of the display, at this time, the page shown by the display can refer to the page 72 in fig. 19. When the controller receives the return instruction, the target floating layer window is reduced to the size before expansion, at this time, the page displayed by the display can refer to the page 73 in fig. 19.
With the controller in preview mode, the controller is further configured to:
in response to receiving the deleting instruction, controlling the target floating layer window to display deleting information, wherein the deleting information is used for prompting a user to close an application corresponding to the target floating layer window; and deleting the target floating layer window in response to receiving the confirmation instruction, and closing the application corresponding to the target floating layer window. Referring specifically to fig. 20, fig. 20 is a schematic diagram showing a change of a display showing a page according to a feasible embodiment, it can be seen from fig. 20 that when a focus moves to a target floating layer window, a controller controls the display to show the target floating layer window in a first showing manner, and at this time, the page shown by the display can be referred to as a page 81 in fig. 20. The controller receives the deletion instruction and controls the target floating layer window to display deletion information, and at this time, the page displayed by the display can refer to the page 82 in fig. 20. In response to receiving the confirmation instruction, the controller deletes the target floating layer window, at which point the page presented by the display may refer to page 83 in fig. 20.
In the cyclic mode, in the big and small window mode, a plurality of windows exist, such as video playing, qq music, weChat, and the like, which are displayed simultaneously. In this mode, a certain key of the four-color keys of the remote controller can be temporarily set as a circulation switch key,
Pressing the key once switches to the next window, which is located at the top layer of the interface display if it is blocked by other windows.
The controller may further enter a zoom mode with the control in the loop mode. Under the application environment, in response to receiving an amplifying instruction, the controller controls the target floating layer window to be amplified, wherein the aspect ratio of the amplified target floating layer window is equal to the aspect ratio of the target floating layer window before amplification; or, the controller responds to the received shrinking instruction to control the target floating layer window to shrink, and the aspect ratio of the reduced target floating layer window is equal to the aspect ratio of the target floating layer window before shrinking;
the process of controlling the enlargement or reduction of the target floating layer window may refer to the process of controlling the enlargement or reduction of the first floating layer window in the above embodiment, which is not described in detail herein by the applicant.
Under the condition that the size of the target floating layer window is reduced to the preset size, the controller responds to the reduction instruction, and controls the target floating layer window to display second prompt information, wherein the second prompt information is used for prompting a user to only enlarge the target floating layer window in the current environment.
For a specific implementation process, reference may be made to the above embodiments, which are not described in detail herein.
And responding to the received drag instruction, entering a drag mode, and responding to the received drag instruction and the received return instruction by the controller.
Under the premise of controlling in a circulation mode, the controller can further enter a drag mode, under the application environment, the controller responds to receiving a return instruction and exits from a zoom mode, and when entering the zoom mode, the controller controls the target floating layer window to display first prompt information, wherein the first prompt information is used for prompting a user to zoom in a key corresponding to the instruction and a key corresponding to the zoom-in instruction.
For a specific implementation process, reference may be made to the above embodiments, which are not described in detail herein. The above parts of the embodiments can be the general parts of the patent document of television software, and especially the parts relevant to the invention point of the invention are supplemented. The relevant low-content part of the department can be deleted appropriately.
In a specific implementation, the present invention further provides a computer storage medium, where the computer storage medium may store a program, where the program may include some or all of the steps in each embodiment of the method for adjusting a camera shooting angle according to the present invention when the program is executed. The storage medium may be a magnetic disk, an optical disk, a read-only memory (ROM), a random access memory (random access memory, RAM), or the like.
It will be apparent to those skilled in the art that the techniques of embodiments of the present invention may be implemented in software plus a necessary general purpose hardware platform. Based on such understanding, the technical solutions in the embodiments of the present invention may be embodied essentially or in parts contributing to the prior art in the form of a software product, which may be stored in a storage medium, such as a ROM/RAM, a magnetic disk, an optical disk, etc., including several instructions to cause a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the method of the embodiments or some parts of the embodiments of the present invention.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some or all of the technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the corresponding technical solutions from the scope of the technical solutions of the embodiments of the present application.
The foregoing description, for purposes of explanation, has been presented in conjunction with specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit the embodiments to the precise forms disclosed above. Many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles and the practical application, to thereby enable others skilled in the art to best utilize the embodiments and various embodiments with various modifications as are suited to the particular use contemplated.
Claims (8)
1. A display device, characterized by comprising:
the display is used for displaying the application page;
the user interface is used for receiving the instruction output by the remote controller;
a controller configured to:
in response to receiving a command to enter a multi-screen presentation mode, entering a multi-screen presentation mode,
creating a first floating layer window corresponding to the current pull-up application, wherein the floating layer window is used for loading application pages of the previous pull-up application, and the number of the first floating layer windows corresponds to the current pull-up application; binding the corresponding relation between the instruction and a plurality of remote controller keys in advance;
controlling the first floating layer window based on the received interaction instruction so that the first floating layer window realizes corresponding functions;
In the multi-screen display mode, responding to receiving a preview instruction, entering a preview mode, creating a plurality of second floating layer windows, and arranging the floating layer windows corresponding to each application according to the sequence of the creation time of the floating layer windows; each second floating layer window is used for loading a historical application page, and the historical application page comprises an application page of the application which is pulled up by the display; controlling the display to display the first floating layer window and the second floating layer window; wherein the number of second floating layer windows depends on the number of display pull-up applications;
in preview mode, receiving an instruction to move a focus, the focus configured to be controlled to move; when the focus moves to a target floating layer window, controlling the display to display the target floating layer window in a first display mode, and displaying other floating layer windows in a second display mode, wherein the first display mode is different from the second display mode, and the floating layer window comprises a first floating layer window and a second floating layer window;
in the preview mode, in response to receiving a full screen instruction, entering a full screen mode, and controlling the size of the target floating window to be enlarged to the size of the display; the display priority of the target floating layer window is greater than the display priority of the rest floating layer windows;
In full screen mode, in response to receiving a return instruction, the target floating layer window is reduced to a pre-expansion size.
2. The display device of claim 1, wherein in response to receiving a zoom instruction, the controller enters a zoom mode, the controller responding accordingly to a zoom-in instruction, a zoom-out instruction, and a return instruction.
3. The display device of claim 2, wherein when the controller is in the zoom mode, the controller is further configured to:
in response to receiving an amplifying instruction, controlling the target floating layer window to be amplified, wherein the aspect ratio of the amplified target floating layer window is equal to the aspect ratio of the target floating layer window before amplification;
in response to receiving a shrinking instruction, controlling the target floating layer window to shrink, wherein the aspect ratio of the target floating layer window after shrinking is equal to the aspect ratio of the target floating layer window before shrinking;
in response to receiving a return instruction, the zoom mode is exited.
4. A display device according to claim 2 or 3, wherein the controller is further configured to:
when the zoom mode is entered, the target floating layer window is controlled to display first prompt information, wherein the first prompt information is used for prompting a user to zoom in a key corresponding to the instruction and zoom out the key corresponding to the instruction.
5. A display device according to claim 2 or 3, wherein the controller is further configured to:
and under the condition that the size of the target floating layer window is reduced to a preset size, responding to a reduction instruction, and controlling the target floating layer window to display second prompt information, wherein the second prompt information is used for prompting a user to only enlarge the target floating layer window in the current environment.
6. The display device of claim 1, wherein the controller is further configured to:
and responding to the received drag command, entering a drag mode, and responding to the received moving command and the received return command by the controller.
7. The display device of claim 6, wherein the controller is further configured to:
responding to a received moving instruction, and controlling the target floating layer window to move towards the direction indicated by the moving instruction;
in response to receiving the return instruction, the zoom mode is exited.
8. The display device of claim 1, wherein the controller is further configured to:
in response to receiving a deleting instruction, controlling the target floating layer window to display deleting information, wherein the deleting information is used for prompting a user to close an application corresponding to the target floating layer window;
And deleting the target floating layer window in response to receiving the confirmation instruction, and closing the application corresponding to the target floating layer window.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010758465.9A CN111935530B (en) | 2020-07-31 | 2020-07-31 | Display equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010758465.9A CN111935530B (en) | 2020-07-31 | 2020-07-31 | Display equipment |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111935530A CN111935530A (en) | 2020-11-13 |
CN111935530B true CN111935530B (en) | 2023-05-09 |
Family
ID=73315946
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010758465.9A Active CN111935530B (en) | 2020-07-31 | 2020-07-31 | Display equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111935530B (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112363683B (en) * | 2020-11-23 | 2023-10-31 | Vidaa美国公司 | Method and display device for supporting multi-layer display by webpage application |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103067787A (en) * | 2013-01-31 | 2013-04-24 | 深圳市酷开网络科技有限公司 | Interaction system and method for intelligent television |
CN110471591A (en) * | 2019-08-08 | 2019-11-19 | 深圳传音控股股份有限公司 | A kind of exchange method, device and computer storage medium |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090144648A1 (en) * | 2007-12-04 | 2009-06-04 | Google Inc. | Tabbed windows for viewing multimedia programs |
US9436217B2 (en) * | 2010-10-01 | 2016-09-06 | Z124 | Windows position control for phone applications |
US9495334B2 (en) * | 2012-02-01 | 2016-11-15 | Adobe Systems Incorporated | Visualizing content referenced in an electronic document |
CN103455178A (en) * | 2013-09-27 | 2013-12-18 | 珠海迈越信息技术有限公司 | Method for simulating mouse with remote controller |
CN103546818B (en) * | 2013-10-31 | 2017-01-04 | 乐视致新电子科技(天津)有限公司 | The focus control method of the list display interface of intelligent television and device |
CN104202649B (en) * | 2014-08-27 | 2018-08-21 | 四川长虹电器股份有限公司 | Manipulate the method that smart television is applied more simultaneously |
US10528207B2 (en) * | 2015-01-12 | 2020-01-07 | Facebook, Inc. | Content-based interactive elements on online social networks |
CN105872832A (en) * | 2015-11-30 | 2016-08-17 | 乐视致新电子科技(天津)有限公司 | Video calling method, video calling system and display device |
CN105898512A (en) * | 2015-12-09 | 2016-08-24 | 四川长虹电器股份有限公司 | Multi-window display and interaction method for intelligent TV Android system |
CN109819329B (en) * | 2019-01-16 | 2022-03-25 | 海信视像科技股份有限公司 | Window display method and smart television |
-
2020
- 2020-07-31 CN CN202010758465.9A patent/CN111935530B/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103067787A (en) * | 2013-01-31 | 2013-04-24 | 深圳市酷开网络科技有限公司 | Interaction system and method for intelligent television |
CN110471591A (en) * | 2019-08-08 | 2019-11-19 | 深圳传音控股股份有限公司 | A kind of exchange method, device and computer storage medium |
Non-Patent Citations (2)
Title |
---|
K. Nilsson Helander.smart tv- a more interactive way of watching tv.Department of Applied Physics & Electronics.2013,全文. * |
智能电视操作系统中多屏互动子系统的设计与实现;霰心培;中国优秀硕士学位论文全文数据库(第6期);全文 * |
Also Published As
Publication number | Publication date |
---|---|
CN111935530A (en) | 2020-11-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112019782B (en) | Control method and display device of enhanced audio return channel | |
CN112135180B (en) | Content display method and display equipment | |
CN111970549B (en) | Menu display method and display device | |
CN111899175A (en) | Image conversion method and display device | |
CN112243141B (en) | Display method and display equipment for screen projection function | |
CN112153440B (en) | Display equipment and display system | |
CN112199064B (en) | Interaction method of browser application and system platform and display equipment | |
CN112087671B (en) | Display method and display equipment for control prompt information of input method control | |
CN112328553A (en) | Thumbnail capturing method and display device | |
CN111954059A (en) | Screen saver display method and display device | |
CN112363683B (en) | Method and display device for supporting multi-layer display by webpage application | |
CN111935530B (en) | Display equipment | |
CN112269668A (en) | Application resource sharing and display equipment | |
CN114079827A (en) | Menu display method and display device | |
CN113438553B (en) | Display device awakening method and display device | |
CN114390190B (en) | Display equipment and method for monitoring application to start camera | |
CN111988646B (en) | User interface display method and display device of application program | |
CN113971049B (en) | Background service management method and display device | |
CN111787115B (en) | Server, display device and file transmission method | |
CN114417035A (en) | Picture browsing method and display device | |
CN111782606A (en) | Display device, server, and file management method | |
CN111988649A (en) | Control separation amplification method and display device | |
CN112235621A (en) | Display method and display equipment for visual area | |
CN112199612B (en) | Bookmark adding and combining method and display equipment | |
CN112231088B (en) | Browser process optimization method and display device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |