CN115993919A - Display device and terminal device - Google Patents

Display device and terminal device Download PDF

Info

Publication number
CN115993919A
CN115993919A CN202210210654.1A CN202210210654A CN115993919A CN 115993919 A CN115993919 A CN 115993919A CN 202210210654 A CN202210210654 A CN 202210210654A CN 115993919 A CN115993919 A CN 115993919A
Authority
CN
China
Prior art keywords
application scene
display
display device
control interface
control
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210210654.1A
Other languages
Chinese (zh)
Inventor
王光强
杨绍栋
刘文静
杨明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Juhaokan Technology Co Ltd
Original Assignee
Juhaokan Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Juhaokan Technology Co Ltd filed Critical Juhaokan Technology Co Ltd
Priority to PCT/CN2022/121223 priority Critical patent/WO2023065976A1/en
Publication of CN115993919A publication Critical patent/CN115993919A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering

Abstract

The application provides a display device and a terminal device. The user can input an application scene control instruction for controlling the user interface under the current application scene of the display device through the terminal device, and the terminal device sends the instruction to the display device. The display equipment detects the type of the application scene to which the user interface belongs and sends the type of the application scene to the terminal equipment. The terminal device may generate and display an application scene control interface according to the application scene type, so as to control a user interface of the display device in the application scene. When the application scenes of the display device are different, the terminal device can display different control interfaces according to different application scene types of the display device, and display different control functions, so that a user can better control the user interfaces in different application scenes, and the user experience is improved.

Description

Display device and terminal device
The present application claims priority from chinese patent office, application number 202111214613.1, chinese patent application entitled "display device, terminal device, and display device control method," filed on day 19, 10, 2021, the entire contents of which are incorporated herein by reference.
Technical Field
The application relates to the technical field of display equipment, in particular to display equipment and terminal equipment.
Background
The display device refers to a terminal device capable of outputting a specific display screen, such as a smart television, a mobile terminal, a smart advertisement screen, a projector, and the like. Along with the rapid development of display equipment, the functions of the display equipment are more and more abundant, the performances are more and more powerful, the bidirectional human-computer interaction function can be realized, and various functions such as video and audio, entertainment, data and the like are integrated, so that the user diversified and personalized requirements are met.
With the widespread use of display devices, there is an increasing demand for display devices, and users can control the display devices not only using remote controllers but also using terminal devices, such as mobile phones.
However, when the display device is controlled by the terminal device, the functions that the terminal device can implement and the functions that the remote controller can implement are generally the same, and the terminal device only has the function of replacing the remote controller, so that the functions that the terminal device can implement are limited, resulting in poor user experience.
Disclosure of Invention
The invention provides a display device and a terminal device. The method solves the problem that in the related art, the terminal equipment has limited functions, so that the experience of a user is poor.
In a first aspect, the present application provides a terminal device, including: display unit, communication unit and processor. Wherein the communication unit is configured to make a communication connection with the display device; the processor is configured to perform the steps of:
responding to an application scene control instruction input by a user, and sending the application scene control instruction to display equipment so as to enable the display equipment to feed back an application scene type; the application scene type is the type of the application scene to which the user interface displayed in the display device belongs;
receiving an application scene type sent by the display equipment;
generating an application scene control interface according to the application scene type and controlling the display unit to display the application scene control interface, wherein the service scene control interface is used for controlling a user interface in the application scene; wherein, different application scene types correspondingly generate different application scene control interfaces.
In a second aspect, the present application provides a display device comprising a display, a communicator, and a controller. Wherein the display is configured to display a user interface; the communicator is configured to make communication connection with the terminal equipment; the controller is configured to perform the steps of:
Detecting that the display equipment and the terminal equipment are in communication connection, and detecting the type of an application scene in response to an application scene control instruction sent by the terminal equipment; the application scene type is the type of the application scene to which the user interface belongs;
the application scene type is sent to the terminal equipment, so that the terminal equipment generates and displays an application scene control interface according to the application scene type, and the service scene control interface is used for controlling a user interface in the application scene
According to the technical scheme, the display device and the terminal device are provided. The user can input an application scene control instruction for controlling the user interface under the current application scene of the display device through the terminal device, and the terminal device sends the instruction to the display device. The display equipment detects the type of the application scene to which the user interface belongs and sends the type of the application scene to the terminal equipment. The terminal device may generate and display an application scene control interface according to the application scene type, so as to control a user interface of the display device in the application scene. When the application scenes of the display device are different, the terminal device can display different control interfaces according to different application scene types of the display device, and display different control functions, so that a user can better control the user interfaces in different application scenes, and the user experience is improved.
Drawings
In order to more clearly illustrate the technical solutions of the present application, the drawings that are needed in the embodiments will be briefly described below, and it will be obvious to those skilled in the art that other drawings can be obtained from these drawings without inventive effort.
FIG. 1 illustrates a usage scenario of a display device according to some embodiments;
fig. 2 shows a hardware configuration block diagram of the control apparatus 100 according to some embodiments;
fig. 3 illustrates a hardware configuration block diagram of a display device 200 according to some embodiments;
FIG. 4 illustrates a software configuration diagram in a display device 200 according to some embodiments;
FIG. 5 illustrates a schematic diagram of a user interface in some embodiments;
FIG. 6 illustrates a schematic diagram of displaying communication mode confirmation information in a display in some embodiments;
FIG. 7 is a schematic diagram of an authentication mode set in a display device in some embodiments;
FIG. 8 is a flow diagram illustrating communication interactions between a terminal device and a display device in some embodiments;
FIG. 9 illustrates a schematic diagram of a terminal interface of a terminal device in some embodiments;
FIG. 10a illustrates a schematic diagram of an initial control interface of a terminal device in some embodiments;
FIG. 10b illustrates a schematic diagram of an initial control interface of a terminal device in some embodiments;
FIG. 10c illustrates a schematic diagram of an initial control interface of a terminal device in some embodiments;
FIG. 11 illustrates a schematic diagram of an initial control interface of a terminal device in some embodiments;
FIG. 12 illustrates a flow diagram of interactions of a display device and a terminal device in some embodiments;
FIG. 13a illustrates a schematic diagram of a media recommendation scenario in some embodiments;
FIG. 13b illustrates a schematic diagram of a media search scenario in some embodiments;
FIG. 14 illustrates a component interaction diagram in a display device in some embodiments;
FIG. 15 is a schematic diagram of a terminal device displaying an application scenario control interface in some embodiments;
FIG. 16 is a schematic diagram of a terminal device displaying an application scenario control interface in some embodiments;
FIG. 17a is a schematic diagram of a display displaying a progress bar for a media asset resource in some embodiments;
FIG. 17b illustrates a schematic diagram of a display displaying a pause flag in some embodiments;
FIG. 17c is a schematic diagram of a double speed hint in some embodiments;
FIG. 18 is a schematic diagram of a terminal device displaying an application scenario control interface in some embodiments;
FIG. 19 illustrates a schematic diagram of a display device in a picture scene in some embodiments;
FIG. 20 is a schematic diagram of a terminal device displaying an application scenario control interface in some embodiments;
FIG. 21a is a schematic diagram of a display displaying picture switching information in some embodiments;
fig. 21b is a schematic diagram illustrating a magnified view of a display in some embodiments.
Detailed Description
For purposes of clarity and implementation of the present application, the following description will make clear and complete descriptions of exemplary implementations of the present application with reference to the accompanying drawings in which exemplary implementations of the present application are illustrated, it being apparent that the exemplary implementations described are only some, but not all, of the examples of the present application.
Based on the exemplary embodiments described herein, all other embodiments that may be obtained by one of ordinary skill in the art without making any inventive effort are within the scope of the claims appended hereto. Furthermore, while the disclosure is presented in the context of an exemplary embodiment or embodiments, it should be appreciated that the various aspects of the disclosure may, separately, comprise a complete embodiment. It should be noted that the brief description of the terms in the present application is only for convenience in understanding the embodiments described below, and is not intended to limit the embodiments of the present application. Unless otherwise indicated, these terms should be construed in their ordinary and customary meaning.
It should be noted that the brief description of the terms in the present application is only for convenience in understanding the embodiments described below, and is not intended to limit the embodiments of the present application. Unless otherwise indicated, these terms should be construed in their ordinary and customary meaning.
The terms "first," second, "" third and the like in the description and in the claims and in the above-described figures are used for distinguishing between similar or similar objects or entities and not necessarily for limiting a particular order or sequence, unless otherwise indicated. It is to be understood that the terms so used are interchangeable under appropriate circumstances.
The terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a product or apparatus that comprises a list of elements is not necessarily limited to all elements explicitly listed, but may include other elements not expressly listed or inherent to such product or apparatus.
The term "module" refers to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware or/and software code that is capable of performing the function associated with that element.
The display device in the scene of application of the embodiment of the present application includes, but is not limited to, a display device having a data transceiving and processing function and an image display function and/or a sound output function, for example, a television, a smart refrigerator, a laser projection device, a display (monitor), an electronic whiteboard (electronic bulletinboard), an electronic desktop (electronic table), and the like.
Fig. 1 is a schematic diagram of a usage scenario of a display device according to an embodiment. As shown in fig. 1, the display device 200 is also in data communication with a server 400, and a user can operate the display device 200 through the smart device 300 or the control apparatus 100.
In some embodiments, the control apparatus 100 may be a remote controller, and the communication between the remote controller and the display device includes at least one of infrared protocol communication or bluetooth protocol communication, and other short-range communication modes, and the display device 200 is controlled by a wireless or wired mode. The user may control the display apparatus 200 by inputting a user instruction through at least one of a key on a remote controller, a voice input, a control panel input, and the like.
In some embodiments, the smart device 300 may include any one of a mobile terminal, tablet, computer, notebook, AR/VR device, etc.
In some embodiments, the smart device 300 may also be used to control the display device 200. For example, the display device 200 is controlled using a camera application running on a smart device.
In some embodiments, the smart device 300 and the display device may also be used for communication of data.
In some embodiments, the display device 200 may also perform control in a manner other than the control apparatus 100 and the smart device 300, for example, the voice command control of the user may be directly received through a module configured inside the display device 200 device for acquiring voice commands, or the voice command control of the user may be received through a voice control apparatus configured outside the display device 200 device.
In some embodiments, the display device 200 is also in data communication with a server 400. The display device 200 may be permitted to make communication connections via a Local Area Network (LAN), a Wireless Local Area Network (WLAN), and other networks. The server 400 may provide various contents and interactions to the display device 200. The server 400 may be a cluster, or may be multiple clusters, and may include one or more types of servers. The number of display devices may also be one or more. For example, multiple display devices 100 may be present simultaneously in a scene.
In some embodiments, software steps performed by one step execution body may migrate on demand to be performed on another step execution body in data communication therewith. For example, software steps executed by the server may migrate to be executed on demand on a display device in data communication therewith, and vice versa.
Fig. 2 shows a block diagram of a configuration of the control apparatus 100 according to some embodiments. As shown in fig. 2, the control device 100 includes a controller 110, a communication interface 130, a user input/output interface 140, a memory, and a power supply. The control apparatus 100 may receive an input operation instruction of a user and convert the operation instruction into an instruction recognizable and responsive to the display device 200, and function as an interaction between the user and the display device 200. The communication interface 130 is configured to communicate with the outside, and includes at least one of a WIFI chip, a bluetooth module, NFC, or an alternative module. The user input/output interface 140 includes at least one of a microphone, a touch pad, a sensor, keys, or an alternative module.
Fig. 3 illustrates a hardware configuration block diagram of a display device 200 according to some embodiments. As shown in fig. 3, the display apparatus 200 includes at least one of a modem 210, a communicator 220, a detector 230, an external device interface 240, a controller 250, a display 260, an audio output interface 270, a memory, a power supply, and a user interface 280. The controller includes a central processor, a video processor, an audio processor, a graphic processor, a RAM, a ROM, and first to nth interfaces for input/output. The display 260 may be at least one of a liquid crystal display, an OLED display, a touch display, and a projection display, and may also be a projection device and a projection screen. The modem 210 receives broadcast television signals through a wired or wireless reception manner, and demodulates audio and video signals, such as EPG data signals, from a plurality of wireless or wired broadcast television signals. The detector 230 is used to collect signals of the external environment or interaction with the outside. The controller 250 and the modem 210 may be located in separate devices, i.e., the modem 210 may also be located in an external device to the main device in which the controller 250 is located, such as an external set-top box.
In some embodiments, the controller 250 controls the operation of the display device and responds to user operations through various software control programs stored on the memory. The controller 250 controls the overall operation of the display apparatus 200. The user may input a user command through a Graphical User Interface (GUI) displayed on the display 260, and the user input interface receives the user input command through the Graphical User Interface (GUI). Alternatively, the user may input the user command by inputting a specific sound or gesture, and the user input interface recognizes the sound or gesture through the sensor to receive the user input command.
In some embodiments, a "user interface" is a media interface for interaction and exchange of information between an application or operating system and a user that enables conversion between an internal form of information and a form acceptable to the user. A commonly used presentation form of the user interface is a graphical user interface (Graphic User Interface, GUI), which refers to a user interface related to computer operations that is displayed in a graphical manner. It may be an interface element such as an icon, a window, a control, etc. displayed in a display screen of the electronic device, where the control may include at least one of a visual interface element such as an icon, a button, a menu, a tab, a text box, a dialog box, a status bar, a navigation bar, a Widget, etc.
Fig. 4 shows a schematic software configuration in the display device 200 according to some embodiments, as shown in fig. 4, the system is divided into four layers, namely, an application layer (application layer), an application framework layer (Application Framework layer), a An Zhuoyun line (Android run time) and a system library layer (system runtime layer), and a kernel layer. The kernel layer contains at least one of the following drivers: audio drive, display drive, bluetooth drive, camera drive, WIFI drive, USB drive, HDMI drive, sensor drive (e.g., fingerprint sensor, temperature sensor, pressure sensor, etc.), and power supply drive, etc.
With the widespread use of display devices, there is an increasing demand for display devices, and users can control the display devices not only using remote controllers but also using terminal devices, such as mobile phones. However, when the display device is controlled by the terminal device, the functions that the terminal device can implement and the functions that the remote controller can implement are generally the same, and the terminal device only has the function of replacing the remote controller, so that the functions that the terminal device can implement are limited, resulting in poor user experience.
The application provides a display device and a terminal device. The display device includes a display, a communicator, and a controller. Wherein the display is for displaying a user interface. The user interface may be a specific target image, for example, various media materials obtained from a network signal source, including video, pictures and the like. The user interface may also be some UI interface of the display device.
The communicator is used for communication connection with the terminal equipment and can also be in communication connection with the server.
In some embodiments, the controller may control the display to display the user interface when the user controls the display device to power on. FIG. 5 illustrates a schematic diagram of a user interface in some embodiments. The user interface includes a first navigation bar 500, a second navigation bar 510, a function bar 520, and a content display area 530, the function bar 520 including a plurality of function controls such as "watch record", "my favorite", and "my application", among others. Wherein the content displayed in the content display area 530 will change as the selected controls in the first navigation bar 500 and the second navigation bar 510 change. When the application panel page is applied, a user can trigger entering a corresponding application panel by clicking on a My application control to input a display instruction for the application panel page. It should be noted that, the user may also input the selection operation of the function control in other manners to trigger entering into the application panel. For example, control is entered into the application panel page using a voice control function, a search function, or the like.
In some embodiments, the user may use the terminal device and the display device to make a communication connection, thereby enabling information interaction between the terminal device and the display device. The terminal device may be a smart phone, a tablet computer, etc.
The terminal device may include a display unit, a communication unit, and a processor. Wherein the display unit is used for displaying some pictures. The communication unit is used for communication connection with the display device. The terminal device may send a communication connection request to the display device to cause the terminal device and the display device to make a communication connection. When the terminal device and the display device are in communication connection, communication interaction can be performed, for example, the terminal device can acquire some data from the display device, such as media data, current display data of the display device, and the like. The terminal equipment can upload the data such as the media resource in the terminal equipment to the display equipment for playing. The user can also send some control instructions to the display device by using the terminal device so as to control the display device to realize corresponding functions.
In some embodiments, the display device may be provided with a communication mode. When the communication mode is closed, the display device does not receive a communication connection request sent by the terminal device. At this time, the display device does not allow the terminal device to perform communication connection, and at this time, the display device does not perform information interaction with any terminal device. When the communication mode is on, the display device receives a communication connection request sent by the terminal device. At this time, the display device may allow the terminal device and the display device to be connected in communication, thereby realizing information interaction. The user may input a communication mode on command to the display device, and when the controller receives the communication mode on command, the controller may control the display device to enter a communication mode.
In some embodiments, the user may send a communication mode instruction to the display device by operating a designated key of the remote control. And pre-binding the corresponding relation between the communication mode instruction and the remote controller key in the actual application process. For example, a communication mode key is set on the remote controller, when the user touches the key, the remote controller sends a communication mode instruction to the controller, and at this time, the controller controls the display device to enter a communication mode. When the user touches the key again, the controller may control the display device to exit the communication mode.
In some embodiments, the correspondence between the communication mode command and the plurality of remote controller keys may be pre-bound, and when the user touches the plurality of keys bound to the communication mode command, the remote controller sends out the communication mode command. In a feasible embodiment, the keys to which the communication mode instruction is bound are direction keys (left, down, left, down) in sequence, that is, when the user continuously touches the keys (left, down, left, down) within a preset time, the remote controller sends the communication mode instruction to the controller. By adopting the binding method, the communication mode instruction can be prevented from being sent out due to misoperation of a user. The embodiment of the application is only to provide binding relations between several communication mode instructions and keys by way of example, and the binding relations between the communication mode instructions and the keys can be set according to habits of users in the actual application process, so that the application is not limited too much.
In some embodiments, the user may send a communication mode instruction to the display device by way of voice input using a sound collector of the display device, such as a microphone, to control the display device to enter a communication mode. The display device can be provided with an intelligent voice system, and the intelligent voice system can recognize the voice of the user so as to extract instruction content input by the user. The user may input a preset wake-up word through the microphone to activate the intelligent voice system so that the controller may respond to the instruction input by the user. And inputting a communication mode instruction within a certain time period so that the display device enters a communication mode. For example, the user may enter "certain classmates" to activate the intelligent speech system. And inputting a communication mode entering command to send a communication mode command to the display device.
In some embodiments, the user may also send communication mode instructions to the display device through a preset gesture. The display device may detect the behavior of the user through an image collector, such as a camera. When the user makes a preset gesture, the user may be considered to have sent a communication mode instruction to the display device. For example, it may be set to: when it is detected that the user has drawn the V-word, it is determined that the user has input a communication mode instruction to the display device. The user can also send a communication mode instruction to the display device through a preset action. For example, it may be set to: when it is detected that the user lifts the left foot and the right hand at the same time, it is determined that the user has input a communication mode instruction to the display device.
In some embodiments, the communication mode instructions may also be sent to the display device when the user controls the display device using a smart device, such as using a cell phone. In the practical application process, a control can be arranged in the mobile phone, whether the mobile phone enters a communication mode or not can be selected through the control, so that a communication mode instruction is sent to the controller, and the controller can control the display equipment to enter the communication mode.
In some embodiments, a continuous click command may be issued to the mobile phone when the user controls the display device using the mobile phone. The continuous click command refers to: and in a preset period, the number of times of clicking the same area of the touch screen of the mobile phone by the user exceeds a preset threshold value. For example: when the user clicks on a certain area of the mobile phone touch screen for 3 times in 1s, the user is regarded as a continuous click command. After the mobile phone receives the continuous click command, the mobile phone can send a communication mode command to the display device so that the controller controls the display device to enter a communication mode.
In some embodiments, when the user controls the display device using the mobile phone, it may also be set to: when detecting that the touch pressure value of a user on a certain area of the touch screen of the mobile phone exceeds a preset pressure threshold, the mobile phone can send a communication mode instruction to the display device.
A communication mode option may also be set in the UI interface of the display device, and when the user clicks the option, the display device may be controlled to enter or exit the communication mode.
In some embodiments, to prevent the user from triggering the communication mode by mistake, when the controller receives the communication mode instruction, the controller may control the display to display communication mode confirmation information, so that the user performs secondary confirmation, and whether to control the display device to enter the communication mode. Fig. 6 illustrates a schematic diagram of displaying communication mode confirmation information in a display in some embodiments.
In some embodiments, when the display device is in the communication mode, an authentication mode may be further set in consideration of security. When the identity authentication mode is closed, the display device does not perform identity authentication on the terminal device, and the terminal device can be directly in communication connection with the display device. That is, when the terminal device transmits a communication connection request to the display device, the display device may not verify the communication connection request, thereby directly performing communication connection with the terminal device.
When the identity authentication mode is started, the display device performs identity authentication on the terminal device. That is, when the display device receives the communication connection request sent by the terminal device, the communication connection request is verified, and when the verification is passed, the communication connection request is allowed, so that communication connection is performed with the terminal device.
Fig. 7 is a schematic diagram of setting an authentication mode in a display device in some embodiments. Wherein, when the user selects to turn on the communication mode, the user can further select to turn on the authentication mode or turn off the authentication mode.
In some embodiments, the terminal device may be in communicative interaction with the display device when the display device is in a communication mode. The terminal device may send a communication connection request to the display device, and when the display device receives the communication connection request, may make a communication connection with the terminal device.
Fig. 8 illustrates a flow diagram of communication interactions between a terminal device and a display device in some embodiments.
In some embodiments, the user may establish a communication connection between the terminal device and the display device by binding the terminal device with the display device. The user can directly search all display devices in the local area network by using the terminal device, and select one target display device to be in communication connection with the terminal device. Fig. 9 shows a schematic diagram of a terminal interface of a terminal device in some embodiments. The terminal equipment can be provided with a connection equipment control, and a user can instruct the terminal equipment to scan connectable display equipment in the local area network by clicking the connection equipment control, and trigger to enter a corresponding equipment list page. When a device list page is displayed, the user can select a display device to be connected in the page.
Upon detecting that the user selects a certain display device, the terminal device may send a communication connection request to the display device. After receiving the communication connection request, the display device may perform binding association with the terminal device, that is, establish communication connection.
In some embodiments, the user may also log in his own account in the terminal device. After the user uses the terminal equipment to be bound and associated with a certain display equipment, the account number of the user is also bound and associated with the display equipment.
In some embodiments, after the terminal device sends a communication connection request to the display device, the display device may first detect its device type. Specifically, the device types of the display device may include: television, refrigerator, air conditioner, etc. After determining its own device type, the display device may send the device type to the terminal device. Meanwhile, the display device can establish communication connection with the terminal device.
After receiving the device type sent by the display device, the terminal device may display a control interface for controlling the display device corresponding to the device type, so that the user may control the display device by using the control interface. The terminal device may display different initial control interfaces for different device types.
Specifically, the terminal device may generate an initial control interface according to the device type. The initial control interface may be a UI interface for controlling the display device of the device type. Wherein different device types correspond to different initial control interfaces, i.e. each device type of display device has its own initial control interface. The initial control interface may include a number of functionality controls for controlling the display device to implement a number of functions. Further, the terminal device may display the initial control interface in the display unit. The user may directly control the display device using the initial control interface.
In some embodiments, the initial control interface may be used to control some basic functions of the display device. The functions that can be controlled are different for different device types of display devices, which have different initial control interfaces. For example, for television, volume, homepage switching, etc. can be controlled; for air conditioning, temperature and air conditioning mode may be controlled, and so on.
Fig. 10a shows a schematic diagram of an initial control interface of a terminal device in some embodiments, where the device type corresponding to the initial control interface is a television. Specifically, the control interface includes 11 operation controls: a power control 1001, a search control 1002, a keyboard toggle control 1003, a program source control 1004, a volume control 1005, a return control 1006, a main interface control 1007, a menu control 1008, a signal source control 1009, an intelligent assistant control 1010, and a favorites control 1011. The power control 1001 is used for controlling on and off of the display device. Search control 1002 is used to search for media assets, etc. The keyboard switch control 1003 is used for switching the displayed keyboard, the keyboard includes a numeric keyboard and a directional keyboard, and after the user clicks the control, the terminal device can switch the numeric keyboard to the directional keyboard. The program source control 1004 is a numeric keypad, and is used for adjusting the currently played program source of the display device, so that a user can press a specific numeric key to switch to a corresponding television program, and can also switch the television programs back and forth according to the sequence of the program sources. Volume control 1005 is used to adjust the volume of the display device. The return control 1006 is used to return to the last operation of the user. The main interface control 1007 is used to return to the main interface of the display device. Menu control 1008 is used to jump to a menu interface of the display device. The signal source control 1009 is configured to display a signal source list of the display device, including signal sources such as HDMI, USB, ATV, for selection by a user. The smart assistant control 1010 is used to turn on a smart assistant of the display device, which may implement functions such as image recognition. The favorites control 1011 is used to control the display device to display a user's favorites interface, which may include media assets collected by the user, and the like.
Fig. 10b is a schematic diagram of an initial control interface of a terminal device in some embodiments, where a device type corresponding to the initial control interface is a television. For the initial control interface shown in fig. 10a, after the user touches the keyboard switch control 1003, the terminal device switches to display the initial control interface shown in fig. 10 b. Also included in the initial control interface shown in fig. 10b is a focus control 1012. The focus control 1012 is a directional keypad with which a user can adjust the focus position in the display device and confirm. Under the current interface, when the user clicks the keyboard switch control 1003, the terminal device may switch the directional keyboard to the numeric keyboard.
Fig. 10c is a schematic diagram of an initial control interface of a terminal device in some embodiments, where a device type corresponding to the initial control interface is an air conditioner. Specifically, the control interface includes a display area 1031 and 6 operation controls: temperature regulation control 1032, power control 1033, refrigeration control 1034, heating control 1035, wind speed control 1036, and timing control 1037. The display area 1031 is used for displaying some working conditions of the air conditioner, for example: temperature, wind speed, timing conditions, etc. The power control 1032 is used for controlling the on and off of the air conditioner. The cooling control 1034 is used to cause the air conditioner to perform a cooling operation. The heating control 1035 is used to make the air conditioner perform a heating operation. The wind speed control 1036 is used to adjust the wind speed when the air conditioner is in operation. The timing control 1037 is used for timing the air conditioner.
When the initial control interface is displayed, the terminal device can replace the remote controller to send a control instruction to the display device, so that a user can control the display device through the terminal device. Specifically, the user can input a corresponding control instruction by clicking or touching a control in the initial control interface. Further, the terminal device may send a control instruction input by the user to the display device. After receiving the corresponding control instruction, the display device can analyze the control instruction, so as to determine the corresponding function of the control instruction and realize the function.
In some embodiments, a control interface template library may be preset in the terminal device. The control interface template library may store a plurality of initial control interfaces, each of which may correspond to a device type for controlling a display device of the corresponding device type. After receiving the device type sent by the display device, the terminal device may directly search the control interface template library for an initial control interface corresponding to the device type. Further, the terminal device may display the initial control interface in the display unit, and the user may control the display device using the initial control interface.
In some embodiments, the user may utilize the initial control interface to control some functions of the display device, which may be basic functions of the display device, which may satisfy some basic requirements of the user when using the display device.
However, the display device may have a plurality of application scenarios, which may be different modes set in the control system or the application according to the resource type. The display device may provide different kinds and numbers of application scenarios depending on the different operating system forms. For example, the application scenario may include: media asset scenes, picture scenes, screen projection scenes, etc.
Under different application scenarios, the display device may recommend different resources for the user. For example, in a media asset scenario, a display device may display various media asset resources. In a picture scene, the display device may present a picture for viewing by a user. In the screen-throwing scene, the display equipment can display the media assets set in the screen-throwing equipment.
When the display device is in different application scenarios, the functions that the display device can implement may be different. For example, in a media asset scenario, the progress of the media asset resource and the current volume of the display device may be adjusted, etc. In the picture scene, the picture can be switched or scaled, etc. For each application scenario, the display device will have some specific functionality. To enhance the user experience, the user is enabled to control these specific functions with the terminal device, which may further display controls controlling these specific functions. However, since each application scenario corresponds to some specific functions, it is obviously impractical to display all of them in the initial control interface. At this time, the terminal device may be switched from the initial control interface to a control interface that controls functions of the display device in the current application scenario.
In some embodiments, an application scenario control may be set in the initial control interface of the terminal device. Fig. 11 shows a schematic diagram of an initial control interface of a terminal device in some embodiments. Wherein the initial control interface includes an application scenario control 1101. The user may click on and touch the application scene control, e.g., the user may hold the control and perform a swipe operation. When the terminal device detects the touch operation of the application scene control by the user, the user can be considered to input an application scene control instruction, namely, the user wants to control and realize some specific functions of the current application scene of the display device.
In some embodiments, the scene control is an indication control that the scene interface may be invoked under the current initialization control interface.
In some embodiments, different initial control interfaces correspond to different controlled devices. After the terminal equipment switches the initial control interface, according to the identification of the controlled equipment and the preset relation, the terminal equipment determines that the scene control is displayed when the equipment corresponding to the current initial control interface looks for the scene Jing Jiemian, and the scene control is not displayed when the equipment does not exist. The preset relationship refers to a mapping relationship of whether the equipment identifier and the equipment corresponding to the identifier are configured with a scene interface.
In some embodiments, the initial control interface of the same controlled device may evoke a different scene interface depending on the scene of the display device's current display interface.
Further, the terminal device may display a control interface corresponding to the current application scenario of the display device, so that the user controls the display device. Specifically, after detecting that the user inputs the application scene control instruction, the terminal device may send the application scene control instruction to the display device, or may send the application scene control instruction to the display device by using the communication unit, so that the display device feeds back the type of the application scene to which the current user interface belongs. In the embodiment of the application, the application scene type is used for indicating the type of the application scene to which the user interface belongs. Further, the terminal device may display an application scene control interface according to the application scene type, so that a user controls the display device. In some cases, the scene control instruction may be an instruction generated after the operation of selecting the scene control, or may be an instruction generated after the detector detects the input preset gesture.
Compared with the case where the scene control instruction is transmitted when the initial interface is displayed, that is, when the scene control instruction is transmitted after the user detects the input operation, the operation pressure of the terminal device can be reduced. When the method is used for displaying the initial interface, a user can operate the television through other control equipment such as a remote controller and the like, and can also operate the display equipment through the initial interface of the terminal equipment, so that the scene of the display equipment when the initial interface is displayed and the scene when the user needs to inquire are possibly inconsistent, and if periodic detection is required to be established, the method consumes more effort, and is beneficial to reducing the operation pressure of the equipment after the preset operation input by the user is detected.
In some embodiments, the terminal device may store all application scenarios corresponding to each device type in advance. It should be noted that some types of display devices may have only one application scene, and some types of display devices may have multiple application scenes. Therefore, after the display device transmits its own device type to the terminal device, the terminal device can determine whether the display device of this device type has a plurality of application scenarios. If the display device has only one application scene, the initial control interface displayed by the terminal device can not contain the application scene control, the display device has no function of switching the application scene, and the display device can be completely controlled based on the current initial control interface.
If the display device of this type has a plurality of application scenes, an application scene control may be included in the initial control interface displayed by the terminal device. Further, after the user touches the application scene control, the terminal device may display an application scene control interface, so that the user can better control the display device.
Fig. 12 illustrates a flow chart of interactions of a display device and a terminal device in some embodiments.
In some embodiments, after receiving an application scene control instruction sent by a terminal device, the display device may first detect whether itself has established a communication connection with the terminal device, that is, whether it is bound with the terminal device, so as to ensure safety and accuracy of information interaction. This is because the terminal device can switch the initial interface to operate the device including the display device but switch different short-distance control modes according to the user under the initial interface, for example, the terminal device is in the display device short-distance control mode when the initial interface of the display device is displayed, and the display device is in the air-conditioner short-distance control mode when the initial interface of the air conditioner is displayed. In some embodiments the short-range control modality may be a control mode by infrared or NFC communication. Therefore, after the application scene control instruction is sent through the short-distance control mode corresponding to the display device, the display device also establishes a second communication mode capable of complex data transmission with the terminal device, and the second communication mode is different from the short-distance control mode. In some embodiments, the second communication mode may be a bluetooth communication mode. In some embodiments, the short-range control modality may be an infrared control mode. Different controlled devices correspond to different infrared encoding modes. The corresponding terminal equipment also needs to be provided with two different signal transceiving channels.
When the communication connection between the display device and the terminal device is detected, the display device can detect the current application scene, so that the current application scene type is determined. In the embodiment of the application, the application scene of the display device is the application scene to which the user interface displayed in the display belongs. Further, the terminal device may display a corresponding application scenario control interface according to the application scenario type of the display device.
In some embodiments, the connection between the terminal device and the display device may all adopt the second communication mode, and at this time, the scene instruction may be sent through the second communication mode, and the subsequent display device may not set the steps of detecting and establishing the communication connection, so that the response speed of the operation may be improved.
When the display device is in different application scenes, the specific functions supported by the display device are different, and the control instructions of the interaction between the display device and the terminal device are different, so that the application scene control interfaces are also different. For example, when the display device is in the media asset scene, the display device may be further specifically classified into a media asset recommendation scene, a media asset playing scene, and a media asset searching scene.
The media asset recommendation scene can be a scene that the display device displays a media asset recommendation page, the media asset recommendation page can comprise a plurality of recommended media asset options, and a user can select a certain target media asset in the media asset recommendation page. Specifically, the user can control the focus cursor to move to a certain target media resource option through the control device or the terminal equipment. Further, the display device may play the target media asset. FIG. 13a illustrates a schematic diagram of a media recommendation scenario in some embodiments. Specifically, when the user controls the display device to start some media playing application programs, for example, APP such as "gather good looking" or "curiosity", the controller may control the display to display a media recommendation page corresponding to each APP.
In some embodiments, the display device may determine the current scene type of the display device through the attribute of the currently displayed interface or the identifier of the currently running application and the preset mapping relationship. The preset relationship may be a mapping relationship of an attribute of a currently displayed interface or an identification of a currently running application, and a scene type.
In some embodiments, the attribute of the currently displayed interface may be marked in advance, or may be determined in real time by recognizing the image.
The media asset playing scene may be a scene in which the display device plays media asset resources. When the display device is in the media resource playing scene, the user can watch the media resource, and can also adjust the playing progress of the media resource, the current volume of the display device and the like.
The media asset search scene may be a scene in which the display device displays a media asset search page. FIG. 13b illustrates a schematic diagram of a media search scenario in some embodiments. When the display device is in the media searching scene, an input method keyboard can be displayed in the user interface, and a user can input media names and the like through the input method keyboard, so that the display device is controlled to realize the media searching function.
In order to enable the user to better control the user interface under the current application scene type, the application scene control interface displayed by the terminal equipment can also comprise some related information displayed in the current user interface. In the embodiment of the application, application scene data is adopted to refer to data corresponding to a user interface under the current application scene type of the display device. The terminal device can generate an application scene control interface according to the application scene type and the application scene data so that a user can better control the user interface.
In some embodiments, after receiving the application scene control instruction sent by the terminal device, the display device may directly send the application scene type and the application scene data to the terminal device, so that the terminal device generates an application scene control interface.
Specifically, the display device may detect a type of an application scenario to which the user interface belongs, and acquire data corresponding to the application scenario. Further, the display device may send the data to the terminal device, and the terminal device may display an application scenario control interface according to the data, and the user may control the user interface in the current application scenario of the display device using the application scenario control interface.
In some embodiments, when detecting the current application scenario, the display device may first detect an application corresponding to the application scenario to which the user interface belongs, that is, an application that the display device corresponding to the user interface is running. After determining the application program, the controller may send an application scenario information acquisition instruction to the application program. After receiving the instruction, the application program can feed back application scene information to the controller, wherein the application scene information comprises application scene types and application scene data corresponding to the current application scene.
FIG. 14 illustrates a component interaction diagram in a display device in some embodiments. For example, the display device is running a focused APP and displaying the asset in the display. At this time, the APP may store current application scenario information of the display device, including an application scenario type and application scenario data, where the application scenario data is some related data of the currently displayed media asset. After receiving the application scene control instruction sent by the terminal equipment, the controller can directly send an application scene information acquisition instruction to the focused APP. The APP can determine application scenario information, specifically, the APP can directly feed back the stored application scenario information to the display device, and further, the display device sends the application scenario information to the terminal device.
In some embodiments, after receiving the application scene control instruction sent by the terminal device, the display device may first detect an application scene type and send the application scene type to the terminal device.
After confirming the application scene type of the display device, the terminal device may send a request for acquiring application scene data to the display device. Further, the display device may acquire the application scenario data, and send the application scenario data to the terminal device. The terminal device may generate an application scene control interface according to the application scene type and the application scene data.
It should be noted that, for the display device, no matter what the current application scene type of the display device is, the corresponding application scene data can be acquired by the display device. Therefore, after receiving the application scene control instruction sent by the terminal equipment, the display equipment can directly send the application scene type and the application scene data to the terminal equipment together; or the application scene type is sent first, and after receiving a request for acquiring the application scene data sent by the terminal equipment, the application scene data is sent to the terminal equipment. The two technical schemes are not limited by the type of application scene.
In some embodiments, application scenario data for the display device may also be obtained for an operation server of the display device. Therefore, the terminal device may also acquire application scene data of the display device through the server.
However, not all application scenario data corresponding to the application scenario types can be acquired by the server. For example, when the display device is in a media asset playing scene and the media asset is being played, some specific playing data of the media asset, such as information of playing progress and the like, cannot be obtained by the server. For another example, when the display device is in a media search scenario, the user has entered text content in the display device that is not available to the server. Therefore, when the display device is in the media asset playing scene and the media asset searching scene, the terminal device cannot acquire the application scene data through the server.
When the display device is in the media asset recommendation scene, the media asset recommendation page can be displayed in the display, for example, media asset resources corresponding to various media asset types are displayed. It should be noted that the media asset recommendation page of the display device is obtained by media asset recommendation data obtained from the server. Namely, the related data in the media asset recommendation page can be obtained from the server. The server may transmit the media recommendation data transmitted to the display device to the terminal device.
Therefore, after receiving the application scene control instruction sent by the terminal device, the display device can detect the application scene type and send the application scene type to the terminal device. The terminal device may detect the application scenario type. If the application scene type is detected to be the media recommendation scene, a request for acquiring the application scene data can be sent to a server. The server may send application scenario data corresponding to the media asset recommendation scenario, that is, media asset recommendation data corresponding to the media asset recommendation page of the display device, to the terminal device. The terminal device may generate an application scene control interface according to the application scene type and the application scene data.
In some embodiments, the terminal device may generate the application scenario control interface according to a preset control interface template library. Specifically, for each application scenario, a control interface template and its correspondence may be set in advance. I.e. each application scenario will have its corresponding control interface template. All the control interface templates and the corresponding relation between each control interface template and the application scene can be stored in the control interface template library. At this time, in the control interface template library, the corresponding target control interface template can be obtained according to the application scene type. Further, an application scenario control interface may be generated according to the target control interface template and the application scenario data.
In some embodiments, when the display device is in the media asset recommendation scene, the display device may directly send the current application scene type and application scene data to the terminal device. The application scenario data may be media asset recommendation data, that is, related information of media asset resources included in the current media asset recommendation page. After receiving the application scene type and the application scene data sent by the display device, the terminal device can determine that the current application scene of the display device is a media resource recommendation scene, and further, can generate an application scene control interface according to the application scene type and the application scene data.
Fig. 15 is a schematic diagram of displaying, by a terminal device, an application scenario control interface, where an application scenario type corresponding to the application scenario control interface is a media asset recommendation scenario in some embodiments. At this time, the terminal device may display some of the media information in the media recommendation page of the display device. The user may manipulate the control interface, for example, to select a particular resource. After the terminal equipment detects the operation of the user, the target media resource selected by the user can be determined and sent to the display equipment. Further, the display device may play the target media asset. The application scene control interface is also provided with an interface return control 1501 for switching the control interface displayed in the terminal device. When it is detected that the user clicks the interface return control 1501, or the user touches the interface return control 1501, for example, the user may press the control and perform a down-stroke operation, the terminal device may switch the control interface to an initial control interface, at this time, the terminal device does not display the application scenario control interface any more, but updates and displays the initial control interface, so that the user may input a control instruction in the initial control interface.
In some embodiments, the controller may send the application scene type and the application scene data to the terminal device when it is detected that the display device enters the media playback scene. After receiving the application scene type and the application scene data, the terminal equipment can determine that the current application scene of the display equipment is a media asset playing scene, and display an application scene control interface for controlling the media asset playing scene. The application scene data may include media asset information of a media asset resource currently played by the display device.
The media asset information may include: the media asset name, the total media asset duration and the media asset playing progress, for example, the total media asset duration is 1 hour, 50 minutes and 20 seconds, and the current playing progress is 1 hour, 20 minutes and 20 seconds. The media asset information may include a media asset selection and media asset definition. The collection number selection set may include the total collection number of the media resource and the current media resource is the nth collection of the television, the media resource definition refers to all definitions supported by the media resource and the current definition, for example, the media resource may support four kinds of definitions including blue light, super definition, high definition and standard definition, and the definition in the current playing is super definition. The media asset information may also include a media asset play rate, i.e., a play rate supported by the media asset resource, a current play rate, and so on. Specifically, the controller may obtain the media information from an application program corresponding to the user interface.
After the display device sends the application scene type and the application scene data to the terminal device, the terminal device can generate an application scene control interface, and specifically, the terminal device can generate the application scene control interface by using a preset control interface template library.
Fig. 16 is a schematic diagram illustrating a display of an application scene control interface by a terminal device in some embodiments. The application scene control interface can control media resource played by the display equipment. Specifically, an interface return control 1501, a mute control 1601, a media playing progress area 1610, a media album area 1620, a media definition area 1630, a playing control area 1640, a media playing rate area 1650 and a media name area 1660 may be set in the display area of the application scene control interface.
The interface return control 1501 is used for controlling the terminal device to switch the control interface to the initial control interface. When the user clicks the mute control 1601, the terminal device may send a mute instruction to the display device, thereby controlling the display device to perform mute processing. The media asset playing progress area 1610 may be a progress bar of media asset resources for adjusting the playing progress of the media asset resources. The user can check the total duration of the media resource and the current playing progress through the media resource playing progress area 1610, and can touch the media resource playing progress area 1610, for example, press the playing progress control 1602 and horizontally move, the terminal device can determine the playing progress corresponding to the moved playing progress control 1602, send the playing progress to the display device, and the controller can adjust the media resource according to the playing progress. After adjusting the playing progress of the media asset resource, the controller may further control the display to display the current progress bar to prompt the user for the current playing progress, as shown in fig. 17 a. After a preset period, for example, after 5 seconds, the controller controls the display to not display the progress bar any more, and only media resource is displayed in the display at this time, so that the watching effect of the user is not affected, and the experience of the user is improved.
The media resource selection area 1620 is configured to switch the collection number of the media resource, and the user may view the current collection number of the media resource and the adjacent collection numbers, and may click on other collection numbers, thereby controlling the display device to play the media resource with the corresponding collection number. The media asset definition area 1630 is used for switching the definition of the media asset resource, so that the user can check all the definition supported by the media asset resource and the definition in the current playing, and the user can click on other definition, thereby controlling the display device to play the media asset resource with corresponding definition. The media asset playing rate area 1650 is used for switching the playing speed of the media asset resource, and the user can select one of 0.8 time speed, 1.0 time speed, 1.25 time speed, 1.5 time speed or 2.0 time speed, and control the display device to play the media asset resource according to the time speed.
The play control area 1640 may further control the display device to control the play speed, volume, etc. of the media asset. Specifically, the play control area 1640 may include a first area, a second area, and a third area. The first area is used for adjusting the media resource to be before a preset T1 time, for example, when the user clicks the first area, the terminal device may send a fast-rewinding instruction to the display device, where the fast-rewinding instruction is used for rewinding the media resource for 15s. After the second area is used for adjusting the media asset to the preset T2 time, for example, when the user clicks on the second area, the terminal device may send a fast forward instruction to the display device, where the fast forward instruction is used to fast forward the media asset for 15s. The time T1 and the time T2 may be the same or different, and the embodiment of the present application is not limited.
The third area may control the playing speed and volume of the media asset resource. When the user clicks the third area, the terminal device may send a pause playing instruction to the display device, and when receiving the pause playing instruction, the controller controls the display device to temporarily stop playing the media asset resource. The controller may also control the display to display a pause flag, as shown in fig. 17 b. When the user clicks the third area again, the terminal device can send a resume play command to the display device, and when the resume play command is received, the controller controls the display device to continue playing the media asset resource. When the user continuously touches the third area for a preset period, for example, when the user presses the third area for 1s, the terminal device may send a double-speed playing instruction to the display device, where the double-speed playing instruction is used to control the display device to play the media asset resource at a preset double speed, for example, 2.0 times speed. When the display device plays the media asset according to the preset double speed, the controller may further control the display to display double speed prompt information, for example, "2.0 double speed play", and fig. 17c shows a schematic diagram of double speed prompt information in some embodiments. When the user no longer touches the third area, the terminal device sends a recovery instruction to the display device, and controls the display device to play the media resource at a normal speed, namely, at a speed of 1.0. When detecting that the user performs the up-stroke operation or the down-stroke operation in the third area, the terminal device may send a volume increasing instruction and a volume decreasing instruction to the display device, respectively, so as to adjust the current playing volume of the display device.
The terminal device can control the third area to display the control instruction prompt information, so that the control instruction supported by the current control interface is prompted for the user. The media asset name field may display the media asset name of the media asset resource. The terminal device can adjust the display area in the target control interface template according to the application scene data, so that the data in the display area is the application scene data.
In some embodiments, the display device may send the application scene type and the application scene data to the terminal device when the display device is in a media asset search scene. After receiving the application scene type and the application scene data, the terminal device can determine that the current application scene of the display device is a media resource searching scene, and further, the terminal device can display an application scene control interface for controlling the media resource searching scene.
In some embodiments, the application scenario data may include historical search records of the display device, search text entered by the user in a text search box, and so forth. After receiving the application scene type and the application scene data, the terminal equipment can generate an application scene control interface, and specifically, the terminal equipment can generate the application scene control interface by using a preset control interface template library. Fig. 18 illustrates a schematic diagram of a terminal device displaying an application scenario control interface in some embodiments. An interface return control 1501 is set in the display area of the application scene control interface, and is used for controlling the terminal device to switch the control interface to an initial control interface. A search area 1810, a history search area 1820, and an input method keyboard 1830 are also provided in the display area. The user can input the content to be searched in the search area and can directly click the input method keyboard to input characters, so that the user experience is improved.
The terminal device can adjust the display area in the target control interface template according to the application scene data. Specifically, the search text that the user has entered in the display device may be caused to be displayed in the search area, and the history search data in the display device may be caused to be displayed in the history search area.
In some embodiments, the display device may present a picture when the display device is in a picture scene. Fig. 19 shows a schematic diagram of a display device in a picture scene in some embodiments. The controller may control the display to display thumbnail images of the pictures, and when a user selects a certain thumbnail image, the controller may control the display to display the target picture full screen.
In some embodiments, the controller may send the application scene type and the application scene data to the terminal device when it is detected that the display device is in a picture scene. After receiving the application scene type and the application scene data, the terminal device can determine that the current application scene of the display device is a picture scene, and further, the terminal device can display an application scene control interface for controlling the picture scene. The application scene data may include picture information of all pictures that the display device may currently display.
In some embodiments, after receiving the application scene type and the application scene data, the terminal device may generate an application scene control interface, and specifically may generate the application scene control interface by using a preset control interface template library. The application scene control interface can control the pictures displayed by the display device, such as zooming operation, switching operation and the like.
Fig. 20 is a schematic diagram of a terminal device displaying an application scenario control interface in some embodiments. The control interface includes a thumbnail area 2010 and a control area 2020. There may be several thumbnails in the thumbnail area, including the target picture currently displayed by the display device and the pictures adjacent to the target picture. The user can slide left and right on the thumbnail area, so that other pictures are selected, the terminal equipment sends the picture selected by the user to the display equipment, and the controller controls the display to display the selected picture in a full screen mode, so that the picture switching function is realized. When the pictures displayed by the display equipment are switched, the controller can also control the display to display the picture switching information. The picture switching information may include a plurality of thumbnails including a target picture currently displayed by the display device and a picture adjacent to the target picture, as shown in fig. 21a, so as to prompt the user about the current picture condition.
The user can zoom or the like on the picture displayed by the display device through the control area. When the user sends out a continuous click command to the terminal equipment, the user is considered to send a picture amplifying command. The continuous click command refers to: and in a preset period, the number of times of clicking the same area of the touch screen of the mobile phone by the user exceeds a preset threshold value. For example: when the user clicks on a certain area of the mobile phone touch screen for 3 times in 1s, the user is regarded as a continuous click command. After the terminal equipment receives the continuous click command, the terminal equipment can send a picture amplifying command to the display equipment so as to enable the display equipment to amplify and display the picture. The controller may control the display to show the picture enlargement as shown in fig. 21 b. The picture amplifying condition can be placed on the upper layer of the amplified picture displayed in the display, the picture amplifying condition comprises a thumbnail of the target picture and a local amplifying frame in the thumbnail, and an image in the local amplifying frame is the amplified picture currently displayed by the display.
When the display device enlarges and displays the picture, the user can touch the control area, for example, press the control area and slide, so as to control the local enlarged frame in the picture enlarging situation. For example, when the user slides left in the control area, the terminal device transmits a control instruction to the display device, controlling the partial enlargement frame to move left with respect to the thumbnail of the target picture. Further, the controller determines the content in the moved partial enlargement frame and controls the display to display the content in the moved partial enlargement frame in full screen. When the display device enlarges the display picture, the user may click on the control area a preset number of times, for example, click on the control area once, at which time the terminal device may send an instruction to cancel the enlargement to the display device. After receiving the instruction for canceling the amplification, the controller cancels the amplification display of the target picture and controls the display to display the target picture in a full screen mode.
In some embodiments, when the user uses the two-finger touch control area and moves towards the edge of the control area, the user may be considered to send a picture enlarging instruction, and the terminal device may send the picture enlarging instruction to the display device, so that the display device enlarges and displays the picture. When the user uses the double-finger touch control area and moves towards the center of the control area, the user can be considered to send a picture reducing instruction, and the terminal equipment can send the picture reducing instruction to the display equipment so that the display equipment can reduce and display the picture.
In some embodiments, the terminal device may determine whether the display device performs the enlarged display processing on the target picture. If the display device performs enlarged display of the target picture, when the user presses the control area and slides, the movement of the partial enlarged frame can be controlled. If the display device does not enlarge and display the target picture, when the user presses the control area and slides, the terminal device sends a picture switching instruction to the display device. For example, when the user slides left in the control area, the terminal device may instruct the display device to display the previous picture, and when the user slides right in the control area, the terminal device may instruct the display device to display the next picture. Control instruction prompt information may be displayed in the control area, as shown in fig. 20, so as to prompt the user for control instructions supported by the current control interface.
In some embodiments, when the terminal device displays the application scene control interface, the user may control the user interface in the display device using the application scene control interface. The control may be performed by using a remote controller or directly performing touch control on the display device.
When the user changes the application scene of the display device in other control modes, the application scene control interface displayed by the terminal device at the moment and the current application scene of the display device are obviously mismatched, so that the user cannot control the display device by using the application scene control interface displayed by the terminal device at the moment.
At this time, the user may first touch the interface return control, and control the terminal device to display the initial control interface. And then, the terminal equipment regenerates and displays the application scene control interface by touching the application scene control in the initial control interface, so that the update of the application scene control interface displayed in the terminal equipment is realized. The application scene control interface is matched with the application scene of the display device at the moment, so that the display device can be controlled.
In some embodiments, if the application scene of the display device is changed, if the user manually updates the application scene control interface, the user experience may be poor. The display device can detect whether the application scene of the display device, namely the application scene to which the user interface belongs, changes in real time. When the application scene is detected to change, the display device can acquire changed application scene information again, namely, the changed application scene type is detected and changed application scene data is acquired.
Further, the display device may actively send the changed application scene type and the changed application scene data to the terminal device. After receiving the information, the terminal device can update and display the application scene control interface, so that the user can continuously control the display device by using the terminal device.
Specifically, after receiving the changed application scene type and the changed application scene data sent by the display device, the terminal device may obtain a control interface template corresponding to the changed application scene type by using a preset control interface template library, and generate a new application scene control interface according to the control interface template and the changed application scene data. Further, the terminal device may cause the display unit to update and display the new application scenario control interface. At this time, the user can control the display device using the application scene control interface.
In some embodiments, the terminal device may display an application scenario control interface as shown in fig. 16 when the display device is in a media playback scenario. It should be noted that, because the application scenario control interface includes the media resource playing progress area, it is required to ensure that the playing progress of the media resource in the area is consistent with the playing progress in the display device. The display device can continuously feed back the playing progress to the terminal device, so that the playing progress of the terminal device and the playing progress of the terminal device are unified. However, this processing method may cause excessive information interaction between the display device and the terminal device, and seriously waste resources.
Therefore, the terminal device can automatically time after the terminal device and the display device synchronize the playing progress once. For example, after the terminal device generates the application scene control interface, a request for acquiring the playing progress may be sent to the display device, so that the display device feeds back the current playing progress to realize one-time synchronization. And then, the display equipment and the terminal equipment respectively count the respective playing progress independently, and real-time synchronization is not needed, so that the resource waste is greatly reduced.
In some embodiments, when the display device is in a media asset playing scene, when the playing state of the media asset being played in the media asset playing scene changes, the playing progress between the display device and the terminal device is not synchronized. For example, when a user pauses, replays, fast-forwards and other operations on media assets played in the display device by using the remote controller, or media asset buffering conditions caused by network reasons and the like, the playing progress in the display device is obviously not synchronous with the playing progress in the terminal device, and at this time, the playing progress in the terminal device needs to be changed.
Specifically, the display device may detect, in real time, a playing state of the media asset in the media asset playing scene. When the play state of the media asset is detected to change, the play progress of the media asset after the play state is changed is obtained, and meanwhile, the changed play state can be determined. Further, the display device may send the changed play progress and the changed play status to the terminal device. After the information is received, the terminal equipment can adjust the application scene control interface according to the information, and particularly, adjust the media resource playing progress area, so that the playing progress in the terminal equipment and the display equipment are synchronous.
For example, when the media asset in the display device is paused, the display device sends the target playing progress and the pause condition when paused to the terminal device, and the terminal device can keep the playing progress at the target playing progress and pause synchronously to wait for the information sent by the display device next time.
When the media asset in the display device has the conditions of fast forward, fast backward and the like, the display device can send the target playing progress and the current playing speed when in change to the terminal device, and the terminal device counts at the same playing speed at the target playing progress, so that the terminal device and the display device keep synchronous.
The same and similar parts of the embodiments in this specification are referred to each other, and are not described herein.
It will be apparent to those skilled in the art that the techniques of embodiments of the present invention may be implemented in software plus a necessary general purpose hardware platform. Based on such understanding, the technical solutions in the embodiments of the present invention may be embodied essentially or in parts contributing to the prior art in the form of a software product, which may be stored in a storage medium, such as a ROM/RAM, a magnetic disk, an optical disk, etc., including several instructions to cause a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the method of the embodiments or some parts of the embodiments of the present invention.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some or all of the technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the corresponding technical solutions from the scope of the technical solutions of the embodiments of the present application.
The foregoing description, for purposes of explanation, has been presented in conjunction with specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit the embodiments to the precise forms disclosed above. Many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles and the practical application, to thereby enable others skilled in the art to best utilize the embodiments and various embodiments with various modifications as are suited to the particular use contemplated.

Claims (10)

1. A terminal device, comprising:
a display unit;
a communication unit configured to make communication connection with the display device;
A processor configured to:
responding to an application scene control instruction input by a user, and sending the application scene control instruction to display equipment so as to enable the display equipment to feed back an application scene type; the application scene type is the type of the application scene to which the user interface displayed in the display device belongs;
receiving an application scene type sent by the display equipment;
generating an application scene control interface according to the application scene type and controlling the display unit to display the application scene control interface, wherein the service scene control interface is used for controlling a user interface in the application scene; wherein, different application scene types correspondingly generate different application scene control interfaces.
2. The terminal device of claim 1, wherein the processor is further configured to:
before performing the step of sending the application scene control instruction to the display device,
sending a communication connection instruction to a display device so that the display device feeds back the device type;
generating an initial control interface according to the equipment type and controlling the display unit to display the initial control interface, wherein the initial control interface is used for controlling display equipment corresponding to the equipment type.
3. The terminal device of claim 2, wherein the initial control interface includes an application scene control; the processor is further configured to:
before performing the step of sending the application scene control instruction to the display device,
and if the touch operation of the user on the application scene control is detected, determining that an application scene control instruction is input for the user.
4. The terminal device of claim 1, wherein the processor is further configured to:
before performing the step of generating an application scene control interface according to said application scene type,
receiving application scene data sent by display equipment, wherein the application scene data is data corresponding to the user interface under the application scene type;
the processor is further configured to: in performing the step of generating an application scene control interface according to said application scene type,
and generating an application scene control interface according to the application scene type and the application scene data.
5. The terminal device according to claim 1, wherein the application scenario type comprises: a media asset recommendation scene, a media asset search scene and a media asset play scene;
The processor is further configured to: in performing the step of generating an application scene control interface according to said application scene type,
if the application scene type is detected to be the media asset recommendation scene, sending a request for acquiring application scene data to the display equipment or the server; the application scene data are data corresponding to the user interface under the application scene type;
receiving application scene data sent by the display equipment or the server;
generating an application scene control interface according to the application scene type and the application scene data;
if the application scene type is detected to be a media resource searching scene or a media resource playing scene, sending a request for acquiring application scene data to the display equipment;
and receiving the application scene data sent by the display equipment, and generating an application scene control interface according to the application scene type and the application scene data.
6. The terminal device of claim 5, wherein the processor is further configured to:
in performing the step of generating an application scene control interface from said application scene type and said application scene data,
Acquiring a first control interface template corresponding to the application scene type from a preset control interface template library; the preset control interface template library stores a plurality of control interface templates and application scene types corresponding to the control interface templates;
and generating an application scene control interface according to the target control interface template and the application scene data.
7. The terminal device of claim 6, wherein the processor is further configured to:
responding to the sent target application scene type and the target application scene data when the application scene to which the user interface belongs changes by the display equipment, and acquiring a second control interface template corresponding to the target application scene type from the preset control interface template library; the target application scene type is the type of the changed application scene, and the target application scene data is the data corresponding to the changed application scene;
generating a target application scene control interface according to the second control interface template and the target application scene data;
and controlling the display unit to update and display the target application scene control interface.
8. The terminal device of claim 6, wherein the application scene type is a media asset playing scene; the application scene data comprises a media resource name, a media resource selection set, a media resource playing progress, media resource definition and media resource playing speed;
the processor is further configured to: in performing the step of generating an application scenario control interface from said target control interface template and said application scenario data,
adjusting a display area in the target control interface template according to the application scene data so that the data in the display area is the application scene data; the display area includes: a media asset name area, a media asset selection area, a media asset playing progress area, a media asset definition area and a media asset playing speed area;
responding to the sent target playing progress and the changed playing state of the display equipment when the playing state of the media asset in the media asset playing scene is changed, and adjusting the media asset playing progress area; and the target playing progress is the playing progress after the playing state of the media asset is changed.
9. The terminal device of claim 6, wherein the application scenario type is a media search scenario; the application scene data comprises search text in a text search box of the display device and historical search data;
The processor is further configured to: in performing the step of generating an application scenario control interface from said target control interface template and said application scenario data,
adjusting a display area in the target control interface template according to the application scene data so as to display the search text in a search area and display the historical search data in a historical search area; the display area includes a search area and a history search area.
10. A display device, characterized by comprising:
a display configured to display a user interface;
a communicator configured to make communication connection with the terminal device;
a controller configured to:
detecting that the display equipment and the terminal equipment are in communication connection, and detecting the type of an application scene in response to an application scene control instruction sent by the terminal equipment; the application scene type is the type of the application scene to which the user interface belongs;
and sending the application scene type to the terminal equipment so that the terminal equipment generates and displays an application scene control interface according to the application scene type, wherein the service scene control interface is used for controlling a user interface in the application scene.
CN202210210654.1A 2021-10-19 2022-03-04 Display device and terminal device Pending CN115993919A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2022/121223 WO2023065976A1 (en) 2021-10-19 2022-09-26 Terminal device, display device and display method thereof

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202111214613 2021-10-19
CN2021112146131 2021-10-19

Publications (1)

Publication Number Publication Date
CN115993919A true CN115993919A (en) 2023-04-21

Family

ID=85994212

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210210654.1A Pending CN115993919A (en) 2021-10-19 2022-03-04 Display device and terminal device

Country Status (2)

Country Link
CN (1) CN115993919A (en)
WO (1) WO2023065976A1 (en)

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103139507B (en) * 2011-12-05 2016-04-13 联想(北京)有限公司 A kind ofly control the method for remote control equipment, a kind of electronic equipment and a kind of remote control equipment
CN103116336B (en) * 2013-01-14 2015-04-29 从兴技术有限公司 Method and device for automatic management of controlled device through intelligent home control terminal
CN103888806B (en) * 2014-04-22 2017-02-15 张志远 Method for controlling interactive smart television
CN104958898A (en) * 2014-08-13 2015-10-07 腾讯科技(深圳)有限公司 Method, apparatus and system for controlling video games
CN105093949A (en) * 2015-07-13 2015-11-25 小米科技有限责任公司 Method and apparatus for controlling device

Also Published As

Publication number Publication date
WO2023065976A1 (en) 2023-04-27

Similar Documents

Publication Publication Date Title
WO2020244266A1 (en) Remote control method for smart television, mobile terminal, and smart television
CN111277884B (en) Video playing method and device
WO2021159723A1 (en) Classic episode highlight display method and display device
KR20140021408A (en) Portable terminal apparatus and method of operating thereof
WO2018120768A1 (en) Remote control method and terminal
CN111770370A (en) Display device, server and media asset recommendation method
CN111901656B (en) Media data searching method, display equipment and server
CN112653906A (en) Video hotspot playing method on display device and display device
CN112188249A (en) Electronic specification-based playing method and display device
CN113453057B (en) Display device and playing progress control method
JP2015130661A (en) Display device, mobile device, system including the same and connection control method thereof
CN111586463B (en) Display device
CN113542899A (en) Information display method, display device and server
CN113784186B (en) Terminal device, server, and communication control method
CN113542900B (en) Media information display method and display equipment
CN115993919A (en) Display device and terminal device
CN111914565A (en) Electronic equipment and user statement processing method
CN115514998B (en) Display equipment and network media resource switching method
WO2023130965A1 (en) Display device, and audio and video data playing method
KR102303286B1 (en) Terminal device and operating method thereof
CN113573115B (en) Method for determining search characters and display device
CN117806577A (en) Display apparatus and display apparatus control method
WO2021218111A1 (en) Method for determining search character and display device
WO2021218096A1 (en) Method for adjusting order of channel controls, and display device
CN114924648A (en) Display device, terminal device and gesture interaction method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination