CN114339372A - Display device and control method - Google Patents

Display device and control method Download PDF

Info

Publication number
CN114339372A
CN114339372A CN202210000465.1A CN202210000465A CN114339372A CN 114339372 A CN114339372 A CN 114339372A CN 202210000465 A CN202210000465 A CN 202210000465A CN 114339372 A CN114339372 A CN 114339372A
Authority
CN
China
Prior art keywords
display
user interface
setting
item
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210000465.1A
Other languages
Chinese (zh)
Other versions
CN114339372B (en
Inventor
崔文华
周维栋
张以通
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense Visual Technology Co Ltd
Original Assignee
Hisense Visual Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Visual Technology Co Ltd filed Critical Hisense Visual Technology Co Ltd
Priority to CN202210000465.1A priority Critical patent/CN114339372B/en
Publication of CN114339372A publication Critical patent/CN114339372A/en
Application granted granted Critical
Publication of CN114339372B publication Critical patent/CN114339372B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Controls And Circuits For Display Device (AREA)

Abstract

The application provides a display device and a control method, wherein the display device comprises a first display and a second display, the first display is used for presenting a first user interface, and the second display is used for presenting a second user interface. Display device has the atmosphere mode function, and the user can be through the corresponding instruction of input at first user interface, and control display device opens or close the atmosphere mode, opens the atmosphere mode back, and the second display shows that the appointed video of user is in order to decorate first display to play and dry by the fire the atmosphere, increase the interesting purpose that the user used display device, be favorable to user experience.

Description

Display device and control method
Technical Field
The embodiment of the application relates to the technical field of display, in particular to a display device and a control method.
Background
With the continuous development of communication technology, terminal devices such as computers, smart phones and display devices have become more and more popular. Various performance indexes of the terminal equipment operating system are continuously improved, and in order to provide rich display contents, it is possible to set a dual system and a dual screen on the Android terminal equipment. The dual system comprises a first controller and a second controller, wherein the two controllers are both configured with independent application programs for running, and the first controller and the second controller can realize connection, communication and power supply through a plurality of interfaces of different types. The double screens comprise a first display and a second display, and the first display is used as a small screen and used for showing the social ability and information prompt of the user; the second display is used as a large screen for displaying the display content of the corresponding application.
At present, the second display can be correspondingly set through the first display, so that the control of basic functions of the second display, such as brightness, status bar display lamps, message reminding and the like, can be realized. However, as the demand of the user for the application experience is higher and higher, the requirement of the user for the terminal device in the aspect of entertainment is also higher and higher, and therefore, how to develop more entertainment functions of the terminal device, meet the numerous requirements of the user for entertainment, and bring better use experience for the user is a problem to be solved in the field all the time.
Disclosure of Invention
The exemplary embodiment of the application provides a display device and a control method, so as to solve the problem that the existing display device lacks an entertainment function, and improve the user experience of a user for operating the display device.
In one aspect, the present application provides a display device, including:
the first display is used for presenting a first user interface, and a setting page is displayed in the first user interface;
the second display is used for presenting a second user interface, and a setting page and/or a playing picture are displayed in the second user interface;
a controller configured to:
presenting the first user interface on the first display and the second user interface on the second display;
receiving a menu item display instruction input by a user on the first user interface, wherein the menu item display instruction indicates that a setting menu of the second display is displayed;
responding to the menu item display instruction, displaying a setting menu of the second display on the first user interface, wherein the setting menu comprises an item for setting an atmosphere mode switch state and an item for setting a video played by the second display when the atmosphere mode is started, and the atmosphere mode refers to a video associated with the item for playing the video on the second user interface;
turning on or off the ambience mode in response to an operation of the item for setting an ambience mode switch state.
On the other hand, the application also provides a control method of the display device, which comprises the following steps:
presenting a first user interface on a first display and a second user interface on a second display;
receiving a menu item display instruction input by a user on the first user interface, wherein the menu item display instruction indicates that a setting menu of the second display is displayed;
responding to the menu item display instruction, displaying a setting menu of the second display on the first user interface, wherein the setting menu comprises an item for setting an atmosphere mode switch state and an item for setting a video played by the second display when the atmosphere mode is started, and the atmosphere mode refers to a video associated with the item for playing the video on the second user interface;
turning on or off the ambience mode in response to an operation of the item for setting an ambience mode switch state.
According to the technical scheme, the display device comprises a first display and a second display, the first display is used for presenting a first user interface, and the second display is used for presenting a second user interface. Display device has atmosphere mode function, and the user can be through the corresponding instruction of input at first user interface, and control display device opens or close the atmosphere mode, opens the atmosphere mode back, and the second display shows the target video in order to decorate first display to play and dry by the fire the atmosphere, increase the interesting purpose that the user used display device, be favorable to user experience.
Drawings
In order to more clearly illustrate the embodiments of the present application or the implementation manner in the related art, a brief description will be given below of the drawings required for the description of the embodiments or the related art, and it is obvious that the drawings in the following description are some embodiments of the present application, and other drawings can be obtained by those skilled in the art according to the drawings.
Fig. 1 is a schematic diagram illustrating an operation scenario between a display device and a control apparatus according to an exemplary embodiment;
fig. 2 is a block diagram schematically showing a configuration of a control apparatus according to an exemplary embodiment;
fig. 3 is a diagram schematically illustrating a hardware configuration of a hardware system in the display device according to the exemplary embodiment;
FIG. 4 is a schematic diagram illustrating the connection of a power strip to a load;
FIG. 5 is a block diagram illustrating an exemplary hardware architecture for the display device of FIG. 3;
fig. 6 is a block diagram illustrating a configuration of a software system in a display device according to an exemplary embodiment;
FIG. 7a is a diagram illustrating a user interface in a display device according to an exemplary embodiment;
FIG. 7b is a diagram illustrating a user interface in a display device according to an exemplary embodiment;
fig. 8 is a diagram illustrating a menu interface in a display device according to an exemplary embodiment;
fig. 9 is a diagram exemplarily illustrating an AI setting interface in the display apparatus according to the exemplary embodiment;
fig. 10 is a diagram illustrating a sub screen setting interface in a display device according to an exemplary embodiment;
fig. 11 is a diagram illustrating a sub screen setting interface in a display device according to an exemplary embodiment;
FIG. 12 is a diagram illustrating a user interface in a display device according to an exemplary embodiment;
FIG. 13 is a diagram illustrating a user interface in a display device according to an exemplary embodiment;
FIG. 14 is a diagram illustrating a user interface in a display device according to an exemplary embodiment;
FIG. 15 is a diagram illustrating a user interface in a display device according to an exemplary embodiment;
FIG. 16 is a diagram illustrating a user interface in a display device according to an exemplary embodiment;
FIG. 17 is a diagram illustrating a user interface in a display device according to an exemplary embodiment;
FIG. 18 is a diagram illustrating a user interface in a display device in accordance with an illustrative embodiment;
FIG. 19 is a diagram illustrating a user interface in a display device according to an exemplary embodiment;
FIG. 20 is a diagram illustrating a user interface in a display device in accordance with an illustrative embodiment;
fig. 21 is a diagram illustrating a flow of a display device registering an ambience mode service according to an exemplary embodiment;
FIG. 22 is a schematic flow diagram illustrating a display device invoking an ambience mode service in accordance with an exemplary embodiment;
fig. 23 is a flowchart illustrating a control method of a display device according to the present application in an exemplary embodiment.
Detailed Description
The technical solutions in the embodiments of the present invention will be described clearly and completely with reference to the accompanying drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The present application is not limited to a display device having a dual system and dual display structure as shown in fig. 1 to 7, that is, a display device having a first controller (first hardware system), a second controller (second hardware system), a first display, and a second display, and a display device having a non-dual system, that is, having one controller (hardware system), or having more than two controllers (hardware systems).
In a specific implementation manner of the present application, a display device with dual systems is used to describe the technical solution of the present application. The structure, function, implementation, and the like of the display device having the dual system hardware structure will be described in detail first.
The concept to which the present application relates will be first explained below with reference to the drawings. It should be noted that the following descriptions of the concepts are only for the purpose of facilitating understanding of the contents of the present application, and do not represent limitations on the scope of the present application.
The term "module," as used in various embodiments of the present application, may refer to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware and/or software code that is capable of performing the functionality associated with that element.
The term "remote control" as used in the embodiments of the present application refers to a component of an electronic device (such as the display device disclosed in the present application) that is capable of wirelessly controlling the electronic device, typically over a short distance. The component may typically be connected to the electronic device using infrared and/or Radio Frequency (RF) signals and/or bluetooth, and may also include functional modules such as WiFi, wireless USB, bluetooth, motion sensors, etc. For example: the hand-held touch remote controller replaces most of the physical built-in hard keys in the common remote control device with the user interface in the touch screen.
The term "gesture" as used in the embodiments of the present application refers to a user behavior used to express an intended idea, action, purpose, or result through a change in hand shape or an action such as hand movement.
The term "hardware system" used in the embodiments of the present application may refer to a physical component having computing, controlling, storing, inputting and outputting functions, which is formed by a mechanical, optical, electrical and magnetic device such as an Integrated Circuit (IC), a Printed Circuit Board (PCB) and the like. In various embodiments of the present application, a hardware system may also be referred to as a motherboard (or chip).
Fig. 1 is a schematic diagram illustrating an operation scenario between a display device and a control apparatus. As shown in fig. 1, a user may operate the display apparatus 200 through the control device 100.
The control device 100 may be a remote controller 100A, which can communicate with the display device 200 through an infrared protocol communication, a bluetooth protocol communication, a ZigBee (ZigBee) protocol communication, or other short-range communication, and is used to control the display device 200 in a wireless or other wired manner. The user may input a user instruction through a key on the remote controller 100A, voice input, control panel input, or the like to control the display apparatus 200. Such as: the user can input a corresponding control command through a volume up/down key, a channel control key, up/down/left/right movement keys, a voice input key, a menu key, a power on/off key, etc. on the remote controller 100A to control the functions of the display device 200.
The control device 100 may also be an intelligent device, such as a mobile terminal 100B, a tablet computer, a notebook computer, and the like, which may communicate with the display device 200 through a Local Area Network (LAN), a Wide Area Network (WAN), a Wireless Local Area Network (WLAN), or other networks, and implement control of the display device 200 through an application program corresponding to the display device 200. For example, the display device 200 is controlled using an application program running on the smart device. The application may provide various controls to the User through an intuitive User Interface (UI) on a screen associated with the smart device.
As shown in fig. 1, the display apparatus 200 may also perform data communication with the server 300 through various communication means. In various embodiments of the present application, the display device 200 may be allowed to be in a wired or wireless communication connection with the server 300 via a local area network, a wireless local area network, or other network. The server 300 may provide various contents and interactions to the display apparatus 200.
The display device 200 includes a first display 201 and a second display 202, wherein the first display 201 and the second display 202 are independent from each other, and the first display 201 and the second display 202 are controlled by different hardware systems respectively. The first display 201 and the second display 202 may be used to display different screen contents. For example, the first display 201 may be used for screen display of conventional television programs, and the second display 202 may be used for screen display of auxiliary information such as notification type messages, voice assistants, and the like.
Alternatively, the content displayed by the first display 201 and the content displayed by the second display 202 may be independent of each other and not affected by each other. For example, when the first display 201 plays a television program, the second display 202 may display information such as time, weather, temperature, reminder messages, and the like, which are not related to the television program.
Optionally, there may also be an association between the content displayed by the first display 201 and the content displayed by the second display 202. For example, when the first display 201 plays a main screen of a video chat, the second display 202 may display information such as an avatar, a chat duration, and the like of a user currently accessing the video chat.
Optionally, part or all of the content displayed by the second display 202 may be adjusted to be displayed by the first display 201. For example, the information such as time, weather, temperature, reminder message, etc. displayed on the first display 201 may be adjusted to be displayed on the first display 201, while other information is displayed on the second display 202.
In addition, the first display 201 displays the multi-party interactive picture while displaying the traditional television program picture, and the multi-party interactive picture does not block the traditional television program picture. The display mode of the traditional television program picture and the multi-party interactive picture is not limited by the application. For example, the position and the size of the traditional television program picture and the multi-party interactive picture can be set according to the priority of the traditional television program picture and the multi-party interactive picture.
Taking the example that the priority of the traditional television program picture is higher than that of the multi-party interactive picture, the area of the traditional television program picture is larger than that of the multi-party interactive picture, and the multi-party interactive picture can be positioned at one side of the traditional television program picture and can also be arranged at one corner of the multi-party interactive picture in a floating manner.
The display device 200, in one aspect, may be a liquid crystal display, an oled (organic Light Emitting diode) display, a projection display device; in another aspect, the display system may be a smart television or a display and a set-top box. The specific type, size, resolution, etc. of the display device 200 are not limited, and those skilled in the art will appreciate that the display device 200 may vary somewhat in its capabilities and configuration as desired.
Fig. 2 is a block diagram illustrating the configuration of the control device 100. As shown in fig. 2, the control device 100 includes a controller 110, a communicator 130, a user input/output interface 140, a memory 190, and a power supply 180.
The control apparatus 100 is configured to control the display device 200, and to receive an input operation instruction from a user, and convert the operation instruction into an instruction recognizable and responsive by the display device 200, and to mediate interaction between the user and the display device 200. Such as: the user operates the channel up/down key on the control device 100, and the display device 200 responds to the channel up/down operation.
Fig. 3 is a schematic diagram illustrating a hardware configuration of a hardware system in the display device 200. For convenience of explanation, the display device 200 in fig. 3 is illustrated by taking a liquid crystal display as an example. As shown in fig. 3, the display device 200 includes: the display panel comprises a first panel 11, a first backlight assembly 21, a main board 31, an interactive board 32, a first display driving board 33, a second panel 12, a second backlight assembly 22, a second display driving board 34, a power board 4, a key board 35, a first rear shell 51, a second rear shell 52 and a base 6.
The first panel 11 is used for presenting the picture of the first display 201 to the user. The first backlight assembly 21 is disposed under the first panel 11, and is generally an optical assembly for providing sufficient light source with uniform brightness and distribution to enable the first panel 11 to normally display images. The first backlight assembly 21 further includes a first back plate (not shown). The main board 31, the interactive board 32, the first display driving board 33 and the power board 4 are disposed on the first back board, and some convex hull structures are typically formed by stamping on the first back board. The main board 31, the interactive board 32, the first display driving board 33 and the power board 4 are fixed on the convex hull through screws or hooks. The main board 31, the interactive board 32, the first display driving board 3 and the power board 4 may be disposed on one board, or may be disposed on different boards respectively. The first rear case 51 covers the first panel 11 to hide the parts of the display device 200, such as the first backlight assembly 21, the main board 31, the interactive board 32, the first display driving board 33, and the power board 4, and to achieve an aesthetic effect.
The first display driving board 33 mainly functions to: the multilevel backlight partition control is performed through the PWM signal and the lcaldimeming signal transmitted by the first controller on the motherboard 31, and the control is changed according to the image content, and after the handshake is established between the first controller on the motherboard 31 and the VbyOne display signal transmitted by the first controller on the motherboard 31 is received, and the VbyOne display signal is converted into the LVDS signal, so that the image display of the first display 201 is realized.
The base 6 is used for supporting the display device 200, and it should be noted that the drawings only show one type of base design, and those skilled in the art can design different types of bases according to the product requirements.
The second panel 12 is used to present the screen of the second display 202 to the user. The second backlight assembly 22 is disposed under the second panel 12, and is generally an optical assembly for providing sufficient brightness and uniform light distribution to enable the second panel 12 to normally display images. The second backlight assembly 22 further includes a second back plate (not shown). Second display driver board 34 is disposed on the second backplane, typically with some convex hull structures stamped thereon. The second display driving board 34 is fixed to the convex bag by a screw or a hook. The second display driving board 34 may be provided on one board or may be provided on different boards, respectively. The second rear case 52 covers the second panel 12 to hide the second backlight assembly 22, the adapter driving board (not shown), the second display driving board 34, the key board 35, and other components of the display device 200, thereby achieving an aesthetic effect.
Optionally, fig. 3 further includes a key sheet 35, where the key sheet 35 may be disposed on the first back plate or the second back plate, which is not limited in this application.
In addition, the display device 200 further includes a sound reproducing means (not shown in the figure), such as an audio component, e.g., an I2S interface including a power Amplifier (AMP) and a Speaker (Speaker), etc., for realizing reproduction of sound. Usually, the sound components are capable of realizing sound output of at least two sound channels; when the panoramic surround effect is to be achieved, a plurality of acoustic components are required to be arranged to output sounds of a plurality of sound channels, and a detailed description thereof is omitted.
It should be noted that the display device 200 may also adopt an OLED display screen, so that the template included in the display device 200 is changed accordingly, which is not described herein too much.
Fig. 4 schematically illustrates a connection relationship between a power panel and a load, and as shown IN fig. 4, the power panel 4 includes an input terminal IN and an output terminal OUT (a first output terminal OUT1, a second output terminal OUT2, a third output terminal OUT3, a fourth output terminal OUT4 and a fifth output terminal OUT5 are shown IN the figure), where the input terminal IN is connected to a commercial power, the output terminal OUT is connected to the load, for example, a first output terminal OUT1 is connected to a light emitting element (such as a light bar or a self-light emitting device), a second output terminal OUT2 is connected to an audio component, a third output terminal OUT3 is connected to the main board 31, a fourth output terminal OUT4 is connected to the first display driving board 33, and a fifth output terminal OUT5 is connected to the first backlight component 21. The power board 4 needs to convert the ac power into dc power required by the load, and the dc power is usually in different specifications, for example, 18V is required for the audio components, 12V/18V is required for the panel, etc.
For ease of description, one hardware system in a dual hardware system architecture will be referred to hereinafter as a first hardware system or a first controller, and the other hardware system will be referred to hereinafter as a second hardware system or a second controller. The first controller comprises various processors and various interfaces of the first controller, and the second controller comprises various processors and various interfaces of the second controller. The first controller and the second controller may each have a relatively independent operating system installed therein, and the operating system of the first controller and the operating system of the second controller may communicate with each other through a communication protocol, which is as follows: the frame layer of the operating system of the first controller and the frame layer of the operating system of the second controller can communicate for the transmission of commands and data, so that there are two independent but interrelated subsystems in the display device 200.
The dual hardware system architecture of the present application is further described below with reference to fig. 5. It should be noted that fig. 5 is only an exemplary illustration of the dual hardware system architecture of the present application, and does not represent a limitation of the present application. In actual practice, both hardware systems may contain more or less hardware or interfaces as desired.
Fig. 5 is a block diagram illustrating a hardware architecture of the display apparatus 200 shown in fig. 3. As shown in fig. 5, the hardware system of the display apparatus 200 includes a first controller 210 and a second controller 310, and a module connected to the first controller 210 or the second controller 310 through various interfaces.
Among them, the first controller 210 may be disposed on the main board 31 shown in fig. 3. Optionally, the first controller 210: the traditional television function is mainly realized (for example, a set top box can be externally connected). The second controller 310 may be disposed on the second display driving board 34 shown in fig. 3. Optionally: the second controller 310 may be used to receive instructions sent by the first controller 210 and control the second display 380 to display a corresponding image.
The modules connected to the first controller 210 may include a tuning demodulator 220, a communicator 230, an external device interface 250, a memory 290, a user input interface 260-3, a video processor 260-1, an audio processor 260-2, a first display 280 (i.e., the first display 201 in fig. 1), an audio output interface 270, and a power supply module 240.
In other embodiments, more or fewer modules may be connected to the first controller 210.
In other embodiments, the first controller includes any of the modules described above.
The communicator 230 is a component for communicating with an external device or an external server according to various communication protocol types. For example: the communicator 230 may include a WIFI module 231, a bluetooth module 232, a wired ethernet module 233, and other network communication protocol modules such as an infrared communication protocol module or a near field communication protocol module (not shown).
The external device interface 250 is a component that provides data transmission between the first controller 210 and other external devices. The external device interface 250 may be connected with an external apparatus such as a set-top box, a game device, a notebook computer, etc. in a wired/wireless manner, and may receive data such as a video signal (e.g., moving image), an audio signal (e.g., music), additional information (e.g., EPG), etc. of the external apparatus.
The first controller 210 controls the operation of the display apparatus 200 and responds to the operation of the user by running various software control programs (e.g., an operating system and/or various application programs) stored on the memory 290.
As shown in FIG. 5, the first controller 210 includes a read only memory ROM213, a random access memory RAM214, a graphics processor 216, a CPU processor 212, a communication interface 218 (a first interface 218-1, a second interface 218-2, an Nth interface 218-N), and a communication bus. The RAM213 and the ROM214, the graphic processor 216, the CPU processor 212, and the communication interface 218 are connected via a communication bus.
The first controller 210 may control operations of the display device 200 in relation to the first display 280. For example: in response to receiving a user command for selecting a UI object to be displayed on the first display 280, the first controller 210 may perform an operation related to the object selected by the user command.
The first controller 210 may control operations of the display device 200 in relation to the second display 380. For example: in response to receiving a user command for selecting a UI object to be displayed on the second display 380, the first controller 210 may perform an operation related to the object selected by the user command.
Wherein the object may be any one of selectable objects, such as a hyperlink or an icon. Operations related to the selected object, such as: displaying an operation connected to a hyperlink page, document, image, or the like, or performing an operation of a program corresponding to an icon. The user command for selecting the UI object may be a command input through various input means (e.g., a mouse, a keyboard, a touch pad, etc.) connected to the display apparatus 200 or a voice command corresponding to a voice spoken by the user.
A user input interface 260-3 for transmitting an input signal of a user to the first controller 210 or transmitting a signal output from the first controller 210 to the user. For example, the control device (e.g., a mobile terminal or a remote controller) may transmit an input signal, such as a power switch signal, a channel selection signal, a volume adjustment signal, etc., input by the user to the user input interface, and then the input signal is forwarded to the first controller 210 through the user input interface 260-3; alternatively, the control device may receive an output signal such as audio, video or data processed by the first controller 210 and output from the user input interface 260-3, and display or output the received output signal in audio or vibration form.
In some embodiments, the user may input a user command on a Graphical User Interface (GUI) displayed on the first display 280, and the user input interface 260-3 receives the user input command through the Graphical User Interface (GUI). Alternatively, the user may input a user command by inputting a specific sound or gesture, and the user input interface 260-3 receives the user input command by recognizing the sound or gesture through the sensor.
The video processor 260-1 is configured to receive a video signal, and perform video data processing such as decompression, decoding, scaling, noise reduction, frame rate conversion, resolution conversion, and image synthesis according to a standard codec protocol of the input signal, so as to obtain a video signal directly displayed or played on the first display 280.
Illustratively, the video processor 260-1 includes a demultiplexing module, a video decoding module, an image synthesizing module, a frame rate conversion module, a display formatting module, and the like (not shown in the figure).
A first display 280 for receiving the image signal from the video processor 260-1 and displaying the video content and image and the menu manipulation interface. The first display 280 includes a display component for presenting a picture and a driving component for driving an image display. The video content to be displayed may be from the video in the broadcast signal received by the tuner/demodulator 220, or may be from the video content input from the communicator or the external device interface. The first display 280 simultaneously displays a user manipulation interface UI generated in the display apparatus 200 and used to control the display apparatus 200.
And a driving component for driving the display according to the type of the first display 280. Alternatively, a projection device and a projection screen may be included, provided that the first display 280 is a projection display.
Similar to the first controller 210, as shown in fig. 5, the modules connected to the second controller 310 may include a communicator 330, a detector 340, a memory 390, a second display 380 (i.e., the second display 202 in fig. 1), a video processor 360, and an external device interface 350. A user input interface, an audio processor, an audio output interface (not shown) may also be included in some embodiments. In some embodiments, there may also be a power supply module (not shown) that independently powers the second controller 310.
In some embodiments, the second controller 310 may include any one or more of the modules described above.
The second controller 310 may control operations of the display device 200 in relation to the second display 380. For example: in response to receiving a user command for selecting a UI object to be displayed on the second display 380, the second controller 310 may perform an operation related to the object selected by the user command.
The second controller 310 may control operations of the display device 200 in relation to the first display 280. For example: in response to receiving a user command for selecting a UI object to be displayed on the first display 280, the first controller 210 may perform an operation related to the object selected by the user command.
The graphics processor 316 of the second controller 310 and the graphics processor 216 of the first controller 210 are both capable of generating various graphics objects. In distinction, if the application 1 is installed in the second controller 310 and the application 2 is installed in the first controller 210, the graphic object is generated by the graphic processor 316 of the second controller 310 when the user performs an instruction input by the user in the application 1 at the interface of the application 1. When a user is at the interface of the application 2 and an instruction input by the user is made within the application 2, a graphic object is generated by the graphic processor 216 of the first controller 210.
Fig. 6 exemplarily shows a configuration block diagram of a software system in the display device 200.
With respect to the first controller 210, as shown in FIG. 6, the operating system 2911, which includes executing operating software for handling various basic system services and for performing hardware related tasks, acts as an intermediary between applications and hardware components for performing data processing.
In some embodiments, portions of the operating system kernel may contain a series of software to manage the display device hardware resources and provide services to other programs or software code.
In other embodiments, portions of the operating system kernel may include one or more device drivers, which may be a set of software code in the operating system that assists in operating or controlling the devices or hardware associated with the display device. The drivers may contain code that operates the video, audio, and/or other multimedia components. Examples include a display, a camera, Flash, WiFi, and audio drivers.
The accessibility module 2911-1 is configured to modify or access the application program to achieve accessibility and operability of the application program for displaying content.
A communication module 2911-2 for connection to other peripherals via associated communication interfaces and a communication network.
The user interface module 2911-3 is configured to provide an object for displaying a user interface, so that each application program can access the object, and user operability can be achieved.
Control applications 2911-4 for controlling process management, including runtime applications and the like.
The event transmission system 2914 may be implemented within the operating system 2911 or within the application 2912. In some embodiments, an aspect is implemented within the operating system 2911 and concurrently in the application programs 2912 for listening for various user input events, and will implement one or more sets of predefined operations in response to various events referring to the recognition of various types of events or sub-events. The event monitoring module 2914-1 is configured to monitor an event or a sub-event input by the user input interface. The event identification module 2914-2 is used to input various event definitions for various user input interfaces, identify various events or sub-events, and transmit them to the process for executing one or more sets of their corresponding handlers.
The event or sub-event refers to an input detected by one or more sensors in the display device 200 and an input of an external control device (e.g., the control apparatus 100). Such as: the method comprises the following steps of inputting various sub-events through voice, inputting a gesture sub-event through gesture recognition, inputting a remote control key command of a control device and the like. Illustratively, the one or more sub-events in the remote control include a variety of forms including, but not limited to, one or a combination of key presses up/down/left/right/, ok keys, key presses, and the like. And non-physical key operations such as move, hold, release, etc.
The interface layout management module 2913, directly or indirectly receiving the input events or sub-events from the event transmission system 2914, monitors the input events or sub-events, and updates the layout of the user interface, including but not limited to the position of each control or sub-control in the interface, and the size, position, and level of the container, which are related to the layout of the interface.
The event transmission system 2914, which may monitor user input for each predefined event or sub-event heard, provides control identifying the event or sub-event directly or indirectly to the interface layout management module 2913.
The interface layout management module 2913 is configured to monitor the state of the user interface (including the position and/or size of the view partition, the item, the focus or the cursor object, the change process, and the like), and according to the event or the sub-event, may perform a modification on the layout of the size and position, the hierarchy, and the like of the view display area, and/or adjust or modify the layout of the size or/and position, the number, the type, the content, and the like of the layout of various items in the view display area. In some embodiments, the layout is modified and adjusted, including displaying or not displaying the view sections or the content of the items in the view sections on the screen.
And a user input interface for transmitting an input signal of a user to the controller or transmitting a signal output from the controller to the user. For example, the control device (e.g., a mobile terminal or a remote controller) may send an input signal, such as a power switch signal, a channel selection signal, a volume adjustment signal, etc., input by a user to the user input interface, and then the input signal is forwarded to the controller by the user input interface; alternatively, the control device may receive an output signal such as audio, video, or data output from the user input interface via the controller, and display the received output signal or output the received output signal in audio or vibration form.
In some embodiments, a user may input a user command on a Graphical User Interface (GUI) displayed on the display 200, and the user input interface receives the user input command through the Graphical User Interface (GUI). Alternatively, the user may input the user command by inputting a specific sound or gesture, and the user input interface receives the user input command by recognizing the sound or gesture through the sensor.
Since the operating system 3911 of the second controller 310 is similar to the operating system 2911 of the first controller 210 in function, reference may be made to the operating system 2911 for details, which are not repeated herein.
Since the second controller 310 and the first controller 210 may have independent operating systems installed therein, there are two independent but interrelated subsystems in the display apparatus 200. For example, Android (Android) and various APPs may be independently installed on the second controller 310 and the first controller 210, and may all realize a certain function, and the second controller 310 and the first controller 210 cooperate to realize a certain function.
Fig. 7a illustrates a schematic diagram of a user interface in the display device 200. As shown in fig. 7a, the user interface includes a first view display area 2011 and a second view display area 2021. The first view display area 2011 and the second view display area 2021 have substantially the same function, and only the first view display area 2011 is described in an important manner below. Illustratively, among other things, the first view display 2011 includes layouts for one or more different items. And a selector indicating that the item is selected is also included in the user interface, and a position of the selector is movable by a user input to change a selection of a different item.
A "user interface" is a media interface for interaction and information exchange between an application or operating system and a user that enables the conversion of the internal form of information to a form acceptable to the user. A common presentation form of a user interface is a Graphical User Interface (GUI), which refers to a user interface related to computer operations and displayed in a graphical manner. It may be an interface element such as an icon, a window, a control, etc. displayed in the display screen of the electronic device, where the control may include a visual interface element such as an icon, a button, a menu, a tab, a text box, a dialog box, a status bar, a navigation bar, a Widget, etc.
The "item" is displayed in a view display area of the user interface in the display device 200 to represent a visual object of corresponding content such as an icon, a thumbnail, a video clip, and the like. For example: the items may represent movies, image content or video clips of a television show, audio content of music, applications, or other user access content history information.
Further, the item may represent an interface or a collection of interfaces on which the display device 200 is connected to an external device, or may represent a name of an external device connected to the display device, or the like. Such as: a signal source input Interface set, or a High Definition Multimedia Interface (HDMI), a USB Interface, a PC terminal Interface, and the like.
It should be noted that: the view display area may present Video chat project content or application layer project content (e.g., web page Video, Video On Demand (VOD) presentations, application screens, etc.).
A "selector" is used to indicate where any item has been selected, such as a cursor or a focus object. Positioning the selection information input according to an icon or menu position touched by the user in the display 200 may cause movement of a focus object displayed in the display device 200 to select a control item, one or more of which may be selected or controlled.
The focus object refers to an object that moves between items according to user input. Illustratively, the focus object position is implemented or identified by drawing a thick line through the item edge. In other embodiments, the focus form is not limited to an example, and may be a form such as a cursor that is recognizable by the user, either tangible or intangible, such as in the form of a 3D deformation of the item, or may change the identification of the border lines, size, color, transparency, and outline and/or font of the text or image of the item in focus.
Referring to the embodiment shown in fig. 1-7 a, the display apparatus includes a first display 280, a first controller 210, a second display 380, and a second controller 310. As shown in fig. 1, the first display and the second display have different display sizes, the first display with a larger size plays a main display role, which is also called a main screen, and the second display with a smaller size plays an auxiliary display role, which is also called an auxiliary screen.
The first controller 210 controls the operation of the display device 200 and responds to user operations associated with the first display 280 by running various software control programs (e.g., an operating system and/or various application programs) stored on the memory 290. For example, control presents a user interface on the first display 280, the user interface including a number of UI objects thereon; in response to a received user command for a UI object on the user interface, the first controller 210 may perform an operation related to the object selected by the user command.
The second controller 310 controls the operation of the display device 200 and responds to user operations associated with the second display 380 by running various software control programs stored on the memory 390 (e.g., with installed third party applications, etc.), as well as interacting with the first controller 210. For example, controlling the presentation of a user interface including a number of UI objects on the user interface on the second display 380, the second controller 310 may perform an operation related to an object selected by a user command in response to the received user command for the UI object on the user interface.
For ease of illustration, the user interface presented on the first display 280 is referred to as a first user interface and the user interface presented on the second display 380 is referred to as a second user interface. Alternatively, the first user interface and the second user interface may display home pages of operating systems run by the first controller and the second controller, respectively.
Fig. 7b exemplarily shows a play screen provided by the display device, and the play screen is composed of a first user interface presented on the first display and a second user interface presented on the second display, as shown in fig. 7 b. One or more of interface elements, video frames, picture frames, etc. may be displayed in the first user interface and/or the second user interface, such as the home page displayed on the first display in fig. 7 b.
Fig. 7b to 20 show schematically an interaction process for causing the first display to present a GUI by operating the control means.
When the first display displays the first user interface shown in fig. 7b, the user can input a menu item display instruction by operating the control device 100 (e.g., the remote controller 100A) to instruct the display apparatus to present a setting menu of the second display on the first display, and when the first controller receives the menu item display instruction input by the user, the first controller controls to present a setting menu of the second display on the first display in response to the menu item display instruction, the setting menu having an item for adjusting an on/off atmosphere mode of the second display, the atmosphere mode being to cause the second display to play an atmosphere video to decorate the first display, wherein the atmosphere video may be a default video preset in the system or a designated video set in the setting menu of the second display presented on the first display by the user.
A user interface interaction process in which the display device presents the setting menu of the second display is explained below with reference to fig. 8 to 10.
First, on the interface shown in fig. 7b, a user may input a user instruction indicating a menu of the display system by activating a "menu" key on the control device, the first controller may present a user interface as shown in fig. 8 on the first display in response to the user instruction, a menu option page 90 is displayed in the user interface, the menu option page 90 contains a plurality of items 91-97, and a selector 98 is displayed indicating that any one of the items is selected, and the user may move the position of the selector 98 in the user interface by operating the input of the control device to change the selection of a different item.
Further, in fig. 8, when a user operates the control means to input a user instruction instructing the selector 98 to select the item 96, the first controller may present a user interface as shown in fig. 9 in which a main setting menu page ("AI setting") 96 is displayed on the first display in response to the user instruction, the main setting menu page 96 corresponding to detailed menu options of the item 96, i.e., the items 961 and 967. Thereafter, in fig. 9, when a user operates the control device to input a user instruction instructing the selector 98 to select an item 964, the first controller may present, on the first display, a user interface as shown in fig. 10 in which a setting menu ("sub-screen setting") 964 of the second display containing detailed menu options corresponding to the item 964, that is, the item 9641 and 9646, is displayed in response to the user instruction.
In some embodiments, the corresponding item's parameter or status is displayed next to the item on the setup menu, e.g., referring to fig. 11, the corresponding brightness parameter 52 is displayed next to the sub-screen brightness, the time parameter for entering the do-not-disturb mode is displayed next to the do-not-disturb item for 30 seconds, and the on or off status is displayed next to the ambience mode.
As described above, on the setting menu of the second display presented on the first display, there is an item for setting the state of the atmosphere mode switch. When the first display displays the setting menu as shown in fig. 11, the user can operate an item for setting the on-off state of the ambience mode on the setting menu by operating the control means, and when the first controller receives the user's operation on the item, the second display turns on or off the ambience mode in response to the operation according to the setting rule corresponding to the operated item.
Illustratively, in response to the operation of the user on the atmosphere mode item, after the atmosphere mode is started, a user interface as shown in fig. 12 is presented on the first display, the user interface displays items "GIF animation a", "GIF animation B", "GIF animation C", and "GIF animation D" for adjusting the atmosphere mode, the focus cursor is moved to the item for adjusting the atmosphere mode, the user interface as shown in fig. 13 is presented on the first display, a preview window is displayed in the user interface, a corresponding preview video is played in the preview window, for example, the cursor is moved to the item corresponding to "GIF animation a", the user interface as shown in fig. 13 is presented on the first display, a preview window is displayed in the user interface, a part or all of the content corresponding to the duration of "GIF animation a" is played in the preview window, so that the user can quickly find the target GIF by previewing, when the user selects the item corresponding to the GIF animation a, the first controller sends a control instruction to the second controller in response to the selection operation of the GIF animation a input by the user, and the second controller controls the second display to enter the atmosphere mode after receiving the control instruction, and plays the GIF animation a on the second display, as shown in fig. 14.
It should be noted that, when the second display enters the atmosphere mode, in order to not affect the playing effect of the second display, the original function items of the second display may be hidden, or the transparency of the original function items of the second display may be adjusted to be smaller than the transparency of the played video, for example, if the transparency is 100%, the transparency is completely transparent, the transparency is 0%, the transparency of the interface for playing the target video presented by the second display is 0%, and in order to not affect the playing effect, the transparency of the original function items of the second display may be set to be greater than 0%.
In some embodiments, after the display device is powered on, the first controller may automatically detect the on-off state of the item corresponding to the atmosphere mode in response to the power-on of the device, where the on-off state of the item corresponding to the atmosphere mode may be a state where the item corresponding to the atmosphere mode is located before the display device is powered off last time, or may be a default state of the system. And if detecting that the item corresponding to the atmosphere mode is in an open state, sending an instruction to the second controller to instruct the second controller to enter the atmosphere mode and play a default video, wherein the default video can be a video played by the second display before the display device is turned off last time or a video set in advance by a user.
In some embodiments, the first controller presents a user interface as shown in fig. 15 on the first display, on which an interface for setting a parameter of "GIF animation a" is displayed, in response to a selection operation of an item for adjusting an atmosphere mode ("GIF animation a").
For example, the interface may display a control for adjusting the playing speed, and in response to the received operation on the item, the second display may be controlled to play the "GIF animation a" at a double speed, for example, if the "GIF animation a" is played at a 1.5-times speed, the second display will play the "GIF animation a" at a frequency of 1.5-times speed.
Illustratively, the interface may also display a control for adjusting the playback type, and may control the second display to cyclically play the "GIF animation a".
For example, the page may further display a control for adjusting the play item, and the first controller controls the first display to present a user interface as shown in fig. 16 in response to an input selection operation of the control for the play item, where the interface includes several video items, for example, "GIF animation a," "GIF animation B," "GIF animation C," and "GIF animation D," and if the currently playing item is "GIF animation a," the item corresponding to "GIF animation a" is in a selected state, and the user may select any one of the items corresponding to "GIF animation B," "GIF animation C," and "GIF animation D," and cancel the selected state of the item corresponding to "GIF animation a," so as to switch the video played by the second display. Or, the user may not cancel the selected state of the item corresponding to the "GIF animation a", and at the same time, perform the selection operation on any one or more items of the items corresponding to the "GIF animation B", the "GIF animation C", and the "GIF animation D", to generate the playlist, for example, when the item corresponding to the "GIF animation a" is in the selected state, perform the selection operation on the item corresponding to the "GIF animation C", so that the items corresponding to the "GIF animation a" and the "GIF animation C" are both in the selected state, and after clicking the confirmation control, the first controller sends an instruction to the second controller, so that the second controller controls the second display to play the "GIF animation a" and the "GIF animation C" according to the selected play type.
It should be noted that the video items in the playlist may be arranged according to text logic, for example, the "GIF animation a" is arranged before the "GIF animation C", the second display plays the "GIF animation a" first and then plays the "GIF animation C" according to the playlist, and the video items in the playlist may also be arranged according to the time of the input selection operation, for example, the "GIF animation C" is selected first and then the "GIF animation a" is selected, and then the second display plays the "GIF animation C" first and then plays the "GIF animation a" according to the playlist.
In some embodiments, the first controller presents a user interface as shown in fig. 18 on the first display in response to an input operation when a user performs a selection operation on an item "sub screen brightness" shown in fig. 17, on the setting menu of the second display presented on the first display, which has an item for adjusting the backlight brightness of the second display, for example, the item "sub screen brightness" shown in fig. 17, and displays a brightness adjustment bar in the user interface. On the user interface shown in fig. 18, the user can trigger the direction key on the control device to turn the brightness value down or up.
In a specific implementation, the luminance value corresponds to the PWM value in equal proportion, the first controller may convert the luminance value input by the user into a corresponding PWM signal, transmit the PWM signal to the communication bus (I2C), and input the PWM signal to the second display driving board 33 in a manner of reading/writing the GPIO interface on the second display driving board 33 through the communication bus, so that the second display driving board 33 performs backlight luminance control on the second backlight assembly 22 according to the input PWM signal.
It can be seen from the above example that the user can modify the brightness value of the second display based on interaction with the item "sub-screen brightness" and the brightness adjustment bar. Meanwhile, based on the operation of the user on the item 'auxiliary screen brightness' and the brightness adjusting bar, the first controller can receive the brightness value input by the user and send a brightness adjusting instruction to the second controller so as to adjust the backlight brightness of the second display to the brightness value expected by the user, and therefore the regulation and control of the video brightness played by the second display are achieved.
In some embodiments, as further shown in fig. 9, the main setting menu page ("AI setting") further includes an item "sound mode", where the sound mode refers to that the first display is in the off-screen state, the second display is in the on-screen state, and the first controller and the second controller are still in the running state, so that the display device can still normally respond to the control instruction, and still has a voice function, that is, the display device in the sound box mode is used as a smart sound box.
In some embodiments, if the item "atmosphere mode" is in the closed state, the user selects the item "sound mode", and the first controller controls the first display to turn off the screen in response to the operation input by the user, as shown in fig. 19, at this time, the second display may still display a screen for notifying auxiliary information such as a message like a notification, a voice assistant, and the like. The user can also input a command through voice to control the first controller and the second controller to execute corresponding operations, for example, the user can input voice to turn off the sound mode, and after receiving the command, the first controller controls the first display to turn on the screen, so that the sound mode is in a turn-off state. Or the user can input voice to start the atmosphere mode, the first controller controls the display device to carry out the atmosphere mode after receiving the instruction and sends the instruction to the second controller, and the second controller controls the second display to play the video after receiving the instruction so as to increase the atmosphere sense of the user when using the display device.
In some embodiments, when the sound mode and the atmosphere mode of the display device are simultaneously in an on state, a picture of auxiliary information such as a notification message and a voice assistant displayed on the second display will be hidden, a play setting interface will be superimposed on the top layer of the user interface displayed on the second display, the play setting interface includes a control for switching a play audio, and a user can operate the control for playing the audio, adjust the progress of the currently played audio, or directly switch the currently played audio to a target audio.
Illustratively, referring to fig. 20, the user interface diagram presented by the second display when the sound mode and the atmosphere mode of the display device are simultaneously in the on state is shown. As shown in fig. 20, the playing setting interface is superimposed on the upper layer of the interface for playing the atmosphere video, and the playing setting interface is a transparent interface so as not to affect the playing effect of the atmosphere video. The playing setting interface comprises a playing control, and the transparency of the playing control is greater than or equal to that of the interface for playing the atmosphere video.
More specifically, one or more buttons on the controller may be bound to the item for setting the transparency of the playing control, and the transparency of the playing control may be adjusted by the buttons to partially or completely hide the playing control.
In some embodiments, the user may perform a selection operation on the play control to adjust the progress of the played audio, for example, the user may press the play control for a long time to fast forward/fast rewind the played audio, or the user may click the play control quickly to switch the currently played audio to the target audio.
In some embodiments, if the user adjusts the transparency of the playing control to 100%, that is, under the condition that the playing control is completely hidden, the user touches the position where the playing control is located, and the playing control can still receive the operation input by the user, and correspondingly adjust the progress of the played audio.
In some embodiments, when the sound mode and the atmosphere mode of the display device are both in the on state, the user may also input an instruction by voice to control the first controller and the second controller to perform corresponding operations, for example, the user may input a voice "fast forward for 30 s", and after receiving the instruction, the first controller issues an instruction to an item related to playing music to instruct to adjust the progress of the currently played audio to 30s later.
The above UI is exemplified by a display device of a dual system and dual display configuration, and other types of display devices, for example, a display device of a single system and dual display configuration, have a UI that is basically similar to the above UI in terms of an atmosphere mode and an audio mode, and are not listed here. The UI interface provided in the present application is merely exemplary, and is subject to practical application and design.
In some embodiments, the turning on and off of the ambience mode/bluetooth mode and the change of the UI interfaces of the first display and the second display are implemented in the software architecture of the display device as follows.
In some embodiments, the memory of the display device has at least a first application and a second application stored therein, wherein the first application may be a large screen setting application APP1 shown in fig. 22, the second application may be a small screen home application APP2 shown in fig. 21-22, and the execution bodies for running the large screen setting application APP1 and the small screen home application APP2 may be two controllers, namely, a large screen setting application APP1 running through the first controller and a small screen home application APP2 running through the second controller; the execution subject running the large screen setting application APP1 and the small screen home application APP2 may also be one controller, that is, the large screen setting application APP1 and the small screen home application APP2 are simultaneously run through one controller.
In some embodiments, when the execution bodies running the large screen setting application APP1 and the small screen home application APP2 are two controllers:
the first controller may control the operation of the display device and respond to user operations associated with the first display by running a large screen setting application APP1 stored in memory. For example, control presents a first user interface on a first display, the first user interface including a number of UI objects thereon; in response to a received user command to a UI object on the first user interface, the first controller may perform an operation related to the object selected by the user command.
The second controller may control the operation of the display device and respond to user operations associated with the second display by running a small screen home application APP2 stored on memory, and interacting with the first controller. For example, control presents a second user interface on the second display, the second user interface including a number of UI objects thereon, and in response to a received user command for a UI object on the second user interface, the second controller may perform an operation related to the object selected by the user command.
In some embodiments, when the first controller runs the large screen setting APP1 stored in the memory, the first display may display a setting menu for setting the second display, and the user may set various items of parameter data of the second display in the setting menu for setting the second display, so that the second controller refreshes the user interface displayed by the second display by running the small screen home page APP2 stored in the memory, and the second display displays the corresponding user interface.
Referring to fig. 21, a flowchart of registering an ambience mode service for a display device is shown. Various functions of the display device and data processing may be performed by the first controller running the APP1 stored in the memory and by the second controller running the APP2 stored in the memory. In some embodiments, in order to make the display device have the function corresponding to the atmosphere mode item, the APP2 needs to be executed by the second controller in advance, so that the APP2 calls a software development kit through a Framework by walking a preset interface (JNI interface), and registers the corresponding service related to the atmosphere mode project in the software development kit, the software development kit can be HiRPCSDK, the HiRPCSDK is a set of development tools when application software is established, such as a specific software package, a software framework, a hardware platform, an operating system and the like, and comprises a set of related documents, examples and tools which are used for assisting in developing a certain type of software in a broad sense, the HiRPCSDK comprises a ServiceMap (service correspondence table), after the HiRPCSDK is called by an application program APP2, a service regarding the on-off state of the atmosphere mode may be established in the ServiceMap, to complete the registration work of the service regarding the on-off state of the atmosphere mode, table 1 is registered ServiceMap:
table 1:
Service Fd
S1 /
S2 /
the Service represents a Service item, and the Fd represents a parameter corresponding to the Service item, for example, S1 is a Service of a registered on-off state of an atmosphere mode, and the corresponding parameter may not be recorded in a table first, and after a user sets an on-off state of the atmosphere mode in a menu setting interface presented when running APP1, the corresponding state is recorded in the table, or a switching state of an atmosphere mode may be preset first, for example, the preset atmosphere mode is in an on state, and if the "0, 1" logic represents the on-off state of the atmosphere mode, the parameter Fd corresponding to S1 may be set to "1" so that the default atmosphere mode is in the on state when the display device is started.
The above embodiment exemplarily shows only one service (service registering the on-off state of the atmosphere mode) registered in the HiRPCSDK, and is in practical use. A plurality of services may be generated in the plurality of services registered in the HiRPCSDK, i.e., the generated ServiceMap, and for example, any one or several of the atmosphere mode video playback speed selection Service, the brightness adjustment Service, and the video type selection Service may be registered.
In some embodiments, after the Service is registered in the hirpcsid, that is, after the corresponding ServiceMap is established, the user may set various parameter data of the second display in the setting menu displayed by the first display when the APP1 is running, and transmit the data to the hirpcsid, and the hirpcsid calls the ServiceMap registered by the APP2 about the atmosphere mode item, so as to record the received data in the corresponding ServiceMap, so that the services are in one-to-one correspondence with the Fd, and according to the updated ServiceMap, the second controller refreshes the user interface displayed by the second display by running the APP2, so that the second display displays the corresponding user interface.
In some embodiments, after the APP2 registers the service corresponding to the atmosphere mode item in the HiRPCSDK, the atmosphere mode item may be set by performing the steps shown in fig. 22, so that the second display presents a corresponding user interface. Taking the example that the user sets the opening atmosphere mode on the first display and the second display presents the effect shown by the atmosphere mode, the first controller controls to operate the APP1 and refresh the first user interface, so that the setting menu of the second display is displayed on the first user interface. The user may set the on/off state of the atmosphere mode on the setting menu, and after receiving and storing the input on/off state of the atmosphere mode, the application APP1 may process the data, for example, the input on/off state of the atmosphere mode may be logically represented by "0, 1", that is, "0" represents that the atmosphere mode is in the off state, and "1" represents that the atmosphere mode is in the on state. The APP1 sends the processed data to HiRPCSDK, which calls the ServiceMap registered by the APP2 about the atmosphere mode item, fills the received data into the corresponding ServiceMap, and updates the registered ServiceMap, as shown in table 2:
table 2:
Service Fd
S1
1
S2 N
s1 may be a registered service in the on/off state of the atmosphere mode, where the parameter Fd corresponding to the registered service in the on/off state of the atmosphere mode is 1, which indicates that the atmosphere mode is in the on state, S2 may be any registered service, and N is the corresponding parameter. For example, S2 may be a registered atmosphere video service, and its corresponding N may be a play path of an atmosphere video, which is a video played by the second display device in a state of the atmosphere mode.
In some embodiments, after the ServiceMap is updated, the second controller may control the APP2 to refresh the user interface displayed by the second display according to the service registered in the ServiceMap and the corresponding parameter thereof, so that the second display plays/closes the atmosphere video.
In some embodiments, when the execution subject running the large screen setting application APP1 and the small screen home application APP2 is one controller:
the controller may control the operation of the display device and respond to user operations associated with the first display by running a large screen setting application APP1 stored in memory. For example, control presents a user interface on the first display, the user interface including a number of UI objects thereon; in response to a received user command to a UI object on the user interface, the controller may perform an operation related to the object selected by the user command.
The controller may also control the operation of the display device and respond to user operations associated with the second display by running a small screen home application APP2 stored on the memory. For example, control presents a user interface on the second display, the user interface including a number of UI objects, and in response to receiving a user command for a UI object on the user interface, the controller may perform an operation related to the object selected by the user command.
In some embodiments, when the controller runs the large screen setting APP1 stored in the memory, the first display may display a setting menu for setting the second display, and the user may set various parameter data of the second display in the setting menu for setting the second display, so that the controller refreshes the user interface displayed on the second display by running the small screen home page APP2 stored in the memory, and the second display displays the corresponding user interface.
In some embodiments, when the execution subject running the large screen setting application APP1 and the small screen home application APP2 is one controller, the step of registering and/or starting the ambience mode service on the display device may refer to the above case when the execution subjects running the large screen setting application APP1 and the small screen home application APP2 are two controllers, which will not be described herein.
According to the above embodiments, an embodiment of the present application further provides a method for controlling a display device, where an execution subject of the method is a controller in the display device, and the method includes the steps shown in fig. 23:
s110: a first user interface is presented on a first display and a second user interface is presented on a second display.
In some embodiments, the first user interface is configured to display a setting page and/or a first playing page, and the second user interface is configured to display a second playing page and/or an information pushing page.
The first playing page may be a page displayed after the first display starts the first application, the first application may be an application for playing the media assets, and after the first application is started, the corresponding media assets may be played in the first playing page. The setting page comprises a main screen setting page and an auxiliary screen setting page, a user can perform corresponding operation on the main screen setting page, the first controller responds to the operation of the user to control the first display to execute corresponding operation, the user can also perform corresponding operation on the auxiliary screen setting page, the first controller responds to the operation of the user to send a corresponding command to the second controller, and the second controller receives the command and then controls the second display to execute corresponding operation. The information push page is used for displaying push information, and the push information specifically includes: weather information, history records (such as video watching records) of application login accounts, various types of updating information (such as drama update and order progress provided by an application), various types of recommendation information (such as commodity recommendation and new film/program recommendation), real-time/hot information, various types of help information (such as how to shut down, how to use a voice function and the like) and the like.
S120: and receiving a menu item display instruction input by a user on the first user interface, wherein the menu item display instruction indicates that a setting menu of the second display is displayed.
S130: and responding to the menu item display instruction, displaying a setting menu of the second display on the first user interface, wherein the setting menu comprises an item for setting the on-off state of an atmosphere mode and an item for setting the video played by the second display when the atmosphere mode is started, and the atmosphere mode refers to the video associated with the item for playing the video on the second user interface.
S140: turning on or off the ambience mode in response to an operation of the item for setting an ambience mode switch state.
In some embodiments, in response to the power-on of the display device, it is detected whether the atmosphere mode is turned on, and if the atmosphere mode is turned on, a default video is played on the second user interface, where the default video is a video associated with any one or more items for setting a video played by the second display.
In some embodiments, when the atmosphere mode is turned off, the item for setting the video played by the second display is in an unselected state, and when the atmosphere mode is turned on, the item for setting the video played by the second display is in a selectable state; and if the atmosphere mode is started, responding to the selected operation of a target item, and playing a target video on the second user interface, wherein the target item is one of the items for setting the video played by the second display, and the target video is a video associated with the target item.
In some embodiments, the playing a target video on the second user interface in response to the selection of the target item further comprises: and responding to the selected operation of the target item, and circularly playing the target video on the second user interface.
In some embodiments, the playing a target video on the second user interface in response to the selection of the target item further comprises: in response to the selection operation of a plurality of target items, generating a play list, wherein the play list comprises all the selected target items; and playing the video associated with the target item in the play list on the second user interface.
In some embodiments, the sound mode is turned on or off in response to an operation of a user on the control for setting the sound mode, where the sound box mode refers to a mode in which the controller is in an operating state and the first display is in a screen-off state, and if the atmosphere mode and the sound mode are turned on at the same time, a play setting interface is superimposed on the second user interface, where the play setting interface includes a control for switching play audio, and the currently-played audio is switched to a target audio in response to an operation of the user on the control for switching play audio.
In some embodiments, the playback setting interface is a transparent interface, the transparency of the playback setting interface being greater than the transparency of the first user interface.
In some embodiments, the control for switching playing audio is a transparent control, and the transparency of the control for switching playing audio is less than or equal to the transparency of the playing setting interface.
In some embodiments, the setting menu further comprises an item for adjusting the brightness of the second display, and the brightness of the second display is set in response to the user operating the item for adjusting the brightness of the second display, and the second display is adjusted to the target brightness.
According to the technical scheme, the display device comprises a first display and a second display, the first display is used for presenting a first user interface, and the second display is used for presenting a second user interface. Display device has atmosphere mode function, and the user can be through the corresponding instruction of input at first user interface, and control display device opens or close the atmosphere mode, opens the atmosphere mode back, and the second display shows the target video in order to decorate first display to play and dry by the fire the atmosphere, increase the interesting purpose that the user used display device, be favorable to user experience.
In a specific implementation, the present invention further provides a computer storage medium, where the computer storage medium may store a program, and the program may include some or all of the steps in the embodiments of the display device and the control method of the display device provided by the present invention when executed. The storage medium may be a magnetic disk, an optical disk, a read-only memory (ROM) or a Random Access Memory (RAM).
Those skilled in the art will readily appreciate that the techniques of the embodiments of the present invention may be implemented as software plus a required general purpose hardware platform. Based on such understanding, the technical solutions in the embodiments of the present invention may be essentially or partially implemented in the form of a software product, which may be stored in a storage medium, such as ROM/RAM, magnetic disk, optical disk, etc., and includes several instructions for enabling a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the method according to the embodiments or some parts of the embodiments.
The same and similar parts in the various embodiments in this specification may be referred to each other. In particular, for the display device and the control method embodiment of the display device, since the embodiments are basically similar to the display device embodiment, the description is relatively simple, and for the relevant points, reference may be made to the description in the display device embodiment.
The above-described embodiments of the present invention should not be construed as limiting the scope of the present invention.

Claims (10)

1. A display device, comprising:
the first display is used for presenting a first user interface, and a setting page is displayed in the first user interface;
the second display is used for presenting a second user interface, and a setting page and/or a playing picture are displayed in the second user interface;
a controller configured to:
presenting the first user interface on the first display and the second user interface on the second display;
receiving a menu item display instruction input by a user on the first user interface, wherein the menu item display instruction indicates that a setting menu of the second display is displayed;
responding to the menu item display instruction, displaying a setting menu of the second display on the first user interface, wherein the setting menu comprises an item for setting an atmosphere mode switch state and an item for setting a video played by the second display when the atmosphere mode is started, and the atmosphere mode refers to a video associated with the item for playing the video on the second user interface;
turning on or off the ambience mode in response to an operation of the item for setting an ambience mode switch state.
2. The display device of claim 1, wherein the setup menu comprises a number of the items for setting up the video played by the second display, and wherein the controller is further configured to:
detecting whether the atmosphere mode is turned on in response to the power-on of the display device;
and if the atmosphere mode is started, playing a default video on the second user interface, wherein the default video is any one or more videos related to the items for setting the video played by the second display.
3. The display device according to claim 2, wherein the item for setting the video played by the second display is in an unselected state when the atmosphere mode is turned off, and the item for setting the video played by the second display is in a selectable state when the atmosphere mode is turned on;
if the ambience mode is turned on, the controller is further configured to:
and responding to the selected operation of a target item, playing a target video on the second user interface, wherein the target item is one of a plurality of items for setting the video played by the second display, and the target video is a video associated with the target item.
4. The display device according to claim 3, wherein the playing a target video on the second user interface in response to the selection of the target item further comprises:
and responding to the selected operation of the target item, and circularly playing the target video on the second user interface.
5. The display device according to claim 3, wherein the playing a target video on the second user interface in response to the selection of the target item further comprises:
in response to the selection operation of a plurality of target items, generating a play list, wherein the play list comprises all the selected target items;
and playing the video associated with the target item in the play list on the second user interface.
6. The display device of claim 1, wherein the setup menu further comprises controls for setting up a sound mode, the controller further configured to:
responding to the operation of a user on the control for setting the sound mode, and starting or closing the sound mode, wherein the sound box mode refers to a mode that the controller is in a running state and the first display is in a screen-off state;
if the atmosphere mode and the sound mode are started simultaneously, a playing setting interface is superposed on the second user interface, and the playing setting interface comprises a control for switching playing audio;
and responding to the operation of the user on the control for switching the playing audio, and switching the currently played audio to the target audio.
7. The display device of claim 6, wherein the play setting interface is a transparent interface, and wherein a transparency of the play setting interface is greater than a transparency of the first user interface.
8. The display device according to claim 7, wherein the control for switching to play audio is a transparent control, and the transparency of the control for switching to play audio is less than or equal to the transparency of the play setting interface.
9. The display device of claim 2, further comprising an item on the setup menu for adjusting the brightness of the second display, the controller further configured to:
setting the brightness of the second display and adjusting the second display to a target brightness in response to the operation of the item for adjusting the brightness of the second display by the user.
10. A control method of a display device, characterized by comprising:
presenting a first user interface on a first display and a second user interface on a second display;
receiving a menu item display instruction input by a user on the first user interface, wherein the menu item display instruction indicates that a setting menu of the second display is displayed;
responding to the menu item display instruction, displaying a setting menu of the second display on the first user interface, wherein the setting menu comprises an item for setting an atmosphere mode switch state and an item for setting a video played by the second display when the atmosphere mode is started, and the atmosphere mode refers to a video associated with the item for playing the video on the second user interface;
turning on or off the ambience mode in response to an operation of the item for setting an ambience mode switch state.
CN202210000465.1A 2022-01-04 2022-01-04 Display equipment and control method Active CN114339372B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210000465.1A CN114339372B (en) 2022-01-04 2022-01-04 Display equipment and control method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210000465.1A CN114339372B (en) 2022-01-04 2022-01-04 Display equipment and control method

Publications (2)

Publication Number Publication Date
CN114339372A true CN114339372A (en) 2022-04-12
CN114339372B CN114339372B (en) 2024-05-28

Family

ID=81022173

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210000465.1A Active CN114339372B (en) 2022-01-04 2022-01-04 Display equipment and control method

Country Status (1)

Country Link
CN (1) CN114339372B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20060111777A (en) * 2005-04-25 2006-10-30 엘지전자 주식회사 Television for automatically creating a seeing and hearing atmosphere of image and its method
CN111479370A (en) * 2020-03-31 2020-07-31 Oppo广东移动通信有限公司 Electronic equipment, control method of atmosphere lamp of electronic equipment and computer storage medium
US20200344508A1 (en) * 2019-04-23 2020-10-29 At&T Intellectual Property I, L.P. Dynamic video background responsive to environmental cues
CN112791388A (en) * 2021-01-22 2021-05-14 网易(杭州)网络有限公司 Information control method and device and electronic equipment
CN113220176A (en) * 2021-04-13 2021-08-06 Oppo广东移动通信有限公司 Display method and device based on widget, electronic equipment and readable storage medium
WO2021223074A1 (en) * 2020-05-06 2021-11-11 海信视像科技股份有限公司 Display device and interaction control method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20060111777A (en) * 2005-04-25 2006-10-30 엘지전자 주식회사 Television for automatically creating a seeing and hearing atmosphere of image and its method
US20200344508A1 (en) * 2019-04-23 2020-10-29 At&T Intellectual Property I, L.P. Dynamic video background responsive to environmental cues
CN111479370A (en) * 2020-03-31 2020-07-31 Oppo广东移动通信有限公司 Electronic equipment, control method of atmosphere lamp of electronic equipment and computer storage medium
WO2021223074A1 (en) * 2020-05-06 2021-11-11 海信视像科技股份有限公司 Display device and interaction control method
CN112791388A (en) * 2021-01-22 2021-05-14 网易(杭州)网络有限公司 Information control method and device and electronic equipment
CN113220176A (en) * 2021-04-13 2021-08-06 Oppo广东移动通信有限公司 Display method and device based on widget, electronic equipment and readable storage medium

Also Published As

Publication number Publication date
CN114339372B (en) 2024-05-28

Similar Documents

Publication Publication Date Title
CN111526415B (en) Double-screen display equipment and HDMI switching method thereof
CN111491190B (en) Dual-system camera switching control method and display equipment
CN111510788B (en) Display method and display device for double-screen double-system screen switching animation
CN111526402A (en) Method for searching video resources through voice of multi-screen display equipment and display equipment
CN111464840B (en) Display device and method for adjusting screen brightness of display device
CN112788422A (en) Display device
CN113115083A (en) Display apparatus and display method
CN112788423A (en) Display device and display method of menu interface
CN112788378B (en) Display device and content display method
CN113141528B (en) Display device, boot animation playing method and storage medium
CN112784137A (en) Display device, display method and computing device
WO2021223074A1 (en) Display device and interaction control method
CN113497884B (en) Dual-system camera switching control method and display equipment
CN113365124B (en) Display device and display method
CN112788381B (en) Display apparatus and display method
CN112788387B (en) Display apparatus, method and storage medium
WO2021088308A1 (en) Display device and music recommendation method
CN114339372B (en) Display equipment and control method
CN113015023A (en) Method and device for controlling video in HTML5 webpage
CN112788375A (en) Display device, display method and computing device
CN112927653A (en) Display device and backlight brightness control method
CN111970547B (en) Display device
CN113453079B (en) Control method for returning double-system-size double-screen application and display equipment
CN112788380B (en) Display device and display method
CN113630633B (en) Display device and interaction control method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant