WO2021031629A1 - 显示设备和控制装置按键复用方法 - Google Patents

显示设备和控制装置按键复用方法 Download PDF

Info

Publication number
WO2021031629A1
WO2021031629A1 PCT/CN2020/090468 CN2020090468W WO2021031629A1 WO 2021031629 A1 WO2021031629 A1 WO 2021031629A1 CN 2020090468 W CN2020090468 W CN 2020090468W WO 2021031629 A1 WO2021031629 A1 WO 2021031629A1
Authority
WO
WIPO (PCT)
Prior art keywords
interface
display device
button
display
key
Prior art date
Application number
PCT/CN2020/090468
Other languages
English (en)
French (fr)
Inventor
王大勇
于文钦
朱铄
丁国耀
Original Assignee
海信视像科技股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 海信视像科技股份有限公司 filed Critical 海信视像科技股份有限公司
Publication of WO2021031629A1 publication Critical patent/WO2021031629A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/4104Peripherals receiving signals from specially adapted client devices
    • H04N21/4122Peripherals receiving signals from specially adapted client devices additional display device, e.g. video projector
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
    • H04N21/42206User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor characterized by hardware details
    • H04N21/42212Specific keyboard arrangements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4788Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/8166Monomedia components thereof involving executable data, e.g. software
    • H04N21/8173End-user applications, e.g. Web browser, game
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • H04N23/632Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters for displaying or modifying preview images prior to image capturing, e.g. variety of image resolutions or capturing parameters

Definitions

  • This application relates to the technical field of display devices, and in particular to a method for multiplexing keys of a display device and a control device.
  • display devices can provide users with playback screens such as audio, video, and pictures, and have received widespread attention.
  • FIG. 1 exemplarily shows an interaction scene between the control device 100 and the display device 200.
  • a user can operate the display device 200 through the control device 100.
  • the control device 100 may be the control device 100A, which can communicate with the display device 200 through infrared protocol communication, Bluetooth protocol communication, ZigBee protocol communication or other short-distance communication methods for wireless or other
  • the display device 200 is controlled in a wired manner.
  • the user can control the display device 200 by inputting user instructions through buttons on the control device, voice input, etc.
  • the user can control the display device 200 by inputting corresponding control instructions through the volume plus and minus keys, channel control keys, up/down/left/right movement keys, voice input keys, menu keys, and power on/off keys on the control device. Function.
  • the control device has a limited number of buttons due to its small size and simple appearance. For example, as shown in Figure 8, the control device only has “volume +", “volume -”, “up”, and “direction”. Down”, “OK”, “Voice”, “Back”, “Home” and “On/Off” buttons.
  • the limited number of buttons on the control device and the inherent functions of each button make it difficult to achieve flexible and diverse control of the display device.
  • the present application provides a method for multiplexing keys of a display device and a remote control.
  • this application provides a display device, including:
  • the camera is configured to: collect image data;
  • the display is configured to: present a user interface and/or an image interface
  • the controller is configured as:
  • this application provides a display device, including:
  • the camera is configured to: collect image data;
  • the display is configured to: present a user interface
  • the controller is configured as:
  • this application provides a display device, including:
  • the camera is configured to: collect image data;
  • the display is configured to: present a user interface
  • the controller is configured as:
  • the function of adjusting the focal length of the camera is executed.
  • this application provides a display device, including:
  • the camera is configured to: collect image data;
  • the display is configured to: present a user interface and/or an image interface
  • the controller is configured as:
  • the controller executes a function related to video playback
  • the controller executes a function of zooming in or zooming out the image data.
  • this application provides a display device, including:
  • the camera is configured to: collect image data;
  • the display is configured to: present a user interface and/or an image interface
  • the controller is configured as:
  • the controller executes a function related to video playback
  • the controller executes a function of adjusting the focal length of the camera.
  • this application provides a display device, including:
  • Display used to present the user interface
  • the controller is configured as:
  • the key multiplexing strategy is updated according to the keys multiplexed on the second interface.
  • the button state is the multiplexed state, and the button state of the unmultiplexed button is the native state;
  • this application also provides a remote control key multiplexing method, the method including:
  • the key multiplexing strategy is updated according to the keys multiplexed on the second interface.
  • the button state is the multiplexed state, and the button state of the unmultiplexed button is the native state;
  • Fig. 1 exemplarily shows a schematic diagram of an operation scene between a display device and a control device
  • FIG. 2 exemplarily shows a hardware configuration block diagram of the control device 100
  • FIG. 3 exemplarily shows a hardware configuration block diagram of the display device 200
  • FIG. 4 exemplarily shows a block diagram of the hardware architecture of the display device 200 according to FIG. 3;
  • FIG. 5 exemplarily shows a schematic diagram of the functional configuration of the display device 200
  • Fig. 6a exemplarily shows a schematic diagram of the software configuration in the display device 200
  • FIG. 6b exemplarily shows a configuration diagram of an application program in the display device 200
  • FIG. 7 exemplarily shows a schematic diagram of the user interface of the display device 200
  • Fig. 8 is a schematic diagram showing keys of a remote control according to an exemplary embodiment of the application.
  • Fig. 9 is a flowchart of a method for multiplexing keys of a remote control of a display device according to an exemplary embodiment of the present application.
  • Figure 10a is the application interface that the user enters after opening the "photograph application” in the application center;
  • Fig. 10b is another user interface shown in this application according to an exemplary embodiment
  • 11a-11f are scenes of multiplexing the "volume +" or "volume -" buttons on the remote control in a camera-related application according to an exemplary embodiment of this application;
  • Fig. 12 is a flowchart of a method for multiplexing remote control keys according to an exemplary embodiment of this application.
  • the display device provided in this application may be a display device with multiple chip architectures, such as the display device with a dual-chip (dual hardware system) architecture shown in FIGS. 3 to 5 of this application, or a non-dual chip architecture
  • the application does not limit this.
  • various external device interfaces are usually provided on the display device to facilitate the connection of different peripheral devices or cables to realize corresponding functions.
  • a high-definition camera is connected to the interface of the display device, if the hardware system of the display device does not have the hardware interface of the high-pixel camera that receives the source code, then the data received by the camera cannot be presented to the display of the display device. On the screen.
  • module refers to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware or/and software code that can perform functions related to the element.
  • remote control refers to a component of an electronic device (such as the display device disclosed in this application), which can generally control the electronic device wirelessly within a short distance.
  • infrared and/or radio frequency (RF) signals and/or Bluetooth are used to connect to electronic devices, and may also include functional modules such as WiFi, wireless USB, Bluetooth, and motion sensors.
  • RF radio frequency
  • a handheld touch remote control uses a user interface in a touch screen to replace most of the physical built-in hard keys in general remote control devices.
  • gesture used in this application refers to a user's behavior through a change of hand shape or hand movement to express expected ideas, actions, goals, and/or results.
  • Fig. 1 exemplarily shows a schematic diagram of an operation scenario between a display device and a control device according to an embodiment. As shown in FIG. 1, the user can operate the display device 200 by controlling the device 100.
  • the control device 100 may be a remote controller 100A, which can communicate with the display device 200 through infrared protocol communication, Bluetooth protocol communication, ZigBee protocol communication or other short-distance communication methods for wireless or other communication
  • the display device 200 is controlled in a wired manner.
  • the user can control the display device 200 by inputting user instructions through keys on the remote control, voice input, control panel input, etc.
  • the user can control the display device 200 by inputting corresponding control commands through the volume plus and minus keys, channel control keys, up/down/left/right movement keys, voice input keys, menu keys, and switch buttons on the remote control. Function.
  • the control device 100 may also be a smart device, such as a mobile terminal 100B, a tablet computer, a computer, a notebook computer, etc., which may be connected to a local area network (LAN), a wide area network (WAN, Wide Area Network), and a wireless local area network ((WLAN) , Wireless Local Area Network) or other networks communicate with the display device 200, and realize the control of the display device 200 through an application program corresponding to the display device 200.
  • LAN local area network
  • WAN wide area network
  • WLAN wireless local area network
  • Wireless Local Area Network Wireless Local Area Network
  • GUI Graphic User Interface
  • the control can include icons, buttons, menus, tabs, text boxes, dialog boxes, status bars, navigation bars, Widgets, etc. Visual interface elements.
  • both the mobile terminal 100B and the display device 200 can be installed with software applications, so that the connection and communication between the two can be realized through a network communication protocol, thereby realizing one-to-one control operation and data communication.
  • the mobile terminal 100B can establish a control command protocol with the display device 200, synchronize the remote control keyboard to the mobile terminal 100B, and control the display device 200 by controlling the user interface of the mobile terminal 100B; or the mobile terminal 100B
  • the audio and video content displayed on the screen is transmitted to the display device 200 to realize the synchronous display function.
  • the display device 200 can also communicate with the server 300 through multiple communication methods.
  • the display device 200 may be allowed to communicate with the server 300 via a local area network, a wireless local area network, or other networks.
  • the server 300 may provide various contents and interactions to the display device 200.
  • the display device 200 transmits and receives information, interacts with an Electronic Program Guide (EPG, Electronic Program Guide), receives software program updates, or accesses a remotely stored digital media library.
  • EPG Electronic Program Guide
  • the server 300 may be a group or multiple groups, and may be one or more types of servers.
  • the server 300 provides other network service content such as video on demand and advertising services.
  • the display device 200 may be a liquid crystal display, an OLED (Organic Light Emitting Diode) display, a projection display device, or a smart TV.
  • OLED Organic Light Emitting Diode
  • the specific display device type, size, resolution, etc. are not limited, and those skilled in the art can understand that the display device 200 can make some changes in performance and configuration as required.
  • the display device 200 may additionally provide a smart network TV function that provides a computer support function. Examples include Internet TV, Smart TV, Internet Protocol TV (IPTV) and so on.
  • IPTV Internet Protocol TV
  • the display device may be connected or provided with a camera, which is used to present the picture captured by the camera on the display interface of the display device or other display devices to realize interactive chats between users.
  • the picture captured by the camera can be displayed on the display device in full screen, half screen, or in any optional area.
  • the camera is connected to the monitor rear shell through a connecting plate, and is fixedly installed on the upper middle of the monitor rear shell.
  • a connecting plate As an installable method, it can be fixedly installed at any position of the monitor rear shell to ensure its It is only necessary that the image capture area is not blocked by the rear shell, for example, the image capture area and the display device have the same orientation.
  • the camera can be connected to the display rear shell through a connecting plate or other conceivable connectors.
  • the connector is equipped with a lifting motor.
  • the user wants to use the camera or has an application to use the camera
  • it can be raised above the display.
  • the camera is not needed, it can be embedded behind the back shell to protect the camera from damage.
  • the camera used in this application may have 16 million pixels to achieve the purpose of ultra-high-definition display. In actual use, a camera with higher or lower than 16 million pixels can also be used.
  • the content displayed in different application scenarios of the display device can be merged in many different ways, so as to achieve functions that cannot be achieved by traditional display devices.
  • the user can video chat with at least one other user while watching a video program.
  • the presentation of the video program can be used as the background picture, and the video chat window is displayed on the background picture.
  • At least one video chat is performed across terminals.
  • the user can video chat with at least one other user while entering the education application for learning.
  • students can realize remote interaction with teachers while learning content in educational applications. Visually, you can call this function "learning and chatting”.
  • a video chat is conducted with players entering the game.
  • players For example, when a player enters a game application to participate in a game, it can realize remote interaction with other players. Visually, you can call this function "watch and play".
  • the game scene is integrated with the video picture, and the portrait in the video picture is cut out and displayed on the game picture to improve user experience.
  • somatosensory games such as ball games, boxing games, running games, dancing games, etc.
  • human body postures and movements are acquired through the camera, body detection and tracking, and the detection of human bone key points data, and then the game Animations are integrated to realize games such as sports and dance scenes.
  • the user can interact with at least one other user in video and voice in the K song application.
  • multiple users can jointly complete the recording of a song.
  • the user can turn on the camera locally to obtain pictures and videos. If it is vivid, this function can be called "mirror”.
  • Fig. 2 exemplarily shows a configuration block diagram of the control device 100 according to an exemplary embodiment.
  • the control device 100 includes a controller 110, a communicator 130, a user input/output interface 140, a memory 190, and a power supply 180.
  • the control device 100 is configured to control the display device 200, and can receive user input operation instructions, and convert the operation instructions into instructions that can be recognized and responded to by the display device 200, and serve as an interactive intermediary between the user and the display device 200 effect.
  • the user operates the channel addition and subtraction keys on the control device 100, and the display device 200 responds to the channel addition and subtraction operations.
  • control device 100 may be a smart device.
  • control device 100 can install various applications for controlling the display device 200 according to user requirements.
  • the mobile terminal 100B or other smart electronic devices can perform similar functions to the control device 100 after installing an application for controlling the display device 200.
  • the user can install various function keys or virtual buttons of the graphical user interface that can be provided on the mobile terminal 100B or other smart electronic devices by installing applications to realize the function of the physical keys of the control device 100.
  • the controller 110 includes a processor 112, RAM 113 and ROM 114, a communication interface, and a communication bus.
  • the controller 110 is used to control the operation and operation of the control device 100, as well as the communication and cooperation between internal components, and external and internal data processing functions.
  • the communicator 130 realizes communication of control signals and data signals with the display device 200 under the control of the controller 110. For example, the received user input signal is sent to the display device 200.
  • the communicator 130 may include at least one of communication modules such as a WIFI module 131, a Bluetooth module 132, and an NFC module 133.
  • the user input/output interface 140 wherein the input interface includes at least one of input interfaces such as a microphone 141, a touch panel 142, a sensor 143, and a button 144.
  • input interfaces such as a microphone 141, a touch panel 142, a sensor 143, and a button 144.
  • the user can implement the user instruction input function through voice, touch, gesture, pressing and other actions.
  • the input interface converts the received analog signal into a digital signal and the digital signal into a corresponding instruction signal, which is sent to the display device 200.
  • the output interface includes an interface for sending the received user instruction to the display device 200.
  • it may be an infrared interface or a radio frequency interface.
  • the user input instruction needs to be converted into an infrared control signal according to the infrared control protocol, and sent to the display device 200 via the infrared sending module.
  • a radio frequency signal interface a user input instruction needs to be converted into a digital signal, and then modulated according to the radio frequency control signal modulation protocol, and then sent to the display device 200 by the radio frequency transmitting terminal.
  • control device 100 includes at least one of a communicator 130 and an output interface.
  • the control device 100 is equipped with a communicator 130, such as WIFI, Bluetooth, NFC and other modules, which can encode user input commands through the WIFI protocol, or the Bluetooth protocol, or the NFC protocol, and send them to the display device 200.
  • a communicator 130 such as WIFI, Bluetooth, NFC and other modules, which can encode user input commands through the WIFI protocol, or the Bluetooth protocol, or the NFC protocol, and send them to the display device 200.
  • the memory 190 is used to store various operating programs, data and applications for driving and controlling the control device 100 under the control of the controller 110.
  • the memory 190 can store various control signal instructions input by the user.
  • the power supply 180 is used to provide operating power support for each element of the control device 100 under the control of the controller 110. Can battery and related control circuit.
  • FIGS. 3 to 5 a block diagram of the hardware configuration of the hardware system in the display device 200 adopting a dual chip is given.
  • the mechanism relationship of the hardware system can be shown in Figure 3.
  • one hardware system in the dual hardware system architecture is referred to as the first hardware system or A system, A chip, and the other hardware system is referred to as the second hardware system or N system, N chip.
  • the A chip includes the controller and various interfaces of the A chip
  • the N chip includes the controller and various interfaces of the N chip.
  • An independent operating system may be installed in the A chip and the N chip, so that there are two independent but related subsystems in the display device 200.
  • the A chip may also be referred to as the first chip, and the functions performed by it may be equivalent to or included in the first controller, and the N chip may also be referred to as the second chip, and the functions performed by it may be Equal to or included in the second controller.
  • the A chip and the N chip can realize connection, communication and power supply through multiple different types of interfaces.
  • the interface type of the interface between the A chip and the N chip may include general-purpose input/output (GPIO), USB interface, HDMI interface, UART interface, etc.
  • GPIO general-purpose input/output
  • USB interface USB interface
  • HDMI interface HDMI interface
  • UART interface UART interface
  • One or more of these interfaces can be used between the A chip and the N chip for communication or power transmission.
  • the N chip can be powered by an external power source, and the A chip can be powered by the N chip instead of the external power source.
  • the A chip may also include interfaces for connecting other devices or components, such as the MIPI interface for connecting to a camera (Camera) shown in FIG. 3, a Bluetooth interface, etc.
  • the N chip can also include a VBY interface for connecting to the display screen TCON (Timer Control Register), which is used to connect a power amplifier (Amplifier, AMP) and a speaker (Speaker). ) I2S interface; and IR/Key interface, USB interface, Wifi interface, Bluetooth interface, HDMI interface, Tuner interface, etc.
  • TCON Timer Control Register
  • AMP power amplifier
  • Speaker speaker
  • I2S interface I2S interface
  • IR/Key interface IR/Key interface
  • USB interface USB interface
  • Wifi interface Wireless Fidelity
  • Bluetooth interface HDMI interface
  • Tuner interface etc.
  • FIG. 4 is only some exemplary descriptions of the dual hardware system architecture of the present application, and does not represent a limitation of the present application. In practical applications, both hardware systems can contain more or less hardware or interfaces as required.
  • FIG. 4 exemplarily shows a hardware architecture block diagram of the display device 200 according to FIG. 3.
  • the hardware system of the display device 200 may include an A chip and an N chip, and modules connected to the A chip or the N chip through various interfaces.
  • the N chip may include a tuner and demodulator 220, a communicator 230, an external device interface 250, a controller 210, a memory 290, a user input interface, a video processor 260-1, an audio processor 260-2, a display 280, and an audio output interface 270. Power supply. In other embodiments, the N chip may also include more or fewer modules.
  • the tuner and demodulator 220 is used to perform modulation and demodulation processing such as amplifying, mixing, and resonating broadcast television signals received through wired or wireless methods, thereby demodulating the user’s information from multiple wireless or cable broadcast television signals. Select the audio and video signals carried in the frequency of the TV channel, and additional information (such as EPG data signals).
  • the signal path of the tuner and demodulator 220 can be many kinds, such as: terrestrial broadcasting, cable broadcasting, satellite broadcasting or Internet broadcasting, etc.; and according to different modulation types, the signal adjustment method can be digitally modulated The method may also be an analog modulation method; and according to different types of received television signals, the tuner demodulator 220 may demodulate analog signals and/or digital signals.
  • the tuner and demodulator 220 is also used to respond to the TV channel frequency selected by the user and the TV signal carried by the frequency according to the user's selection and control by the controller 210.
  • the tuner demodulator 220 may also be in an external device, such as an external set-top box.
  • the set-top box outputs TV audio and video signals through modulation and demodulation, and inputs them to the display device 200 through the external device interface 250.
  • the communicator 230 is a component for communicating with external devices or external servers according to various communication protocol types.
  • the communicator 230 may include a WIFI module 231, a Bluetooth communication protocol module 232, a wired Ethernet communication protocol module 233, and an infrared communication protocol module and other network communication protocol modules or near field communication protocol modules.
  • the display device 200 may establish a control signal and a data signal connection with an external control device or content providing device through the communicator 230.
  • the communicator may receive the control signal of the remote controller 100A according to the control of the controller.
  • the external device interface 250 is a component that provides data transmission between the N chip controller 210 and the A chip and other external devices.
  • the external device interface can be connected to external devices such as set-top boxes, game devices, notebook computers, etc. in a wired/wireless manner, and can receive external devices such as video signals (such as moving images), audio signals (such as music), and additional information (such as EPG). ) And other data.
  • the external device interface 250 may include: a high-definition multimedia interface (HDMI) terminal 251, a composite video blanking synchronization (CVBS) terminal 252, an analog or digital component terminal 253, a universal serial bus (USB) terminal 254, red, green, and blue ( RGB) terminal (not shown in the figure) and any one or more.
  • HDMI high-definition multimedia interface
  • CVBS composite video blanking synchronization
  • USB universal serial bus
  • RGB red, green, and blue
  • the controller 210 controls the work of the display device 200 and responds to user operations by running various software control programs (such as an operating system and/or various application programs) stored on the memory 290.
  • various software control programs such as an operating system and/or various application programs
  • the controller 210 includes a read-only memory RAM 214, a random access memory ROM 213, a graphics processor 216, a CPU processor 212, a communication interface 218, and a communication bus.
  • RAM 214 and ROM 213, graphics processor 216, CPU processor 212, and communication interface 218 are connected by a bus.
  • the graphics processor 216 is used to generate various graphics objects, such as icons, operation menus, and user input instructions to display graphics. Including an arithmetic unit, which performs operations by receiving various interactive commands input by the user, and displays various objects according to display attributes. As well as including a renderer, various objects obtained based on the arithmetic unit are generated, and the rendering result is displayed on the display 280.
  • the CPU processor 212 is configured to execute operating system and application program instructions stored in the memory 290. And according to receiving various interactive instructions input from the outside, to execute various applications, data and content, so as to finally display and play various audio and video content.
  • the CPU processor 212 may include multiple processors.
  • the multiple processors may include one main processor and multiple or one sub-processors.
  • the main processor is used to perform some operations of the display device 200 in the pre-power-on mode, and/or to display images in the normal mode.
  • the communication interface may include the first interface 218-1 to the nth interface 218-n. These interfaces may be network interfaces connected to external devices via a network.
  • the controller 210 may control the overall operation of the display device 200. For example, in response to receiving a user command for selecting a UI object to be displayed on the display 280, the controller 210 may perform an operation related to the object selected by the user command.
  • the object may be any one of the selectable objects, such as a hyperlink or an icon.
  • Operations related to the selected object for example: display operations connected to hyperlink pages, documents, images, etc., or perform operations corresponding to the icon.
  • the user command for selecting the UI object may be a command input through various input devices (for example, a mouse, a keyboard, a touch pad, etc.) connected to the display device 200 or a voice command corresponding to the voice spoken by the user.
  • the memory 290 includes storing various software modules for driving and controlling the display device 200.
  • various software modules stored in the memory 290 include: a basic module, a detection module, a communication module, a display control module, a browser module, and various service modules.
  • the basic module is the underlying software module used for signal communication between various hardware in the display device 200 and sending processing and control signals to the upper module.
  • the detection module is a management module used to collect various information from various sensors or user input interfaces, and perform digital-to-analog conversion and analysis management.
  • the voice recognition module includes a voice analysis module and a voice command database module.
  • the display control module is a module for controlling the display 280 to display image content, and can be used to play information such as multimedia image content and UI interfaces.
  • the communication module is a module used for control and data communication with external devices.
  • the browser module is a module used to perform data communication between browsing servers.
  • the service module is a module used to provide various services and various applications.
  • the memory 290 is also used to store and receive external data and user data, images of various items in various user interfaces, and visual effect diagrams of focus objects.
  • the user input interface is used to send a user's input signal to the controller 210, or to transmit a signal output from the controller to the user.
  • the control device (such as a mobile terminal or a remote control) may send input signals input by the user, such as a power switch signal, a channel selection signal, and a volume adjustment signal, to the user input interface, and then the user input interface forwards the input signal to the controller;
  • the control device may receive output signals such as audio, video, or data output from the user input interface processed by the controller, and display the received output signal or output the received output signal as audio or vibration.
  • the user may input a user command through a graphical user interface (GUI) displayed on the display 280, and the user input interface receives the user input command through the graphical user interface (GUI).
  • GUI graphical user interface
  • the user can input a user command by inputting a specific sound or gesture, and the user input interface recognizes the sound or gesture through the sensor to receive the user input command.
  • the video processor 260-1 is used to receive video signals, and perform video data processing such as decompression, decoding, scaling, noise reduction, frame rate conversion, resolution conversion, and image synthesis according to the standard codec protocol of the input signal.
  • the video signal displayed or played directly on the display 280.
  • the video processor 260-1 includes a demultiplexing module, a video decoding module, an image synthesis module, a frame rate conversion module, a display formatting module, and the like.
  • the demultiplexing module is used to demultiplex the input audio and video data stream. For example, if MPEG-2 is input, the demultiplexing module will demultiplex into a video signal and an audio signal.
  • the video decoding module is used to process the demultiplexed video signal, including decoding and scaling.
  • An image synthesis module such as an image synthesizer, is used to superimpose and mix the GUI signal generated by the graphics generator with the zoomed video image according to user input or itself, to generate an image signal for display.
  • Frame rate conversion module used to convert the frame rate of the input video, such as converting the frame rate of the input 24Hz, 25Hz, 30Hz, 60Hz video to the frame rate of 60Hz, 120Hz or 240Hz, where the input frame rate can be compared with the source
  • the video stream is related, and the output frame rate can be related to the update rate of the display.
  • the input has a usual format, such as frame insertion.
  • the display formatting module is used to change the signal output by the frame rate conversion module into a signal that conforms to a display format such as a display, such as format conversion of the signal output by the frame rate conversion module to output RGB data signals.
  • the display 280 is used to receive the image signal input from the video processor 260-1, display video content and images, and a menu control interface.
  • the display 280 includes a display component for presenting a picture and a driving component for driving image display.
  • the displayed video content can be from the video in the broadcast signal received by the tuner and demodulator 220, or from the video content input by the communicator or the external device interface.
  • the display 220 simultaneously displays a user manipulation interface UI generated in the display device 200 and used to control the display device 200.
  • the display 280 it also includes a driving component for driving the display.
  • the display 280 is a projection display, it may also include a projection device and a projection screen.
  • the audio processor 260-2 is used to receive audio signals, and perform decompression and decoding according to the standard codec protocol of the input signal, as well as audio data processing such as noise reduction, digital-to-analog conversion, and amplification processing, and the result can be in the speaker 272 The audio signal to be played.
  • the audio output interface 270 is used to receive the audio signal output by the audio processor 260-2 under the control of the controller 210.
  • the audio output interface may include a speaker 272 or output to an external audio output terminal 274 of a generator of an external device, such as : External audio terminal or headphone output terminal, etc.
  • the video processor 260-1 may include one or more chips.
  • the audio processor 260-2 may also include one or more chips.
  • the video processor 260-1 and the audio processor 260-2 may be separate chips, or they may be integrated with the controller 210 in one or more chips.
  • the power supply is used to provide power supply support for the display device 200 with power input from an external power supply under the control of the controller 210.
  • the power supply may include a built-in power supply circuit installed inside the display device 200, or may be a power supply installed outside the display device 200, such as a power interface that provides an external power supply in the display device 200.
  • the A chip may include a controller 310, a communicator 330, a detector 340, and a memory 390. In some embodiments, it may also include a user input interface, a video processor, an audio processor, a display, and an audio output interface. In some embodiments, there may also be a power supply that independently powers the A chip.
  • the communicator 330 is a component for communicating with external devices or external servers according to various communication protocol types.
  • the communicator 330 may include a WIFI module 331, a Bluetooth communication protocol module 332, a wired Ethernet communication protocol module 333, and an infrared communication protocol module and other network communication protocol modules or near field communication protocol modules.
  • the communicator 330 of the A chip and the communicator 230 of the N chip also interact with each other.
  • the WiFi module 231 of the N chip is used to connect to an external network and generate network communication with an external server and the like.
  • the WiFi module 331 of the A chip is used to connect to the WiFi module 231 of the N chip, without direct connection with an external network or the like. Therefore, for the user, a display device as in the above embodiment can externally display a WiFi account.
  • the detector 340 is a component used by the chip of the display device A to collect signals from the external environment or interact with the outside.
  • the detector 340 may include a light receiver 342, a sensor used to collect the intensity of ambient light, which can adaptively display parameter changes by collecting ambient light, etc.; it may also include an image collector 341, such as a camera, a camera, etc., which can be used to collect external Environmental scenes, as well as gestures used to collect user attributes or interact with users, can adaptively change display parameters, and can also recognize user gestures to achieve the function of interaction with users.
  • the external device interface 350 provides components for data transmission between the controller 310 and the N chip or other external devices.
  • the external device interface can be connected to external devices such as set-top boxes, game devices, notebook computers, etc., in a wired/wireless manner.
  • the controller 310 controls the work of the display device 200 and responds to user operations by running various software control programs (such as installed third-party applications, etc.) stored on the memory 390 and interacting with the N chip.
  • various software control programs such as installed third-party applications, etc.
  • the controller 310 includes a read-only memory ROM 313, a random access memory RAM 314, a graphics processor 316, a CPU processor 312, a communication interface 318, and a communication bus.
  • the ROM 313 and the RAM 314, the graphics processor 316, the CPU processor 312, and the communication interface 318 are connected by a bus.
  • the CPU processor 312 runs the system startup instruction in the ROM, and copies the operating system stored in the memory 390 to the RAM 314 to start the startup operating system. After the operating system is started up, the CPU processor 312 copies various application programs in the memory 390 to the RAM 314, and then starts to run and start various application programs.
  • the CPU processor 312 is used to execute operating system and application instructions stored in the memory 390, communicate with the N chip, transmit and interact with signals, data, instructions, etc., and execute according to various interactive instructions received from external inputs Various applications, data and content, in order to finally display and play various audio and video content.
  • the communication interface may include the first interface 318-1 to the nth interface 318-n. These interfaces may be network interfaces connected to external devices via a network, or network interfaces connected to the N chip via a network.
  • the controller 310 may control the overall operation of the display device 200. For example, in response to receiving a user command for selecting a UI object to be displayed on the display 280, the controller 210 may perform an operation related to the object selected by the user command.
  • the graphics processor 316 is used to generate various graphics objects, such as icons, operation menus, and user input instructions to display graphics. Including an arithmetic unit, which performs operations by receiving various interactive commands input by the user, and displays various objects according to display attributes. As well as including a renderer, various objects obtained based on the arithmetic unit are generated, and the rendering result is displayed on the display 280.
  • Both the graphics processor 316 of the A chip and the graphics processor 216 of the N chip can generate various graphics objects. Differentily, if application 1 is installed on the A chip and application 2 is installed on the N chip, when the user is in the interface of the application 1 and the user inputs instructions in the application 1, the A chip graphics processor 316 generates a graphic object. When the user is on the interface of Application 2 and performs the user-input instructions in Application 2, the graphics processor 216 of the N chip generates the graphics object.
  • Fig. 5 is a schematic diagram of a functional configuration of a display device exemplarily shown according to some embodiments of the application.
  • the memory 390 of the A chip and the memory 290 of the N chip are respectively used to store the operating system, application programs, content and user data, etc., under the control of the controller 310 of the A chip and the controller 210 of the N chip. , Drive the system operation of the display device 200 and respond to various operations of the user.
  • the memory 390 of the A chip and the memory 290 of the N chip may include volatile and/or nonvolatile memory.
  • the memory 290 is specifically used to store the operating program that drives the controller 210 in the display device 200, and store various application programs built in the display device 200, and various application programs downloaded by the user from an external device, and application programs.
  • the memory 290 is used to store system software such as an operating system (OS) kernel, middleware, and applications, and to store input video data and audio data, and other user data.
  • OS operating system
  • the memory 290 is specifically used to store driver programs and related data such as the video processor 260-1 and the audio processor 260-2, the display 280, the communication interface 230, the tuner and demodulator 220, and the input/output interface.
  • the memory 290 may store software and/or programs.
  • the software programs used to represent an operating system (OS) include, for example, a kernel, middleware, application programming interface (API), and/or application programs.
  • OS operating system
  • the kernel may control or manage system resources, or functions implemented by other programs (such as the middleware, API, or application program), and the kernel may provide interfaces to allow middleware and APIs, or applications to access the controller , In order to achieve control or management of system resources.
  • the memory 290 includes a broadcast receiving module 2901, a channel control module 2902, a volume control module 2903, an image control module 2904, a display control module 2905, an audio control module 2906, an external command recognition module 2907, a communication control module 2908, and power control Module 2910, operating system 2911, and other application programs 2912, interface layout management module 2913, event transmission system 2914, browser module and so on.
  • the controller 210 executes various software programs in the memory 290, such as: broadcast and television signal reception and demodulation function, TV channel selection control function, volume selection control function, image control function, display control function, audio control function, external command Various functions such as identification function, communication control function, optical signal receiving function, power control function, software control platform supporting various functions, and browser function.
  • the memory 390 includes storing various software modules for driving and controlling the display device 200.
  • various software modules stored in the memory 390 include: a basic module, a detection module, a communication module, a display control module, a browser module, and various service modules. Since the functions of the memory 390 and the memory 290 are relatively similar, please refer to the memory 290 for related parts, and will not be repeated here.
  • the memory 390 includes an image control module 3904, an audio control module 2906, an external command recognition module 3907, a communication control module 3908, an optical receiving module 3909, an operating system 3911, and other application programs 3912, a browser module, and so on.
  • the controller 210 executes various software programs in the memory 290, such as: image control function, display control function, audio control function, external command recognition function, communication control function, light signal receiving function, power control function, support for various Functional software control platform, and various functions such as browser functions.
  • the external command recognition module 2907 of the N chip and the external command recognition module 3907 of the A chip can recognize different commands.
  • the external command recognition module 3907 of the A chip may include a graphic recognition module 3907-1.
  • the graphic recognition module 3907-1 stores a graphic database, and the camera receives external commands. In order to control the display device, the corresponding relationship is made with the instructions in the graphics database.
  • the voice receiving device and the remote controller are connected to the N chip, the external command recognition module 2907 of the N chip may include a voice recognition module 2907-2.
  • the voice recognition module 2907-2 stores a voice database, and the voice receiving device receives The external voice commands or time correspond to the commands in the voice database to control the display device.
  • a control device 100 such as a remote controller is connected to the N chip, and the key command recognition module interacts with the control device 100.
  • the controller of the display device is an operating system at the software level, and the built-in applications may be the same as those in the above-mentioned dual-chip architecture display device. Also set up all the above interfaces.
  • Fig. 6a exemplarily shows a configuration block diagram of the software system in the display device 200 in some embodiments.
  • the operating system 2911 includes operating software for processing various basic system services and for implementing hardware-related tasks, acting as a medium for data processing between application programs and hardware components.
  • part of the operating system kernel may include a series of software to manage the hardware resources of the display device and provide services for other programs or software codes.
  • part of the operating system kernel may include one or more device drivers, and the device drivers may be a set of software codes in the operating system to help operate or control devices or hardware associated with the display device.
  • the drive may contain code to manipulate video, audio, and/or other multimedia components. Examples include displays, cameras, Flash, WiFi, and audio drivers.
  • the operating system 2911 may specifically include: an accessibility module 2911-1, a communication module 2911-2, a user interface module 2911-3, and a control application 2911-4.
  • the operating system 2911 may further include a camera scheduling module 2911-5, a camera driving module 2911-6, and a camera switch module 2911-7.
  • the accessibility module 2911-1 is used to modify or access the application program, so as to realize the accessibility of the application program and the operability of its display content.
  • the communication module 2911-2 is used to connect to other peripherals via related communication interfaces and communication networks.
  • the user interface module 2911-3 is used to provide objects that display the user interface for access by various applications, and can realize user operability.
  • the control application 2911-4 is used to control process management and switch foreground applications, including runtime applications.
  • the camera scheduling module 2911-5 is used to control the camera to turn on or off, and to raise or lower the camera.
  • the camera driving module 2911-6 which is used to drive the motor mechanically connected with the camera to raise or lower the camera under the control of the camera scheduling module 2911-5;
  • the camera switch module 2911-7 is used to turn on the camera under the control of the camera scheduling module 2911-5, even if it enters the on state, or turn off the camera, even if it enters the off state.
  • the event transmission system 2914 may be implemented in the operating system 2911 or in the application program 2912. In some embodiments, it is implemented in the operating system 2911 on the one hand, and implemented in the application program 2912 at the same time, for monitoring various user input events, and responding to the recognition results of various events or sub-events according to various events. And implement one or more sets of pre-defined operation procedures.
  • the event transmission system 2914 may include an event monitoring module 2914-1 and an event identification module 2914-2.
  • the event monitoring module 2914-1 is used to monitor input events or sub-events of the user input interface.
  • Event recognition module 2914-2 used to input the definition of various events to various user input interfaces, recognize various events or sub-events, and transmit them to the processing to execute the corresponding one or more groups of processing programs .
  • the event or sub-event refers to the input detected by one or more sensors in the display device 200 and the input of an external control device (such as the control device 100, etc.).
  • an external control device such as the control device 100, etc.
  • various sub-events of voice input, gesture input sub-events of gesture recognition, and sub-events of remote control button command input of control devices include multiple forms, including but not limited to one or a combination of pressing up/down/left/right/, confirming keys, and pressing keys.
  • non-physical buttons such as moving, pressing, and releasing.
  • the interface layout management module 2913 which directly or indirectly receives various user input events or sub-events monitored by the event transmission system 2914, is used to update the layout of the user interface, including but not limited to the position of each control or sub-control in the interface, and the container
  • the size, position, level, etc. of the interface are related to various execution operations.
  • the application layer of the display device includes various applications that can be executed on the display device 200.
  • the application layer 2912 of the N chip may include, but is not limited to, one or more applications, such as video-on-demand applications, application centers, and game applications.
  • the application layer 3912 of the A chip may include, but is not limited to, one or more applications, such as a live TV application, a media center application, and so on. It should be noted that the application programs contained on the A chip and the N chip are determined according to the operating system and other designs. This application does not need to specifically limit and divide the application programs contained on the A chip and the N chip.
  • Live TV applications can provide live TV through different sources.
  • a live TV application may use input from cable TV, wireless broadcasting, satellite services, or other types of live TV services to provide TV signals.
  • the live TV application can display the video of the live TV signal on the display device 200.
  • Video-on-demand applications can provide videos from different storage sources. Unlike live TV applications, VOD provides video display from certain storage sources. For example, the video on demand can come from the server side of cloud storage, and from the local hard disk storage that contains the stored video programs.
  • Media center applications can provide various multimedia content playback applications.
  • the media center can provide services that are different from live TV or video on demand, and users can access various images or audio through the media center application.
  • Application center can provide storage of various applications.
  • An application can be a game, an application, or some other application that is related to a computer system or other device but can run on a display device.
  • the application center can obtain these applications from different sources, store them in the local storage, and then run on the display device 200.
  • the A chip and the N chip may be respectively installed with independent operating systems, there are two independent but related sub-systems in the display device 200.
  • both the A chip and the N chip can be independently installed with Android and various APPs, so that each chip can realize a certain function, and the A chip and the N chip can cooperate to realize a certain function.
  • FIG. 7 exemplarily shows a schematic diagram of a user interface in the display device 200 according to an exemplary embodiment.
  • the user interface includes multiple view display areas, for example, a first view display area 201 and a play screen 202, where the play screen includes layout of one or more different items.
  • the user interface also includes a selector indicating that the item is selected, and the position of the selector can be moved through user input to change the selection of different items.
  • multiple view display areas can present display screens of different levels.
  • the first view display area can present video chat item content
  • the second view display area can present application layer item content (eg, webpage video, VOD display, application program screen, etc.).
  • the presentation of different view display areas has priority differences, and the display priority of the view display areas is different between view display areas with different priorities.
  • the priority of the system layer is higher than that of the application layer.
  • the screen display in the view display area of the system layer is not blocked; and the application layer is enabled according to the user's choice
  • the size and position of the view display area change the size and position of the view display area of the system layer will not be affected.
  • the same level of display screen can also be presented.
  • the selector can switch between the display area of the first view and the display area of the second view, and when the size and position of the display area of the first view change, the second view The size and position of the display area can be changed at any time.
  • any area in FIG. 7 may display a picture obtained by a camera.
  • Items refer to visual objects displayed in each view display area of the user interface of the display device 200 to represent corresponding content such as icons, thumbnails, and video clips.
  • the item can represent the image content or video clip of a movie, TV series, audio content of music, application program, or other user access content history information.
  • "items” may display image thumbnails.
  • the project when the project is a movie or TV series, the project can be displayed as a poster of the movie or TV series. If the item is music, the poster of the music album can be displayed.
  • the project when the project is an application, it can be displayed as an icon of the application, or a screenshot of the content of the application captured when the application was executed most recently.
  • the project when the project is a user's access history, it can be displayed as a screenshot of the content during the most recent execution.
  • "Projects" can be displayed as video clips.
  • the project is a video clip of a trailer of a TV or TV series.
  • the item may indicate an interface or a set of interfaces displayed by connecting the display device 200 with an external device, or may indicate the name of an external device connected to the display device, or the like.
  • signal source input interface collection or HDMI interface, USB interface, PC terminal interface, etc.
  • “Selector” is used to indicate that any item has been selected, such as cursor or focus object.
  • the cursor is moved on the display device 200 to select or control items.
  • the movement of the focus object displayed in the display device 200 can be selected to select the control item, and one or more items can be selected or controlled.
  • the user can use the arrow keys on the control device 100 to control the movement of the focus object between items to select and control items.
  • Fig. 8 is a schematic diagram showing the keys of the remote control according to an exemplary embodiment of the present application.
  • the remote control only has "Volume +", “Volume -”, “Up”, “Down”, “OK”, “Voice”, “Return”, “Home” and “On/Off” "Buttons, where each button corresponds to an inherent control function, that is, its native function.
  • the native function of "Volume+” is to increase the volume
  • the native function of "Up” is to move the focus upward.
  • the controller of the display device successfully switches from the first interface to the second interface in response to the foreground interface, and acquires the interface attributes of the second interface.
  • the interface attributes are used to characterize whether the interface reuses remote control buttons. ;
  • the key reuse strategy is updated according to the keys reused in the second interface, and the key state of the multiplexed keys in the key reuse strategy is reuse State, the button state of the unmultiplexed button is the native state; in response to receiving the button input event of the remote control, the button state corresponding to the button input event is determined according to the updated button multiplexing strategy; in response to the determination of the button input event corresponding In the multiplexing state, it can instruct the application to perform the multiplexing function corresponding to the key input event.
  • a customized broadcast containing the case input event is sent, wherein the customized broadcast is used to instruct the application to perform the multiplexing function corresponding to the case input event .
  • the display device can reuse any key on the remote control according to its application state or interface.
  • the key values sent by the remote control received by the display device do not distinguish between applications and interfaces, except that after the display device receives it, it depends on the application it is in. Or the interface parses into different commands.
  • the display device multiplexes the volume key.
  • the display device presents a non-camera-related user interface in a non-camera-related application, such as a video playback interface.
  • the user triggers the volume key, such as "volume +" or "volume -" to achieve the volume of the whole machine The function of increasing (or decreasing).
  • the display presents the camera-related user interface, such as presenting the image data collected by the camera.
  • the user triggers the same above-mentioned volume key to achieve zoom in (or zoom out) of the camera focal length , In order to present the second image data matching the focal length of the camera on the display; or realize that the focal length of the camera remains unchanged but the displayed image is enlarged (or reduced).
  • the multiplexing of the volume key is released, and the volume key is triggered to realize the function of increasing (or decreasing) the volume of the whole machine.
  • non-camera related applications include video playback related applications and music playback related applications, such as on-demand applications, live broadcast applications, etc.
  • Camera-related applications include the above-mentioned mirror application and screen recording application.
  • the image is displayed in full screen, as shown in Figure 11a.
  • the volume key corresponds to the adjustment of the camera focal length.
  • volume - corresponds to a decrease in the focal length of the camera
  • volume + corresponds to an enlargement of the focal length of the camera.
  • the screen shown in Figure 11b is presented.
  • the focal length of the camera becomes smaller and the acquired image data range is larger.
  • a larger range of image data is displayed on the monitor than before, and the screen size adjustment bar appears on the right side of the screen, prompting the user that the current focal length is 0.5 times. .
  • the picture in the camera-related interface, is displayed in full screen, presenting the picture of FIG. 11a.
  • the volume key corresponds to the zoom adjustment of the preview screen.
  • “volume -” corresponds to zooming out of the preview picture
  • “volume +” corresponds to zooming in the preview picture.
  • the image is reduced to 0.5 times the original full-screen image, and a screen size adjustment bar appears on the right side of the screen, prompting the user that the image is currently reduced by 0.5 times.
  • the user presses "Volume +" to "1.0 ⁇ ” the picture returns to the full screen display state.
  • the image when the user presses "Volume +" to "2.0 ⁇ ", the image will be twice as large as the original image, and a part of the image will appear on the display, and the right side of the screen will appear
  • the screen size adjustment bar reminds the user that the screen is currently zoomed in by 2 times.
  • an enlarged area selection bar is presented on the display, and the partial image can be locked and browsed through the up, down, left, and right direction keys of the remote control, as shown in Fig. 11f.
  • the button multiplexing strategy is updated in real time according to whether the front interface needs to reuse buttons and the multiplexed buttons.
  • centralized control of forwarding key input events can not only realize the multiplexing of any key on the remote control under any application interface, but also ensure the reliability of multiplexing keys.
  • the embodiment of the present application also provides a remote control button multiplexing method.
  • the controller of the display device is configured to perform some or all of the steps of the method. Through this method, the multiplexing of a specific remote control button can be realized in a predetermined scene.
  • the utility model has high reliability, so that the remote controller can control the display device flexibly and diversely.
  • Fig. 9 is a flowchart of a method for multiplexing remote control keys according to an exemplary embodiment of the present application. As shown in Figure 9, the method may include:
  • Step 901 In response to the successful switching of the foreground interface from the first interface to the second interface, the interface attributes of the second interface are acquired, and the interface attributes are used to characterize whether the interface reuses remote control keys.
  • the user interface is presented.
  • the presented user interface may only include the application interface of one application.
  • the application interface is displayed in full screen on the display screen, or it may include the application interfaces of multiple applications.
  • the application interface of at least one application is displayed in a window interface smaller than the display screen size.
  • the interface that can obtain the remote control signal focus is the foreground interface
  • the application to which the foreground interface belongs is the foreground application.
  • Fig. 10a is an application interface entered by the user after opening the "photographing application" in the application center.
  • the user interface of the display device includes only one application interface, and the application interface can obtain the focus of the remote control signal and is the foreground interface.
  • FIG. 10b is another user interface according to an exemplary embodiment of this application, which includes two application interfaces.
  • Interface A is displayed on interface B as a window, and interface B is displayed on the display as a background interface.
  • interface A can get the remote control signal focus, which belongs to the foreground interface.
  • the application interface displayed in the foreground can be switched. For example, referring to Figure 10b, when the user moves the focus position from interface A to interface B through the remote control, the foreground interface switches from interface A to interface B. For another example, continue to refer to Figure 10b, when the user selects a function icon in interface A through the remote control and enters another interface C (not shown in the figure) of the application to which interface A belongs, the foreground interface is switched from interface A to interface C interface.
  • the system monitors whether the front-end interface is successfully switched.
  • the switching process of the foreground interface includes: the application sends a request to the system to exit the first interface from the foreground interface; after the system receives the request, it sends a response message to the application; when the application receives the response message, it exits the first interface from the foreground interface ;
  • the application sends a request to the system to display the second interface on the foreground interface; after the system receives the request, it sends a response message to the application; when the application receives the response message, the second interface is displayed on the foreground interface.
  • the system detects that the second interface is successfully displayed on the foreground interface, it is determined that the foreground interface is successfully switched from the first interface to the second interface.
  • first interface and the second interface may be two different interfaces of the same application, or may be interfaces belonging to different applications.
  • step 901 when the system detects that the foreground interface is successfully switched from the first interface to the second interface, the interface attributes of the second interface are acquired.
  • the interface may have two attributes, such as a first attribute and a second attribute, which are used to represent the need to reuse keys and the need not to reuse keys, respectively.
  • the interface can also have two values under the same attribute, which are used to represent the need to reuse keys and the need not to reuse keys.
  • the interface that needs to be reused is called a customized interface
  • the interface that does not need to be reused is called a normal interface.
  • an interface attribute table such as an .xml table
  • an .xml table can be established in advance to correspond to save the package name of the application and/or application interface and the application attribute and/or interface attribute of the application and/or application interface.
  • the .xml table can also save the keys that need to be reused in the customized applications/interfaces, and the keys can be represented by key values.
  • the data saving format in the .xml table may be ⁇ application/application interface package name-application/interface attribute-key value for multiplexing keys>.
  • the application/interface attributes it can be determined whether a certain application/application interface is a customized interface or a normal interface. If it is a customized interface, the reused key value of the application/application interface can be determined Button.
  • step 901 when it is detected that the foreground interface is successfully switched from the first interface to the second interface, the interface name of the second interface, that is, its package name, is acquired, and then the interface attribute corresponding to the interface name is searched in the H_MAP table.
  • Step 902 When the interface attribute of the second interface characterizes the second interface multiplexing remote control keys, update the key multiplexing strategy according to the keys multiplexed on the second interface, and the key multiplexing strategy is multiplexed
  • the button state of the button is the multiplexed state
  • the button state of the unmultiplexed button is the native state.
  • the button reuse strategy is preset in the operating system.
  • the button reuse strategy includes the button state of each button on the remote control, and the button state includes the native state and the reuse state. For a certain key, if it is determined to be in the original state according to the key multiplexing strategy, it means that the key is not reused, and if it is determined to be in the multiplexing state according to the key multiplexing strategy, it means that the key is multiplexed.
  • the second interface may be a customized interface or a normal interface.
  • the interface attribute of the second interface characterizes the second interface multiplexing the keys of the remote control
  • the second interface is reproduced according to the The key used to update the key reuse strategy.
  • the second interface is a preset interface that needs to reuse keys according to the interface attributes of the second interface. For example, if the interface attribute value of the second interface is a predetermined value, the second interface is the preset interface. If the interface attribute value of the second interface is not the predetermined value, the second interface is not the default interface, that is, there is no need to reuse keys .
  • the keys multiplexed on the second interface are acquired. It should be noted that the multiplexed keys on any preset interface such as the second interface are all predetermined and pre-stored in the above interface attribute list. Based on this, if the second interface is a preset interface, the key value of the multiplexed button can be found in the above H_MAP table. Then, the button state of the multiplexed buttons on the second interface in the button multiplexing strategy is updated to the multiplexed state, and the button states of the remaining buttons are updated to the original state.
  • the button state of each button in the button reuse strategy is updated to the original state.
  • Step 903 In response to receiving the key input event of the remote control, determine the key state corresponding to the key input event according to the updated key multiplexing strategy.
  • the display device will receive the key input event and obtain the key value corresponding to the key input event.
  • the key state corresponding to the key key value can be found. Native state, or reuse state.
  • step 904 in response to determining that the key input event corresponds to the multiplexing state, a customized broadcast including the key input event is sent, and the customized broadcast is used to instruct the application to execute the corresponding key input event. Multiplexing functions.
  • the key input event is sent to the application layer through inter-thread communication between the system thread and the UI thread.
  • the second interface is the application interface shown in Figure 10a
  • the predetermined multiplexing button of the application interface is "volume +”
  • the attribute value of the interface attribute corresponding to the interface name of the application interface Is a predetermined value
  • the corresponding multiplexing key value is the key value of "volume +”.
  • the embodiment of the present application provides a remote control button multiplexing method, which is applied to a display device controller, and in response to the successful switching of the foreground interface from the first interface to the second interface, the interface attributes of the second interface are acquired, ;
  • the key multiplexing strategy is updated according to the keys multiplexed on the second interface; in response to receiving the key input event of the remote control, according to the updated keys
  • the multiplexing strategy determines the key state corresponding to the key input event; in response to determining that the key input event corresponds to the multiplexing state, a customized broadcast containing the key input event is sent, and the customized broadcast is used to instruct the application to execute the key input event corresponding to the key input event.
  • Fig. 11 is a scene of multiplexing the "volume +" and "volume -” buttons on the remote control in the "album” application according to an exemplary embodiment of this application.
  • the "Album” application is the foreground application
  • the application interface shown in Figure 11 is a preset interface, that is, a customized interface.
  • the user can press “Volume +” and “Volume +” on the remote control.
  • Volume - zoom in or out the picture being viewed, "Volume +” and “Volume -” to enlarge or reduce the function of the picture being viewed, which is the multiplexing function of "Volume +" and "Volume -".
  • the method further includes: in response to the button state of at least one button in the button multiplexing strategy being updated to the multiplexing state, generating prompt information
  • the prompt information is used to prompt the user of the currently multiplexed remote control keys; the prompt information is displayed on the foreground interface.
  • the remote control key multiplexing method includes:
  • Step 121 Receive a request for switching the foreground interface from the first interface to the second interface.
  • the request in step 121 may be two requests, one is a request for exiting the first interface from the foreground interface, and the other is a request for displaying the second interface from the foreground interface.
  • Step 122 Update the button multiplexing strategy to a default strategy, and the button state corresponding to any key input event in the default strategy is the native state;
  • Step 123 Exit the first interface from the foreground interface
  • Step 124 Display the second interface on the foreground interface
  • Step 125 Obtain interface attributes of the second interface
  • Step 126 Update the button reuse strategy according to the interface attributes of the second interface
  • Step 127 When a key input event of the remote control is received, the key state corresponding to the key input event is determined according to the updated key multiplexing strategy;
  • Step 128 If the key state corresponding to the key input event is the multiplexing state, send a customized broadcast containing the key input event, and the customized broadcast is used to instruct the application to perform the multiplexing corresponding to the key input event Features.
  • the key multiplexing strategy is updated to the default strategy, so as to ensure that once the customized interface is exited, the remote control key is pressed Restore to the original state, to avoid that some keys are still in the multiplexing state during the time of switching the foreground interface, causing these keys to fail.
  • the foreground application switching is not involved in the process of switching the foreground interface.
  • the system receives a request for switching the foreground application from the first application to the second application , First update the button reuse strategy to the default strategy, then exit the first application from the foreground application, and run the second application on the foreground interface.
  • the foreground application may have abnormalities such as memory leaks, unresponsiveness, failure to obtain focus, etc.
  • the buttons in the reused state before the application is stopped by the system Will not be able to achieve its native functions.
  • the "photograph application” in the foreground does not respond. If the "volume +" button is reused by the "photograph application”, the user will not be able to use the "volume +” button during the time when the "photograph application” is unresponsive. Adjust the output volume of the display device.
  • the display device performs the steps provided in the above embodiment, it also performs:
  • the button reuse strategy is updated to the default strategy.
  • the key reuse strategy is updated to the default strategy, so as to ensure that the remote control keys can achieve their native functions during the interface switching process and improve key reliability.
  • the preset button multiplexing strategy is updated according to the interface properties of the second interface; when the button input event of the remote control is received, the updated button
  • the multiplexing strategy determines the key state corresponding to the key input event, the key state includes the native state and the multiplexed state; if the key state corresponding to the key input event is the multiplexed state, send the key input event containing the key Broadcast, so as to realize the multiplexing of any key on the remote control under any application interface.
  • the button reuse strategy is updated to the default policy. Furthermore, from the time the foreground application is abnormal until it is closed by the system, it can be guaranteed that any button can achieve its native function, thereby improving the reliability of the button Sex.
  • the present application also provides a computer storage medium, where the computer storage medium may store a program, and the program may include part or all of the steps in each embodiment of the method provided in the present application when the program is executed.
  • the storage medium may be a magnetic disk, an optical disc, a read-only memory (English: read-only memory, abbreviated as: ROM) or a random access memory (English: random access memory, abbreviated as: RAM), etc.
  • the technology in the embodiments of the present application can be implemented by means of software plus a necessary general hardware platform.
  • the implementation in the embodiments of this application essentially or the part that contributes to the prior art can be embodied in the form of a software product, and the computer software product can be stored in a storage medium, such as ROM/RAM , Magnetic disks, optical disks, etc., including a number of instructions to enable a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the methods described in each embodiment of the application or some parts of the embodiment.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Databases & Information Systems (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

一种显示设备和控制装置按键复用方法,显示设备的控制器被配置为:响应于用户由视频播放相关界面切换至摄像头相关界面的指令,接收与所述显示设备匹配的控制装置的按键指令,控制所述显示设备由视频播放相关功能切换至焦距调节相关的功能。

Description

显示设备和控制装置按键复用方法
本申请要求在2019年8月18日提交中国专利局、申请号为201910761467.0、发明名称为“显示设备的遥控器按键复用方法及显示设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及显示设备技术领域,尤其涉及一种显示设备和控制装置按键复用方法。
背景技术
当前,显示设备可以为用户提供诸如音频、视频、图片等播放画面,受到广泛关注。
图1示例性示出了控制装置100与显示设备200的交互场景,如图1所示,用户可通过控制装置100来操作显示设备200。其中,控制装置100可以是控制装置100A,其可与显示设备200之间通过红外协议通信、蓝牙协议通信、紫蜂(ZigBee)协议通信或其他短距离通信方式进行通信,用于通过无线或其他有线方式来控制显示设备200。用户可以通过控制装置上的按键、语音输入等输入用户指令,来控制显示设备200。如:用户可以通过控制装置上音量加减键、频道控制键、上/下/左/右的移动按键、语音输入按键、菜单键、开关机按键等输入相应控制指令,来实现控制显示设备200的功能。
通常,控制装置由于较小的体积和简约的外形,使得控制装置上的按键数量有限,例如图8所示,控制装置上仅具有“音量+”、“音量-”、“向上”、“向下”、“确认OK”“语音”、“返回”、“主页”以及“开/关”按键。然而,随着显示设备功能的增多,控制装置上有限的按键数量及每个按键固有的功能,难以实现对显示设备进行灵活多样的控制。
发明内容
有鉴于此,本申请提供一种显示设备和遥控器按键复用方法。
第一方面,本申请提供一种显示设备,包括:
摄像头,被配置为:采集图像数据;
显示器,被配置为:呈现用户界面,和/或图像界面;
控制器,被配置为:
响应于用户由视频播放相关界面切换至摄像头相关界面的指令,接收与所述显示设备匹配的控制设备的按键指令,控制所述显示设备由视频播放相关功能切换至焦距调节相关的功能。
第二方面,本申请提供一种显示设备,包括:
摄像头,被配置为:采集图像数据;
显示器,被配置为:呈现用户界面;
控制器,被配置为:
接收与所述显示设备匹配的控制装置发送的按键指令,
当所述显示器呈现视频播放相关界面时,执行与视频播放相关的功能;
当所述显示器呈现图像数据相关界面时,执行对所述图像数据的缩放的功能。
第三方面,本申请提供一种显示设备,包括:
摄像头,被配置为:采集图像数据;
显示器,被配置为:呈现用户界面;
控制器,被配置为:
接收与所述显示设备匹配的控制装置发送的按键指令,
当所述显示器呈现视频播放相关界面时,执行与视频播放相关的功能;
当所述显示器呈现图像数据相关界面时,执行对所述摄像头焦距的调节功能。
第四方面,本申请提供一种显示设备,包括:
摄像头,被配置为:采集图像数据;
显示器,被配置为:呈现用户界面,和/或图像界面;
控制器,被配置为:
当所述显示器呈现视频播放相关界面时,接收与所述显示设备匹配的控制装置发送的按键指令时,所述控制器执行与视频播放相关的功能;
当所述显示器呈现摄像头相关界面时,接收与所述显示设备匹配的控制设备发送的按键指令时,所述控制器执行对所述图像数据的放大、或缩小的功能。
第五方面,本申请提供一种显示设备,包括:
摄像头,被配置为:采集图像数据;
显示器,被配置为:呈现用户界面,和/或图像界面;
控制器,被配置为:
当所述显示器呈现视频播放相关界面时,接收与所述显示设备匹配的控制装置发送的按键指令时,所述控制器执行与视频播放相关的功能;
当所述显示器呈现摄像头相关界面时,接收与所述显示设备匹配的控制设备发送的按键指令时,所述控制器执行对所述摄像头焦距的调节功能。
第六方面,本申请提供一种显示设备,包括:
显示器,用于呈现用户界面;
控制器,被配置为:
响应于前台界面由第一界面成功切换到第二界面,获取所述第二界面的界面属性,所述界面属性用于表征界面是否复用遥控器按键;
在所述第二界面的界面属性表征第二界面复用遥控器按键时,根据所述第二界面所复用的按键更新按键复用策略,所述按键复用策略中被复用的按键的按键状态为复用状态,未被复用的按键的按键状态为原生状态;
响应于接收到遥控器的按键输入事件,根据更新后的按键复用策略确定所述按键输入事件对应的按键状态;
响应于确定所述按键输入事件对应于所述复用状态,发送包含所述按键输入事件的定制化广播,所述定制化广播用于指示应用执行所述按键输入事件对应的复用功能。
第七方面,本申请还提供一种遥控器按键复用方法,所述方法包括:
响应于前台界面由第一界面成功切换到第二界面,获取所述第二界面的界面属性,所述界面属性用于表征界面是否复用遥控器按键;
在所述第二界面的界面属性表征第二界面复用遥控器按键时,根据所述第二界面所复用的按键更新按键复用策略,所述按键复用策略中被复用的按键的按键状态为复用状态,未被复用的按键的按键状态为原生状态;
响应于接收到遥控器的按键输入事件,根据更新后的按键复用策略确定所述按键输入事件对应的按键状态;
响应于确定所述按键输入事件对应于所述复用状态,发送包含所述按键输入事件的定制化广播,所述定制化广播用于指示应用执行所述按键输入事件对应的复用功能。
附图说明
为了更清楚地说明本申请的实现方式,下面将对实施例中所需要使用的附图作简单地介绍,显而易见地,对于本领域普通技术人员而言,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。
图1中示例性示出了显示设备与控制装置之间操作场景的示意图;
图2中示例性示出了控制装置100的硬件配置框图;
图3中示例性示出了显示设备200的硬件配置框图;
图4中示例性示出了根据图3显示设备200的硬件架构框图;
图5中示例性示出了显示设备200的功能配置示意图;
图6a中示例性示出了显示设备200中软件配置示意图;
图6b中示例性示出了显示设备200中应用程序的配置示意图;
图7中示例性示出了显示设备200用户界面的示意图;
图8为本申请根据一示例性实施例示出的遥控器按键示意图;
图9为本申请根据一示例性实施例示出的显示设备的遥控器按键复用方法流程图;
图10a为用户在应用中心打开“拍照应用”后进入的应用界面;
图10b为本申请根据一示例性实施例示出的另一用户界面;
图11a-11f为本申请根据一示例性实施例示出的在摄像头相关应用中,复用遥控器上“音量+”或“音量-”按键的场景;
图12为本申请根据一示例性实施例示出的遥控器按键复用方法流程图。
具体实施方式
为了使本技术领域的人员更好地理解本申请中的实现方式,下面将结合本申请实施例中的附图,对本申请实施例中的实现方式进行清楚、完整地描述,显然,所描述的实施例仅仅是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都应当属于本申请保护的范围。
本申请提供的显示设备,可以是具有多个芯片架构的显示设备,如本申请图3至图5示出的具有双芯片(双硬件系统)架构的显示设备,也可以是具有非双芯片架构的显示设备,在具体实现中,只要能实现下述各实施例中所述的功能即可,本申请对此不予限定。
为便于用户使用,显示设备上通常会设置各种外部装置接口,以便于连接不同的外设设备或线缆以实现相应的功能。而在显示设备的接口上连接有高清晰度的摄像头时,如果显示设备的硬件系统没有接收源码的高像素摄像头的硬件接口,那么就会导致无法将摄像头接收到的数据呈现到显示设备的显示屏上。
下面首先结合附图对本申请所涉及的概念进行说明。在此需要指出的是,以下对各个概念的说明,仅为了使本申请的内容更加容易理解,并不表示对本申请保护 范围的限定。
应当理解,本申请中说明书和权利要求书及上述附图中的术语“第一”、“第二”、“第三”等是用于区别类似的对象,而不必用于描述特定的顺序或先后次序。应该理解这样使用的数据在适当情况下可以互换,例如能够根据本申请实施例图示或描述中给出那些以外的顺序实施。
此外,术语“包括”和“具有”以及他们的任何变形,意图在于覆盖但不排他的包含,例如,包含了一系列组件的产品或设备不必限于清楚地列出的那些组件,而是可包括没有清楚地列出的或对于这些产品或设备固有的其它组件。
本申请中使用的术语“模块”,是指任何已知或后来开发的硬件、软件、固件、人工智能、模糊逻辑或硬件或/和软件代码的组合,能够执行与该元件相关的功能。
本申请中使用的术语“遥控器”,是指电子设备(如本申请中公开的显示设备)的一个组件,通常可在较短的距离范围内无线控制电子设备。一般使用红外线和/或射频(RF)信号和/或蓝牙与电子设备连接,也可以包括WiFi、无线USB、蓝牙、动作传感器等功能模块。例如:手持式触摸遥控器,是以触摸屏中用户界面取代一般遥控装置中的大部分物理内置硬键。
本申请中使用的术语“手势”,是指用户通过一种手型的变化或手部运动等动作,用于表达预期想法、动作、目的/或结果的用户行为。
图1中示例性示出了根据实施例中显示设备与控制装置之间操作场景的示意图。如图1所示,用户可通过控制装置100来操作显示设备200。
其中,控制装置100可以是遥控器100A,其可与显示设备200之间通过红外协议通信、蓝牙协议通信、紫蜂(ZigBee)协议通信或其他短距离通信方式进行通信,用于通过无线或其他有线方式来控制显示设备200。用户可以通过遥控器上按键、语音输入、控制面板输入等输入用户指令,来控制显示设备200。如:用户可以通过遥控器上音量加减键、频道控制键、上/下/左/右的移动按键、语音输入按键、菜单键、开关机按键等输入相应控制指令,来实现控制显示设备200的功能。
控制装置100也可以是智能设备,如移动终端100B、平板电脑、计算机、笔记本电脑等,其可以通过本地网(LAN,Local Area Network)、广域网(WAN,Wide Area Network)、无线局域网((WLAN,Wireless Local Area Network)或其他网络与显示设备200之间通信,并通过与显示设备200相应的应用程序实现对显示设备200的控制。例如,使用在智能设备上运行的应用程序控制显示设备200。该应用程序可以在与智能设备关联的屏幕上通过直观的用户界面(UI,User Interface)为用户提供各种控制。
“用户界面”,是应用程序或操作系统与用户之间进行交互和信息交换的介质接口,它实现信息的内部形式与用户可以接受形式之间的转换。用户界面常用的表现形式是图形用户界面(Graphic User Interface,GUI),是指采用图形方式显示的与计算机操作相关的用户界面。它可以是在电子设备的显示屏中显示的一个图标、窗口、控件等界面元素,其中控件可以包括图标、按钮、菜单、选项卡、文本框、对话框、状态栏、导航栏、Widget等可视的界面元素。
示例的,移动终端100B与显示设备200均可安装软件应用,从而可通过网络通信协议实现二者之间的连接通信,进而实现一对一控制操作的和数据通信的目的。如:可以使移动终端100B与显示设备200建立控制指令协议,将遥控控制键盘同步到移动终端100B上,通过控制移动终端100B上用户界面,实现控制显示设备200的功能;也可以将移动终端100B上显示的音视频内容传输到显示设备200上,实现同步显示功能。
如图1所示,显示设备200还可与服务器300通过多种通信方式进行数据通信。在本申请各个实施例中,可允许显示设备200通过局域网、无线局域网或其他网络与服务器300进行通信连接。服务器300可以向显示设备200提供各种内容和互动。
示例的,显示设备200通过发送和接收信息,以及电子节目指南(EPG,Electronic Program Guide)互动,接收软件程序更新,或访问远程储存的数字媒体库。服务器300可以是一组,也可以是多组,可以是一类或多类服务器。通过服务器300提供视频点播和广告服务等其他网络服务内容。
显示设备200,可以是液晶显示器、OLED(Organic Light Emitting Diode)显示器、投影显示设备、智能电视。具体显示设备类型,尺寸大小和分辨率等不作限定,本领技术人员可以理解的是,显示设备200可以根据需要做性能和配置上的一些改变。
显示设备200除了提供广播接收电视功能之外,还可以附加提供计算机支持功能的智能网络电视功能。示例的包括,网络电视、智能电视、互联网协议电视(IPTV)等。
如图1所述,显示设备上可以连接或设置有摄像头,用于将摄像头拍摄到的画面呈现在本显示设备或其他显示设备的显示界面上,以实现用户之间的交互聊天。具体的,摄像头拍摄到的画面可在显示设备上全屏显示、半屏显示、或者显示任意可选区域。
作为一种可选的连接方式,摄像头通过连接板与显示器后壳连接,固定安装在 显示器后壳的上侧中部,作为可安装的方式,可以固定安装在显示器后壳的任意位置,能保证其图像采集区域不被后壳遮挡即可,例如,图像采集区域与显示设备的显示朝向相同。
作为另一种可选的连接方式,摄像头通过连接板或者其他可想到的连接器可升降的与显示后壳连接,连接器上安装有升降马达,当用户要使用摄像头或者有应用程序要使用摄像头时,再升出显示器之上,当不需要使用摄像头时,其可内嵌到后壳之后,以达到保护摄像头免受损坏。
作为一种实施例,本申请所采用的摄像头可以为1600万像素,以达到超高清显示目的。在实际使用中,也可采用比1600万像素更高或更低的摄像头。
当显示设备上安装有摄像头以后,显示设备不同应用场景所显示的内容可得到多种不同方式的融合,从而达到传统显示设备无法实现的功能。
示例性的,用户可以在边观看视频节目的同时,与至少一位其他用户进行视频聊天。视频节目的呈现可作为背景画面,视频聊天的窗口显示在背景画面之上。形象的,可以称该功能为“边看边聊”。
可选的,在“边看边聊”的场景中,在观看直播视频或网络视频的同时,跨终端的进行至少一路的视频聊天。
另一些实施例中,用户可以在边进入教育应用学习的同时,与至少一位其他用户进行视频聊天。例如,学生在学习教育应用程序中内容的同时,可实现与老师的远程互动。形象的,可以称该功能为“边学边聊”。
另一些实施例中,用户在玩纸牌游戏时,与进入游戏的玩家进行视频聊天。例如,玩家在进入游戏应用参与游戏时,可实现与其他玩家的远程互动。形象的,可以称该功能为“边看边玩”。
可选的,游戏场景与视频画面进行融合,将视频画面中人像进行抠图,显示在游戏画面中,提升用户体验。
可选的,在体感类游戏中(如打球类、拳击类、跑步类、跳舞类等),通过摄像头获取人体姿势和动作,肢体检测和追踪、人体骨骼关键点数据的检测,再与游戏中动画进行融合,实现如体育、舞蹈等场景的游戏。
另一些实施例中,用户可以在K歌应用中,与至少一位其他用户进行视频和语音的交互。形象的,可以称该功能为“边看边唱”。优选的,当至少一位用户在聊天场景进入该应用时,可多个用户共同完成一首歌的录制。
另一些实施例中,用户可在本地打开摄像头获取图片和视频,形象的,可以称该功能为“照镜子”。
在另一些示例中,还可以再增加更多功能或减少上述功能。本申请对该显示设备的功能不作具体限定。
图2中示例性示出了根据示例性实施例中控制装置100的配置框图。如图2所示,控制装置100包括控制器110、通信器130、用户输入/输出接口140、存储器190、供电电源180。
控制装置100被配置为可控制所述显示设备200,以及可接收用户的输入操作指令,且将操作指令转换为显示设备200可识别和响应的指令,起到用户与显示设备200之间交互中介作用。如:用户通过操作控制装置100上频道加减键,显示设备200响应频道加减的操作。
在一些实施例中,控制装置100可是一种智能设备。如:控制装置100可根据用户需求安装控制显示设备200的各种应用。
在一些实施例中,如图1所示,移动终端100B或其他智能电子设备,可在安装操控显示设备200的应用之后,起到控制装置100类似功能。如:用户可以通过安装应用,在移动终端100B或其他智能电子设备上可提供的图形用户界面的各种功能键或虚拟按钮,以实现控制装置100实体按键的功能。
控制器110包括处理器112、RAM113和ROM114、通信接口以及通信总线。控制器110用于控制控制装置100的运行和操作,以及内部各部件之间通信协作以及外部和内部的数据处理功能。
通信器130在控制器110的控制下,实现与显示设备200之间控制信号和数据信号的通信。如:将接收到的用户输入信号发送至显示设备200上。通信器130可包括WIFI模块131、蓝牙模块132、NFC模块133等通信模块中至少一种。
用户输入/输出接口140,其中,输入接口包括麦克风141、触摸板142、传感器143、按键144等输入接口中至少一者。如:用户可以通过语音、触摸、手势、按压等动作实现用户指令输入功能,输入接口通过将接收的模拟信号转换为数字信号,以及数字信号转换为相应指令信号,发送至显示设备200。
输出接口包括将接收的用户指令发送至显示设备200的接口。在一些实施例中,可以是红外接口,也可以是射频接口。如:红外信号接口时,需要将用户输入指令按照红外控制协议转化为红外控制信号,经红外发送模块进行发送至显示设备200。再如:射频信号接口时,需将用户输入指令转化为数字信号,然后按照射频控制信号调制协议进行调制后,由射频发送端子发送至显示设备200。
在一些实施例中,控制装置100包括通信器130和输出接口中至少一者。控制装置100中配置通信器130,如:WIFI、蓝牙、NFC等模块,可将用户输入指令 通过WIFI协议、或蓝牙协议、或NFC协议编码,发送至显示设备200。
存储器190,用于在控制器110的控制下存储驱动和控制控制装置100的各种运行程序、数据和应用。存储器190,可以存储用户输入的各类控制信号指令。
供电电源180,用于在控制器110的控制下为控制装置100各元件提供运行电力支持。可以电池及相关控制电路。
在一些实施例中,如图3-图5中所示,给出了采用双芯片的显示设备200中硬件系统的硬件配置框图。
在采用双硬件系统架构时,硬件系统的机构关系可以图3所示。为便于表述以下将双硬件系统架构中的一个硬件系统称为第一硬件系统或A系统、A芯片,并将另一个硬件系统称为第二硬件系统或N系统、N芯片。A芯片包含A芯片的控制器及各类接口,N芯片则包含N芯片的控制器及各类接口。A芯片及N芯片中可以各自安装有独立的操作系统,从而使显示设备200中存在两个在独立但又存在相互关联的子系统。
在一些实施例中,A芯片也可被称为第一芯片,其所执行的功能可等同于或者包含于第一控制器,N芯片也可以被称为第二芯片,其所执行的功能可等同于或者包含于第二控制器。
如图3所示,A芯片与N芯片之间可以通过多个不同类型的接口实现连接、通信及供电。A芯片与N芯片之间接口的接口类型可以包括通用输入输出接口(General-purpose input/output,GPIO)、USB接口、HDMI接口、UART接口等。A芯片与N芯片之间可以使用这些接口中的一个或多个进行通信或电力传输。例如图3所示,在双硬件系统架构下,可以由外接的电源(power)为N芯片供电,而A芯片则可以不由外接电源,而由N芯片供电。
除用于与N芯片进行连接的接口之外,A芯片还可以包含用于连接其他设备或组件的接口,例如图3中所示的用于连接摄像头(Camera)的MIPI接口,蓝牙接口等。
类似的,除用于与N芯片进行连接的接口之外,N芯片还可以包含用于连接显示屏TCON(Timer Control Register)的VBY接口,用于连接功率放大器(Amplifier,AMP)及扬声器(Speaker)的i2S接口;以及IR/Key接口,USB接口,Wifi接口,蓝牙接口,HDMI接口,Tuner接口等。
下面结合图4对双芯片架构进行进一步的说明。需要说明的是图4仅仅是对本申请双硬件系统架构的一些示例性性说明,并不表示对本申请的限定。在实际应用中,两个硬件系统均可根据需要包含更多或更少的硬件或接口。
图4中示例性示出了根据图3显示设备200的硬件架构框图。如图4所示,显示设备200的硬件系统可以包括A芯片和N芯片,以及通过各类接口与A芯片或N芯片相连接的模块。
N芯片可以包括调谐解调器220、通信器230、外部装置接口250、控制器210、存储器290、用户输入接口、视频处理器260-1、音频处理器260-2、显示器280、音频输出接口270、供电电源。在其他实施例中N芯片也可以包括更多或更少的模块。
其中,调谐解调器220,用于对通过有线或无线方式接收广播电视信号,进行放大、混频和谐振等调制解调处理,从而从多个无线或有线广播电视信号中解调出用户所选择电视频道的频率中所携带的音视频信号,以及附加信息(例如EPG数据信号)。根据电视信号广播制式不同,调谐解调器220的信号途径可以有很多种,诸如:地面广播、有线广播、卫星广播或互联网广播等;以及根据调制类型不同,所述信号的调整方式可以数字调制方式,也可以模拟调制方式;以及根据接收电视信号种类不同,调谐解调器220可以解调模拟信号和/或数字信号。
调谐解调器220,还用于根据用户选择,以及由控制器210控制,响应用户选择的电视频道频率以及该频率所携带的电视信号。
在其他一些示例性实施例中,调谐解调器220也可在外置设备中,如外置机顶盒等。这样,机顶盒通过调制解调后输出电视音视频信号,经过外部装置接口250输入至显示设备200中。
通信器230是用于根据各种通信协议类型与外部设备或外部服务器进行通信的组件。例如:通信器230可以包括WIFI模块231,蓝牙通信协议模块232,有线以太网通信协议模块233,及红外通信协议模块等其他网络通信协议模块或近场通信协议模块。
显示设备200可以通过通信器230与外部控制设备或内容提供设备之间建立控制信号和数据信号的连接。例如,通信器可根据控制器的控制接收遥控器100A的控制信号。
外部装置接口250,是提供N芯片控制器210和A芯片及外部其他设备间数据传输的组件。外部装置接口可按照有线/无线方式与诸如机顶盒、游戏装置、笔记本电脑等的外部设备连接,可接收外部设备的诸如视频信号(例如运动图像)、音频信号(例如音乐)、附加信息(例如EPG)等数据。
其中,外部装置接口250可以包括:高清多媒体接口(HDMI)端子251、复合视频消隐同步(CVBS)端子252、模拟或数字分量端子253、通用串行总线(USB) 端子254、红绿蓝(RGB)端子(图中未示出)等任一个或多个。本申请不对外部装置接口的数量和类型进行限制。
控制器210,通过运行存储在存储器290上的各种软件控制程序(如操作系统和/或各种应用程序),来控制显示设备200的工作和响应用户的操作。
如图4所示,控制器210包括只读存储器RAM214、随机存取存储器ROM213、图形处理器216、CPU处理器212、通信接口218、以及通信总线。其中,RAM214和ROM213以及图形处理器216、CPU处理器212、通信接口218通过总线相连接。
ROM213,用于存储各种系统启动的指令。如在收到开机信号时,显示设备200电源开始启动,CPU处理器212运行ROM中系统启动指令,将存储在存储器290的操作系统拷贝至RAM214中,以开始运行启动操作系统。当操作系统启动完成后,CPU处理器212再将存储器290中各种应用程序拷贝至RAM214中,然后,开始运行启动各种应用程序。
图形处理器216,用于产生各种图形对象,如:图标、操作菜单、以及用户输入指令显示图形等。包括运算器,通过接收用户输入各种交互指令进行运算,根据显示属性显示各种对象。以及包括渲染器,产生基于运算器得到的各种对象,进行渲染的结果显示在显示器280上。
CPU处理器212,用于执行存储在存储器290中操作系统和应用程序指令。以及根据接收外部输入的各种交互指令,来执行各种应用程序、数据和内容,以便最终显示和播放各种音视频内容。
在一些示例性实施例中,CPU处理器212,可以包括多个处理器。所述多个处理器中可包括一个主处理器以及多个或一个子处理器。主处理器,用于在预加电模式中执行显示设备200一些操作,和/或在正常模式下显示画面的操作。多个或一个子处理器,用于执行在待机模式等状态下的一种操作。
通信接口,可包括第一接口218-1到第n接口218-n。这些接口可以是经由网络被连接到外部设备的网络接口。
控制器210可以控制显示设备200的整体操作。例如:响应于接收到用于选择在显示器280上显示UI对象的用户命令,控制器210便可以执行与由用户命令选择的对象有关的操作。
其中,所述对象可以是可选对象中的任何一个,例如超链接或图标。与所选择的对象有关操作,例如:显示连接到超链接页面、文档、图像等操作,或者执行与图标相对应程序的操作。用于选择UI对象用户命令,可以是通过连接到显示设备 200的各种输入装置(例如,鼠标、键盘、触摸板等)输入命令或者与由用户说出语音相对应的语音命令。
存储器290,包括存储用于驱动和控制显示设备200的各种软件模块。如:存储器290中存储的各种软件模块,包括:基础模块、检测模块、通信模块、显示控制模块、浏览器模块、和各种服务模块等。
其中,基础模块是用于显示设备200中各个硬件之间信号通信、并向上层模块发送处理和控制信号的底层软件模块。检测模块是用于从各种传感器或用户输入接口中收集各种信息,并进行数模转换以及分析管理的管理模块。
例如:语音识别模块中包括语音解析模块和语音指令数据库模块。显示控制模块是用于控制显示器280进行显示图像内容的模块,可以用于播放多媒体图像内容和UI界面等信息。通信模块,是用于与外部设备之间进行控制和数据通信的模块。浏览器模块,是用于执行浏览服务器之间数据通信的模块。服务模块,是用于提供各种服务以及各类应用程序在内的模块。
同时,存储器290还用于存储接收外部数据和用户数据、各种用户界面中各个项目的图像以及焦点对象的视觉效果图等。
用户输入接口,用于将用户的输入信号发送给控制器210,或者,将从控制器输出的信号传送给用户。示例性的,控制装置(例如移动终端或遥控器)可将用户输入的诸如电源开关信号、频道选择信号、音量调节信号等输入信号发送至用户输入接口,再由用户输入接口转送至控制器;或者,控制装置可接收经控制器处理从用户输入接口输出的音频、视频或数据等输出信号,并且显示接收的输出信号或将接收的输出信号输出为音频或振动形式。
在一些实施例中,用户可在显示器280上显示的图形用户界面(GUI)输入用户命令,则用户输入接口通过图形用户界面(GUI)接收用户输入命令。或者,用户可通过输入特定的声音或手势进行输入用户命令,则用户输入接口通过传感器识别出声音或手势,来接收用户输入命令。
视频处理器260-1,用于接收视频信号,根据输入信号的标准编解码协议,进行解压缩、解码、缩放、降噪、帧率转换、分辨率转换、图像合成等视频数据处理,可得到直接在显示器280上显示或播放的视频信号。
示例的,视频处理器260-1,包括解复用模块、视频解码模块、图像合成模块、帧率转换模块、显示格式化模块等。
其中,解复用模块,用于对输入音视频数据流进行解复用处理,如输入MPEG-2,则解复用模块进行解复用成视频信号和音频信号等。
视频解码模块,用于对解复用后的视频信号进行处理,包括解码和缩放处理等。
图像合成模块,如图像合成器,其用于将图形生成器根据用户输入或自身生成的GUI信号,与缩放处理后视频图像进行叠加混合处理,以生成可供显示的图像信号。
帧率转换模块,用于对输入视频的帧率进行转换,如将输入的24Hz、25Hz、30Hz、60Hz视频的帧率转换为60Hz、120Hz或240Hz的帧率,其中,输入帧率可以与源视频流有关,输出帧率可以与显示器的更新率有关。输入有通常的格式采用如插帧方式实现。
显示格式化模块,用于将帧率转换模块输出的信号,改变为符合诸如显示器显示格式的信号,如将帧率转换模块输出的信号进行格式转换以输出RGB数据信号。
显示器280,用于接收源自视频处理器260-1输入的图像信号,进行显示视频内容和图像以及菜单操控界面。显示器280包括用于呈现画面的显示器组件以及驱动图像显示的驱动组件。显示视频内容,可以来自调谐解调器220接收的广播信号中的视频,也可以来自通信器或外部设备接口输入的视频内容。显示器220,同时显示显示设备200中产生且用于控制显示设备200的用户操控界面UI。
以及,根据显示器280类型不同,还包括用于驱动显示的驱动组件。或者,倘若显示器280为一种投影显示器,还可以包括一种投影装置和投影屏幕。
音频处理器260-2,用于接收音频信号,根据输入信号的标准编解码协议,进行解压缩和解码,以及降噪、数模转换、和放大处理等音频数据处理,得到可以在扬声器272中播放的音频信号。
音频输出接口270,用于在控制器210的控制下接收音频处理器260-2输出的音频信号,音频输出接口可包括扬声器272,或输出至外接设备的发生装置的外接音响输出端子274,如:外接音响端子或耳机输出端子等。
在其他一些示例性实施例中,视频处理器260-1可以包括一个或多个芯片组成。音频处理器260-2,也可以包括一个或多个芯片组成。
以及,在其他一些示例性实施例中,视频处理器260-1和音频处理器260-2,可以为单独的芯片,也可以与控制器210一起集成在一个或多个芯片中。
供电电源,用于在控制器210控制下,将外部电源输入的电力为显示设备200提供电源供电支持。供电电源可以包括安装显示设备200内部的内置电源电路,也可以是安装在显示设备200外部的电源,如在显示设备200中提供外接电源的电源接口。
与N芯片相类似,如图4所示,A芯片可以包括控制器310、通信器330、检 测器340、存储器390。在某些实施例中还可以包括用户输入接口、视频处理器、音频处理器、显示器、音频输出接口。在某些实施例中,也可以存在独立为A芯片供电的供电电源。
通信器330是用于根据各种通信协议类型与外部设备或外部服务器进行通信的组件。例如:通信器330可以包括WIFI模块331,蓝牙通信协议模块332,有线以太网通信协议模块333,及红外通信协议模块等其他网络通信协议模块或近场通信协议模块。
A芯片的通信器330和N芯片的通信器230也有相互交互。例如,N芯片的WiFi模块231用于连接外部网络,与外部服务器等产生网络通信。A芯片的WiFi模块331用于连接至N芯片的WiFi模块231,而不与外界网络等产生直接连接。因此,对于用户而言,一个如上述实施例中的显示设备至对外显示一个WiFi账号。
检测器340,是显示设备A芯片用于采集外部环境或与外部交互的信号的组件。检测器340可以包括光接收器342,用于采集环境光线强度的传感器,可以通过采集环境光来自适应显示参数变化等;还可以包括图像采集器341,如相机、摄像头等,可以用于采集外部环境场景,以及用于采集用户的属性或与用户交互手势,可以自适应变化显示参数,也可以识别用户手势,以实现与用户之间互动的功能。
外部装置接口350,提供控制器310与N芯片或外部其他设备间数据传输的组件。外部装置接口可按照有线/无线方式与诸如机顶盒、游戏装置、笔记本电脑等的外部设备连接。
控制器310,通过运行存储在存储器390上的各种软件控制程序(如用安装的第三方应用等),以及与N芯片的交互,来控制显示设备200的工作和响应用户的操作。
如图4所示,控制器310包括只读存储器ROM313、随机存取存储器RAM314、图形处理器316、CPU处理器312、通信接口318、以及通信总线。其中,ROM313和RAM314以及图形处理器316、CPU处理器312、通信接口318通过总线相连接。
ROM313,用于存储各种系统启动的指令。CPU处理器312运行ROM中系统启动指令,将存储在存储器390的操作系统拷贝至RAM314中,以开始运行启动操作系统。当操作系统启动完成后,CPU处理器312再将存储器390中各种应用程序拷贝至RAM314中,然后,开始运行启动各种应用程序。
CPU处理器312,用于执行存储在存储器390中操作系统和应用程序指令,和与N芯片进行通信、信号、数据、指令等传输与交互,以及根据接收外部输入的各 种交互指令,来执行各种应用程序、数据和内容,以便最终显示和播放各种音视频内容。
通信接口,可包括第一接口318-1到第n接口318-n。这些接口可以是经由网络被连接到外部设备的网络接口,也可以是经由网络被连接到N芯片的网络接口。
控制器310可以控制显示设备200的整体操作。例如:响应于接收到用于选择在显示器280上显示UI对象的用户命令,控制器210便可以执行与由用户命令选择的对象有关的操作。
图形处理器316,用于产生各种图形对象,如:图标、操作菜单、以及用户输入指令显示图形等。包括运算器,通过接收用户输入各种交互指令进行运算,根据显示属性显示各种对象。以及包括渲染器,产生基于运算器得到的各种对象,进行渲染的结果显示在显示器280上。
A芯片的图形处理器316与N芯片的图形处理器216均能产生各种图形对象。区别性的,若应用1安装于A芯片,应用2安装在N芯片,当用户在应用1的界面,且在应用1内进行用户输入的指令时,由A芯片图形处理器316产生图形对象。当用户在应用2的界面,且在应用2内进行用户输入的指令时,由N芯片的图形处理器216产生图形对象。
图5为本申请根据一些实施例示例性示出的显示设备的功能配置示意图。
如图5所示,A芯片的存储器390和N芯片的存储器290分别用于存储操作系统、应用程序、内容和用户数据等,在A芯片的控制器310和N芯片的控制器210的控制下,驱动显示设备200的系统运行以及响应用户的各种操作。A芯片的存储器390和N芯片的存储器290可以包括易失性和/或非易失性存储器。
对于N芯片,存储器290,具体用于存储驱动显示设备200中控制器210的运行程序,以及存储显示设备200内置各种应用程序,以及用户从外部设备下载的各种应用程序、以及与应用程序相关的各种图形用户界面,以及与图形用户界面相关的各种对象,用户数据信息,以及各种支持应用程序的内部数据。存储器290用于存储操作系统(OS)内核、中间件和应用等系统软件,以及存储输入的视频数据和音频数据、及其他用户数据。
存储器290,具体用于存储视频处理器260-1和音频处理器260-2、显示器280、通信接口230、调谐解调器220、输入/输出接口等驱动程序和相关数据。
在一些实施例中,存储器290可以存储软件和/或程序,用于表示操作系统(OS)的软件程序包括,例如:内核、中间件、应用编程接口(API)和/或应用程序。示例性的,内核可控制或管理系统资源,或其它程序所实施的功能(如所述中间件、API 或应用程序),以及内核可以提供接口,以允许中间件和API,或应用访问控制器,以实现控制或管理系统资源。
示例的,存储器290,包括广播接收模块2901、频道控制模块2902、音量控制模块2903、图像控制模块2904、显示控制模块2905、音频控制模块2906、外部指令识别模块2907、通信控制模块2908、电力控制模块2910、操作系统2911、以及其他应用程序2912、界面布局管理模块2913、事件传输系统2914以及浏览器模块等等。控制器210通过运行存储器290中各种软件程序,来执行诸如:广播电视信号接收解调功能、电视频道选择控制功能、音量选择控制功能、图像控制功能、显示控制功能、音频控制功能、外部指令识别功能、通信控制功能、光信号接收功能、电力控制功能、支持各种功能的软件操控平台、以及浏览器功能等各类功能。存储器390,包括存储用于驱动和控制显示设备200的各种软件模块。如:存储器390中存储的各种软件模块,包括:基础模块、检测模块、通信模块、显示控制模块、浏览器模块、和各种服务模块等。由于存储器390与存储器290的功能比较相似,相关之处参见存储器290即可,在此就不再赘述。
示例的,存储器390,包括图像控制模块3904、音频控制模块2906、外部指令识别模块3907、通信控制模块3908、光接收模块3909、操作系统3911、以及其他应用程序3912、浏览器模块等等。控制器210通过运行存储器290中各种软件程序,来执行诸如:图像控制功能、显示控制功能、音频控制功能、外部指令识别功能、通信控制功能、光信号接收功能、电力控制功能、支持各种功能的软件操控平台、以及浏览器功能等各类功能。区别性的,N芯片的外部指令识别模块2907和A芯片的外部指令识别模块3907可识别不同的指令。
示例性的,由于摄像头等图像接收设备与A芯片连接,因此,A芯片的外部指令识别模块3907可包括图形识别模块3907-1,图形识别模块3907-1内存储有图形数据库,摄像头接收到外界的图形指令时,与图形数据库中的指令进行对应关系,以对显示设备作出指令控制。而由于语音接收设备以及遥控器与N芯片连接,因此,N芯片的外部指令识别模块2907可包括语音识别模块2907-2,语音识别模块2907-2内存储有语音数据库,语音接收设备等接收到外界的语音指令或时,与语音数据库中的指令进行对应关系,以对显示设备作出指令控制。同样的,遥控器等控制装置100与N芯片连接,由按键指令识别模块与控制装置100进行指令交互。
在一些实施例中,可以没有第一芯片、第二芯片之分,显示设备具有的控制器在软件层面为一个操作系统,内置的应用可与上述双芯片架构显示设备中的应用相同。同样设置上述所有接口。
图6a中示例性示出了一些实施例中显示设备200中软件系统的配置框图。
对N芯片,如图6a中所示,操作系统2911,包括用于处理各种基础系统服务和用于实施硬件相关任务的执行操作软件,充当应用程序和硬件组件之间完成数据处理的媒介。
一些实施例中,部分操作系统内核可以包含一系列软件,用以管理显示设备硬件资源,并为其他程序或软件代码提供服务。
其他一些实施例中,部分操作系统内核可包含一个或多个设备驱动器,设备驱动器可以是操作系统中的一组软件代码,帮助操作或控制显示设备关联的设备或硬件。驱动器可以包含操作视频、音频和/或其他多媒体组件的代码。示例的,包括显示器、摄像头、Flash、WiFi和音频驱动器。
如图6a所示,在一些实施例中,操作系统2911具体可以包括:可访问性模块2911-1、通信模块2911-2、用户界面模块2911-3和控制应用程序2911-4。
在一些实施例中,操作系统2911还可以包括:摄像头调度模块2911-5、摄像头驱动模块2911-6和摄像头开关模块2911-7。
其中,可访问性模块2911-1,用于修改或访问应用程序,以实现应用程序的可访问性和对其显示内容的可操作性。
通信模块2911-2,用于经由相关通信接口和通信网络与其他外设的连接。
用户界面模块2911-3,用于提供显示用户界面的对象,以供各应用程序访问,可实现用户可操作性。
控制应用程序2911-4,用于控制进程管理、切换前台应用,包括运行时间应用程序等。
摄像头调度模块2911-5,用于控制摄像头开启或者关闭,以及升起或者降落。
和摄像头驱动模块2911-6,用于在摄像头调度模块2911-5的控制下,驱动与摄像头机械连接的马达以使摄像头升起或者降落;
摄像头开关模块2911-7,用于在摄像头调度模块2911-5的控制下,开启摄像头,即使其进入开启状态,或者关闭摄像头,即使其进入关闭状态。
如图6a所示,在一些实施例中,事件传输系统2914,可在操作系统2911内或应用程序2912中实现。一些实施例中,一方面在在操作系统2911内实现,同时在应用程序2912中实现,用于监听各种用户输入事件,将根据各种事件指代响应各类事件或子事件的识别结果,而实施一组或多组预定义的操作的处理程序。
具体的,事件传输系统2914可以包括事件监听模块2914-1和事件识别模块2914-2。其中,事件监听模块2914-1,用于监听用户输入接口输入事件或子事件。
事件识别模块2914-2,用于对各种用户输入接口输入各类事件的定义,识别出各种事件或子事件,且将其传输给处理用以执行其相应一组或多组的处理程序。
需要说明的是,事件或子事件,是指显示设备200中一个或多个传感器检测的输入,以及外界控制设备(如控制装置100等)的输入。如:语音输入各种子事件,手势识别的手势输入子事件,以及控制装置的遥控按键指令输入的子事件等。示例的,遥控器中一个或多个子事件包括多种形式,包括但不限于按键按上/下/左右/、确定键、按键按住等中一个或组合。以及非实体按键的操作,如移动、按住、释放等操作。
界面布局管理模块2913,直接或间接接收来自于事件传输系统2914监听到各用户输入事件或子事件,用于更新用户界面的布局,包括但不限于界面中各控件或子控件的位置,以及容器的大小或位置、层级等与界面布局相关各种执行操作。
由于A芯片的操作系统3911与N芯片的操作系统2911的功能比较相似,相关之处参见操作系统2911即可,在此就不再赘述。
如图6b中所示,显示设备的应用程序层包含可在显示设备200执行的各种应用程序。
N芯片的应用程序层2912可包含但不限于一个或多个应用程序,如:视频点播应用程序、应用程序中心、游戏应用等。A芯片的应用程序层3912可包含但不限于一个或多个应用程序,如:直播电视应用程序、媒体中心应用程序等。需要说明的是,A芯片和N芯片上分别包含什么应用程序是根据操作系统和其他设计确定的,本申请无需对A芯片和N芯片上所包含的应用程序做具体的限定和划分。
直播电视应用程序,可以通过不同的信号源提供直播电视。例如,直播电视应用程可以使用来自有线电视、无线广播、卫星服务或其他类型的直播电视服务的输入提供电视信号。以及,直播电视应用程序可在显示设备200上显示直播电视信号的视频。
视频点播应用程序,可以提供来自不同存储源的视频。不同于直播电视应用程序,视频点播提供来自某些存储源的视频显示。例如,视频点播可以来自云存储的服务器端、来自包含已存视频节目的本地硬盘储存器。
媒体中心应用程序,可以提供各种多媒体内容播放的应用程序。例如,媒体中心,可以为不同于直播电视或视频点播,用户可通过媒体中心应用程序访问各种图像或音频所提供服务。
应用程序中心,可以提供储存各种应用程序。应用程序可以是一种游戏、应用程序,或某些和计算机系统或其他设备相关但可以在显示设备中运行的其他应用程 序。应用程序中心可从不同来源获得这些应用程序,将它们储存在本地储存器中,然后在显示设备200上可运行。
在一些实施例中,由于A芯片及N芯片中可能分别安装有独立的操作系统,从而使显示设备200中存在两个在独立但又存在相互关联的子系统。例如,A芯片和N均可以独立安装有安卓(Android)及各类APP,使得每个芯片均可以实现一定的功能,并且使A芯片和N芯片协同实现某项功能。
图7中示例性示出了根据示例性实施例中显示设备200中用户界面的示意图。如图7所示,用户界面包括多个视图显示区,示例的,第一视图显示区201和播放画面202,其中,播放画面包括布局一个或多个不同项目。以及用户界面中还包括指示项目被选择的选择器,可通过用户输入而移动选择器的位置,以改变选择不同的项目。
需要说明的是,多个视图显示区可以呈现不同层级的显示画面。如,第一视图显示区可呈现视频聊天项目内容,第二视图显示区可呈现应用层项目内容(如,网页视频、VOD展示、应用程序画面等)。
可选的,不同视图显示区的呈现存在优先级区别,优先级不同的视图显示区之间,视图显示区的显示优先级不同。如,系统层的优先级高于应用层的优先级,当用户在应用层使用获取选择器和画面切换时,不遮挡系统层的视图显示区的画面展示;以及,根据用户的选择使应用层的视图显示区的大小和位置发生变化时,系统层的视图显示区的大小和位置不受影响。
也可以呈现相同层级的显示画面,此时,选择器可以在第一视图显示区和第二视图显示区之间做切换,以及当第一视图显示区的大小和位置发生变化时,第二视图显示区的大小和位置可随及发生改变。
在一些实施例中,图7中的任意一个区域可以显示摄像头获取的画面。
“项目”是指在显示设备200中用户界面的各视图显示区中显示以表示,诸如图标、缩略图、视频剪辑等对应内容的视觉对象。例如:项目可以表示电影、电视剧的图像内容或视频剪辑、音乐的音频内容、应用程序,或其他用户访问内容历史信息。
一些实施例中,“项目”可显示图像缩略图。如:当项目为电影或电视剧时,项目可显示为电影或电视剧的海报。如项目为音乐时,可显示音乐专辑的海报。如项目为应用程序时,可显示为应用程序的图标,或当应用程序被执行最近执行时捕捉到应用程序的内容截图。如项目为用户访问历史时,可显示为最近执行过程中内容截图。“项目”可显示为视频剪辑。如:项目为电视或电视剧的预告片的视频剪 辑动态画面。
此外,项目可以表示显示设备200与外接设备连接的接口或接口集合显示,或可表示连接至显示设备的外部设备名称等。如:信号源输入接口集合、或HDMI接口、USB接口、PC端子接口等。
“选择器”用于指示其中任意项目已被选择,如:光标或焦点对象。根据用户通过控制装置100上输入,控制在显示设备200上光标移动来选择或控制项目。可根据用户通过控制装置100的输入,可使显示设备200中显示焦点对象的移动来选择控制项目,可选择或控制其中一个或多个项目。如:用户可通过控制装置100上方向键控制焦点对象在项目之间的移动来选择和控制项目。
示例性地,图8为本申请根据一示例性实施例示出的遥控器按键示意图。如图7所示,遥控器上仅具有“音量+”、“音量-”、“向上”、“向下”、“确认OK”“语音”、“返回”、“主页”以及“开/关”按键,其中,每个按键对应一个固有的控制功能,即其原生功能。例如,“音量+”的原生功能是提升音量,“向上”的原生功能是向上移动焦点。
随着显示设备业务场景、所安装软件应用、系统功能的增多,同时,随着以遥控器为主的控制装置趋向极简化、小巧化的设计趋势,遥控器上的按键数量及其原生功能,已难以满足各类场景中对显示设备的控制需求。
有鉴于此,在一些实施例中,显示设备的控制器响应于前台界面由第一界面成功切换到第二界面,获取第二界面的界面属性,界面属性用于表征界面是否复用遥控器按键;在第二界面的界面属性表征第二界面复用遥控器按键时,根据第二界面所复用的按键更新按键复用策略,按键复用策略中被复用的按键的按键状态为复用状态,未被复用的按键的按键状态为原生状态;响应于接收到遥控器的按键输入事件,根据更新后的按键复用策略确定按键输入事件对应的按键状态;响应于确定按键输入事件对应于复用状态,实现指示应用执行按键输入事件对应的复用功能。
在一些示例性的实施方式中,响应于确定案件输入事件对应复用状态,发送包含案件输入事件的定制化广播,其中,该定制化广播于用于指示应用执行案件输入事件对应的复用功能。
具体实现中,显示设备可根据其所处的应用状态,或者所在界面,复用任意遥控器上的按键。
在一些示例性的实施方式中,对遥控器上按键复用时,显示设备接收到的遥控器发出的键值的不区分应用和界面的,只是显示设备在接收后,根据其所处的应用或者界面解析成不同的命令。
在一些示例性的实施方式中,显示设备对音量键复用。显示设备在非摄像头相关的应用下,呈现非摄像头相关的用户界面,如呈现视频播放的界面,在该界面下用户触发音量键,如“音量+”或者“音量-”,实现对整机音量的增大(或减小)的功能。而当用户触发进入摄像头相关的应用时,显示器呈现摄像头相关的用户界面,如呈现摄像头采集的图像数据,在该界面下,用户触发同样的上述音量键,实现对摄像头焦距的放大(或缩小),以在显示器上呈现与摄像头焦距相匹配的第二图像数据;或者实现摄像头焦距保持不变而是对呈现画面的放大(或缩小)。当用户退出该摄像头相关的应用进入视频播放类的应用时,解除对音量键的复用,触发音量键实现对整机音量的增大(或减小)的功能。
其中,非摄像头相关的应用包括视频播放相关应用和音乐播放相关应用,如点播应用,直播应用等。摄像头相关的应用包括上文提到的照镜子应用,录屏应用等。
一些实施例中,在摄像头相关界面中,图像全屏显示,如图11a。在该场景中,音量键对应摄像头焦距的调整。具体的,“音量-”对应摄像头焦距减小,“音量+”对应摄像头焦距放大。当用户在遥控器上触发“音量-”时,呈现图11b的画面。如图11b所示,摄像头焦距变小,获取的图像数据范围更大,在显示器上呈现比之前较大范围的图像数据,画面右侧出现画面大小调整条,提示用户当前处于0.5倍焦距的状态。当用户触发“音量+”时,呈现图11c所示的画面。如图11c所示,摄像头焦距变大,获取的图像范围变小,在显示器上呈现比之前较大范围的图像数据,画面右侧出现画面大小调整条,提示用户当前处于1.5倍或者更大焦距的状态。
另一些实施例中,在摄像头相关界面中,图片全屏显示,呈现图11a的画面。在该场景中,音量键对应预览画面的缩放调整。具体的,“音量-”对应预览图片缩小,“音量+”对应预览画面图片放大。当用户在遥控器上触发“音量-”时,如图11d所示,图像缩小为原全屏画面的0.5倍,画面右侧出现画面大小调整条,提示用户当前处于画面缩小0.5倍的状态。当用户按“音量+”至“1.0×”时,图片回到全屏显示状态。当用户在按“音量+”至“2.0×”时,图像当大至原图片的2倍,显示器上呈现图片的局部部分,画面右侧出现画面大小调整条,提示用户当前处于画面放大2倍的状态。
在上一实施例的另一种实施方式中,当用户在按“音量+”至“2.0×”时,图像当大至原图片的2倍,显示器上呈现图片的局部部分,画面右侧出现画面大小调整条,提示用户当前处于画面放大2倍的状态。另外,在显示器上呈现放大区域选择栏,可以通过遥控器上下左右的方向键实现对局部图像的锁定浏览,如图11f所示。
由于上述实施例将每个应用界面是否需要复用按键以及复用的按键集中保存在系统中,并根据前台界面是否需要复用按键以及复用的按键对按键复用策略进行即时更新,当接收到按键输入事件时,根据最新的按键复用策略,对转发按键输入事件进行集中控制,不仅能够实现遥控器上任意按键在任意应用界面下的复用,还可以保证复用按键的可靠性。
本申请实施例还提供一种遥控器按键复用方法,显示设备的控制器被配置为执行该方法的部分或者全部步骤,通过该方法,可以在预定的场景中,实现特定遥控器按键的复用,可靠性高,使遥控器可以对显示设备进行灵活多样的控制。
图9为本申请根据一示例性实施例示出的遥控器按键复用方法流程图。如图9所示,该方法可以包括:
步骤901,响应于前台界面由第一界面成功切换到第二界面,获取所述第二界面的界面属性,所述界面属性用于表征界面是否复用遥控器按键。
显示设备开机后,呈现用户界面,所呈现的用户界面中,可以仅包括一个应用的应用界面,此时,该应用界面在显示器屏幕上全屏显示,也可以包括多个应用的应用界面,此时,其中至少一个应用的应用界面以小于显示屏幕尺寸的窗口界面进行显示。
显示设备所呈现的一个或多个应用界面中,可以获取到遥控器信号焦点的界面为前台界面,前台界面所属的应用为前台应用。
示例性地,图10a为用户在应用中心打开“拍照应用”后进入的应用界面。如图10a所示,该显示设备用户界面中,仅包括一个应用界面,该应用界面可以获取到遥控器信号焦点,为前台界面。
另一示例性地,图10b为本申请根据一示例性实施例示出的另一用户界面,包括两个应用界面,其中,A界面以窗口形式浮于B界面显示,B界面作为背景界面在显示器上全屏显示,A界面可以获取到遥控器信号焦点,属于前台界面。
基于用户操作,在前台显示的应用界面可以发生切换。例如,参阅图10b,当用户通过遥控器将焦点位置由A界面移动到B界面时,前台界面由A界面切换至B界面。再如,继续参阅图10b,当用户通过遥控器选择A界面中的某一功能图标而进入A界面所属应用的另一界面C(图中未示出)时,前台界面由A界面切换至C界面。
在步骤901中,系统监测前台界面是否切换成功。通常,前台界面的切换过程包括:应用向系统发送将第一界面退出前台界面的请求;系统接收该请求后,向应用发送响应消息;应用接收到响应消息时,将第一界面从前台界面退出;应用向系 统发送将第二界面在前台界面显示的请求;系统接收该请求后,向应用发送响应消息;应用接收到响应消息时,将第二界面在前台界面显示。鉴于此,当系统监测到第二界面成功在前台界面显示时,确定前台界面由第一界面成功切换到第二界面。
需要说明的是,第一界面和第二界面可以为同一应用的两个不同界面,也可以是分别属于不同应用的界面。
在步骤901中,当系统监测到前台界面由第一界面成功切换到第二界面时,获取第二界面的界面属性。
本实施例中,界面可以有两种属性,例如第一属性和第二属性,分别用于表征需要复用按键和不需要复用按键。界面也可以在同一属性下具有两种取值,分别用于表征需要复用按键和不需要复用按键。为了便于说明,本申请将需要复用按键的界面称为客制化界面,将不需要复用按键的界面称为正常界面。
具体实现时,可以预先建立一个界面属性表,如.xml表格,用于对应保存应用和/或应用界面的包名名称和应用和/或应用界面的应用属性和/或界面属性,此外,对于客制化应用/界面,该.xml表格中还可以保存客制化应用/界面需要复用的按键,其中按键可以以键值来表示。
示例性地,.xml表格中数据保存格式可以为,<应用/应用界面包名-应用/界面属性-复用按键键值>。其中,根据应用/界面属性可以判定某一应用/应用界面是为客制化界面,或为正常界面,若为客制化界面,根据复用按键键值可以确定该应用/应用界面所复用的按键。
当显示设备启动后,将上述.xml表格数据读取到系统框架存储的H_MAP表格中。进而,在步骤901中,当监测到前台界面由第一界面成功切换至第二界面时,获取第二界面的界面名称,即其包名,然后在H_MAP表格中查找界面名称对应的界面属性。
步骤902,在所述第二界面的界面属性表征第二界面复用遥控器按键时,根据所述第二界面所复用的按键更新按键复用策略,所述按键复用策略中被复用的按键的按键状态为复用状态,未被复用的按键的按键状态为原生状态。
按键复用策略预置在操作系统中,按键复用策略中包括遥控器上每个按键的按键状态,按键状态包括原生状态和复用状态。对于某一按键,如果根据按键复用策略确定其处于原生状态,则表示该按键未被复用,如果根据按键复用策略确定其处于复用状态,则表示该按键被复用。
在本实施例中,第二界面可能为客制化界面,也可能为正常界面,在所述第二界面的界面属性表征第二界面复用遥控器按键时,根据所述第二界面所复用的按键 更新按键复用策略。
具体实现时,首先根据第二界面的界面属性判断第二界面是否为需要复用按键的预设界面。例如,如果第二界面的界面属性值为预定值,则第二界面为预设界面,如果第二界面的界面属性值非预定值,则第二界面非预设界面,即不需要复用按键。
如果第二界面是预设界面,则获取第二界面所复用的按键。需要说明的是,诸如第二界面的任意预设界面所复用的按键,均为预定,并且预先保存在上述界面属性列表中。基于此,如果第二界面是预设界面,则在上述H_MAP表格中,可以查找到其所复用按键的键值。然后,将所述按键复用策略中所述第二界面所复用按键的按键状态更新为复用状态,其余按键的按键状态更新为原生状态。
如果第二界面非预设界面,则将所述按键复用策略中每一按键的按键状态更新为原生状态。
步骤903,响应于接收到遥控器的按键输入事件,根据更新后的按键复用策略确定所述按键输入事件对应的按键状态。
在本申请场景中,如果用户使用遥控器操作显示设备,显示设备会接收到按键输入事件,并获取到按键输入事件对应的按键键值。当接收到遥控器的按键输入事件时,在步骤903中,根据按键输入事件对应的按键键值,在更新后的按键复用策略中,可以查找到该按键键值对应的按键状态,是为原生状态,或为复用状态。
在步骤904中,响应于确定所述按键输入事件对应于所述复用状态,发送包含所述按键输入事件的定制化广播,所述定制化广播用于指示应用执行所述按键输入事件对应的复用功能。
或者,相应地,响应于确定所述按键输入事件对应于所述原生状态,则通过系统线程与UI线程之间的线程间通信,将该按键输入事件发送给应用层。
例如,假设第二界面为图10a所示的应用界面,该应用界面预定的复用按键为“音量+”,那么,在界面属性列表中,该应用界面的界面名称对应的界面属性的属性值为预定值,对应的复用按键键值为“音量+”的键值。当该应用界面成功在前台界面显示时,系统将按键复用策略中“音量+”的按键状态修改为复用状态,其余按键的按键状态均为原生状态。当系统接收到按键输入事件时,如果为“音量+”的按键输入,则发送包含该按键输入事件的广播。图10a所示应用界面所属的应用接收到该广播后,执行广播中该按键输入事件对应的处理逻辑。
由上述实施例可知,本申请实施例提供一种遥控器按键复用方法,应用于显示设备控制器,响应于前台界面由第一界面成功切换到第二界面,获取第二界面的界面属性,;在第二界面的界面属性表征第二界面复用遥控器按键时,根据第二界面 所复用的按键更新按键复用策略;响应于接收到遥控器的按键输入事件,根据更新后的按键复用策略确定按键输入事件对应的按键状态;响应于确定按键输入事件对应于复用状态,发送包含按键输入事件的定制化广播,该定制化广播用于指示应用执行所述按键输入事件对应的复用功能。
由于本申请方法将每个应用界面是否需要复用按键以及复用的具体按键集中保存在系统中,并通过按键复用策略,对转发按键输入事件进行集中控制,不仅能够实现遥控器上任意按键在任意应用界面下的复用,还可以保证复用按键的可靠性。
图11为本申请根据一示例性实施例示出的在“相册”应用中,复用遥控器上“音量+”和“音量-”按键的场景。在该场景中,“相册”应用为前台应用,图11所呈现的应用界面为预设界面,即客制化界面,在该界面下,用户可以通过按压遥控器上的“音量+”和“音量-”,放大或缩小正在查看的图片,“音量+”和“音量-”的放大或缩小正在查看的图片的功能,即为“音量+”和“音量-”的复用功能。
另外,为了提高用户体验的友好度,在本申请另外的实施例中,上述步骤902之后,还包括:响应于按键复用策略中至少一个按键的按键状态被更新为复用状态,生成提示信息,所述提示信息用于提示用户当前已复用的遥控器按键;将所述提示信息在前台界面上显示。
为了进一步提高遥控器按键的可靠性,在上述实施例基础上,本申请还提供以下实施例,如图12所示,在该实施例中,遥控器按键复用方法包括:
步骤121,接收用于将前台界面由第一界面切换到第二界面的请求。
需要说明的是,步骤121所述的请求,可以为两个请求,其一为用于将第一界面从前台界面退出的请求,其二为用于将第二界面从前台界面显示的请求。
步骤122,更新所述按键复用策略为默认策略,所述默认策略中任意按键输入事件对应的按键状态均为原生状态;
步骤123,将所述第一界面从前台界面退出;
步骤124,将所述第二界面在前台界面显示;
步骤125,获取所述第二界面的界面属性;
步骤126,根据所述第二界面的界面属性,更新按键复用策略;
步骤127,当接收到遥控器的按键输入事件时,根据更新后的按键复用策略,确定所述按键输入事件对应的按键状态;
步骤128,如果所述按键输入事件对应的按键状态为复用状态,则发送包含所述按键输入事件的定制化广播,所述定制化广播用于指示应用执行所述按键输入事件对应的复用功能。
在图12所示实施例中,由于切换前台界面的过程中(步骤121之后,步骤124之前),将按键复用策略更新为默认策略,从而保证一旦退出客制化界面,就使遥控器按键恢复到原生状态,避免在切换前台界面的时间里,由于某一些按键仍然处于复用状态而导致这些按键失效。
在图12所示实施例中,由于第一界面和第二界面属于同一应用的不同界面,因此在切换前台界面的过程中不涉及前台应用切换。基于此,在本申请的另外的实施例中,如果第一界面和第二界面分别属于不同的应用,那么,在当系统接收用于将前台应用由第一应用切换到第二应用的请求时,先行将按键复用策略更新为默认策略,然后再将第一应用从前台应用退出,将所述第二应用在前台界面运行。从而,能够保证一旦前台应用退出,就使遥控器按键恢复到原生状态,避免在切换前台应用的时间里,由于某一些按键仍然处于复用状态而导致这些按键失效。
在一些场景中,前台应用可能出现诸如内存泄露、无响应、无法获取焦点等异常,此时,如果存在按键状态为复用状态的按键,则在应用被系统停止之前,处于复用状态的按键将无法实现其原生功能。例如,处于前台的“拍照应用”出现无响应,若“音量+”为该“拍照应用”复用的按键,则在“拍照应用”无响应的时间里,用户将无法使用“音量+”按键开调节显示设备的输出音量。
为了解决该问题,在本申请另外的实施例中,显示设备在执行以上实施例所提供步骤的同时,还执行:
检测前台应用是否异常;
如果检测到前台应用异常,更新所述按键复用策略为默认策略。
进而,在前台应用出现异常到其被系统关闭的时间里,可以保证任意按键均能实现其原生功能,从而提高按键可靠性。
由上述实施例可知,本申请提供一种遥控器按键复用方法:
当接收到将前台界面由第一界面切换到第二界面的请求时,更新按键复用策略为默认策略,从而保证在界面切换过程中,遥控器按键可以实现其原生功能,提高按键可靠性。
当前台界面由第一界面成功切换到第二界面时,根据所述第二界面的界面属性,更新预置的按键复用策略;当接收到遥控器的按键输入事件时,根据更新后的按键复用策略,确定所述按键输入事件对应的按键状态,所述按键状态包括原生状态和复用状态;如果所述按键输入事件对应的按键状态为复用状态,则发送包含所述按键输入事件的广播,从而实现遥控器上任意按键在任意应用界面下的复用。
当检测到前台应用异常时,更新所述按键复用策略为默认策略,进而,在前台 应用出现异常到其被系统关闭的时间里,可以保证任意按键均能实现其原生功能,从而提高按键可靠性。
具体实现中,本申请还提供一种计算机存储介质,其中,该计算机存储介质可存储有程序,该程序执行时可包括本申请提供的方法的各实施例中的部分或全部步骤。所述的存储介质可为磁碟、光盘、只读存储记忆体(英文:read-only memory,简称:ROM)或随机存储记忆体(英文:random access memory,简称:RAM)等。
本领域的技术人员可以清楚地了解到本申请实施例中的技术可借助软件加必需的通用硬件平台的方式来实现。基于这样的理解,本申请实施例中的实现方式本质上或者说对现有技术做出贡献的部分可以以软件产品的形式体现出来,该计算机软件产品可以存储在存储介质中,如ROM/RAM、磁碟、光盘等,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)执行本申请各个实施例或者实施例的某些部分所述的方法。
本说明书中各个实施例之间相同相似的部分互相参见即可。尤其,对于显示设备实施例而言,由于其基本相似于方法实施例,所以描述的比较简单,相关之处参见方法实施例中的说明即可。
以上所述的本申请实施方式并不构成对本申请保护范围的限定。本领域的技术人员可以清楚地了解到本申请实施例中的技术可借助软件加必需的通用硬件平台的方式来实现。基于这样的理解,本申请实施例中的实现方式本质上或者说对现有技术做出贡献的部分可以以软件产品的形式体现出来,该计算机软件产品可以存储在存储介质中,如ROM/RAM、磁碟、光盘等,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)执行本申请各个实施例或者实施例的某些部分所述的方法。
本说明书中各个实施例之间相同相似的部分互相参见即可。尤其,对于显示设备实施例而言,由于其基本相似于方法实施例,所以描述的比较简单,相关之处参见方法实施例中的说明即可。
以上所述的本申请实施方式并不构成对本申请保护范围的限定。

Claims (18)

  1. 一种显示设备,包括:
    摄像头,被配置为:采集图像数据;
    显示器,被配置为:呈现视频播放相关界面,和/或,摄像头相关界面;
    控制器,被配置为:
    响应于用户由视频播放相关界面切换至摄像头相关界面的指令,接收与所述显示设备匹配的控制装置的按键指令,控制所述显示设备由视频播放相关功能切换至焦距调节相关的功能。
  2. 一种显示设备,包括:
    摄像头,被配置为:采集图像数据;
    显示器,被配置为:呈现用户界面;
    控制器,被配置为:
    接收与所述显示设备匹配的控制装置发送的按键指令,
    当所述显示器呈现视频播放相关界面时,执行与视频播放相关的功能;
    当所述显示器呈现图像数据相关界面时,执行对所述图像数据的缩放的功能。
  3. 根据权利要求2所述的显示设备,所述控制器还被配置为:
    接收所述按键指令,当所述显示器呈现图像数据相关界面时,控制所述显示器呈现经过缩放的图像数据。
  4. 根据权利要求2所述的显示设备,所述控制器还被配置为:
    接收与所述显示设备匹配的控制装置发送的按键指令,判断当前是否处于图像数据相关应用,若是,则对显示器上呈现图像数据进行缩放。
  5. 根据权利要求2所述的显示设备,所述控制器还被配置为:
    当所述显示器从呈现视频播放相关界面切换至图像数据相关界面时,
    根据所述图像数据相关界面所复用的按键更新按键复用策略,所述按键复用策略中被复用的按键的按键状态为复用状态,未被复用的按键的按键状态为原生状态;
    响应于接收到控制装置的按键输入事件,根据更新后的按键复用策略确定所述按键输入事件对应的按键状态;
    响应于确定所述按键输入事件对应于所述复用状态,实现用于指示应用执行所述按键输入事件对应的复用功能。
  6. 一种显示设备,包括:
    摄像头,被配置为:采集图像数据;
    显示器,被配置为:呈现用户界面;
    控制器,被配置为:
    接收与所述显示设备匹配的控制装置发送的按键指令,
    当所述显示器呈现视频播放相关界面时,执行与视频播放相关的功能;
    当所述显示器呈现图像数据相关界面时,执行对所述摄像头焦距的调节功能。
  7. 根据权利要求6所述的显示设备,所述控制器还被配置为:
    接收所述按键指令,当所述显示器呈现图像数据相关界面时,控制所述显示器呈现经过与所述摄像头焦距相对应的第二图像数据。
  8. 根据权利要求6所述的显示设备,所述控制器还被配置为:
    接收与所述显示设备匹配的控制装置发送的按键指令,判断当前是否处于图像数据相关应用,若是,则控制所述显示器呈现经过与所述摄像头焦距相对应的第二图像数据。
  9. 根据权利要求6所述的显示设备,所述控制器还被配置为:
    当所述显示器从呈现视频播放相关界面切换至图像数据相关界面时,
    根据所述图像数据相关界面所复用的按键更新按键复用策略,所述按键复用策略中被复用的按键的按键状态为复用状态,未被复用的按键的按键状态为原生状态;
    响应于接收到控制装置的按键输入事件,根据更新后的按键复用策略确定所述按键输入事件对应的按键状态;
    响应于确定所述按键输入事件对应于所述复用状态,实现用于指示应用执行所述按键输入事件对应的复用功能。
  10. 一种显示设备,包括:
    显示器,用于呈现用户界面;
    控制器,被配置为:
    响应于前台界面由第一界面成功切换到第二界面,获取所述第二界面的界面属性,所述界面属性用于表征界面是否复用遥控器按键;
    在所述第二界面的界面属性表征第二界面复用遥控器按键时,根据所述第二界面所复用的按键更新按键复用策略,所述按键复用策略中被复用的按键的按键状态为复用状态,未被复用的按键的按键状态为原生状态;
    响应于接收到遥控器的按键输入事件,根据更新后的按键复用策略确定所述按键输入事件对应的按键状态;
    响应于确定所述按键输入事件对应于所述复用状态,实现用于指示应用执行所述按键输入事件对应的复用功能。
  11. 根据权利要求10所述的显示设备,其特征在于,所述控制器还被配置为,在所述获取第二界面的界面属性之前:
    响应于接收到用于将前台界面由第一界面切换到第二界面的请求,将按键复用策略更新为默认策略,所述默认策略中任意按键对应的按键状态均为原生状态;
    以及,将所述第一界面从前台退出,将所述第二界面在前台显示。
  12. 根据权利要求10所述的显示设备,其特征在于,所述控制器被配置为,根据所述第二界面所复用的按键按照下述步骤更新按键复用策略:
    将所述按键复用策略中所述第二界面所复用按键的按键状态更新为复用状态,其余按键的按键状态更新为原生状态。
  13. 根据权利要求10所述的显示设备,其特征在于,所述控制器还被配置为:
    响应于按键复用策略中至少一个按键的按键状态被更新为复用状态,生成提示信息,所述提示信息用于提示用户当前已复用的遥控器按键;
    将所述提示信息在前台界面上显示。
  14. 根据权利要求10所述的显示设备,其特征在于,所述控制器还被配置为:
    检测前台应用是否异常;
    响应于检测到前台应用异常,将所述按键复用策略更新为默认策略。
  15. 根据权利要求10所述的显示设备,其特征在于,所述控制器被配置为按照下述步骤获取所述第二界面的界面属性:
    获取所述第二界面的界面名称;
    在预先建立的界面属性表中,查找所述界面名称对应的界面属性;
    以及,所述控制器还被配置为,从所述界面属性表获取所述第二界面所复用的按键。
  16. 根据权利要求10所述的显示设备,其特征在于,所述控制器还被配置为,在根据所述第二界面所复用的按键更新按键复用策略之前,根据所述第二界面的界面属性判断所述第二界面是否复用遥控器按键,其中,如果所述第二 界面的界面属性值为预定值,则所述第二界面为复用遥控器按键的预设界面;如果所述第二界面的界面属性值非预定值,则所述第二界面为不复用遥控器按键的非预设界面。
  17. 根据权利要求10所述的显示设备,其特征在于,所述第一界面和第二界面为同一应用的不同界面,或者,所述第一界面和第二界面分别为不同应用的界面。
  18. 一种遥控器按键复用方法,包括:
    响应于前台界面由第一界面成功切换到第二界面,获取所述第二界面的界面属性,所述界面属性用于表征界面是否复用遥控器按键;
    在所述第二界面的界面属性表征第二界面复用遥控器按键时,根据所述第二界面所复用的按键更新按键复用策略,所述按键复用策略中被复用的按键的按键状态为复用状态,未被复用的按键的按键状态为原生状态;
    响应于接收到遥控器的按键输入事件,根据更新后的按键复用策略确定所述按键输入事件对应的按键状态;
    响应于确定所述按键输入事件对应于所述复用状态,发送包含所述按键输入事件的定制化广播,所述定制化广播用于指示应用执行所述按键输入事件对应的复用功能。
PCT/CN2020/090468 2019-08-18 2020-05-15 显示设备和控制装置按键复用方法 WO2021031629A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910761467 2019-08-18
CN201910761467.0 2019-08-18

Publications (1)

Publication Number Publication Date
WO2021031629A1 true WO2021031629A1 (zh) 2021-02-25

Family

ID=74603795

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/090468 WO2021031629A1 (zh) 2019-08-18 2020-05-15 显示设备和控制装置按键复用方法

Country Status (2)

Country Link
CN (2) CN112399214A (zh)
WO (1) WO2021031629A1 (zh)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113347482A (zh) * 2021-06-18 2021-09-03 聚好看科技股份有限公司 一种播放数据的方法及显示设备
CN114281291A (zh) * 2021-12-28 2022-04-05 海信视像科技股份有限公司 一种显示设备、控制装置及控制装置的低电量提示方法
CN115022695A (zh) * 2021-03-04 2022-09-06 聚好看科技股份有限公司 显示设备及Widget控件显示方法
CN115022689A (zh) * 2022-05-25 2022-09-06 Vidaa国际控股(荷兰)公司 一种控制装置按键的配置方法及显示设备、控制装置

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113141479B (zh) * 2021-05-12 2024-04-26 Vidaa美国公司 一种显示设备及其按键复用方法
CN117348779B (zh) * 2023-10-13 2024-07-09 深圳市大我云读写科技有限公司 功能按键映射方法、装置、设备及计算机可读存储介质

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050041151A1 (en) * 2003-08-20 2005-02-24 Samsung Electronis, Co., Ltd. Display apparatus and control method thereof
CN101621610A (zh) * 2008-07-03 2010-01-06 深圳Tcl新技术有限公司 一种控制摄像机的电视机
CN102292130A (zh) * 2009-01-21 2011-12-21 皇家飞利浦电子股份有限公司 看电视时访问锻炼的方法和设备
CN102984595A (zh) * 2012-12-31 2013-03-20 北京京东世纪贸易有限公司 一种图像处理系统和方法
CN203278987U (zh) * 2013-04-09 2013-11-06 四川长虹电器股份有限公司 可摄像的电视机
CN104735338A (zh) * 2013-12-23 2015-06-24 厦门美图移动科技有限公司 一种使用音量键进行滤镜切换的拍摄终端及拍摄方法
CN106327817A (zh) * 2015-06-17 2017-01-11 中兴通讯股份有限公司 投影设备遥控器及其按键复用方法、装置

Family Cites Families (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004357188A (ja) * 2003-05-30 2004-12-16 Canon Inc 制御システム、制御装置、データ出力装置、制御方法、記録媒体およびプログラム
JP2010233157A (ja) * 2009-03-30 2010-10-14 Funai Electric Co Ltd リモコン装置を備えた電子機器
CN101963980A (zh) * 2010-09-27 2011-02-02 苏州阔地网络科技有限公司 一种网页上实现的不同分辨率下图片居中后白板同步的方法
CN102938852B (zh) * 2012-11-14 2018-03-20 康佳集团股份有限公司 一种3d双通道智能电视遥控器、遥控系统及其遥控方法
CN103200460B (zh) * 2013-04-10 2016-03-23 孙根海 一种车载导航仪复用个人移动信息终端功能的方法
CN104007837A (zh) * 2014-05-09 2014-08-27 北京航天发射技术研究所 一种控制系统显控终端的键盘输入方法
CN105323612B (zh) * 2014-06-19 2019-03-05 深圳市同方多媒体科技有限公司 一种电视机的按键复用方法及装置
CN105376620A (zh) * 2014-08-18 2016-03-02 深圳Tcl新技术有限公司 控制方法及系统
CN105407216A (zh) * 2015-10-27 2016-03-16 苏州蜗牛数字科技股份有限公司 实现按键自定义的方法及系统
WO2017177385A1 (zh) * 2016-04-12 2017-10-19 深圳信炜科技有限公司 电子设备
KR102575230B1 (ko) * 2016-07-29 2023-09-06 엘지전자 주식회사 원격제어장치 및 그 동작방법
CN107566900B (zh) * 2017-08-30 2022-07-05 北京酷我科技有限公司 一种基于按键复用的智能电视App键盘布局的优化方法
CN107704098A (zh) * 2017-08-30 2018-02-16 北京酷我科技有限公司 一种智能电视App键盘布局优化方法
CN107592415B (zh) * 2017-08-31 2020-08-21 努比亚技术有限公司 语音发送方法、终端和计算机可读存储介质
CN107707755B (zh) * 2017-09-28 2020-12-22 努比亚技术有限公司 按键使用方法、终端及计算机可读存储介质
CN107995513A (zh) * 2017-12-06 2018-05-04 康佳集团股份有限公司 一种电视遥控器及其按键复用的方法、存储介质
CN111212250B (zh) * 2017-12-20 2023-04-14 海信视像科技股份有限公司 智能电视及电视画面截图的图形用户界面的显示方法
CN108347639A (zh) * 2018-03-09 2018-07-31 深圳创维-Rgb电子有限公司 智能电视控制方法、装置、智能电视及可读存储介质
CN109979180B (zh) * 2019-03-25 2021-12-10 深圳创维数字技术有限公司 遥控器按键复用方法、装置、存储介质和遥控器

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050041151A1 (en) * 2003-08-20 2005-02-24 Samsung Electronis, Co., Ltd. Display apparatus and control method thereof
CN101621610A (zh) * 2008-07-03 2010-01-06 深圳Tcl新技术有限公司 一种控制摄像机的电视机
CN102292130A (zh) * 2009-01-21 2011-12-21 皇家飞利浦电子股份有限公司 看电视时访问锻炼的方法和设备
CN102984595A (zh) * 2012-12-31 2013-03-20 北京京东世纪贸易有限公司 一种图像处理系统和方法
CN203278987U (zh) * 2013-04-09 2013-11-06 四川长虹电器股份有限公司 可摄像的电视机
CN104735338A (zh) * 2013-12-23 2015-06-24 厦门美图移动科技有限公司 一种使用音量键进行滤镜切换的拍摄终端及拍摄方法
CN106327817A (zh) * 2015-06-17 2017-01-11 中兴通讯股份有限公司 投影设备遥控器及其按键复用方法、装置

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115022695A (zh) * 2021-03-04 2022-09-06 聚好看科技股份有限公司 显示设备及Widget控件显示方法
CN115022695B (zh) * 2021-03-04 2023-09-19 聚好看科技股份有限公司 显示设备及Widget控件显示方法
CN113347482A (zh) * 2021-06-18 2021-09-03 聚好看科技股份有限公司 一种播放数据的方法及显示设备
CN113347482B (zh) * 2021-06-18 2023-10-27 聚好看科技股份有限公司 一种播放数据的方法及显示设备
CN114281291A (zh) * 2021-12-28 2022-04-05 海信视像科技股份有限公司 一种显示设备、控制装置及控制装置的低电量提示方法
CN115022689A (zh) * 2022-05-25 2022-09-06 Vidaa国际控股(荷兰)公司 一种控制装置按键的配置方法及显示设备、控制装置
CN115022689B (zh) * 2022-05-25 2023-11-03 Vidaa国际控股(荷兰)公司 一种控制装置按键的配置方法及显示设备、控制装置

Also Published As

Publication number Publication date
CN112399214A (zh) 2021-02-23
CN112399213A (zh) 2021-02-23
CN112399213B (zh) 2022-11-22

Similar Documents

Publication Publication Date Title
WO2020248668A1 (zh) 一种显示器及图像处理方法
WO2021031629A1 (zh) 显示设备和控制装置按键复用方法
WO2021088320A1 (zh) 显示设备和内容显示方法
WO2021031623A1 (zh) 显示设备、分享文件的方法和服务器
CN111491190B (zh) 一种双系统摄像头切换控制方法及显示设备
WO2020248714A1 (zh) 一种数据传输方法及设备
WO2021189358A1 (zh) 显示设备和音量调节方法
CN111970549B (zh) 菜单显示方法和显示设备
CN110708581B (zh) 显示设备及呈现多媒体屏保信息的方法
WO2020248680A1 (zh) 视频数据处理方法、装置及显示设备
CN111031375A (zh) 一种开机动画详情页的跳转方法及显示设备
WO2021031598A1 (zh) 视频聊天窗口位置的自适应调整方法及显示设备
WO2021031589A1 (zh) 一种显示设备及色域空间动态调整方法
CN112399232A (zh) 一种显示设备、摄像头优先级使用的控制方法及装置
CN112788422A (zh) 显示设备
CN111464840B (zh) 显示设备及显示设备屏幕亮度的调节方法
CN112783380A (zh) 显示设备和方法
WO2021031620A1 (zh) 显示设备和背光亮度调节方法
WO2020248699A1 (zh) 一种声音处理法及显示设备
WO2020248681A1 (zh) 显示设备及蓝牙开关状态的显示方法
CN112463267B (zh) 在显示设备屏幕上呈现屏保信息的方法及显示设备
CN113141528B (zh) 显示设备、开机动画播放方法及存储介质
WO2020248654A1 (zh) 显示设备及应用共同显示的方法
WO2021169125A1 (zh) 显示设备和控制方法
WO2020248790A1 (zh) 语音控制方法及显示设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20855596

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20855596

Country of ref document: EP

Kind code of ref document: A1