Detailed Description
In order to make those skilled in the art better understand the technical solutions in the present application, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
All other embodiments, which can be derived by a person skilled in the art from the exemplary embodiments shown in the present application without inventive effort, shall fall within the scope of protection of the present application. Moreover, while the disclosure herein has been presented in terms of exemplary one or more examples, it is to be understood that each aspect of the disclosure can be utilized independently and separately from other aspects of the disclosure to provide a complete disclosure.
It should be understood that the terms "first," "second," "third," and the like in the description and claims of this application and in the foregoing drawings are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used are interchangeable under appropriate circumstances and can, for example, be implemented in sequences other than those illustrated or otherwise described herein with reference to the embodiments of the application.
Furthermore, the terms "comprises" and "comprising," as well as any variations thereof, are intended to cover a non-exclusive inclusion, such that a product or device that comprises a list of elements is not necessarily limited to those elements explicitly listed, but may include other elements not expressly listed or inherent to such product or device.
The term module, as used herein, refers to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware or/and software code that is capable of performing the functionality associated with that element.
The term "remote control" as used in this application refers to a component of an electronic device (such as the display device disclosed in this application) that is typically wirelessly controllable over a relatively short range of distances. Typically using infrared and/or Radio Frequency (RF) signals and/or bluetooth to connect with the electronic device, and may also include WiFi, wireless USB, bluetooth, motion sensor, etc. For example: the hand-held touch remote controller replaces most of the physical built-in hard keys in a common remote control device with a user interface in a touch screen.
The term "gesture" as used in this application refers to a user's behavior through a change in hand shape or an action such as hand motion to convey a desired idea, action, purpose or result.
Before detailed description is given to specific implementation manners of the technical scheme of the present application, a basic application scenario of the technical scheme of the present application is described.
Fig. 1 is a diagram illustrating an operation scenario between a display device and a control apparatus according to an exemplary embodiment. As shown in fig. 1, a user may operate the display apparatus 200 through the control device 100.
The control device 100 may be a remote controller 100A, which includes infrared protocol communication, bluetooth protocol communication, other short-distance communication methods, and the like, and controls the display apparatus 200 in a wireless or other wired manner. The user may input a user instruction through a key on a remote controller, voice input, control panel input, etc., to control the display apparatus 200. Such as: the user can input a corresponding control command through a volume up/down key, a channel control key, up/down/left/right moving keys, a voice input key, a menu key, a power on/off key, etc. on the remote controller, to implement the function of controlling the display device 200.
The control device 100 may also be a smart device, such as a mobile terminal 100B, a tablet computer, a notebook computer, and so on. For example, the display device 200 is controlled using an application program running on the smart device. The application may provide the user with various controls through an intuitive User Interface (UI) on a screen associated with the smart device.
For example, the mobile terminal 100B may install a software application with the display device 200, implement connection communication through a network communication protocol, and implement the purpose of one-to-one control operation and data communication. Such as: the mobile terminal 100B and the display device 200 may establish a control instruction protocol, synchronize the remote control keyboard to the mobile terminal 100B, and control the function of the display device 200 by controlling the user interface on the mobile terminal 100B. The audio and video content displayed on the mobile terminal 100B may also be transmitted to the display device 200, so as to implement a synchronous display function.
As shown in fig. 1, the display apparatus 200 also performs data communication with the server 300 through various communication means. The display device 200 may be allowed to be communicatively connected through a Local Area Network (LAN), a Wireless Local Area Network (WLAN), and other networks. The server 300 may provide various contents and interactions to the display apparatus 200. Illustratively, the display device 200 receives software program updates, or accesses a remotely stored digital media library, by sending and receiving information, as well as Electronic Program Guide (EPG) interactions. The servers 300 may be a group or groups, and may be one or more types of servers. Other web service contents such as a video on demand and an advertisement service are provided through the server 300.
The display device 200 may be a liquid crystal display, an OLED display, a projection display device. The specific display device type, size, resolution, etc. are not limiting, and those skilled in the art will appreciate that the display device 200 may be modified in performance and configuration as desired.
The display apparatus 200 may additionally provide an intelligent network tv function that provides a computer support function in addition to the broadcast receiving tv function. Examples include a network television, a display device, an Internet Protocol Television (IPTV), and the like.
As shown in fig. 1, a camera may be connected or disposed on the display device, and is used to present a picture taken by the camera on a display interface of the display device or other display devices, so as to implement interactive chat between users. Specifically, the picture shot by the camera can be displayed on the display device in a full screen mode, a half screen mode or any optional area.
In other examples, more or less functionality may be added. The function of the display device is not particularly limited in the present application.
Fig. 2 is a block diagram illustrating a hardware configuration of a display device 200 according to an exemplary embodiment of the present application. As shown in fig. 2, the display apparatus 200 may include a tuner demodulator 220, a communicator 230, a detector 240, an external device interface 250, a controller 210, a memory 290, a user input interface, a video processor 260-1, an audio processor 260-2, a display 280, an audio output interface 270, and a power supply.
The tuner/demodulator 220 receives the broadcast television signal in a wired or wireless manner, and may perform modulation/demodulation processing such as amplification, mixing, resonance, and the like, so as to demodulate, from a plurality of wireless or wired broadcast television signals, the audio/video signal carried in the frequency of the television channel selected by the user, and additional information (e.g., EPG data signal).
The tuner demodulator 220 is responsive to a user selected television channel frequency and the television signal carried thereby, as selected by the user and as controlled by the controller 210.
The tuner/demodulator 220 may receive signals according to different broadcasting systems of television signals, such as: terrestrial broadcasting, cable broadcasting, satellite broadcasting, internet broadcasting, or the like; and according to different modulation types, the digital modulation mode and the analog modulation mode can be adopted; and can demodulate the analog signal and the digital signal according to different types of the received television signals.
In other exemplary embodiments, the tuning demodulator 220 may be in an external device, such as an external set-top box. In this way, the set-top box outputs television audio/video signals after modulation and demodulation, and the television audio/video signals are input into the display device 200 through the external device interface 250.
The communicator 230 is a component for communicating with an external device or an external server according to various communication protocol types. For example: communicator 230 may include a WIFI module 231, a bluetooth (communication protocol) module 232, a wired ethernet (communication protocol) module 233, and other network communication protocol modules or near field communication protocol modules.
The display apparatus 200 may establish a connection of a control signal and a data signal with an external control apparatus or a content providing apparatus through the communicator 230. For example, the communicator may receive a control signal of the remote controller 100A according to the control of the controller.
The detector 240 is a component of the display apparatus 200 for collecting signals of an external environment or interaction with the outside. The detector 240 may include a light receiver 242, a sensor for collecting the intensity of ambient light, which may be used to adapt to display parameter changes, etc.; the system can further include an image collector 241, such as a camera, etc., which can be used for collecting external environment scenes, collecting attributes of the user or interacting gestures with the user, adaptively changing display parameters, and recognizing user gestures, so as to realize the function of interaction with the user.
In some other exemplary embodiments, the detector 240 may further include a temperature sensor, such as by sensing an ambient temperature, and the display device 200 may adaptively adjust a display color temperature of the image. For example, when the temperature is higher, the display apparatus 200 may be adjusted to display a color temperature of an image that is cooler; when the temperature is lower, the display device 200 may be adjusted to display a warmer color temperature of the image.
In other exemplary embodiments, the detector 240 may further include a sound collector, such as a microphone, which may be used to receive a user's voice, a voice signal including a control instruction for the user to control the display device 200, or collect environmental sounds for identifying an environmental scene type, and the display device 200 may adapt to environmental noise.
The external device interface 250 provides a component for the controller 210 to control data transmission between the display apparatus 200 and other external apparatuses. The external device interface may be connected with an external apparatus such as a set-top box, a game device, a notebook computer, etc. in a wired/wireless manner, and may receive data such as a video signal (e.g., moving image), an audio signal (e.g., music), additional information (e.g., EPG), etc. of the external apparatus.
The external device interface 250 may include: a high definition multimedia interface terminal 251(HDMI), a Composite Video Blanking Sync (CVBS) terminal 252(AV), an analog or digital component terminal 253 (component), a universal serial bus terminal 254(USB), a Red Green Blue (RGB) terminal (not shown), and the like.
The controller 210 controls the operation of the display device 200 and responds to the user's operation by running various software control programs, such as an operating system and various application programs, stored on the memory 290.
As shown in FIG. 2, the controller 210 includes a random access memory RAM214, a read only memory ROM213, a graphics processor 216, a CPU processor 212, communication interfaces (218-1 ~ 218-n), and a communication bus. The RAM214, the ROM213, the graphic processor 216, the CPU processor 212 and the communication interfaces (218-1 to 218-n) are connected through a bus.
A ROM213 for storing instructions for various system boots. If the display device 200 is powered on upon receipt of the power-on signal, the CPU processor 212 executes a system boot instruction in the ROM and copies the operating system stored in the memory 290 to the RAM214 to start running the boot operating system. After the start of the operating system is completed, the CPU processor 212 copies the various application programs in the memory 290 to the RAM214, and then starts running and starting the various application programs.
A graphics processor 216 for generating various graphics objects, such as: icons, operation menus, user input instruction display graphics, and the like. The display device comprises an arithmetic unit which carries out operation by receiving various interactive instructions input by a user and displays various objects according to display attributes. And a renderer for generating various objects based on the operator and displaying the rendered result on the display 280.
A CPU processor 212 for executing operating system and application program instructions stored in memory 290. And executing various application programs, data and contents according to various interactive instructions received from the outside so as to finally display and play various audio and video contents.
In some exemplary embodiments, the CPU processor 212 may include a plurality of processors. The plurality of processors may include one main processor and a plurality of or one sub-processor. A main processor for performing some operations of the display apparatus 200 in a pre-power-up mode and/or operations of displaying a screen in a normal mode. A plurality of or one sub-processor for performing an operation in a standby mode or the like.
The communication interfaces may include a first interface 218-1 through an nth interface 218-n. These interfaces may be network interfaces that are connected to external devices via a network.
The controller 210 may control the overall operation of the display apparatus 200. For example: in response to receiving a user command for selecting a UI object to be displayed on the display 280, the controller 210 may perform an operation related to the object selected by the user command.
Wherein the object may be any one of selectable objects, such as a hyperlink or an icon. Operations related to the selected object, such as: displaying an operation connected to a hyperlink page, document, image, or the like, or performing an operation of a program corresponding to an icon. The user command for selecting the UI object may be a command input through various input means (e.g., a mouse, a keyboard, a touch pad, etc.) connected to the display apparatus 200 or a voice command corresponding to a voice spoken by the user.
The memory 290 includes a memory for storing various software modules for driving and controlling the display apparatus 200. Such as: various software modules stored in memory 290, including: the system comprises a basic module, a detection module, a communication module, a display control module, a browser module, various service modules and the like.
The basic module is a bottom layer software module for signal communication between hardware in the display device 200 and sending processing and control signals to an upper layer module. The detection module is a management module used for collecting various information from various sensors or user input interfaces, and performing digital-to-analog conversion and analysis management.
For example: the voice recognition module comprises a voice analysis module and a voice instruction database module. The display control module is a module for controlling the display 280 to display image content, and may be used to play information such as multimedia image content and UI interface. The communication module is used for carrying out control and data communication with external equipment. And the browser module is used for executing data communication between the browsing servers. The service module is a module for providing various services and various application programs.
Meanwhile, the memory 290 is also used to store received external data and user data, images of respective items in various user interfaces, and visual effect maps of the focus object, etc.
A user input interface for transmitting an input signal of a user to the controller 210 or transmitting a signal output from the controller to the user. For example, the control device (e.g., a mobile terminal or a remote controller) may transmit an input signal input by a user, such as a power switch signal, a channel selection signal, a volume adjustment signal, etc., to the user input interface, and then the input signal is forwarded to the controller through the user input interface; alternatively, the control device may receive an output signal such as audio, video, or data output from the user input interface via the controller, and display the received output signal or output the received output signal in audio or vibration form.
In some embodiments, the user may input a user command on a Graphical User Interface (GUI) displayed on the display 280, and the user input interface receives the user input command through the Graphical User Interface (GUI). Alternatively, the user may input a user command by inputting a specific sound or gesture, and the user input interface receives the user input command by recognizing the sound or gesture through the sensor.
The video processor 260-1 is configured to receive a video signal, and perform video data processing such as decompression, decoding, scaling, noise reduction, frame rate conversion, resolution conversion, and image synthesis according to a standard codec protocol of the input signal, so as to obtain a video signal that is directly displayed or played on the display 280.
Illustratively, the video processor 260-1 includes a demultiplexing module, a video decoding module, an image synthesizing module, a frame rate conversion module, a display formatting module, and the like.
The demultiplexing module is used for demultiplexing the input audio and video data stream, and if the input MPEG-2 is input, the demultiplexing module demultiplexes the input audio and video data stream into a video signal, an audio signal and the like.
And the video decoding module is used for processing the demultiplexed video signal, including decoding, scaling and the like.
And the image synthesis module, such as an image synthesizer, is used for performing superposition mixing processing on the GUI signal input by the user or generated by the user and the video image after the zooming processing by the graphics generator so as to generate an image signal for display.
The frame rate conversion module is configured to convert a frame rate of an input video, such as a 24Hz, 25Hz, 30Hz, or 60Hz video, into a 60Hz, 120Hz, or 240Hz frame rate, where the input frame rate may be related to a source video stream, and the output frame rate may be related to an update rate of a display. The input is realized in a common format in a frame interpolation mode.
And a display formatting module for converting the signal output by the frame rate conversion module into a signal conforming to a display format of a display, such as converting the format of the signal output by the frame rate conversion module to output RGB data signals.
And a display 280 for receiving the image signal input from the video processor 260-1, displaying the video content and image, and displaying the menu manipulation interface. The display 280 includes a display component for presenting a picture and a driving component for driving image display. The video content may be displayed from the video in the broadcast signal received by the tuner/demodulator 220, or from the video content input from the communicator or the external device interface. And a display 220 simultaneously displaying a user manipulation interface UI generated in the display apparatus 200 and used to control the display apparatus 200.
And, a driving component for driving the display according to the type of the display 280. Alternatively, in case the display 280 is a projection display, it may also comprise a projection device and a projection screen.
The audio processor 260-2 is configured to receive an audio signal, decompress and decode the audio signal according to a standard codec protocol of the input signal, and perform noise reduction, digital-to-analog conversion, amplification and other audio data processing to obtain an audio signal that can be played in the speaker 272.
An audio output interface 270 for receiving the audio signal output by the audio processor 260-2 under the control of the controller 210, wherein the audio output interface may include a speaker 272 or an external sound output terminal 274 for outputting to a generating device of an external device, such as: external sound terminal or earphone output terminal.
In other exemplary embodiments, video processor 260-1 may comprise one or more chip components. The audio processor 260-2 may also include one or more chips.
And, in other exemplary embodiments, the video processor 260-1 and the audio processor 260-2 may be separate chips or may be integrated in one or more chips with the controller 210.
And a power supply for supplying power to the display apparatus 200 with power input from an external power source under the control of the controller 210. The power supply may include a built-in power supply circuit installed inside the display apparatus 200, or may be a power supply installed outside the display apparatus 200, such as a power supply interface for providing an external power supply in the display apparatus 200.
Fig. 3 is a block diagram illustrating a hardware configuration of the control apparatus 100 according to an exemplary embodiment of the present application. As shown in fig. 3, the control device 100 includes a controller 110, a communicator 130, a user input/output interface 140, a memory 190, and a power supply 180.
The control apparatus 100 is configured to control the display device 200 and may receive an input operation instruction of a user and convert the operation instruction into an instruction recognizable and responsive by the display device 200, serving as an interaction intermediary between the user and the display device 200. Such as: the user operates the channel up/down key on the control device 100, and the display device 200 responds to the channel up/down operation.
In some embodiments, the control device 100 may be a smart device. Such as: the control apparatus 100 may install various applications that control the display device 200 according to user demands.
In some embodiments, as shown in fig. 1, the mobile terminal 100B or other intelligent electronic device may function similar to the control apparatus 100 after an application for manipulating the display device 200 is installed. Such as: the user may implement the functions of controlling the physical keys of the apparatus 100 by installing applications, various function keys or virtual buttons of a graphical user interface available on the mobile terminal 100B or other intelligent electronic devices.
The controller 110 includes a processor 112, a RAM113 and a ROM114, a communication interface, and a communication bus. The controller 110 is used to control the operation of the control device 100, as well as the internal components for communication and coordination and external and internal data processing functions.
The communicator 130 enables communication of control signals and data signals with the display apparatus 200 under the control of the controller 110. Such as: the received user input signal is transmitted to the display apparatus 200. The communicator 130 may include at least one of a WIFI module 131, a bluetooth module 132, an NFC module 133, and the like.
A user input/output interface 140, wherein the input interface includes at least one of a microphone 141, a touch pad 142, a sensor 143, a key 144, and the like. Such as: the user can realize a user instruction input function through actions such as voice, touch, gesture, pressing, and the like, and the input interface converts the received analog signal into a digital signal and converts the digital signal into a corresponding instruction signal, and sends the instruction signal to the display device 200.
The output interface includes an interface that transmits the received user instruction to the display apparatus 200. In some embodiments, it may be an infrared interface or a radio frequency interface. Such as: when the infrared signal interface is used, the user input instruction needs to be converted into an infrared control signal according to an infrared control protocol, and the infrared control signal is sent to the display device 200 through the infrared sending module. The following steps are repeated: when the rf signal interface is used, a user input command needs to be converted into a digital signal, and then the digital signal is modulated according to the rf control signal modulation protocol and then transmitted to the display device 200 through the rf transmitting terminal.
In some embodiments, the control device 100 includes at least one of a communicator 130 and an output interface. The communicator 130 is configured in the control device 100, such as: the WIFI module, the Bluetooth module, the NFC module and the like can encode the user input command through a WIFI protocol, a Bluetooth protocol or an NFC protocol and send the encoded user input command to the display device 200.
The memory 190 stores various operation programs, data, and applications for driving and controlling the control apparatus 100 under the control of the controller 110. The memory 190 may store various control signal commands input by a user.
And a power supply 180 for providing operational power support to the components of the control device 100 under the control of the controller 110. A battery and associated control circuitry.
Fig. 4 is a functional configuration diagram of a display device 200 according to an exemplary embodiment of the present application. As shown in fig. 4, the memory 290 is used to store an operating system, an application program, contents, user data, and the like, and performs system operations for driving the display device 200 and various operations in response to a user under the control of the controller 210. The memory 290 may include volatile and/or nonvolatile memory.
The memory 290 is specifically used for storing an operating program for driving the controller 210 in the display device 200, and storing various applications installed in the display device 200, various applications downloaded by a user from an external device, various graphical user interfaces related to the applications, various objects related to the graphical user interfaces, user data information, and internal data of various supported applications. The memory 290 is used to store system software such as an Operating System (OS) kernel, middleware, and applications, and to store input video data and audio data, and other user data.
The memory 290 is specifically used for storing drivers and related data such as the video processor 260-1 and the audio processor 260-2, the display 280, the communication interface 230, the tuner demodulator 220, the detector 240, the input/output interface, etc.
In some embodiments, memory 290 may store software and/or programs representing software programs for an Operating System (OS) including, for example: a kernel, middleware, an Application Programming Interface (API), and/or an application program. For example, the kernel may control or manage system resources, or functions implemented by other programs (e.g., the middleware, APIs, or applications), and the kernel may provide interfaces to allow the middleware and APIs, or applications, to access the controller to implement controlling or managing system resources.
The memory 290, for example, includes a broadcast receiving module 2901, a channel control module 2902, a volume control module 2903, an image control module 2904, a display control module 2905, an audio control module 2906, an external instruction recognition module 2907, a communication control module 2908, a light receiving module 2909, a power control module 2910, an operating system 2911, and other applications 2912, a browser module, and the like. The external command recognition module 2907 includes a pattern recognition module 2907-1, a voice recognition module 2907-2, and a key command recognition module 2907-3. The controller 210 performs functions such as: a broadcast television signal reception demodulation function, a television channel selection control function, a volume selection control function, an image control function, a display control function, an audio control function, an external instruction recognition function, a communication control function, an optical signal reception function, an electric power control function, a software control platform supporting various functions, a browser function, and the like.
Fig. 5a is a block diagram illustrating a software configuration of a display device 200 according to an exemplary embodiment of the present application.
As shown in fig. 5 a:
the operating system, which includes executing operating software for handling various basic system services and for performing hardware-related tasks, acts as an intermediary between applications and hardware components for performing data processing.
In some embodiments, portions of the operating system kernel may contain a series of software to manage the display device hardware resources and provide services to other programs or software code.
In other embodiments, portions of the operating system kernel may include one or more device drivers, which may be a set of software code in the operating system that assists in operating or controlling the devices or hardware associated with the display device. The drivers may contain code that operates the video, audio, and/or other multimedia components. Examples include a display screen, a camera, Flash, WiFi, and audio drivers.
The accessibility module 511 is used for modifying or accessing the application program to realize the accessibility of the application program and the operability of the displayed content. A communication module 512 for connection with other peripherals via relevant communication interfaces and communication networks. And a user interface module 513, configured to provide an object for displaying a user interface for access by each application program, so that user operability can be achieved. The user interface module 513 further includes a user layer that provides a plurality of user operable controls and a video layer for displaying video frames. And the interface bottom layer module 514 comprises a bottom layer module of the user layer and a bottom layer module of the video layer, wherein the bottom layer module of the video layer is used for providing a video picture to be displayed for the video layer. Control applications 515 for controlling process management, including runtime applications and the like.
An interface layout management module 520, configured to manage user interface objects to be displayed, and control a user interface layout so that the user interface can be displayed in response to a user operation.
The event transmission system 530 may be implemented within the operating system or in the application layer 540. In some embodiments, an aspect is implemented within the operating system, while implemented in the application 540, for listening for various user input events, and the handlers that implement one or more sets of predefined operations in response to the recognition of various types of events or sub-events will be referred to in terms of various events.
Fig. 5b is a schematic diagram of an application center in the display device 200 according to an exemplary embodiment of the present application. As shown in FIG. 5b, the application layer 540 contains various applications that may be executed on the display device. The application may include, but is not limited to, one or more applications such as: live television applications, Video On Demand (VOD) applications, media center applications, application centers, gaming applications, and the like.
The live television application program can provide live television through different signal sources. For example, a live television application may provide television signals using input from cable television, radio broadcasts, satellite services, or other types of live television services. And, the live television application may display a video of the live television signal on the display device.
A video-on-demand application may provide video from different storage sources. Unlike live television applications, video on demand provides video displays from some storage source. For example, the video on demand may come from a server side of the cloud storage, from a local hard disk storage containing stored video programs.
The media center application program can provide various applications for playing multimedia contents. For example, a media center, which may be other than live television or video on demand, may provide services that a user may access to various images or audio through a media center application.
At least one application program of the live television application program, the video-on-demand application program and the media center application program is provided with a zooming function module for responding to a zooming instruction triggered by a user to realize a zooming function.
The application program center can provide and store various application programs. The application may be a game, an application, or some other application associated with a computer system or other device that may be run on the smart television. The application center may obtain these applications from different sources, store them in local storage, and then be operable on the display device.
Fig. 6a is a schematic view of a user interface of a display device 200 according to an exemplary embodiment of the present application. As shown in fig. 6a, the user interface is displayed by overlapping display frames of different levels, and the multiple layers of frames are respectively used for presenting different image contents, for example, a first layer 610 may present system layer project contents, such as current attributes, etc., a second layer 620 may be used for presenting application layer project contents, such as web videos, VOD presentations, application program frames, etc., and a third layer 630 may be further included.
Since the images presented by the user interface, the content of the image presented by each layer, the source of the image, and the like are different according to different operations performed by the user on the display device 200, for convenience of clear description of the technical solution of the present application, the user interface when the display device presents the video images is referred to as a video display interface in the present application.
Fig. 6b is a schematic view of a video display interface of the display device 200 according to an exemplary embodiment of the present application. As shown in fig. 6b, the video display interface includes at least two layers, i.e., a video layer and a user layer, wherein the video layer is used for displaying a video frame, and the user layer is used for providing at least a video profile, recommendation information, and other information.
Further, the user layer can be displayed by overlapping a plurality of sub-layer pictures, and each sub-layer is used for displaying a part of content of the video display interface, so that other pictures except the video pictures can be displayed in a layered manner, and the display effect is ensured.
In the embodiment of the application, through a parental control function of the display device, a user can select target objects to set parental locks one by one, such as applications, signal source channels, live television channels, television program contents and the like, so that the user without the operation right can be prevented from being unable to operate the target objects.
However, after the user sets the target object, each target object has an independent parental lock, so that the situation that the parental user needs to repeatedly input unlocking information during the process of using the display device easily occurs, the operation is complicated, and the user experience is not friendly.
For example, in a single use process before the user powers on and powers off the display device, when the user exits from a locked application and enters the locked application again, the user needs to input the unlocking information again, and when the user exits from a locked application and selects to enter another locked application in an application center, the user also needs to input the unlocking information again, or when the user exits from a locked application and switches a signal source channel of the display device from a current HDMI1 channel to an HDMI4 channel provided with a hometown lock, the user also needs to input the unlocking information again, so that the user experience is not friendly.
In order to solve the problem, the embodiment of the application provides a method for controlling parental locks of display equipment, which can implement unified control on a plurality of objects provided with parental locks, and for a user with an operation right item, in the process of using the display equipment once, the user is prevented from repeatedly inputting unlocking information, so that the user operation is reduced, and the experience is improved.
In the embodiment of the present application, a plurality of objects of different types are involved in the display device, for example, an application installed on the display device, a plurality of signal source channels (such as HDMI1, HDMI2, HDMI3, HDMI4, AV and/or TV, etc.) of the display device, live television channels (such as channel 1, channel 2, etc.) corresponding to different live broadcast signals that can be received by the display device, and television program content that can be played by the display device.
The user can optionally set the homeman lock on the part of the objects, and when a certain object is set with the homeman lock, the user who cannot provide unlocking information cannot operate the object.
Since various operations can be performed on an object, for example, when a certain object is an application program, multiple operations such as starting, exiting, setting, etc. can be performed on the object, in order to more clearly describe the technical solution of the present application, the present application proposes a concept of presetting an operation instruction.
The preset operation instruction is used to instruct the target object to execute the processing action corresponding to the preset operation instruction, and specifically, according to the difference of the target object, the preset operation instruction and the processing action corresponding to the preset operation instruction may include:
a first instruction, configured to instruct to start a target application of an application center of a display device, where the target application is the target object, and a processing action corresponding to the first instruction is to start the target application;
a second instruction, configured to instruct to receive and play a program signal of a target signal source channel, where the target signal source channel is the target object, and a processing action corresponding to the second instruction is to receive and play the program signal of the target signal source channel;
a third instruction for instructing to receive and play the program signal of the target live television channel;
and a fourth instruction for instructing to play the target television program content.
Fig. 7 and 10 are flowcharts of a display device parental lock control method according to an exemplary embodiment of the present application, respectively, where fig. 7 is a flowchart of a method when a preset operation instruction is a first instruction and a target object is an application. As shown in fig. 7, the method may include:
step 701, receiving a first instruction, where the first instruction is used to instruct to start a target application of a display device application center.
Fig. 8 is a diagram illustrating an interaction scenario of a user using a remote control and a display device according to an exemplary embodiment of the present application, where as shown in fig. 8, a screen of the display device presents an application center interface and a selector, UI icons of a plurality of applications are displayed in the application center, the selector is used for indicating a currently selected application, and when a target application is selected, the user presses an OK key on the remote control, and the remote control sends a first instruction corresponding to a key value of the OK key to the display device.
In this embodiment, the UI layer of the display device system receives the first instruction and executes the corresponding processing logic.
Step 702, determine whether the target application has a parental lock, if yes, execute step 703, if no, execute step 708.
Step 703, acquiring the parent lock state of the target application.
For an object provided with a parental lock, for example, an application program in this embodiment, as well as a signal source channel, a live television channel, television program content, and the like in another embodiment, the parental lock may be in two states in different situations, one of which is a locked state, and at this time, the object may only be operated when unlocking information is input and verification passes, and the other of which is an unlocked state, that is, when unlocking information is input and verification passes, and at this time, the object may be operated.
The parent lock state is used as a judgment basis for judging whether the user needs to be prompted to input unlocking information to start the application or not by the UI layer. If the application is judged to be in the locking state according to the parent lock state, the application can be started only if the user inputs correct unlocking information, otherwise, the application can be operated without inputting the unlocking information.
In a specific implementation, the parent lock state of the application may be saved as an attribute of the application, and when the object needs to be locked, the state of its parent lock is set to UNLOCK, and when the object does not need to be locked, the state of its parent lock is set to true.
Step 704, determining the parent lock status of the target application, if the parent lock status is the locked status, then step 705 is executed, if the parent lock status is the unlocked status, then step 708 is executed.
Step 705, generating an interface prompt for prompting a user to input unlocking information, and presenting the interface prompt on a screen of the display device.
Fig. 9 is an exemplary illustration of a possible interface prompt for prompting a user to input unlocking information, in which an input box for inputting an unlocking password and a number option constituting the password are displayed, and the user selects and inputs a number by controlling a mobile selector through a remote controller.
And step 706, receiving unlocking information input by a user according to the displayed interface prompt. As described in step 105, the unlocking information is generally an unlocking password, but in another embodiment, the unlocking information may also be gesture information or voice information.
Step 707, the unlocking information is verified, if the verification is passed, step 708 is executed, and if the verification is not passed, step 705 is executed.
At step 708, the target application is launched.
And executing step 709 while executing step 708, judging whether a unified unlocking instruction is received, if so, executing step 710, and if not, not acting.
In step 710, the parental lock status of each object provided with the parental lock is modified to an unlocked status.
The objects provided with the parental lock comprise application programs, signal source channels, live television channels and television program contents. For the application program with the parental lock, the UI layer modifies the parental lock state, and for the signal source channel, the live television channel and the television program content with the parental lock, the target middleware for directly controlling the display device chip to play signals modifies the parental lock state.
The unified unlocking instruction is used for instructing the display device to modify the parental lock state of each object provided with the parental lock into an unlocking state during the use, and further, in the process of using the method, if the objects need to be operated, the unlocking information does not need to be input again, for example, after step 108, if exiting from the target application and re-entering the target application, the application can be launched directly without re-entering the unlock information, as another example, if exiting from the target application 2 and selecting the entry application 4 at the application center, there is no need to enter the unlocking information again, or, for example, if exiting from the target application 2 and switching the signal source channel of the display device from the current HDMI1 channel to the HDMI4 channel provided with the home lock, unlocking information does not need to be input again, so that user operation is greatly simplified, and user experience is improved.
It should be noted that the "use process of this time" may be understood as a period of time from when the display device is turned on to before the display device is turned off, specifically, after the display device is turned on, the use of the display device by the user starts, and after the display device is turned off, the use is finished.
The unified unlocking instruction is input by a user through a function control provided by the UI interface. For example, referring to fig. 9, in fig. 9, the function control for a user to input an instruction by checking is further included, and when the user checks "remember the unlock state before shutdown", a unified unlock instruction is input to the display device.
In step 711, it is detected whether a shutdown command is received, if so, step 712 is executed, and if not, no action is taken.
In step 712, the parental lock status of each of the objects provided with the parental lock is modified to a locked status, and shutdown is performed.
As can be seen from the foregoing embodiments, in the embodiments of the present application, when a first instruction for instructing to start a target application of a display device application center is received, a parent lock state of the target application is first determined, if the target application is in a locked state, a user is prompted to input unlocking information and a unified unlocking instruction, the target application is started when the unlocking information is verified, and the parent lock state of each object provided with a parental lock is modified into the unlocked state when the unified unlocking instruction is received, so that the user does not need to repeatedly input unlocking information during the use of the display device, thereby reducing user operations and improving user experience.
FIG. 10 is a flowchart illustrating a method when the predetermined operation command is the second command and the target object is the signal source channel. The target middleware is used for controlling the switching of the signal source channel and controlling whether the display equipment receives and plays the program signal. As shown in fig. 10, the method may include:
in step 101, the UI layer receives a second instruction, where the second instruction is used to instruct to switch a signal source channel of the display device to a target signal source channel, so as to receive and play a program signal of the target signal source channel.
Fig. 11 is another interaction scenario of a user using a remote controller and a display device according to an exemplary embodiment of the present application, where as shown in fig. 11, a screen of the display device presents a signal source channel selection interface and a selector, a UI icon of a plurality of signal source channels is displayed in the interface, the selector is used to indicate a currently selected signal source channel, and in a case that a target signal source channel is selected, the user presses an OK key on the remote controller, and the remote controller sends a second instruction corresponding to a key value of the OK key to the display device.
Step 102, the UI layer sends a second instruction to the target middleware.
And 103, switching the signal source channel from the current channel to the target signal source channel by the target middleware.
And 104, the target middleware judges whether the switched target signal source channel is provided with a home lock, if so, the step 105 is executed, and if not, the step 111 is executed.
And 105, the target middleware acquires the parent lock state of the switched target signal source channel.
Step 106, the target middleware determines the parental lock state of the target signal source channel, if the target signal source channel is in the lock state, step 107 is executed, and if the target signal source channel is in the unlock state, step 111 is executed.
Step 107, the target middleware generates an instruction for instructing the UI layer to acquire the unlocking information, and sends the instruction to the UI layer.
And step 108, the UI layer receives the instruction, generates an interface prompt for prompting a user to input unlocking information, and presents the interface prompt on a display device screen.
And step 109, the UI layer receives unlocking information input by a user and sends the unlocking information to the target middleware.
And step 110, the target middleware verifies the unlocking information, if the unlocking information passes the verification, step 111 is executed, and if the unlocking information does not pass the verification, step 107 is executed.
And step 111, the target middleware controls the display equipment chip to receive and play the program signal of the target signal source channel.
And in step 112, the target middleware judges whether a unified unlocking instruction is received, if so, the step 113 is executed, and if not, the action is not carried out.
In step 113, the parent-lock status of each object having a parent lock set therein is modified to an unlocked status by the UI layer and/or the target middleware.
For the application class object, the UI layer sets the parental lock state of the application class object to be the unlocking state, and for the channel class or the object of the television program content, the target middleware sets the parental lock state of the application class object to be the unlocking state.
This embodiment still includes: and detecting whether a shutdown instruction is received, if so, modifying the parental lock state of each object provided with the parental lock into a locking state, and executing shutdown. If not, no action is taken.
As can be seen from the foregoing embodiments, in the embodiments of the present application, when a first instruction for instructing switching of a signal source channel of a display device is received, the display device is first switched to a target signal source channel, and then a parental lock state of the target signal source channel is obtained, if the target signal source channel is in a locked state, a user is prompted to input unlocking information and a unified unlocking instruction, and under the condition that the unlocking information is verified, a program signal of the target signal source channel is received and played, and under the condition that the unified unlocking instruction is received, a parental lock state of each object provided with a parental lock is modified to an unlocking state, and further, in the use process of the display device by the user, the user does not need to repeatedly input unlocking information, so that user operations are reduced, and user experience is improved.
In order to further ensure that the content of the display device can be safely browsed by underage, a time lock can be set by using a home control function, and for the locked time slot object, a user without operation authority cannot operate any object on the display device.
Based on this, on the basis of the above embodiment, in another embodiment, the parent lock control method for a display device provided by the present application further includes the steps shown in fig. 12:
in step 121, an arbitrary operation instruction is received, where the arbitrary operation instruction includes the preset operation instruction and a non-preset operation instruction.
In step 122, it is determined whether the time for receiving the operation command belongs to a lock time period, if so, step 123 is executed, and if not, step 125 is executed.
In step 123, unlock information is acquired.
For a specific implementation process of obtaining the unlocking information, reference may be made to the above embodiments, which are not described herein again.
In step 124, the unlocking information is verified. If the verification is passed, step 125 is performed, and if the verification is not passed, step 123 is performed.
In step 125, the processing action corresponding to the operation instruction is executed.
And, in case of receiving the unified unlocking instruction, modifying the parental lock status of the object provided with the parental lock into an unlocked status in step 126.
It can be known from the above embodiments that, for the case that the user sets the time lock on the display device by using the parental control function, the processing logic described in steps 121-126 can prevent the user from repeatedly inputting the unlocking information before the shutdown, thereby simplifying the user operation and improving the user experience.
In another embodiment, when the display device receives the third instruction for instructing to receive and play the program signal of the target live television channel, the live television channel is switched from the current channel to the target live television channel, and then the parent lock state of the target live television channel is acquired, if the target live television channel is in the locked state, the user is prompted to input unlocking information and a unified unlocking instruction, the program signal of the live television channel is received and played under the condition that the unlocking information is verified, and the parent lock state of each object provided with the parent lock is modified into the unlocked state under the condition that the unified unlocking instruction is received, so that the user does not need to repeatedly input the unlocking information in the using process of the display device, thereby reducing user operation and improving user experience.
In another embodiment, when the display device receives the fourth instruction for instructing to play the target television program content, the display device first obtains a program signal corresponding to the target television program content, and then obtains the parental lock state of the target television program content according to the program signal, for example, obtains the rating of the program content according to the program signal, and obtains the parental lock state corresponding to the rating.
If the target television program content is in the locked state, prompting a user to input unlocking information and a unified unlocking instruction, playing the television program content under the condition that the unlocking information is verified to pass, and modifying the parental lock state of each object provided with the parental lock into the unlocked state under the condition that the unified unlocking instruction is received, so that the user does not need to repeatedly input the unlocking information in the using process of the display equipment, the user operation is reduced, and the user experience is improved.
According to the parent lock control method for the display device provided in the embodiment of the present application, an embodiment of the present application further provides a display device, where fig. 13 is a schematic diagram of a hardware structure of the display device, and as shown in fig. 13, the display device includes:
a display 131, configured to display a user interface, where the user interface includes a selector used to indicate that a control in the user interface is selected;
a memory 132 at least for storing program instructions corresponding to the method configured by the controller 133;
the controller 133 is configured to perform some or all steps in the embodiments of the method for controlling parental locks of a display device provided by the present application.
In specific implementation, the present invention further provides a computer storage medium, where the computer storage medium may store a program, and the program may include some or all of the steps in the embodiments of the control method provided by the present invention when executed. The storage medium may be a magnetic disk, an optical disk, a read-only memory (ROM) or a Random Access Memory (RAM).
Those skilled in the art will readily appreciate that the techniques of the embodiments of the present invention may be implemented as software plus a required general purpose hardware platform. Based on such understanding, the technical solutions in the embodiments of the present invention may be essentially or partially implemented in the form of a software product, which may be stored in a storage medium, such as ROM/RAM, magnetic disk, optical disk, etc., and includes several instructions for enabling a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the method according to the embodiments or some parts of the embodiments.
The same and similar parts in the various embodiments in this specification may be referred to each other. In particular, for the embodiment of the display device, since it is substantially similar to the embodiment of the method, the description is simple, and for the relevant points, refer to the description in the embodiment of the method.
The above-described embodiments of the present invention should not be construed as limiting the scope of the present invention.