WO2022073392A1 - 图像显示方法及显示设备 - Google Patents

图像显示方法及显示设备 Download PDF

Info

Publication number
WO2022073392A1
WO2022073392A1 PCT/CN2021/113762 CN2021113762W WO2022073392A1 WO 2022073392 A1 WO2022073392 A1 WO 2022073392A1 CN 2021113762 W CN2021113762 W CN 2021113762W WO 2022073392 A1 WO2022073392 A1 WO 2022073392A1
Authority
WO
WIPO (PCT)
Prior art keywords
target image
function
display device
display
image quality
Prior art date
Application number
PCT/CN2021/113762
Other languages
English (en)
French (fr)
Inventor
王丽娟
魏强
Original Assignee
青岛海信传媒网络技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 青岛海信传媒网络技术有限公司 filed Critical 青岛海信传媒网络技术有限公司
Publication of WO2022073392A1 publication Critical patent/WO2022073392A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1415Digital output to display device ; Cooperation and interconnection of the display device with other functional units with means for detecting differences between the image stored in the host and the images displayed on the displays

Definitions

  • the present application relates to the field of display technology, and in particular, to an image display method and a display device.
  • the display device usually uses the ACR (Auto Content Recognition, automatic content recognition) function to collect the displayed content for content recognition, and finally use the content recognition result to display the device's AQ (Audio Quality, sound quality) and PQ (Picture Quality, image quality) ), as well as content recommendation, etc., to improve the user experience of using the display device.
  • ACR Automatic Content Recognition, automatic content recognition
  • the present application provides an image display method and display device
  • the present application provides a display device, including:
  • the display is controlled to display the target image according to the target image quality parameter.
  • the present application also provides an image display method, including:
  • the display is controlled to display the target image according to the target image quality parameter.
  • FIG. 1 exemplarily shows a schematic diagram of an operation scene between a display device and a control apparatus according to some embodiments
  • FIG. 2 exemplarily shows a hardware configuration block diagram of a display device 200 according to some embodiments
  • FIG. 3 exemplarily shows a hardware configuration block diagram of the control apparatus 100 according to some embodiments
  • FIG. 4 exemplarily shows a schematic diagram of software configuration in the display device 200 according to some embodiments
  • FIG. 5 exemplarily shows a schematic diagram of displaying an icon control interface of an application in the display device 200 according to some embodiments
  • FIG. 6 is a schematic diagram of a first control flow of the controller 250 in the display device 200 according to the embodiment of the application;
  • FIG. 7 is a schematic diagram of a second control flow of the controller 250 in the display device 200 according to the embodiment of the application;
  • FIG. 8 is a schematic diagram of a third control flow of the controller 250 in the display device 200 according to the embodiment of the application;
  • FIG. 9 is a schematic diagram of a fourth control flow of the controller 250 in the display device 200 according to the embodiment of the application.
  • FIG. 10 is a schematic diagram of a fifth control flow of the controller 250 in the display device 200 according to the embodiment of the application;
  • FIG. 11 is a schematic diagram of a sixth control flow of the controller 250 in the display device 200 according to the embodiment of the application;
  • FIG. 13 is a flowchart of another image display method according to an embodiment of the present application.
  • module refers to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic or combination of hardware or/and software code capable of performing the function associated with that element.
  • remote control refers to a component of an electronic device, such as the display device disclosed in this application, that can wirelessly control the electronic device, usually over a short distance.
  • infrared and/or radio frequency (RF) signals and/or Bluetooth are used to connect with electronic devices, and functional modules such as WiFi, wireless USB, Bluetooth, and motion sensors may also be included.
  • RF radio frequency
  • a hand-held touch remote control replaces most of the physical built-in hard keys in a general remote control device with a user interface in a touch screen.
  • gesture used in this application refers to a user's behavior that is used by a user to express an expected thought, action, purpose/or result through an action such as a change of hand shape or hand movement.
  • FIG. 1 exemplarily shows a schematic diagram of an operation scenario between a display device and a control apparatus according to an embodiment.
  • a user may operate the display apparatus 200 through the mobile terminal 300 and the control apparatus 100 .
  • the control device 100 may be a remote control, and the communication between the remote control and the display device includes infrared protocol communication or Bluetooth protocol communication, and other short-range communication methods, etc., and controls the display device 200 by wireless or other wired methods.
  • the user can control the display device 200 by inputting user instructions through keys on the remote control, voice input, control panel input, and the like.
  • the user can control the display device 200 by inputting corresponding control commands through the volume up/down key, channel control key, up/down/left/right movement keys, voice input key, menu key, power-on/off key, etc. on the remote control. function.
  • mobile terminals, tablet computers, computers, notebook computers, and other smart devices may also be used to control the display device 200 .
  • the display device 200 is controlled using an application running on the smart device.
  • the app can be configured to provide users with various controls in an intuitive user interface (UI) on the screen associated with the smart device.
  • UI intuitive user interface
  • the mobile terminal 300 may install a software application with the display device 200 to implement connection communication through a network communication protocol, so as to achieve the purpose of one-to-one control operation and data communication.
  • a control command protocol can be established between the mobile terminal 300 and the display device 200
  • the remote control keyboard can be synchronized to the mobile terminal 300
  • the function of controlling the display device 200 can be realized by controlling the user interface on the mobile terminal 300.
  • the audio and video content displayed on the mobile terminal 300 may also be transmitted to the display device 200 to implement a synchronous display function.
  • the display device 200 also performs data communication with the server 400 through various communication methods.
  • the display device 200 may be allowed to communicate via local area network (LAN), wireless local area network (WLAN), and other networks.
  • the server 400 may provide various contents and interactions to the display device 200 .
  • the display device 200 interacts by sending and receiving information, and electronic program guide (EPG), receiving software program updates, or accessing a remotely stored digital media library.
  • EPG electronic program guide
  • the server 400 may be a cluster or multiple clusters, and may include one or more types of servers. Other network service contents such as video-on-demand and advertising services are provided through the server 400 .
  • the display device 200 may be a liquid crystal display, an OLED display, or a projection display device.
  • the specific display device type, size and resolution are not limited. Those skilled in the art can understand that the display device 200 can make some changes in performance and configuration as required.
  • the display device 200 may additionally provide a smart IPTV function that provides computer-supported functions, including but not limited to, IPTV, smart TV, Internet Protocol TV (IPTV), and the like, in addition to the broadcast receiving TV function.
  • a smart IPTV function that provides computer-supported functions, including but not limited to, IPTV, smart TV, Internet Protocol TV (IPTV), and the like, in addition to the broadcast receiving TV function.
  • FIG. 2 exemplarily shows a block diagram of the hardware configuration of the display device 200 according to the exemplary embodiment.
  • the display device 200 includes a controller 250, a tuner 210, a communicator 220, a detector 230, an input/output interface 255, a display 275, an audio output interface 285, a memory 260, a power supply 290, At least one of the user interface 265 and the external device interface 240 .
  • the display 275 for receiving the image signal from the output of the first processor, performs components for displaying video content and images and a menu manipulation interface.
  • the display 275 includes a display screen component for presenting pictures, and a driving component for driving image display.
  • the video content displayed may be from broadcast television content or various broadcast signals that may be received via wired or wireless communication protocols.
  • various image contents sent from the network server side can be displayed and received from the network communication protocol.
  • display 275 is used to present a user-manipulated UI interface generated in display device 200 and used to control display device 200 .
  • a driving component for driving the display is also included.
  • display 275 is a projection display and may also include a projection device and projection screen.
  • communicator 220 is a component for communicating with external devices or external servers according to various communication protocol types.
  • the communicator 220 may include at least one of a Wifi module 221, a Bluetooth module 222, a wired Ethernet module 223 and other network communication protocol modules or near field communication protocol modules, and an infrared receiver.
  • the display apparatus 200 may establish control signal and data signal transmission and reception between the communicator 220 and the external control apparatus 100 or the content providing apparatus.
  • the user interface 265 may be used to receive infrared control signals from the control device 100 (eg, an infrared remote control, etc.).
  • the detector 230 is a signal used by the display device 200 to collect the external environment or interact with the outside.
  • the detector 230 includes a light receiver, a sensor for collecting ambient light intensity, and can adaptively display parameter changes and the like by collecting ambient light.
  • the detector 230 may further include an image collector 232, such as a camera, a camera, etc., which can be used to collect external environment scenes, and used to collect user attributes or interactive gestures with the user, and can adaptively change display parameters , and can also recognize user gestures to implement functions that interact with users.
  • an image collector 232 such as a camera, a camera, etc., which can be used to collect external environment scenes, and used to collect user attributes or interactive gestures with the user, and can adaptively change display parameters , and can also recognize user gestures to implement functions that interact with users.
  • detector 230 may also include a temperature sensor or the like, such as by sensing ambient temperature.
  • the display device 200 can adaptively adjust the display color temperature of the image. For example, when the temperature is relatively high, the display device 200 can be adjusted to display a relatively cool color temperature of the image, or when the temperature is relatively low, the display device 200 can be adjusted to display a warmer color of the image.
  • the detector 230 may further include a sound collector 231 or the like, such as a microphone, which may be used to receive the user's voice.
  • a voice signal including a control instruction of the user to control the display device 200, or collecting ambient sounds is used to identify the type of the environment scene, so that the display device 200 can adaptively adapt to the ambient noise.
  • the input/output interface 255 is configured to enable data transfer between the controller 250 and other external devices or other controllers 250 . Such as receiving video signal data and audio signal data of external equipment, or command instruction data, etc.
  • the external device interface 240 may include, but is not limited to, the following: any one or more of a high-definition multimedia interface HDMI interface, an analog or data high-definition component input interface, a composite video input interface, a USB input interface, an RGB port, etc. interface. It is also possible to form a composite input/output interface by a plurality of the above-mentioned interfaces.
  • the tuner and demodulator 210 is configured to receive broadcast television signals through wired or wireless reception, and can perform modulation and demodulation processing such as amplification, frequency mixing, and resonance, and can perform modulation and demodulation processing from multiple wireless receivers.
  • the audio and video signal may include the TV audio and video signal carried in the frequency of the TV channel selected by the user, and the EPG data signal.
  • the frequency demodulated by the tuner-demodulator 210 is controlled by the controller 250, and the controller 250 can send a control signal according to the user's selection, so that the modem responds to the user-selected TV signal frequency and modulates and demodulates the frequency.
  • broadcast television signals may be classified into terrestrial broadcast signals, cable broadcast signals, satellite broadcast signals, or Internet broadcast signals, etc. according to different broadcast formats of the television signals. Or according to different modulation types, it can be divided into digital modulation signal, analog modulation signal, etc. Alternatively, it can be divided into digital signals, analog signals, etc. according to different signal types.
  • the controller 250 and the tuner 210 may be located in different separate devices, that is, the tuner 210 may also be located in an external device of the main device where the controller 250 is located, such as an external set-top box Wait.
  • the set-top box outputs the modulated and demodulated television audio and video signals of the received broadcast television signals to the main device, and the main device receives the audio and video signals through the first input/output interface.
  • the controller 250 controls the operation of the display device and responds to user operations.
  • the controller 250 may control the overall operation of the display apparatus 200 .
  • the controller 250 may perform an operation related to the object selected by the user command.
  • the object may be any of the selectable objects, such as a hyperlink or an icon.
  • Operations related to the selected object such as displaying operations linked to hyperlinked pages, documents, images, etc., or executing operations corresponding to the icon.
  • the user command for selecting the UI object may be an input command through various input devices (eg, a mouse, a keyboard, a touchpad, etc.) connected to the display device 200 or a voice command corresponding to a voice spoken by the user.
  • the controller 250 includes a random access memory 251 (Random Access Memory, RAM), a read-only memory 252 (Read-Only Memory, ROM), a graphics processor 253 (Graphics Processing Unit, GPU), a central processing unit At least one of a processor 254 (Central Processing Unit, CPU), an input/output interface 255 and a communication bus 256 (Bus).
  • the communication bus connects the various components.
  • RAM 251 is used to store temporary data for the operating system or other running programs.
  • ROM 252 is used to store various system startup instructions.
  • ROM 252 is used to store a basic input output system, called a Basic Input Output System (BIOS). It is used to complete the power-on self-check of the system, the initialization of each functional module in the system, the driver program of the basic input/output of the system, and the boot operating system.
  • BIOS Basic Input Output System
  • the power supply of the display device 200 is started, and the CPU executes the system start-up instruction in the ROM 252, and copies the temporary data of the operating system stored in the memory to the RAM 251, so as to facilitate the start-up or running operation system.
  • the CPU copies the temporary data of various application programs in the memory to the RAM 251, and then, in order to start or run various application programs.
  • processor 254 executes operating system and application program instructions stored in memory. And various application programs, data and content are executed according to various interactive instructions received from the external input, so as to finally display and play various audio and video content.
  • processor 254 may include multiple processors.
  • the plurality of processors may include a main processor and one or more sub-processors.
  • the main processor is used to perform some operations of the display device 200 in the pre-power-on mode, and/or an operation of displaying a picture in the normal mode.
  • One or more sub-processors for an operation in a state such as standby mode.
  • the graphics processor 253 is used to generate various graphic objects, such as: icons, operation menus, and user input instructions to display graphics and the like. It includes an operator, which performs operations by receiving various interactive instructions input by the user, and displays various objects according to the display properties. and includes a renderer, which renders various objects obtained based on the operator, and the rendered objects are used for displaying on a display.
  • various graphic objects such as: icons, operation menus, and user input instructions to display graphics and the like. It includes an operator, which performs operations by receiving various interactive instructions input by the user, and displays various objects according to the display properties. and includes a renderer, which renders various objects obtained based on the operator, and the rendered objects are used for displaying on a display.
  • the video processor 270 is configured to receive the external video signal and perform decompression, decoding, scaling, noise reduction, frame rate conversion, resolution conversion, image synthesis, etc. according to the standard codec protocol of the input signal. After video processing, a signal that can be directly displayed or played on the display device 200 can be obtained.
  • the video processor 270 includes a demultiplexing module, a video decoding module, an image synthesis module, a frame rate conversion module, a display formatting module, and the like.
  • the demultiplexing module is used for demultiplexing the input audio and video data stream. For example, if MPEG-2 is input, the demultiplexing module demultiplexes it into video signals and audio signals.
  • the video decoding module is used to process the demultiplexed video signal, including decoding and scaling.
  • the image synthesizing module such as an image synthesizer, is used for superimposing and mixing the GUI signal generated by the graphics generator according to the user's input or itself, and the zoomed video image, so as to generate an image signal that can be displayed.
  • the frame rate conversion module is used to convert the input video frame rate, such as converting 60Hz frame rate to 120Hz frame rate or 240Hz frame rate.
  • the usual format is implemented by means of frame insertion.
  • the display formatting module is used for converting the received frame rate into the video output signal, and changing the signal to conform to the display format signal, such as outputting the RGB data signal.
  • the graphics processor 253 may be integrated with the video processor, or may be separately configured.
  • the processing of the graphics signal output to the display may be performed.
  • different functions may be performed respectively. For example, GPU+FRC (Frame Rate Conversion)) architecture.
  • the audio processor 280 is configured to receive an external audio signal, perform decompression and decoding, and noise reduction, digital-to-analog conversion, and amplification processing according to a standard codec protocol of the input signal to obtain a The sound signal played in the speaker.
  • the video processor 270 may comprise one or more chips.
  • the audio processor may also include one or more chips.
  • the video processor 270 and the audio processor 280 may be separate chips, or may be integrated into one or more chips together with the controller.
  • the audio output under the control of the controller 250, receives the sound signal output by the audio processor 280, such as the speaker 286, and in addition to the speaker carried by the display device 200 itself, can be output to an external device.
  • the external audio output terminal of the device such as an external audio interface or an earphone interface, etc., may also include a short-range communication module in the communication interface, such as a Bluetooth module for outputting sound from a Bluetooth speaker.
  • the power supply 290 under the control of the controller 250, provides power supply support for the display device 200 with the power input from the external power supply.
  • the power supply 290 may include a built-in power supply circuit installed inside the display device 200 , or may be an external power supply installed in the display device 200 to provide an external power supply interface in the display device 200 .
  • the user interface 265 is used for receiving user input signals, and then sending the received user input signals to the controller 250 .
  • the user input signal may be a remote control signal received through an infrared receiver, and various user control signals may be received through a network communication module.
  • the user inputs user commands through the control device 100 or the mobile terminal 300 , the user input interface is based on the user's input, and the display device 200 responds to the user's input through the controller 250 .
  • the user may input user commands on a graphical user interface (GUI) displayed on the display 275, and the user input interface receives the user input commands through the graphical user interface (GUI).
  • GUI graphical user interface
  • the user may input a user command by inputting a specific sound or gesture, and the user input interface recognizes the sound or gesture through a sensor to receive the user input command.
  • the memory 260 includes storing various software modules for driving the display device 200 .
  • various software modules stored in the first memory include at least one of a basic module, a detection module, a communication module, a display control module, a browser module, and various service modules.
  • the basic module is used for signal communication between various hardwares in the display device 200, and is a low-level software module that sends processing and control signals to the upper-layer module.
  • the detection module is a management module used to collect various information from various sensors or user input interfaces, perform digital-to-analog conversion, and analyze and manage.
  • the speech recognition module includes a speech parsing module and a speech instruction database module.
  • the display control module is a module used to control the display to display image content, and can be used to play information such as multimedia image content and UI interface.
  • Communication module a module for control and data communication with external devices.
  • the browser module is a module for performing data communication between browsing servers. Service modules are used to provide various services and modules including various applications.
  • the memory 260 is also used to store and receive external data and user data, images of various items in various user interfaces, and visual effect diagrams of focus objects, and the like.
  • FIG. 3 exemplarily shows a configuration block diagram of the control apparatus 100 according to an exemplary embodiment.
  • the control device 100 includes a controller 110 , a communication interface 130 , a user input/output interface 140 , a memory 190 , and a power supply 180 .
  • the control apparatus 100 is configured to control the display device 200 , and can receive the user's input operation instructions, and convert the operation instructions into instructions that the display device 200 can recognize and respond to, so as to play an interactive intermediary role between the user and the display device 200 .
  • the user operates the channel addition and subtraction keys on the control device 100, and the display device 200 responds to the channel addition and subtraction operation.
  • control apparatus 100 may be a smart device.
  • control apparatus 100 may install various applications for controlling the display device 200 according to user requirements.
  • the mobile terminal 300 or other intelligent electronic device can perform a similar function of the control apparatus 100 after installing the application for operating the display device 200 .
  • the user can install the application, various function keys or virtual buttons of the graphical user interface available on the mobile terminal 300 or other intelligent electronic devices, so as to realize the function of the physical key of the control apparatus 100 .
  • Controller 110 includes processor 112 and RAM 113 and ROM 114.
  • the controller is used to control the operation and operation of the control device 100, as well as the communication and cooperation between the internal components and the external and internal data processing functions.
  • the communication interface 130 realizes the communication of control signals and data signals with the display device 200 under the control of the controller 110 .
  • the received user input signal is sent to the display device 200 .
  • the communication interface 130 may include at least one of other near field communication modules such as a WiFi chip 131 , a Bluetooth module 132 , and an NFC module 133 .
  • the user input/output interface 140 wherein the input interface includes at least one of other input interfaces such as a microphone 141, a touch panel 142, a sensor 143, and a key 144.
  • the user can implement the user command input function through actions such as voice, touch, gesture, pressing, etc.
  • the input interface converts the received analog signal into a digital signal, and converts the digital signal into a corresponding command signal, and sends it to the display device 200.
  • the output interface includes an interface for transmitting received user instructions to the display device 200 .
  • it can be an infrared interface or a radio frequency interface.
  • the infrared signal interface when used, the user input instruction needs to be converted into an infrared control signal according to the infrared control protocol, and sent to the display device 200 through the infrared sending module.
  • the radio frequency signal interface when a radio frequency signal interface is used, the user input command needs to be converted into a digital signal, and then modulated according to the radio frequency control signal modulation protocol, and then sent to the display device 200 by the radio frequency transmission terminal.
  • control device 100 includes at least one of a communication interface 130 and an input-output interface 140.
  • the control device 100 is configured with a communication interface 130, such as modules such as WiFi, Bluetooth, NFC, etc., which can send user input instructions to the display device 200 through WiFi protocol, Bluetooth protocol, or NFC protocol encoding.
  • the memory 190 is used to store various operating programs, data and applications for driving and controlling the control device 200 under the control of the controller.
  • the memory 190 can store various control signal instructions input by the user.
  • the power supply 180 is used to provide operating power support for each element of the control device 100 under the control of the controller.
  • a system may include a kernel (Kernel), a command parser (shell), a file system, and applications.
  • kernel Kernel
  • shell command parser
  • file system a file system
  • applications the kernel, shell, and file system make up the basic operating system structures that allow users to manage files, run programs, and use the system.
  • the kernel starts, activates the kernel space, abstracts hardware, initializes hardware parameters, etc., runs and maintains virtual memory, scheduler, signals and inter-process communication (IPC).
  • IPC inter-process communication
  • the shell and user applications are loaded.
  • An application is compiled into machine code after startup, forming a process.
  • the system is divided into four layers, from top to bottom, they are an application layer (referred to as “application layer”), an application framework layer (referred to as “framework layer”) ”), the Android runtime and the system library layer (referred to as the “system runtime layer”), and the kernel layer.
  • application layer an application layer
  • frame layer an application framework layer
  • Android runtime the Android runtime
  • system library layer the system library layer
  • kernel layer the kernel layer
  • the framework layer provides an application programming interface (API) and a programming framework for applications in the application layer.
  • the application framework layer includes some predefined functions.
  • the application framework layer is equivalent to a processing center, which decides to let the applications in the application layer take action. Through the API interface, the application can access the resources in the system and obtain the services of the system during execution.
  • the application framework layer in the embodiment of the present application includes managers (Managers), content providers (Content Provider), etc., wherein the manager includes at least one of the following modules: an activity manager (Activity Manager) uses Interacts with all activities running in the system; Location Manager is used to provide system services or applications with access to system location services; Package Manager is used to retrieve files currently installed on the device Various information related to the application package; Notification Manager (Notification Manager) is used to control the display and clearing of notification messages; Window Manager (Window Manager) is used to manage icons, windows, toolbars, wallpapers on the user interface and desktop widgets.
  • an activity manager uses Interacts with all activities running in the system
  • Location Manager is used to provide system services or applications with access to system location services
  • Package Manager is used to retrieve files currently installed on the device Various information related to the application package
  • Notification Manager Notification Manager
  • Window Manager Window Manager
  • the activity manager is used to: manage the life cycle of each application and the usual navigation and fallback functions, such as controlling the exit of the application (including switching the user interface currently displayed in the display window to the system desktop), opening the , back (including switching the currently displayed user interface in the display window to the upper-level user interface of the currently displayed user interface), and the like.
  • the window manager is used to manage all window programs, such as obtaining the size of the display screen, judging whether there is a status bar, locking the screen, taking screenshots, and controlling the change of the display window (for example, reducing the display window to display, shaking display, twisting deformation display, etc.), etc.
  • the system runtime layer provides support for the upper layer, that is, the framework layer.
  • the Android operating system will run the C/C++ library included in the system runtime layer to implement the functions to be implemented by the framework layer.
  • the kernel layer is the layer between hardware and software. As shown in Figure 4, the kernel layer at least includes at least one of the following drivers: audio driver, display driver, Bluetooth driver, camera driver, WIFI driver, USB driver, HDMI driver, sensor driver (such as fingerprint sensor, temperature sensor, touch sensors, pressure sensors, etc.), etc.
  • the kernel layer at least includes at least one of the following drivers: audio driver, display driver, Bluetooth driver, camera driver, WIFI driver, USB driver, HDMI driver, sensor driver (such as fingerprint sensor, temperature sensor, touch sensors, pressure sensors, etc.), etc.
  • the kernel layer further includes a power driver module for power management.
  • the display device receives an input operation (such as a split-screen operation) performed by the user on the display screen, and the kernel layer can generate corresponding input operations according to the input operation. Enter an event and report the event to the application framework layer.
  • the window mode (such as multi-window mode) and window position and size corresponding to the input operation are set by the activity manager of the application framework layer.
  • the window management of the application framework layer draws the window according to the settings of the activity manager, and then sends the drawn window data to the display driver of the kernel layer, and the display driver displays the corresponding application interface in different display areas of the display screen.
  • the application layer contains at least one application that can display corresponding icon controls in the display, such as: live TV application icon control, video on demand application icon control, media center application Program icon controls, application center icon controls, game application icon controls, etc.
  • the live TV application may provide live TV from different sources.
  • a live TV application may provide a TV signal using input from cable, over-the-air, satellite services, or other types of live TV services.
  • the live TV application may display the video of the live TV signal on the display device 200 .
  • a video-on-demand application may provide video from various storage sources. Unlike live TV applications, video-on-demand provides a display of video from certain storage sources. For example, video-on-demand can come from the server side of cloud storage, from local hard disk storage containing existing video programs.
  • the media center application may provide various multimedia content playback applications.
  • a media center may provide services other than live TV or video-on-demand, where users can access various images or audio through a media center application.
  • the application center may provide storage of various applications.
  • An application can be a game, an application, or some other application that is related to a computer system or other device but can be run on a Smart TV.
  • the application center can obtain these applications from various sources, store them in local storage, and then run them on the display device 200 .
  • the display device 200 usually uses the ACR (Auto Content Recognition, automatic content recognition) function to collect the displayed content for content recognition, and finally uses the content recognition result for the AQ (Audio Quality, sound quality) and PQ (Picture Quality) of the display device 200 Quality, image quality) enhancement, and content recommendation, etc., to improve the user's experience of using the display device 200 .
  • ACR Auto Content Recognition, automatic content recognition
  • the ACR function also has some disadvantages, for example, its content identification depends on a third-party service provider, and for example, the ACR function is only supported in a specific country, and its use is limited.
  • some display devices 200 currently use the AIPQ (Artificial Intelligence Picture Quality, intelligent image mode switching) function.
  • the AIPQ function uses the machine learning model to identify the scene of the content currently being played by the display device 200, and automatically applies scene-specific PQ parameters according to the identified scene, thereby providing users with a better viewing experience. This function is not restricted by third-party service providers and can be used in any region, with a wider range of use.
  • the display device 200 will use the pre-stored default PQ parameters to display images.
  • the images displayed according to the default PQ parameters often cannot satisfy the user's experience of viewing the display device 200. Require. Therefore, the use of the AIPQ function by the current display device 200 also has a problem of affecting the user's sense of experience.
  • the embodiments of the present application provide an image display method and a display device, which combine the intelligent image mode switching AIPQ function with the automatic content recognition ACR function.
  • the intelligent image mode switching AIPQ function cannot recognize the scene, the solution of the present application Without using the default image parameters, high-quality image display can also be achieved, ensuring that the target image can meet the user's experience requirements for viewing the display device 200, and at the same time, it can also avoid the adjustment process of image quality parameters when the AIPQ function and the ACR function are used at the same time. conflict.
  • the display device 200 provided in this embodiment of the present application includes at least a display 275 and a controller 250, where the display 275 is used to display a target image that the user needs to watch, and the controller 250 is used to control the display device 200 to respond to a control instruction input by the user and set image quality parameters and display the target image according to the image quality parameters, etc.
  • the user When the user wants to view a certain image, he will input an instruction to the display device 200 to adjust the content currently displayed on the display device 200.
  • the user can input an instruction to the display device 200 by pressing a button on the remote
  • the display device 200 speaks the content to be selected and then inputs an instruction to the display device 200 .
  • the display device 200 After receiving the user's instruction, the display device 200 will select the corresponding signal source channel to play and play the target image that the user wants to watch.
  • the controller 250 needs to acquire the target image, and then use the intelligent image mode switching AIPQ function and/or the automatic content recognition ACR function to set the target image quality corresponding to the target image parameters, and finally control the display 275 to display the above-mentioned target image according to the target image quality parameter.
  • both the AIPQ function and the ACR function can match or calculate a series of image quality parameters for the identified content.
  • the image quality parameters can make the target image display better. It is clearer and the RGB brightness of the image is more optimized, and the target image is more realistic.
  • the AIPQ function can use the machine learning model to identify the scene of the target image, and match the scene-specific image quality parameters according to the scene.
  • some image quality parameters are correspondingly set in advance for different scenarios that can be recognized by the AIPQ function.
  • the scenes that can be recognized by the AIPQ function include grass, sky, face, buildings, etc., then some better image quality parameters will be pre-configured for grass, sky, face and buildings.
  • the AIPQ function can match the image quality parameters corresponding to the grass scene, and then adjust the clarity, contrast, image RGB brightness, chromaticity, etc. of the grass scene image, so that the grass and other objects in the target image are presented more realistically .
  • the ACR function can use computer algorithms to directly identify multimedia content, and then calculate a series of parameters corresponding to the content according to the identified content.
  • the ACR function in the embodiment of the present application is mainly used to identify the content of the target image, and then calculate and set some image quality parameters according to the identified content. For example, if the ACR function is used to identify a face, grass, etc.
  • the function can take into account the color characteristics of the face and grass, and then calculate the image quality parameters that meet the characteristics of the face and grass, so as to adjust the sharpness, contrast, image RGB brightness and chromaticity of the face and grass displayed in the target image, etc. It makes things like faces and grass appear more realistic.
  • the controller 250 after acquiring the target image, the controller 250 also needs to determine whether the target signal source where the target image is located is in the signal source whitelist, and then determine whether the display device 200 can currently use AIPQ function to set target image parameters.
  • the display device 200 To use the AIPQ function, it is necessary to capture the image currently displayed by the display device 200, but for example, some third-party applications such as Netflix, Amazon, Youtube, etc., based on the content of CSP (Content-Security-Policy, content security policy), it is not allowed to take screenshots of the video , so the display device 200 cannot perform the AIPQ operation on the copyright-protected content, and if the AIPQ operation is forced to be performed, a complaint from the partner will be caused and legal problems will arise. Based on this situation, a signal source that is not required by the CSP can be added to the signal source whitelist, and whether the content provided by the signal source is allowed to perform AIPQ operation is determined by judging whether the signal source exists in the whitelist.
  • CSP Content-Security-Policy, content security policy
  • the AIPQ function and/or the ACR function can be used to set the target image quality parameter, and then the controller 250 controls the display 275 to display the target image according to the target image quality parameter; if the target signal source If it does not exist in the signal source whitelist, the target signal source cannot support the AIPQ operation, the controller 250 can only use the ACR function to set the target image quality parameters, and then the controller 250 controls the display 275 to display the target image according to the target image quality parameters.
  • the controller 250 will continue to determine whether the scene of the target image can be recognized by using the AIPQ function. Because, although the machine learning model trained for the AIPQ function is based on a large number of sample image scenes, it cannot guarantee to cover all the scenes, and the scene that the AIPQ function can recognize is also limited. For example, if only four scenes of grass, sky, face, and building are trained in the machine learning model, AIPQ cannot recognize scenes other than these four.
  • the controller 250 can use the AIPQ function to set the target image quality parameter, and then control the display 275 to display the target image according to the target image quality parameter; if the AIPQ function cannot be used to recognize the target image In this scenario, the controller 250 needs to use the ACR function to set the target image quality parameters, and then control the display 275 to display the target image according to the target image quality parameters.
  • the controller 250 can use the ACR function to set the target image quality parameter.
  • the controller 250 also needs to determine whether the ACR function of the display device 200 is available, that is, to determine whether the ACR function on the display device 200 is enabled.
  • the display device 200 supporting the ACR function will display the option of the ACR function on its setting interface. The ACR function is disabled by default. If it needs to be enabled, the user needs to control it by inputting an instruction.
  • the controller 250 continues to determine whether the ACR function of the display device 200 is available, and if the ACR function is available, the controller 250 can use the ACR function to set the target image quality parameter, and then the controller 250 controls the display 275 to display the target image according to the target image quality parameter; if the ACR function is unavailable, the controller 250 needs to obtain the preset image quality parameter in the display device 200 as the target image quality parameters, and then control the display 275 to display the target image according to the target image quality parameters.
  • the embodiment of the present application can directly use the ACR function to set the target image quality parameter when the AIPQ function is unavailable, so as to avoid the situation when the AIPQ function is unavailable.
  • the controller 250 can use the AIPQ function to set the target image quality parameter.
  • the ACR function of the display device 200 may be available. If the ACR function is available, the controller 250 can use the AIPQ function and the ACR function to jointly set the target image quality parameters, so that the image quality parameters can be optimized to the greatest extent.
  • the controller 250 controls the display 275 according to the first image quality parameter.
  • An image quality parameter is displayed, and the image displayed at this time is regarded as the image to be processed. Then, the controller 250 may continue to determine whether the ACR function of the display device 200 is available.
  • the controller 250 directly uses the first image quality parameter set by the AIPQ function as the target image quality parameter, and the image to be processed is the target image at this time; if the ACR function is available, the controller 250 continues Use the ACR function to set target image quality parameters of the image to be processed, and then control the display 275 to display the target image according to the target image quality parameters.
  • the embodiment of the present application can use the ACR function again on the basis of the AIPQ function, and can optimize the image quality parameters to the greatest extent.
  • the target image displayed by the image quality parameter better meets the needs of the user.
  • the ACR function is used to further set the target image quality parameters, which can effectively prevent the controller 250 from using the AIPQ function and the ACR function to set the image quality parameters of the same target image.
  • the controller 250 can only set the target image quality parameter by using the ACR function.
  • the ACR function of the display device 200 may not be available. For example, the function is disabled in a default state. If there is no user control to enable the ACR function, the ACR function is always unavailable. Therefore, in some embodiments, as shown in FIG. 11 , if the scene in the target image cannot be recognized by the AIPQ function, before the controller 250 uses the ACR function, it can continue to determine whether the ACR function of the display device 200 is available.
  • the controller 250 can use the ACR function to set the target image quality parameter, and then the controller 250 controls the display 275 to display the target image according to the target image quality parameter; if the ACR function is not available, the controller 250 needs to acquire the display device 200 The preset image quality parameter is used as the target image quality parameter, and then the display 275 is controlled to display the target image according to the target image quality parameter.
  • the embodiment of the present application can use the ACR function to set the target image quality parameters, so as to avoid the use of default image parameters when the AIPQ function cannot recognize the scene.
  • the target image cannot meet the user's needs.
  • the AIPQ function on the current display device 200 can also be controlled to be turned on or off. For example, if the user does not turn off the AIPQ function in advance, if the controller 250 detects that the current target signal source is in the whitelist of signal sources, the AIPQ function on the setting interface is adjusted to be turned on; if the target signal source is not in the signal source whitelist If the signal source is in the whitelist, the AIPQ function on the setting interface will be hidden, that is, it will become unavailable.
  • the display device 200 provided by the embodiment of the present application can set the target image quality parameter of the target image in combination with the AIPQ function and the ACR function, so that the target image finally displayed according to the target image quality parameter can meet the viewing needs of the user.
  • the solution of the embodiment of the present application can also use the target image quality parameters set by the ACR function, avoid using the default image parameters, and then realize the display of high-quality images, ensuring that The target image displayed according to the target image quality parameter can meet the user's experience requirements for viewing the display device 200, and can also avoid conflicts in the adjustment process of the image quality parameter when the AIPQ function and the ACR function are used at the same time.
  • the embodiment of the present application also provides an image display method, the method mainly includes the steps performed by the controller 250 in the foregoing embodiment, as shown in FIG. 12 , the method mainly includes:
  • Step S101 acquiring the target image that the user needs to watch;
  • Step S102 using the intelligent image mode switching AIPQ function and/or the automatic content recognition ACR function to set the target image quality parameter corresponding to the target image;
  • Step S103 controlling the display 275 according to the The target image quality parameter displays the target image.
  • step S102 it can also be judged whether the target signal source where the target image is located exists in the whitelist of signal sources, whether the scene of the target image can be identified by using the AIPQ function, and whether the ACR function of the display device 200 is available. Moreover, different image quality parameter setting operations are performed according to different judgment results.
  • the target signal source exists in the signal source whitelist, it is also determined whether the scene of the target image can be recognized by using the AIPQ function. If the scene of the target image can be identified by using the AIPQ function, the target image quality parameter can be set using the AIPQ function, and then the display 275 is controlled to display the target image according to the target image quality parameter; if the scene of the target image cannot be identified by the AIPQ function, it is necessary to The target image quality parameter is set using the ACR function, and then the display 275 is controlled to display the target image according to the target image quality parameter.
  • the AIPQ function when the target signal source exists in the signal source whitelist and the scene of the target image can be identified by using the AIPQ function, the AIPQ function should be used to set the first image quality parameter corresponding to the target image, and control the display 275 Display according to the first image quality parameter, and the displayed image at this time is the image to be processed. Then, it can continue to judge whether the ACR function of the display device 200 is available. If the ACR function is unavailable, the first image quality parameter set by the AIPQ function is directly used as the target image quality parameter, and the image to be processed is the target image at this time; if the ACR function is available, continue to use the ACR function to set the pending image quality parameter. the target image quality parameter of the image, and then control the display 275 to display the target image according to the target image quality parameter.
  • the embodiment of the present application can directly use the ACR function to set the target image quality parameter when the AIPQ function is unavailable, Avoid the problem that the target image cannot meet the user's needs caused by using the default image parameters when the AIPQ function is unavailable.
  • the embodiment of the present application can use the ACR function again on the basis of the AIPQ function, which can optimize the image quality parameters to the greatest extent. Further, the target image displayed according to the image quality parameter can better meet the needs of the user, and at the same time, the conflict in the adjustment process of the image quality parameter when the AIPQ function and the ACR function are used at the same time can be avoided.
  • the embodiment of the present application can use the ACR function to set the target image quality parameter, so as to avoid the situation when the AIPQ function cannot recognize the scene.

Abstract

一种图像显示方法及显示设备,可以结合人工智能图像质量AIPQ功能和自动内容识别ACR功能设置目标图像的目标图像质量参数,以使得根据目标图像质量参数最终显示的目标图像能够满足用户的观看需求。

Description

图像显示方法及显示设备
本申请要求在2020年10月10日提交中国专利局、申请号为202011078919.4、名称为“图像显示方法及显示设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及显示技术领域,尤其涉及一种图像显示方法及显示设备。
背景技术
显示设备通常使用ACR(Auto Content Recognition,自动内容识别)功能采集显示的内容进行内容识别,最终再将内容识别结果用于显示设备的AQ(Audio Quality,声音质量)和PQ(Picture Quality,图像质量)的增强,以及内容推荐等,以提升用户使用显示设备的体验感。
发明内容
本申请提供了一种图像显示方法及显示设备,
第一方面,本申请提供了一种显示设备,包括:
显示器,用于显示用户需要在显示设备上观看的目标图像;
控制器,用于执行:
获取用户需要观看的目标图像;
利用智能图像模式切换AIPQ功能和/或自动内容识别ACR功能设置所述目标图像对应的目标图像质量参数;
控制显示器根据所述目标图像质量参数显示所述目标图像。
第二方面,本申请还提供了一种图像显示方法,包括:
获取用户需要观看的目标图像;
利用智能图像模式切换AIPQ功能和/或自动内容识别ACR功能设置所述目标图像对应的目标图像质量参数;
控制显示器根据所述目标图像质量参数显示所述目标图像。
附图说明
下面对实施例中所需要使用的附图作简单地介绍,显而易见地,对于本领域普通技术人员而言,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1中示例性示出了根据一些实施例的显示设备与控制装置之间操作场景的示意图;
图2中示例性示出了根据一些实施例的显示设备200的硬件配置框图;
图3中示例性示出了根据一些实施例的控制装置100的硬件配置框图;
图4中示例性示出了根据一些实施例的显示设备200中软件配置示意图;
图5中示例性示出了根据一些实施例的显示设备200中应用程序的图标控件界面显示示意图;
图6为本申请实施例示出的显示设备200中控制器250的第一种控制流程示意图;
图7为本申请实施例示出的显示设备200中控制器250的第二种控制流程示意图;
图8为本申请实施例示出的显示设备200中控制器250的第三种控制流程示意图;
图9为本申请实施例示出的显示设备200中控制器250的第四种控制流程示意图;
图10为本申请实施例示出的显示设备200中控制器250的第五种控制流程示意图;
图11为本申请实施例示出的显示设备200中控制器250的第六种控制流程示意图;
图12为本申请实施例示出的一种图像显示方法的流程图;
图13为本申请实施例示出的另一种图像显示方法的流程图。
具体实施方式
为使本申请的目的、实施方式和优点更加清楚,下面将结合本申请示例性实施例中的附图,对本申请示例性实施方式进行清楚、完整地描述,显然,所描述的示例性实施例仅是本申请一部分实施例,而不是全部的实施例。
基于本申请描述的示例性实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本申请所附权利要求保护的范围。此外,虽然本申请中公开内容按照示范性一个或几个实例来介绍,但应理解,可以就这些公开内容的各个方面也可以单独构成一个完整实施方式。
需要说明的是,本申请中对于术语的简要说明,仅是为了方便理解接下来描述的实施方式,而不是意图限定本申请的实施方式。除非另有说明,这些术语应当按照其普通和通常的含义理解。
本申请中说明书和权利要求书及上述附图中的术语“第一”、“第二”、“第三”等是用于区别类似或同类的对象或实体,而不必然意味着限定特定的顺序或先后次序,除非另外注明(Unless otherwise indicated)。应该理解这样使用的用语在适当情况下可以互换,例如能够根据本申请实施例图示或描述中给出那些以外的顺序实施。
此外,术语“包括”和“具有”以及他们的任何变形,意图在于覆盖但不排他的包含,例如,包含了一系列组件的产品或设备不必限于清楚地列出的那些组件,而是可包括没有清楚地列出的或对于这些产品或设备固有的其它组件。
本申请中使用的术语“模块”,是指任何已知或后来开发的硬件、软件、固件、人工智能、模糊逻辑或硬件或/和软件代码的组合,能够执行与该元件相关的功能。
本申请中使用的术语“遥控器”,是指电子设备(如本申请中公开的显示设备)的一个组件,通常可在较短的距离范围内无线控制电子设备。一般使用红外线和/或射频(RF)信号和/或蓝牙与电子设备连接,也可以包括WiFi、无线USB、蓝牙、动作传感器等功能模块。例如:手持式触摸遥控器,是以触摸屏中用户界面取代一般遥控装置中的大部分物理内置硬键。
本申请中使用的术语“手势”,是指用户通过一种手型的变化或手部运动等动作,用于表达预期想法、动作、目的/或结果的用户行为。
图1中示例性示出了根据实施例中显示设备与控制装置之间操作场景的示意图。如图1中示出,用户可通过移动终端300和控制装置100操作显示设备200。
在一些实施例中,控制装置100可以是遥控器,遥控器和显示设备的通信包括红外协议通信或蓝牙协议通信,及其他短距离通信方式等,通过无线或其他有线方式来 控制显示设备200。用户可以通过遥控器上按键,语音输入、控制面板输入等输入用户指令,来控制显示设备200。如:用户可以通过遥控器上音量加减键、频道控制键、上/下/左/右的移动按键、语音输入按键、菜单键、开关机按键等输入相应控制指令,来实现控制显示设备200的功能。
在一些实施例中,也可以使用移动终端、平板电脑、计算机、笔记本电脑、和其他智能设备以控制显示设备200。例如,使用在智能设备上运行的应用程序控制显示设备200。该应用程序通过配置可以在与智能设备关联的屏幕上,在直观的用户界面(UI)中为用户提供各种控制。
在一些实施例中,移动终端300可与显示设备200安装软件应用,通过网络通信协议实现连接通信,实现一对一控制操作的和数据通信的目的。如:可以实现用移动终端300与显示设备200建立控制指令协议,将遥控控制键盘同步到移动终端300上,通过控制移动终端300上用户界面,实现控制显示设备200的功能。也可以将移动终端300上显示音视频内容传输到显示设备200上,实现同步显示功能。
如图1中还示出,显示设备200还与服务器400通过多种通信方式进行数据通信。可允许显示设备200通过局域网(LAN)、无线局域网(WLAN)和其他网络进行通信连接。服务器400可以向显示设备200提供各种内容和互动。示例的,显示设备200通过发送和接收信息,以及电子节目指南(EPG)互动,接收软件程序更新,或访问远程储存的数字媒体库。服务器400可以是一个集群,也可以是多个集群,可以包括一类或多类服务器。通过服务器400提供视频点播和广告服务等其他网络服务内容。
显示设备200,可以液晶显示器、OLED显示器、投影显示设备。具体显示设备类型,尺寸大小和分辨率等不作限定,本领技术人员可以理解的是,显示设备200可以根据需要做性能和配置上一些改变。
显示设备200除了提供广播接收电视功能之外,还可以附加提供计算机支持功能的智能网络电视功能,包括但不限于,网络电视、智能电视、互联网协议电视(IPTV)等。
图2中示例性示出了根据示例性实施例中显示设备200的硬件配置框图。
在一些实施例中,显示设备200中包括控制器250、调谐解调器210、通信器220、检测器230、输入/输出接口255、显示器275,音频输出接口285、存储器260、供电电源290、用户接口265、外部装置接口240中的至少一种。
在一些实施例中,显示器275,用于接收源自第一处理器输出的图像信号,进行显示视频内容和图像以及菜单操控界面的组件。
在一些实施例中,显示器275,包括用于呈现画面的显示屏组件,以及驱动图像显示的驱动组件。
在一些实施例中,显示视频内容,可以来自广播电视内容,也可以是说,可通过有线或无线通信协议接收的各种广播信号。或者,可显示来自网络通信协议接收来自网络服务器端发送的各种图像内容。
在一些实施例中,显示器275用于呈现显示设备200中产生且用于控制显示设备200的用户操控UI界面。
在一些实施例中,根据显示器275类型不同,还包括用于驱动显示的驱动组件。
在一些实施例中,显示器275为一种投影显示器,还可以包括一种投影装置和投 影屏幕。
在一些实施例中,通信器220是用于根据各种通信协议类型与外部设备或外部服务器进行通信的组件。例如:通信器220可以包括Wifi模块221,蓝牙模块222,有线以太网模块223等其他网络通信协议模块或近场通信协议模块,以及红外接收器中的至少一种。
在一些实施例中,显示设备200可以通过通信器220与外部控制装置100或内容提供设备之间建立控制信号和数据信号发送和接收。
在一些实施例中,用户接口265,可用于接收控制装置100(如:红外遥控器等)红外控制信号。
在一些实施例中,检测器230是显示设备200用于采集外部环境或与外部交互的信号。
在一些实施例中,检测器230包括光接收器,用于采集环境光线强度的传感器,可以通过采集环境光可以自适应性显示参数变化等。
在一些实施例中,检测器230还可以包括图像采集器232,如相机、摄像头等,可以用于采集外部环境场景,以及用于采集用户的属性或与用户交互手势,可以自适应变化显示参数,也可以识别用户手势,以实现与用户之间互动的功能。
在一些实施例中,检测器230还可以包括温度传感器等,如通过感测环境温度。
在一些实施例中,显示设备200可自适应调整图像的显示色温。如当温度偏高的环境时,可调整显示设备200显示图像色温偏冷色调,或当温度偏低的环境时,可以调整显示设备200显示图像偏暖色调。
在一些实施例中,检测器230还可以包括声音采集器231等,如麦克风,可以用于接收用户的声音。示例性的,包括用户控制显示设备200的控制指令的语音信号,或采集环境声音,用于识别环境场景类型,使得显示设备200可以自适应适应环境噪声。
在一些实施例中,如图2所示,输入/输出接口255被配置为,可进行控制器250与外部其他设备或其他控制器250之间的数据传输。如接收外部设备的视频信号数据和音频信号数据、或命令指令数据等。
在一些实施例中,外部装置接口240可以包括,但不限于如下:可以高清多媒体接口HDMI接口、模拟或数据高清分量输入接口、复合视频输入接口、USB输入接口、RGB端口等任一个或多个接口。也可以是上述多个接口形成复合性的输入/输出接口。
在一些实施例中,如图2所示,调谐解调器210被配置为,通过有线或无线接收方式接收广播电视信号,可以进行放大、混频和谐振等调制解调处理,从多个无线或有线广播电视信号中解调出音视频信号,该音视频信号可以包括用户所选择电视频道频率中所携带的电视音视频信号,以及EPG数据信号。
在一些实施例中,调谐解调器210解调的频点受到控制器250的控制,控制器250可根据用户选择发出控制信号,以使的调制解调器响应用户选择的电视信号频率以及调制解调该频率所携带的电视信号。
在一些实施例中,广播电视信号可根据电视信号广播制式不同区分为地面广播信号、有线广播信号、卫星广播信号或互联网广播信号等。或者根据调制类型不同可以区分为数字调制信号,模拟调制信号等。或者根据信号种类不同区分为数字信号、模 拟信号等。
在一些实施例中,控制器250和调谐解调器210可以位于不同的分体设备中,即调谐解调器210也可在控制器250所在的主体设备的外置设备中,如外置机顶盒等。这样,机顶盒将接收到的广播电视信号调制解调后的电视音视频信号输出给主体设备,主体设备经过第一输入/输出接口接收音视频信号。
在一些实施例中,控制器250,通过存储在存储器上中各种软件控制程序,来控制显示设备的工作和响应用户的操作。控制器250可以控制显示设备200的整体操作。例如:响应于接收到用于选择在显示器275上显示UI对象的用户命令,控制器250便可以执行与由用户命令选择的对象有关的操作。
在一些实施例中,所述对象可以是可选对象中的任何一个,例如超链接或图标。与所选择的对象有关操作,例如:显示连接到超链接页面、文档、图像等操作,或者执行与所述图标相对应程序的操作。用于选择UI对象用户命令,可以是通过连接到显示设备200的各种输入装置(例如,鼠标、键盘、触摸板等)输入命令或者与由用户说出语音相对应的语音命令。
如图2所示,控制器250包括随机存取存储器251(Random Access Memory,RAM)、只读存储器252(Read-Only Memory,ROM)、图形处理器253(Graphics Processing Unit,GPU)、中央处理器254(Central Processing Unit,CPU)、输入/输出接口255以及通信总线256(Bus)中的至少一种。其中,通信总线连接各个部件。
在一些实施例中,RAM 251用于存储操作系统或其他正在运行中的程序的临时数据。
在一些实施例中,ROM 252用于存储各种系统启动的指令。
在一些实施例中,ROM 252用于存储一个基本输入输出系统,称为基本输入输出系统(Basic Input Output System,BIOS)。用于完成对系统的加电自检、系统中各功能模块的初始化、系统的基本输入/输出的驱动程序及引导操作系统。
在一些实施例中,在收到开机信号时,显示设备200电源开始启动,CPU运行ROM252中系统启动指令,将存储在存储器的操作系统的临时数据拷贝至RAM 251中,以便于启动或运行操作系统。当操作系统启动完成后,CPU再将存储器中各种应用程序的临时数据拷贝至RAM 251中,然后,以便于启动或运行各种应用程序。
在一些实施例中,处理器254,用于执行存储在存储器中操作系统和应用程序指令。以及根据接收外部输入的各种交互指令,来执行各种应用程序、数据和内容,以便最终显示和播放各种音视频内容。
在一些示例性实施例中,处理器254,可以包括多个处理器。多个处理器可包括一个主处理器以及一个或多个子处理器。主处理器,用于在预加电模式中执行显示设备200一些操作,和/或在正常模式下显示画面的操作。一个或多个子处理器,用于在待机模式等状态下一种操作。
在一些实施例中,图形处理器253,用于产生各种图形对象,如:图标、操作菜单、以及用户输入指令显示图形等。包括运算器,通过接收用户输入各种交互指令进行运算,根据显示属性显示各种对象。以及包括渲染器,对基于运算器得到的各种对象,进行渲染,上述渲染后的对象用于显示在显示器上。
在一些实施例中,视频处理器270被配置为将接收外部视频信号,根据输入信号 的标准编解码协议,进行解压缩、解码、缩放、降噪、帧率转换、分辨率转换、图像合成等等视频处理,可得到直接可显示设备200上显示或播放的信号。
在一些实施例中,视频处理器270,包括解复用模块、视频解码模块、图像合成模块、帧率转换模块、显示格式化模块等。
其中,解复用模块,用于对输入音视频数据流进行解复用处理,如输入MPEG-2,则解复用模块进行解复用成视频信号和音频信号等。
视频解码模块,则用于对解复用后的视频信号进行处理,包括解码和缩放处理等。
图像合成模块,如图像合成器,其用于将图形生成器根据用户输入或自身生成的GUI信号,与缩放处理后视频图像进行叠加混合处理,以生成可供显示的图像信号。
帧率转换模块,用于对转换输入视频帧率,如将60Hz帧率转换为120Hz帧率或240Hz帧率,通常的格式采用如插帧方式实现。
显示格式化模块,则用于将接收帧率转换后视频输出信号,改变信号以符合显示格式的信号,如输出RGB数据信号。
在一些实施例中,图形处理器253可以和视频处理器可以集成设置,也可以分开设置,集成设置的时候可以执行输出给显示器的图形信号的处理,分离设置的时候可以分别执行不同的功能,例如GPU+FRC(Frame Rate Conversion))架构。
在一些实施例中,音频处理器280,用于接收外部的音频信号,根据输入信号的标准编解码协议,进行解压缩和解码,以及降噪、数模转换、和放大处理等处理,得到可以在扬声器中播放的声音信号。
在一些实施例中,视频处理器270可以包括一颗或多颗芯片组成。音频处理器,也可以包括一颗或多颗芯片组成。
在一些实施例中,视频处理器270和音频处理器280,可以单独的芯片,也可以于控制器一起集成在一颗或多颗芯片中。
在一些实施例中,音频输出,在控制器250的控制下接收音频处理器280输出的声音信号,如:扬声器286,以及除了显示设备200自身携带的扬声器之外,可以输出至外接设备的发生装置的外接音响输出端子,如:外接音响接口或耳机接口等,还可以包括通信接口中的近距离通信模块,例如:用于进行蓝牙扬声器声音输出的蓝牙模块。
供电电源290,在控制器250控制下,将外部电源输入的电力为显示设备200提供电源供电支持。供电电源290可以包括安装显示设备200内部的内置电源电路,也可以是安装在显示设备200外部电源,在显示设备200中提供外接电源的电源接口。
用户接口265,用于接收用户的输入信号,然后,将接收用户输入信号发送给控制器250。用户输入信号可以是通过红外接收器接收的遥控器信号,可以通过网络通信模块接收各种用户控制信号。
在一些实施例中,用户通过控制装置100或移动终端300输入用户命令,用户输入接口则根据用户的输入,显示设备200则通过控制器250响应用户的输入。
在一些实施例中,用户可在显示器275上显示的图形用户界面(GUI)输入用户命令,则用户输入接口通过图形用户界面(GUI)接收用户输入命令。或者,用户可通过输入特定的声音或手势进行输入用户命令,则用户输入接口通过传感器识别出声音或手势,来接收用户输入命令。
存储器260,包括存储用于驱动显示设备200的各种软件模块。如:第一存储器中存储的各种软件模块,包括:基础模块、检测模块、通信模块、显示控制模块、浏览器模块、和各种服务模块等中的至少一种。
基础模块用于显示设备200中各个硬件之间信号通信、并向上层模块发送处理和控制信号的底层软件模块。检测模块用于从各种传感器或用户输入接口中收集各种信息,并进行数模转换以及分析管理的管理模块。
例如,语音识别模块中包括语音解析模块和语音指令数据库模块。显示控制模块用于控制显示器进行显示图像内容的模块,可以用于播放多媒体图像内容和UI界面等信息。通信模块,用于与外部设备之间进行控制和数据通信的模块。浏览器模块,用于执行浏览服务器之间数据通信的模块。服务模块,用于提供各种服务以及各类应用程序在内的模块。同时,存储器260还用存储接收外部数据和用户数据、各种用户界面中各个项目的图像以及焦点对象的视觉效果图等。
图3示例性示出了根据示例性实施例中控制装置100的配置框图。如图3所示,控制装置100包括控制器110、通信接口130、用户输入/输出接口140、存储器190、供电电源180。
控制装置100被配置为控制显示设备200,以及可接收用户的输入操作指令,且将操作指令转换为显示设备200可识别和响应的指令,起用用户与显示设备200之间交互中介作用。如:用户通过操作控制装置100上频道加减键,显示设备200响应频道加减的操作。
在一些实施例中,控制装置100可是一种智能设备。如:控制装置100可根据用户需求安装控制显示设备200的各种应用。
在一些实施例中,如图1所示,移动终端300或其他智能电子设备,可在安装操控显示设备200的应用之后,可以起到控制装置100类似功能。如:用户可以通过安装应用,在移动终端300或其他智能电子设备上可提供的图形用户界面的各种功能键或虚拟按钮,以实现控制装置100实体按键的功能。
控制器110包括处理器112和RAM 113和ROM 114。控制器用于控制控制装置100的运行和操作,以及内部各部件之间通信协作以及外部和内部的数据处理功能。
通信接口130在控制器110的控制下,实现与显示设备200之间控制信号和数据信号的通信。如:将接收到的用户输入信号发送至显示设备200上。通信接口130可包括WiFi芯片131、蓝牙模块132、NFC模块133等其他近场通信模块中至少之一种。
用户输入/输出接口140,其中,输入接口包括麦克风141、触摸板142、传感器143、按键144等其他输入接口中至少一者。如:用户可以通过语音、触摸、手势、按压等动作实现用户指令输入功能,输入接口通过将接收的模拟信号转换为数字信号,以及数字信号转换为相应指令信号,发送至显示设备200。
输出接口包括将接收的用户指令发送至显示设备200的接口。在一些实施例中,可以红外接口,也可以是射频接口。如:红外信号接口时,需要将用户输入指令按照红外控制协议转化为红外控制信号,经红外发送模块进行发送至显示设备200。再如:射频信号接口时,需将用户输入指令转化为数字信号,然后按照射频控制信号调制协议进行调制后,由射频发送端子发送至显示设备200。
在一些实施例中,控制装置100包括通信接口130和输入输出接口140中至少一 者。控制装置100中配置通信接口130,如:WiFi、蓝牙、NFC等模块,可将用户输入指令通过WiFi协议、或蓝牙协议、或NFC协议编码,发送至显示设备200。
存储器190,用于在控制器的控制下存储驱动和控制控制设备200的各种运行程序、数据和应用。存储器190,可以存储用户输入的各类控制信号指令。
供电电源180,用于在控制器的控制下为控制装置100各元件提供运行电力支持。
可以电池及相关控制电路。
在一些实施例中,系统可以包括内核(Kernel)、命令解析器(shell)、文件系统和应用程序。内核、shell和文件系统一起组成了基本的操作系统结构,它们让用户可以管理文件、运行程序并使用系统。上电后,内核启动,激活内核空间,抽象硬件、初始化硬件参数等,运行并维护虚拟内存、调度器、信号及进程间通信(IPC)。内核启动后,再加载Shell和用户应用程序。应用程序在启动后被编译成机器码,形成一个进程。
参见图4,在一些实施例中,将系统分为四层,从上至下分别为应用程序(Applications)层(简称“应用层”),应用程序框架(Application Framework)层(简称“框架层”),安卓运行时(Android runtime)和系统库层(简称“系统运行库层”),以及内核层。
框架层为应用程序层的应用程序提供应用编程接口(application programming interface,API)和编程框架。应用程序框架层包括一些预先定义的函数。应用程序框架层相当于一个处理中心,这个中心决定让应用层中的应用程序做出动作。应用程序通过API接口,可在执行中访问系统中的资源和取得系统的服务。
如图4所示,本申请实施例中应用程序框架层包括管理器(Managers),内容提供者(Content Provider)等,其中管理器包括以下模块中的至少一个:活动管理器(Activity Manager)用与和系统中正在运行的所有活动进行交互;位置管理器(Location Manager)用于给系统服务或应用提供了系统位置服务的访问;文件包管理器(Package Manager)用于检索当前安装在设备上的应用程序包相关的各种信息;通知管理器(Notification Manager)用于控制通知消息的显示和清除;窗口管理器(Window Manager)用于管理用户界面上的括图标、窗口、工具栏、壁纸和桌面部件。
在一些实施例中,活动管理器用于:管理各个应用程序的生命周期以及通常的导航回退功能,比如控制应用程序的退出(包括将显示窗口中当前显示的用户界面切换到系统桌面)、打开、后退(包括将显示窗口中当前显示的用户界面切换到当前显示的用户界面的上一级用户界面)等。
在一些实施例中,窗口管理器用于管理所有的窗口程序,比如获取显示屏大小,判断是否有状态栏,锁定屏幕,截取屏幕,控制显示窗口变化(例如将显示窗口缩小显示、抖动显示、扭曲变形显示等)等。
在一些实施例中,系统运行库层为上层即框架层提供支撑,当框架层被使用时,安卓操作系统会运行系统运行库层中包含的C/C++库以实现框架层要实现的功能。
在一些实施例中,内核层是硬件和软件之间的层。如图4所示,内核层至少包含以下驱动中的至少一种:音频驱动、显示驱动、蓝牙驱动、摄像头驱动、WIFI驱动、USB驱动、HDMI驱动、传感器驱动(如指纹传感器,温度传感器,触摸传感器、压力 传感器等)等。
在一些实施例中,内核层还包括用于进行电源管理的电源驱动模块。
在一些实施例中,对于具备触控功能的显示设备,以分屏操作为例,显示设备接收用户作用于显示屏上的输入操作(如分屏操作),内核层可以根据输入操作产生相应的输入事件,并向应用程序框架层上报该事件。由应用程序框架层的活动管理器设置与该输入操作对应的窗口模式(如多窗口模式)以及窗口位置和大小等。应用程序框架层的窗口管理根据活动管理器的设置绘制窗口,然后将绘制的窗口数据发送给内核层的显示驱动,由显示驱动在显示屏的不同显示区域显示与之对应的应用界面。
在一些实施例中,如图5中所示,应用程序层包含至少一个应用程序可以在显示器中显示对应的图标控件,如:直播电视应用程序图标控件、视频点播应用程序图标控件、媒体中心应用程序图标控件、应用程序中心图标控件、游戏应用图标控件等。
在一些实施例中,直播电视应用程序,可以通过不同的信号源提供直播电视。例如,直播电视应用程可以使用来自有线电视、无线广播、卫星服务或其他类型的直播电视服务的输入提供电视信号。以及,直播电视应用程序可在显示设备200上显示直播电视信号的视频。
在一些实施例中,视频点播应用程序,可以提供来自不同存储源的视频。不同于直播电视应用程序,视频点播提供来自某些存储源的视频显示。例如,视频点播可以来自云存储的服务器端、来自包含已存视频节目的本地硬盘储存器。
在一些实施例中,媒体中心应用程序,可以提供各种多媒体内容播放的应用程序。例如,媒体中心,可以为不同于直播电视或视频点播,用户可通过媒体中心应用程序访问各种图像或音频所提供服务。
在一些实施例中,应用程序中心,可以提供储存各种应用程序。应用程序可以是一种游戏、应用程序,或某些和计算机系统或其他设备相关但可以在智能电视中运行的其他应用程序。应用程序中心可从不同来源获得这些应用程序,将它们储存在本地储存器中,然后在显示设备200上可运行。
目前,显示设备200通常使用ACR(Auto Content Recognition,自动内容识别)功能采集显示的内容进行内容识别,最终再将内容识别结果用于显示设备200的AQ(Audio Quality,声音质量)和PQ(Picture Quality,图像质量)的增强,以及内容推荐等,以提升用户使用显示设备200的体验感。
但是,ACR功能也存在一些缺点,例如其内容识别依赖于第三方服务商,再例如ACR功能只在特定的国家支持,其使用具有局限性等。为了避免ACR功能的前述缺点引发的问题,当前也有一些显示设备200使用AIPQ(Artificial Intelligence Picture Quality,智能图像模式切换)功能。AIPQ功能是利用机器学习模型识别显示设备200当前播放内容的场景,并根据识别到的场景自动应用具有场景针对性的PQ参数,给用户提供更佳的观看体验。这种功能不受第三方服务商的限制也可以在任何地区使用,使用范围更广。
但是,AIPQ功能的使用过程中如果存在无法识别的场景,那么显示设备200将会使用预存的默认PQ参数进行图像显示,然而根据默认PQ参数显示的图像很多时候不能满足用户观看显示设备200的体验要求。因此,当前显示设备200使用AIPQ功能也存在着影响用户体验感的问题。
基于以上问题,本申请实施例提供了一种图像显示方法及显示设备,将智能图像模式切换AIPQ功能与自动内容识别ACR功能结合,当智能图像模式切换AIPQ功能无法识别场景时,本申请的方案无需使用默认图像参数,也可以实现高质量图像的显示,保证目标图像能够满足用户观看显示设备200的体验要求,同时也能避免AIPQ功能和ACR功能同时使用时,图像质量参数的调整过程出现的冲突。
本申请实施例提供的显示设备200至少包括显示器275和控制器250,其中显示器275用于显示用户需要观看的目标图像,控制器250用于控制显示设备200响应用户输入的控制指令、设置图像质量参数以及根据图像质量参数显示目标图像等等。
当用户想要观看某个图像时,会向显示设备200输入指令,用以调整显示设备200当前显示的内容,用户可以通过按下遥控器上的按键向显示设备200输入指令,也可以通过向显示设备200说出想选择的内容进而向显示设备200输入指令。
显示设备200接收到用户的指令后,会选择相应的信号源通道播放播放用户想要观看的目标图像。为了保证目标图像的质量可以满足用户的要求,如图6所示,控制器250需要获取目标图像,然后利用智能图像模式切换AIPQ功能和/或自动内容识别ACR功能设置目标图像对应的目标图像质量参数,最后再控制显示器275根据目标图像质量参数显示上述目标图像。通常,AIPQ功能和ACR功能均能针对识别到的内容匹配或者计算出一系列图像质量参数,这种图像质量参数相比于显示设备200中默认的图像质量参数来说,能使目标图像显示得更加清晰并且图像的RGB亮度等更加优化,目标图像更加真实。
其中,AIPQ功能可以利用机器学习模型识别出目标图像的场景,并且根据场景匹配到具有场景针对性的图像质量参数。在实际应用时,针对于AIPQ功能能识别到的不同场景都会预先对应设置一些图像质量参数。例如,利用AIPQ功能可以识别出的场景包括草地、天空、人脸、建筑等,那么针对于草地、天空、人脸和建筑会分别预先配置一些较优的图像质量参数,如果识别目标图像的场景为草地,那么AIPQ功能就可以匹配到草地场景对应的图像质量参数,进而调整草地场景图像的清晰度、对比度、图像RGB亮度、色度等,使得目标图像中的草地及其他物体呈现得更加逼真。
ACR功能可以利用计算机算法直接识别多媒体内容,进而根据识别出的内容计算出该内容对应的一系列参数。本申请实施例中的ACR功能主要用于识别目标图像的内容,进而根据识别出的内容计算并设置一些图像质量参数,例如,利用ACR功能识别出目标图像中有人脸、有草地等,那么ACR功能可以考虑到人脸和草地的颜色特征,进而计算出符合人脸和草地特点的图像质量参数,以调整目标图像中人脸和草地显示的清晰度、对比度、图像RGB亮度和色度等,使得人脸和草地等事物呈现得更加逼真。
在一些实施例中,如图7所示,控制器250在获取目标图像之后,还需要判断目标图像所在的目标信号源是否存在于信号源白名单中,进而确定显示设备200当前是否可以利用AIPQ功能来设置目标图像参数。AIPQ功能的使用,需要截取显示设备200当前显示的图像,但是例如一些Netflix、Amazon、Youtube等第三方应用,会基于CSP(Content-Security-Policy,内容安全策略)的内容不允许对视频进行截屏,所以显示设备200对有版权保护的内容不能进行AIPQ操作,如果强制进行AIPQ操作会引起合作方的投诉,引起法务问题。基于这种情况,可以将不受CSP要求的信号源加入到信号源白名单中,通过判断信号源是否存在于白名单中,来判断该信号源所提 供的内容是否允许进行AIPQ操作。在目标信号源存在于信号源白名单中的情况下,可以利用AIPQ功能和/或ACR功能设置目标图像质量参数,而后控制器250控制显示器275根据目标图像质量参数显示目标图像;如果目标信号源不存在于信号源白名单,则目标信号源无法支持AIPQ操作,控制器250只能利用ACR功能设置目标图像质量参数,而后控制器250控制显示器275根据目标图像质量参数显示目标图像。
在一些实施例中,如图8所示,如果目标信号源存在于信号源白名单中,则控制器250还要继续判断利用AIPQ功能能否识别到目标图像的场景。因为,针对于AIPQ功能训练的机器学习模型虽然基于海量的样本图像场景,但是并不能保证一定涵盖了所有的场景,进而AIPQ功能能够识别出的场景也是有限的。例如机器学习模型中如果只训练出草地、天空、人脸、建筑四种场景,那么对于除这四种之外的场景AIPQ是识别不到的。如果利用AIPQ功能可以识别到目标图像的场景,那么控制器250才可以使用AIPQ功能设置目标图像质量参数,而后控制显示器275根据目标图像质量参数显示目标图像;如果利用AIPQ功能识别不到目标图像的场景,就需要控制器250利用ACR功能设置目标图像质量参数,而后控制显示器275根据目标图像质量参数显示目标图像。
如图7所示,如果目标信号源不存在于信号源白名单中,说明目标信号源目前不能支持AIPQ功能,控制器250可以利用ACR功能设置目标图像质量参数。但是,在使用ACR功能之前,控制器250还需要判断显示设备200的ACR功能是否可用,即判断显示设备200上的ACR功能是否开启。通常,支持ACR功能的显示设备200,会在其设置界面上显示有ACR功能的选项,默认状态下ACR功能是关闭的,如需开启,需要用户通过输入指令来控制其开启。
在一些实施例中,如图9所示,在目标信号源不存在于信号源白名单中的情况下,控制器250继续判断显示设备200的ACR功能是否可用,如果ACR功能可用,则控制器250可以利用ACR功能设置目标图像质量参数,而后控制器250控制显示器275根据目标图像质量参数显示目标图像;如果ACR功能不可用,则控制器250需要获取显示设备200中预设图像质量参数作为目标图像质量参数,而后再控制显示器275根据目标图像质量参数显示目标图像。
针对于上述目标信号源不存在于信号源白名单中并且ACR功能可用的情况,本申请实施例可以在AIPQ功能不可用的情况下,直接利用ACR功能设置目标图像质量参数,避免AIPQ功能不可用时使用默认图像参数造成的目标图像不能满足用户需求的问题。
如图8所示,如果利用AIPQ功能可以识别到目标图像的场景,那么控制器250才可以使用AIPQ功能设置目标图像质量参数。但是一些情况下,显示设备200的ACR功能有可能可用,如果ACR功能可用,那么控制器250就可以利用AIPQ功能和ACR功能共同设置目标图像质量参数,这样能最大程度地优化图像质量参数。
因此,在一些实施例中,如图10所示,如果利用AIPQ功能可以识别到目标图像的场景,应当先利用AIPQ功能设置目标图像对应的第一图像质量参数,控制器250控制显示器275根据第一图像质量参数显示,此时显示出的图像作为待处理图像。而后,控制器250还可以继续判断显示设备200的ACR功能是否可用。如果ACR功能不可用,则控制器250直接将AIPQ功能设置出的第一图像质量参数作为目标图像质 量参数使用,而此时待处理图像即为目标图像;如果ACR功能可用,则控制器250继续利用ACR功能设置待处理图像的目标图像质量参数,而后再控制显示器275根据目标图像质量参数显示目标图像。
针对于上述利用AIPQ功能可以识别到目标图像的场景并且ACR功能可用的情况,本申请实施例可以在AIPQ功能的基础上再次使用ACR功能,可以最大程度地优化图像质量参数,进而使得根据这种图像质量参数显示的目标图像更加满足用户的需求。
另外,在先利用AIPQ功能设置图像质量参数的基础上再利用ACR功能进一步设置目标图像质量参数,可以有效避免控制器250分别利用AIPQ功能和ACR功能设置同一个目标图像的图像质量参数时,因为两种功能的识别内容重叠而产生的图像质量参数设置冲突的问题。
如图8所示,如果利用AIPQ功能识别不到目标图像中的场景,则控制器250只能利用ACR功能设置目标图像质量参数。但是在一些情况下,显示设备200的ACR功能可能不可用,比如该功能在默认状态是关闭的,如果一直没有用户去控制开启,那么该ACR功能一直是不可用的。因此,在一些实施例中,如图11所示,如果利用AIPQ功能识别不到目标图像中的场景,那么在控制器250使用ACR功能之前,还可以继续判断显示设备200的ACR功能是否可用。如果ACR功能可用,则控制器250可以利用ACR功能设置目标图像质量参数,而后控制器250控制显示器275根据目标图像质量参数显示目标图像;如果ACR功能不可用,则控制器250需要获取显示设备200中预设图像质量参数作为目标图像质量参数,而后再控制显示器275根据目标图像质量参数显示目标图像。
针对于上述利用AIPQ功能可以识别不到目标图像的场景并且ACR功能可用的情况,本申请实施例可以利用ACR功能设置目标图像质量参数,避免了在AIPQ功能识别不到场景时使用默认图像参数造成的目标图像不能满足用户需求的问题。
并且,目前显示设备200上的AIPQ功能也是可以被控制开启或者关闭的。例如,如果用户没有预先将AIPQ功能关闭,那么控制器250如果检测到当前的目标信号源处于信号源白名单中,则设置界面上的AIPQ功能就被调整为打开状态;如果目标信号源不处于信号源白名单中,则设置界面上的AIPQ功能就被隐藏,即变为不可用。
根据以上内容可知,本申请实施例提供的显示设备200,可以结合AIPQ功能和ACR功能设置目标图像的目标图像质量参数,以使得根据目标图像质量参数最终显示的目标图像能够满足用户的观看需求。并且,当AIPQ功能无法识别目标图像的场景但是ACR功能可用时,本申请实施例的方案也可以使用ACR功能设置的目标图像质量参数,避免使用默认图像参数,进而实现高质量图像的显示,保证根据目标图像质量参数显示的目标图像能够满足用户观看显示设备200的体验要求,同时也能避免AIPQ功能和ACR功能同时使用时,图像质量参数的调整过程出现的冲突。
本申请实施例还提供了一种图像显示方法,该方法主要包括前述实施例中控制器250所执行的步骤,如图12所示,该方法主要包括:
步骤S101,获取用户需要观看的目标图像;步骤S102,利用智能图像模式切换AIPQ功能和/或自动内容识别ACR功能设置所述目标图像对应的目标图像质量参数;步骤S103,控制显示器275根据所述目标图像质量参数显示所述目标图像。
另外,在步骤S102中,还可以判断目标图像所在的目标信号源是否存在于信号源 白名单中、判断利用AIPQ功能能否识别目标图像的场景以及判断显示设备200的ACR功能是否可用。并且,根据不同的判断结果进行不同的图像质量参数设置操作。
在一些实施例中,在目标信号源不存在于信号源白名单中的情况下,判断显示设备200的ACR功能是否可用。如果ACR功能可用,则可以利用ACR功能设置目标图像质量参数,而后控制显示器275根据目标图像质量参数显示目标图像;如果ACR功能不可用,则需要获取显示设备200中预设图像质量参数作为目标图像质量参数,而后再控制显示器275根据目标图像质量参数显示目标图像。
在一些实施例中,在目标信号源存在于信号源白名单中的情况下,还要判断利用AIPQ功能能否识别到目标图像的场景。如果利用AIPQ功能可以识别到目标图像的场景,那么可以使用AIPQ功能设置目标图像质量参数,而后控制显示器275根据目标图像质量参数显示目标图像;如果利用AIPQ功能识别不到目标图像的场景,就需要利用ACR功能设置目标图像质量参数,而后控制显示器275根据目标图像质量参数显示目标图像。
在一些实施例中,在目标信号源存在于信号源白名单中并且利用AIPQ功能可以识别到目标图像的场景的情况下,应当先利用AIPQ功能设置目标图像对应的第一图像质量参数,控制显示器275根据第一图像质量参数显示,此时显示出的图像作为待处理图像。而后,还可以继续判断显示设备200的ACR功能是否可用。如果ACR功能不可用,则直接将AIPQ功能设置出的第一图像质量参数作为目标图像质量参数使用,而此时待处理图像即为目标图像;如果ACR功能可用,则继续利用ACR功能设置待处理图像的目标图像质量参数,而后再控制显示器275根据目标图像质量参数显示目标图像。
在一些实施例中,在目标信号源存在于信号源白名单中并且利用AIPQ功能识别不到目标图像的场景的情况下,也要判断显示设备的ACR功能是否可用。如果ACR功能可用,则可以利用ACR功能设置目标图像质量参数,而后控制显示器275根据目标图像质量参数显示目标图像;如果ACR功能不可用,则需要获取显示设备200中预设图像质量参数作为目标图像质量参数,而后再控制显示器275根据目标图像质量参数显示目标图像。
本申请实施例中,结合上述目标图像所在的目标信号源是否存在于信号源白名单中、利用AIPQ功能能否识别目标图像的场景以及显示设备200的ACR功能是否可用的各种结果,还可以形成如图13所示的图像显示方法。
如图13所示,针对于目标信号源不存在于信号源白名单中并且ACR功能可用的情况,本申请实施例可以在AIPQ功能不可用的情况下,直接利用ACR功能设置目标图像质量参数,避免AIPQ功能不可用时使用默认图像参数造成的目标图像不能满足用户需求的问题。
如图13所示,针对于利用AIPQ功能可以识别到目标图像的场景并且ACR功能可用的情况,本申请实施例可以在AIPQ功能的基础上再次使用ACR功能,可以最大程度地优化图像质量参数,进而使得根据这种图像质量参数显示的目标图像更加满足用户的需求,同时也能避免AIPQ功能和ACR功能同时使用时,图像质量参数的调整过程出现的冲突。
如图13所示,针对于利用AIPQ功能可以识别不到目标图像的场景并且ACR功 能可用的情况,本申请实施例可以利用ACR功能设置目标图像质量参数,避免了在AIPQ功能识别不到场景时使用默认图像参数造成的目标图像不能满足用户需求的问题。
为了方便解释,已经结合具体的实施方式进行了上述说明。但是,上述示例性的讨论不是意图穷尽或者将实施方式限定到上述公开的具体形式。根据上述的教导,可以得到多种修改和变形。上述实施方式的选择和描述是为了更好的解释原理以及实际的应用,从而使得本领域技术人员更好的使用所述实施方式以及适于具体使用考虑的各种不同的变形的实施方式。

Claims (10)

  1. 一种显示设备,包括:
    显示器,用于显示用户需要在显示设备上观看的目标图像;
    控制器,用于执行:
    获取用户需要观看的目标图像;
    利用智能图像模式切换AIPQ功能和/或自动内容识别ACR功能设置所述目标图像对应的目标图像质量参数;
    控制显示器根据所述目标图像质量参数显示所述目标图像。
  2. 根据权利要求1所述的显示设备,所述控制器,还用于执行:
    判断所述目标图像所在的目标信号源是否存在于信号源白名单中;所述信号源白名单用于表示可以支持智能图像模式切换AIPQ功能的信号源集合;
    在所述目标信号源存在于所述信号源白名单中的情况下,利用智能图像模式切换AIPQ功能和/或自动内容识别ACR功能设置所述目标图像对应的目标图像质量参数。
  3. 根据权利要求2所述的显示设备,所述控制器,还用于执行:
    在所述目标信号源存在于所述信号源白名单中的情况下,判断利用所述智能图像模式切换AIPQ功能能否识别到所述目标图像的场景;
    在利用所述智能图像模式切换AIPQ功能识别到所述目标图像的场景的情况下,利用智能图像模式切换AIPQ功能设置所述目标图像对应的第一图像质量参数。
  4. 根据权利要求3所述的显示设备,所述控制器,还用于执行:
    在利用智能图像模式切换AIPQ功能设置所述目标图像对应的第一图像质量参数之后,判断所述显示设备的自动内容识别ACR功能是否可用;
    在所述自动内容识别ACR功能可用的情况下,利用所述自动内容识别ACR功能设置显示器当前显示的待处理图像的目标图像质量参数;所述待处理图像为显示器根据所述第一图像质量参数显示的图像。
  5. 根据权利要求4所述的显示设备,所述控制器,还用于执行:
    在利用智能图像模式切换AIPQ功能设置所述目标图像对应的第一图像质量参数之后,在所述自动内容识别ACR功能不可用的情况下,将所述第一图像质量参数作为目标图像质量参数。
  6. 根据权利要求3所述的显示设备,所述控制器,还用于执行:
    在利用所述智能图像模式切换AIPQ功能不能识别到所述目标图像的场景并且所述显示设备的自动内容识别ACR功能不可用的情况下,获取所述显示设备中的预设图像质量参数作为目标图像质量参数。
  7. 根据权利要求3所述的显示设备,所述控制器,还用于执行:
    在利用所述智能图像模式切换AIPQ功能不能识别到所述目标图像的场景并且所述显示设备的自动内容识别ACR功能可用的情况下,利用自动内容识别ACR功能设置所述目标图像的目标图像质量参数。
  8. 根据权利要求2所述的显示设备,所述控制器,还用于执行:
    在所述目标信号源不存在于所述信号源白名单中并且所述显示设备的自动内容识别ACR功能可用的情况下,利用自动内容识别ACR功能设置所述目标图像的目标图 像质量参数。
  9. 根据权利要求2所述的显示设备,其特征在于,所述控制器,还用于执行:
    在所述目标信号源不存在于所述信号源白名单中并且所述显示设备的自动内容识别ACR功能不可用的情况下,获取所述显示设备中的预设图像质量参数作为目标图像质量参数。
  10. 一种图像显示方法,包括:
    获取用户需要观看的目标图像;
    利用智能图像模式切换AIPQ功能和/或自动内容识别ACR功能设置所述目标图像对应的目标图像质量参数;
    控制显示器根据所述目标图像质量参数显示所述目标图像。
PCT/CN2021/113762 2020-10-10 2021-08-20 图像显示方法及显示设备 WO2022073392A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202011078919.4A CN112214189B (zh) 2020-10-10 2020-10-10 图像显示方法及显示设备
CN202011078919.4 2020-10-10

Publications (1)

Publication Number Publication Date
WO2022073392A1 true WO2022073392A1 (zh) 2022-04-14

Family

ID=74053122

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/113762 WO2022073392A1 (zh) 2020-10-10 2021-08-20 图像显示方法及显示设备

Country Status (2)

Country Link
CN (1) CN112214189B (zh)
WO (1) WO2022073392A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115334351A (zh) * 2022-08-02 2022-11-11 Vidaa国际控股(荷兰)公司 一种显示设备及自适应画质调节方法

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111757024A (zh) * 2020-07-30 2020-10-09 青岛海信传媒网络技术有限公司 一种控制智能图像模式切换的方法及显示设备
CN112214189B (zh) * 2020-10-10 2023-10-31 青岛海信传媒网络技术有限公司 图像显示方法及显示设备
WO2023281091A2 (en) * 2021-07-09 2023-01-12 VIDAA (Netherlands) International Holdings B.V. Refreshing method and display apparatus
CN117917085A (zh) 2021-07-20 2024-04-19 海信视像科技股份有限公司 一种显示设备及用于显示设备的显示方法
CN113434240B (zh) * 2021-07-21 2022-09-09 海信视像科技股份有限公司 图像模式的显示方法及显示设备

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150249870A1 (en) * 2014-02-28 2015-09-03 Kabushiki Kaisha Toshiba Image display apparatus, external information terminal and program to be executed thereby
US20180139493A1 (en) * 2015-05-05 2018-05-17 Viaccess Method for setting the level of definition of the images of a multimedia programme
CN111131889A (zh) * 2019-12-31 2020-05-08 深圳创维-Rgb电子有限公司 场景自适应调整图像及声音的方法、系统及可读存储介质
US20200304883A1 (en) * 2017-09-15 2020-09-24 Samsung Electronics Co., Ltd. Method and apparatus for executing content
CN111757024A (zh) * 2020-07-30 2020-10-09 青岛海信传媒网络技术有限公司 一种控制智能图像模式切换的方法及显示设备
CN112214189A (zh) * 2020-10-10 2021-01-12 青岛海信传媒网络技术有限公司 图像显示方法及显示设备

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103905883A (zh) * 2012-12-30 2014-07-02 青岛海尔软件有限公司 基于dtv节目类别的图像模式自动切换系统及切换方法
KR102509072B1 (ko) * 2018-10-05 2023-03-13 삼성전자주식회사 방송 프로그램 정보를 제공하는 영상 표시 장치 및 그 방법
EP3644616A1 (en) * 2018-10-22 2020-04-29 Samsung Electronics Co., Ltd. Display apparatus and operating method of the same

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150249870A1 (en) * 2014-02-28 2015-09-03 Kabushiki Kaisha Toshiba Image display apparatus, external information terminal and program to be executed thereby
US20180139493A1 (en) * 2015-05-05 2018-05-17 Viaccess Method for setting the level of definition of the images of a multimedia programme
US20200304883A1 (en) * 2017-09-15 2020-09-24 Samsung Electronics Co., Ltd. Method and apparatus for executing content
CN111131889A (zh) * 2019-12-31 2020-05-08 深圳创维-Rgb电子有限公司 场景自适应调整图像及声音的方法、系统及可读存储介质
CN111757024A (zh) * 2020-07-30 2020-10-09 青岛海信传媒网络技术有限公司 一种控制智能图像模式切换的方法及显示设备
CN112214189A (zh) * 2020-10-10 2021-01-12 青岛海信传媒网络技术有限公司 图像显示方法及显示设备

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115334351A (zh) * 2022-08-02 2022-11-11 Vidaa国际控股(荷兰)公司 一种显示设备及自适应画质调节方法
CN115334351B (zh) * 2022-08-02 2023-10-31 Vidaa国际控股(荷兰)公司 一种显示设备及自适应画质调节方法

Also Published As

Publication number Publication date
CN112214189B (zh) 2023-10-31
CN112214189A (zh) 2021-01-12

Similar Documents

Publication Publication Date Title
WO2022073392A1 (zh) 图像显示方法及显示设备
CN111752518A (zh) 一种显示设备投屏方法及显示设备
CN112019782B (zh) 增强型音频回传通道的控制方法及显示设备
WO2022021669A1 (zh) 一种控制智能图像模式切换的方法及显示设备
WO2022048203A1 (zh) 一种输入法控件的操控提示信息的显示方法及显示设备
CN112118400B (zh) 显示设备上图像的显示方法及显示设备
CN112118468A (zh) 一种外设设备颜色跟随画面颜色变化的方法及显示设备
CN111954059A (zh) 屏保的展示方法及显示设备
WO2022078065A1 (zh) 显示设备资源播放方法及显示设备
WO2022028060A1 (zh) 一种显示设备及显示方法
CN112306604B (zh) 一种传输文件的进度显示方法及显示设备
CN112272331B (zh) 一种节目频道列表快速展示的方法及显示设备
CN112040535B (zh) 一种Wifi处理方法及显示设备
CN111818654B (zh) 一种信道接入方法及显示设备
CN111669662A (zh) 显示设备、视频通话方法及服务器
CN111988646B (zh) 一种应用程序的用户界面显示方法和显示设备
WO2020147507A1 (zh) 显示设备和显示方法
CN114302197A (zh) 一种语音分离控制方法及显示设备
CN114390190A (zh) 显示设备及监测应用启动摄像头的方法
CN111918056A (zh) 一种摄像头状态检测方法及显示设备
CN113436564B (zh) 一种epos的展示方法及显示设备
WO2022105410A1 (zh) 一种显示设备及其设备参数的记忆方法、恢复方法
CN111935519B (zh) 通道切换方法和显示设备
CN111970554B (zh) 一种图片显示方法和显示设备
WO2022100252A1 (zh) 一种显示设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21876912

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21876912

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 19/07/2023)