CN112565915B - Display apparatus and display method - Google Patents

Display apparatus and display method Download PDF

Info

Publication number
CN112565915B
CN112565915B CN202011394145.6A CN202011394145A CN112565915B CN 112565915 B CN112565915 B CN 112565915B CN 202011394145 A CN202011394145 A CN 202011394145A CN 112565915 B CN112565915 B CN 112565915B
Authority
CN
China
Prior art keywords
image
displayed
noise
brightness
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011394145.6A
Other languages
Chinese (zh)
Other versions
CN112565915A (en
Inventor
王烨东
沈海杰
刁玉洁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense Visual Technology Co Ltd
Original Assignee
Hisense Visual Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Visual Technology Co Ltd filed Critical Hisense Visual Technology Co Ltd
Publication of CN112565915A publication Critical patent/CN112565915A/en
Application granted granted Critical
Publication of CN112565915B publication Critical patent/CN112565915B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/485End-user interface for client configuration
    • H04N21/4854End-user interface for client configuration for modifying image parameters, e.g. image brightness, contrast
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

The embodiment of the application provides a display device and a display method, wherein the display device comprises: a display and a controller; the controller is coupled to the display, the controller configured to: acquiring an image to be displayed; if the image to be displayed has noise, acquiring a first average brightness of the image to be displayed after noise removal and a first brightness histogram of the image to be displayed after noise removal; correcting the contrast of the image to be displayed according to the first average brightness and the first brightness histogram to obtain a first image to be displayed; the control display displays based on the first image to be displayed. According to the display device and the display method, when the noise of the image to be displayed is determined, the contrast of the image is corrected according to the first average brightness after the noise is removed and the brightness histogram after the noise is removed, so that the influence of the noise is eliminated, the problem of dark field floating white is solved, and the watching effect of a user is improved.

Description

Display apparatus and display method
The present application claims priority from the chinese patent office, application No. 202010498838. X, application name "display device and display method", filed on 4/6/2020, the entire contents of which are incorporated herein by reference.
Technical Field
The embodiment of the application relates to the intelligent television technology, in particular to display equipment and a display method.
Background
The intelligent television equipment is provided with an independent operating system and supports function expansion. The user can install various application programs in the intelligent television according to the self requirements, such as a traditional video application, a social application such as a short video and a reading application such as cartoon and reading books. The applications can utilize the screen of the intelligent television to display application pictures, and rich media resources are provided for the intelligent television. Meanwhile, the intelligent television can also perform data interaction and resource sharing with different terminals. For example, the smart tv may be connected to the mobile phone through a wireless communication manner such as a local area network, bluetooth, etc., so as to play resources in the mobile phone or directly perform screen projection to display a picture on the mobile phone.
When the smart television device plays an image, the contrast of the image is an important factor affecting the viewing effect of the user. However, when the intelligent television equipment plays an image shot by the shooting equipment in a low-brightness scene, the intelligent television equipment is limited by response characteristics of a camera sensor of the shooting equipment, and a large amount of noise is introduced into the image shot by the shooting equipment, so that the problems of dark field whitening and the like are caused. In the prior art, the smart television device generally corrects the contrast of the image by using a static contrast curve, corrects the contrast of the image by using a histogram gain curve, and corrects the contrast of the image by using a combination of the static contrast curve and the histogram gain curve. However, the above method for correcting the contrast of the image does not consider the problem of dark field whitening caused by noise of the image, so that the corrected image still has the problem of dark field whitening, and the viewing effect of the user is poor.
Disclosure of Invention
The embodiment of the application provides display equipment and a display method, which are used for solving the problem that the prior art cannot eliminate the floating white of a dark field, so that the watching effect of a user is poor.
In a first aspect, embodiments of the present application provide a display device, including:
a display;
a controller, coupled to the display, configured to:
acquiring an image to be displayed;
if the image to be displayed has noise, acquiring first average brightness of the image to be displayed after noise elimination and a first brightness histogram of the image to be displayed after noise elimination;
correcting the contrast of the image to be displayed according to the first average brightness and the first brightness histogram to obtain a first image to be displayed;
and controlling the display to display based on the first image to be displayed.
In a second aspect, an embodiment of the present application provides a display method, where the method includes:
acquiring an image to be displayed;
if the image to be displayed has noise, acquiring first average brightness of the image to be displayed after noise elimination and a first brightness histogram of the image to be displayed after noise elimination;
and correcting the contrast of the image to be displayed according to the first average brightness and the first brightness histogram.
According to the display device and the display method, when the noise of the image to be displayed is determined, the contrast of the image is corrected according to the first average brightness after the noise is removed and the brightness histogram after the noise is removed, so that the influence of the noise is eliminated, the problem of dark field floating white is solved, and the watching effect of a user is improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, a brief description will be given below of the drawings that are needed in the embodiments or the prior art descriptions, and it is obvious that the drawings in the following description are some embodiments of the present application, and that other drawings can be obtained according to these drawings without inventive effort to a person skilled in the art.
FIG. 1 is a schematic illustration of a static contrast curve;
FIG. 2 is a flow chart of contrast correction in the prior art;
fig. 3 is an application scenario schematic diagram of a display device according to an embodiment of the present application;
fig. 4 is a block diagram of the configuration of the control apparatus 100 of the embodiment of the present application;
fig. 5 is a hardware configuration block diagram of the display device 200 of the embodiment of the present application;
FIG. 6 is a block diagram of the architectural configuration of an operating system in the memory of the display device 200 of an embodiment of the present application;
Fig. 7 is a schematic flow chart of a display method according to an embodiment of the present application;
FIG. 8 is a schematic diagram of an exemplary image to be displayed, a modified first image to be displayed;
FIG. 9 is a flow chart of another display method according to an embodiment of the present disclosure;
FIG. 10 is a flowchart of another display method according to an embodiment of the present disclosure;
fig. 11 is a flow chart of another display method according to an embodiment of the present application.
Detailed Description
In order to better understand the technical solutions in the present application, the following description will clearly and completely describe the technical solutions in the embodiments of the present application with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only some embodiments of the present application, but not all embodiments. All other embodiments, which can be made by one of ordinary skill in the art based on the embodiments herein without making any inventive effort, shall fall within the scope of the present application.
For a better understanding of the technical solutions of the present application, the terms and prior art solutions to which the present application relates will first be described in detail.
Brightness: brightness, also called lightness, represents the degree of darkness of a color.
Contrast ratio: the greater the brightness difference, the higher the contrast, the smaller the brightness difference, and the smaller the contrast of the pixel points in the image.
Static contrast curve: for static correction of the brightness of each pixel in the image. The horizontal axis of the static contrast curve represents the luminance before correction, and the vertical axis represents the luminance after correction. Typically, different static contrast curves may be employed for different average brightnesses of the image. Illustratively, FIG. 1 is a schematic representation of a static contrast curve. As shown in fig. 1, the three static contrast curves are respectively a static contrast curve H, a static contrast curve M, and a static contrast curve L, and when the average luminance (Yavg) is less than or equal to the average luminance low threshold value, the static contrast curve H is adopted; when the average brightness is equal to the threshold value in the average brightness, adopting a static contrast curve M; when the average brightness is larger than or equal to the average brightness high threshold value, adopting a static contrast curve L; when the average brightness is larger than the average brightness low threshold value and smaller than the average brightness medium threshold value, a static contrast curve after the static contrast curve H and the static contrast curve M are combined according to a preset proportion can be adopted; when the average brightness is greater than the average brightness middle threshold value and less than the average brightness high threshold value, a static contrast curve after the static contrast curve L and the static contrast curve M are combined according to a preset proportion can be adopted.
Histogram gain curve: for expanding the brightness of each pixel point in the image according to the distribution condition of the histogram, namely, performing histogram equalization.
Fig. 2 is a schematic flow chart of contrast correction in the prior art. As shown in fig. 2, the method comprises the following steps:
s701, inputting an image, namely inputting a frame of image.
S702, carrying out brightness statistical analysis on an input image to obtain average brightness of pixel points in the image and a brightness histogram of the pixel points in the image.
S703, comparing the average brightness with an average brightness low threshold value, an average brightness middle threshold value and an average brightness high threshold value to obtain a corresponding static contrast curve.
S704, acquiring a histogram gain curve corresponding to the image according to the histogram distribution.
And S705, summing the static contrast curve and the histogram gain curve according to a preset proportionality coefficient to obtain a contrast control curve, and correcting the contrast of the image by using the contrast control curve. Illustratively, the contrast control curve can be obtained by the following formula (1):
Y out =α*Y APL +(1-α)*Y hist (1)
wherein Y is out For contrast control curve, Y APL For static contrast curve, Y hist For the histogram gain curve, alpha is a preset proportionality coefficient, and 0 < alpha < 1.
S706, outputting an image.
However, when the smart television device plays an image shot by the shooting device in a low-light scene, a large amount of noise is introduced into the image. Based on the prior art shown in fig. 2, the introduction of noise increases the average brightness of the image, and when the average brightness of the image increases, the static contrast curve is affected, and at the same time, the introduction of noise also changes the brightness histogram of the image, thereby affecting the histogram gain curve. Further, when the contrast of the image is corrected by the prior art method as shown in fig. 2, the corrected image still has the problem of dark field and white, which results in poor viewing effect for the user.
According to the display device and the display method, whether the image contains noise or not is judged, the noise is removed when the image contains the noise, and then the contrast of the image after the noise removal is corrected based on the static contrast control curve and the histogram gain curve, so that the problem of dark field floating white is solved, and the watching effect of a user is improved.
It should be noted that the method provided by the present application is not only applicable to televisions, but also applicable to other display devices, such as computers, tablet computers, and the like.
The term "module" as used in various embodiments of the present application may refer to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware or/and software code that is capable of performing the functionality associated with that element.
The term "hardware system" as used in the various embodiments of the present application may refer to a physical component comprising mechanical, optical, electrical, magnetic devices such as integrated circuits (Integrated Circuit, ICs), printed circuit boards (Printed circuit board, PCBs) with computing, control, storage, input and output functions. In various embodiments of the present application, the hardware system will also be generally referred to as a motherboard (or a motherboard) or a host chip or controller.
Fig. 3 is an application scenario schematic diagram of a display device according to an embodiment of the present application. As shown in fig. 3, communication between the control apparatus 100 and the display device 200 may be performed in a wired or wireless manner.
Wherein the control apparatus 100 is configured to control the display device 200, which can receive an operation instruction input by a user, and convert the operation instruction into an instruction recognizable and responsive to the display device 200, and to mediate interaction between the user and the display device 200. Such as: the user responds to the channel addition and subtraction operation by operating the channel addition and subtraction key on the control apparatus 100.
The control device 100 may be a remote control 100A, including an infrared protocol communication or a bluetooth protocol communication, and other short-range communication modes, and the display apparatus 200 is controlled by a wireless or other wired mode. The user may control the display device 200 by inputting user instructions through keys on a remote control, voice input, control panel input, etc. Such as: the user can input corresponding control instructions through volume up-down keys, channel control keys, up/down/left/right movement keys, voice input keys, menu keys, on-off keys, etc. on the remote controller to realize the functions of the control display device 200.
The control device 100 may also be an intelligent device, such as a mobile terminal 100B, a tablet computer, a notebook computer, or the like. For example, the display device 200 is controlled using an application running on a smart device. The application program, by configuration, can provide various controls to the user through an intuitive User Interface (UI) on a screen associated with the smart device.
For example, the mobile terminal 100B may install a software application with the display device 200, implement connection communication through a network communication protocol, and achieve the purpose of one-to-one control operation and data communication. Such as: the mobile terminal 100B may be caused to establish a control instruction protocol with the display device 200 to implement functions such as physical buttons arranged by the remote controller 100A by operating various function keys or virtual controls of a user interface provided on the mobile terminal 100B. The audio/video content displayed on the mobile terminal 100B may also be transmitted to the display device 200, so as to implement a synchronous display function.
The display device 200 may provide a broadcast receiving function and a network television function of a computer supporting function. The display device may be implemented as a digital television, a web television, an Internet Protocol Television (IPTV), or the like.
The display device 200 may be a liquid crystal display, an organic light emitting display, a projection device. The specific display device type, size, resolution, etc. are not limited.
The display device 200 is also in data communication with the server 300 via a variety of communication means. Display device 200 may be permitted to communicate via a Local Area Network (LAN), a Wireless Local Area Network (WLAN), and other networks. The server 300 may provide various contents and interactions to the display device 200. By way of example, the display device 200 may send and receive information, such as: receiving Electronic Program Guide (EPG) data, receiving software program updates, or accessing a remotely stored digital media library. The servers 300 may be one group, may be multiple groups, and may be one or more types of servers. Other web service content such as video on demand and advertising services are provided through the server 300.
Fig. 4 is a block diagram of the configuration of the control apparatus 100 of the embodiment of the present application. As shown in fig. 4, the control device 100 includes a controller 110, a memory 120, a communicator 130, a user input interface 140, a user output interface 150, and a power supply 160.
The controller 110 includes a Random Access Memory (RAM) 111, a Read Only Memory (ROM) 112, a processor 113, a communication interface, and a communication bus. The controller 110 is used to control the operation and operation of the control device 100, as well as the communication collaboration between the internal components, external and internal data processing functions.
For example, when an interaction in which a user presses a key arranged on the remote controller 100A or an interaction in which a touch panel arranged on the remote controller 100A is touched is detected, the controller 110 may control to generate a signal corresponding to the detected interaction and transmit the signal to the display device 200.
The memory 120 stores various operation programs, data, and applications for driving and controlling the control device 100 under the control of the controller 110. The memory 120 may store various control signal instructions input by a user.
The communicator 130 performs communication of control signals and data signals with the display device 200 under the control of the controller 110. Such as: the control apparatus 100 transmits a control signal (e.g., a touch signal or a control signal) to the display device 200 via the communicator 130, and the control apparatus 100 may receive the signal transmitted by the display device 200 via the communicator 130. Communicator 130 may include an infrared signal interface 131 and a radio frequency signal interface 132. For example: when the infrared signal interface is used, the user input instruction needs to be converted into an infrared control signal according to an infrared control protocol, and the infrared control signal is sent to the display device 200 through the infrared sending module. And the following steps: when the radio frequency signal interface is used, the user input instruction is converted into a digital signal, and then the digital signal is modulated according to a radio frequency control signal modulation protocol and then transmitted to the display device 200 through the radio frequency transmission terminal.
The user input interface 140 may include at least one of a microphone 141, a touch pad 142, a sensor 143, keys 144, etc., so that a user may input user instructions regarding controlling the display apparatus 200 to the control device 100 through voice, touch, gesture, press, etc.
The user output interface 150 outputs a user instruction received by the user input interface 140 to the display device 200 or outputs an image or voice signal received by the display device 200. Here, the user output interface 150 may include an LED interface 151, a vibration interface 152 generating vibrations, a sound output interface 153 outputting sound, a display 154 outputting an image, and the like. For example, the remote controller 100A may receive an output signal of audio, video, or data from the user output interface 150, and display the output signal as an image on the display 154, as an audio at the sound output interface 153, or as a vibration at the vibration interface 152.
A power supply 160 for providing operating power support for the various elements of the control device 100 under the control of the controller 110. May be in the form of a battery and associated control circuitry.
Fig. 5 is a hardware configuration block diagram of the display device 200 of the embodiment of the present application. As shown in fig. 5, a modem 210, a communicator 220, a detector 230, an external device interface 240, a controller 250, a memory 260, a user interface 265, a video processor 270, a display 275, a rotating assembly 276, an audio processor 280, an audio output interface 285, a power supply 290 may be included in the display apparatus 200.
Wherein the rotating assembly 276 may include a drive motor, a rotating shaft, etc. The driving motor may be connected to the controller 250, and the controller 250 outputs a rotation angle under control; one end of the rotating shaft is connected to a power output shaft of the driving motor, and the other end is connected to the display 275, so that the display 275 can be fixedly mounted on a wall or a bracket through the rotating assembly 276.
The rotating assembly 276 may also include other components, such as a transmission component, a detection component, and the like. Wherein, the transmission component can adjust the rotation speed and torque output by the rotating component 276 through a specific transmission ratio, and can be in a gear transmission mode; the detection means may be constituted by a sensor provided on the rotation shaft, such as an angle sensor, an attitude sensor, or the like. These sensors may detect parameters such as the angle at which the rotating assembly 276 rotates and send the detected parameters to the controller 250 to enable the controller 250 to determine or adjust the status of the display device 200 based on the detected parameters. In practice, the rotating assembly 276 may include, but is not limited to, one or more of the components described above.
The modem 210 receives broadcast television signals through a wired or wireless manner, and may perform modulation and demodulation processes such as amplification, mixing, and resonance, for demodulating an audio/video signal carried in a frequency of a television channel selected by a user and additional information (e.g., EPG data) from among a plurality of wireless or wired broadcast television signals.
The tuning demodulator 210 is responsive to the frequency of the television channel selected by the user and the television signal carried by that frequency, as selected by the user, and as controlled by the controller 250.
The tuning demodulator 210 can receive signals in various ways according to broadcasting systems of television signals, such as: terrestrial broadcasting, cable broadcasting, satellite broadcasting, internet broadcasting, or the like; according to different modulation types, a digital modulation mode or an analog modulation mode can be adopted; and the analog signal and the digital signal can be demodulated according to the kind of the received television signal.
In other exemplary embodiments, the modem 210 may also be in an external device, such as an external set-top box or the like. In this way, the set-top box outputs a television signal after modulation and demodulation, and inputs the television signal to the display apparatus 200 through the external device interface 240.
The communicator 220 is a component for communicating with an external device or an external server according to various communication protocol types. For example, the display device 200 may transmit content data to an external device connected via the communicator 220, or browse and download content data from an external device connected via the communicator 220. The communicator 220 may include a network communication protocol module or a near field communication protocol module such as a WIFI module 221, a bluetooth communication protocol module 222, a wired ethernet communication protocol module 223, etc., so that the communicator 220 may receive a control signal of the control device 100 according to the control of the controller 250 and implement the control signal as a WIFI signal, a bluetooth signal, a radio frequency signal, etc.
The detector 230 is a component of the display device 200 for collecting signals of the external environment or interaction with the outside. The detector 230 may include a sound collector 231, such as a microphone, that may be used to receive a user's sound, such as a voice signal of a control instruction of the user controlling the display device 200; alternatively, ambient sounds for identifying the type of ambient scene may be collected, and the implementation display device 200 may adapt to ambient noise.
In other exemplary embodiments, the detector 230 may further include an image collector 232, such as a camera, webcam, etc., that may be used to collect external environmental scenes to adaptively change the display parameters of the display device 200; and the function is used for collecting the attribute of the user or interacting gestures with the user so as to realize the interaction between the display equipment and the user.
In other exemplary embodiments, the detector 230 may further include a light receiver for collecting ambient light intensity to adapt to changes in display parameters of the display device 200, etc.
In other exemplary embodiments, the detector 230 may further include a temperature sensor, such as by sensing ambient temperature, the display device 200 may adaptively adjust the display color temperature of the image. Illustratively, the display device 200 may be adjusted to display a colder color temperature shade of the image when the temperature is higher than ambient; when the temperature is low, the display device 200 may be adjusted to display a color temperature-warm tone of the image.
The external device interface 240 is a component that provides the controller 250 to control data transmission between the display apparatus 200 and an external device. The external device interface 240 may be connected to an external device such as a set-top box, a game device, a notebook computer, etc., in a wired/wireless manner, and may receive data such as a video signal (e.g., a moving image), an audio signal (e.g., music), additional information (e.g., an EPG), etc., of the external device.
The external device interface 240 may include: any one or more of a High Definition Multimedia Interface (HDMI) terminal 241, a Composite Video Blanking Sync (CVBS) terminal 242, an analog or digital Component terminal 243, a Universal Serial Bus (USB) terminal 244, a Component terminal (not shown), a Red Green Blue (RGB) terminal (not shown), and the like.
The controller 250 controls the operation of the display device 200 and responds to the user's operations by running various software control programs (e.g., an operating system and various application programs) stored on the memory 260.
As shown in fig. 5, the controller 250 includes a Random Access Memory (RAM) 251, a Read Only Memory (ROM) 252, a graphic processor 253, a CPU processor 254, a communication interface 255, and a communication bus 256. The RAM251, the ROM252, the graphics processor 253, and the CPU 254 are connected to each other via a communication bus 256.
A ROM252 for storing various system boot instructions. When the power of the display apparatus 200 starts to be started upon receiving the power-on signal, the CPU processor 254 runs a system start instruction in the ROM252, copies the operating system stored in the memory 260 into the RAM251 to start running the start operating system. When the operating system is started, the CPU processor 254 copies various applications in the memory 260 to the RAM251, and then starts running the various applications.
The graphic processor 253 generates various graphic objects such as icons, operation menus, and user input instruction display graphics, etc. The graphic processor 253 may include an operator for performing an operation by receiving user input of various interactive instructions, thereby displaying various objects according to display attributes; and a renderer for generating various objects based on the operator, and displaying the result of rendering on the display 275.
CPU processor 254 is operative to execute operating system and application program instructions stored in memory 260. And executing processing of various application programs, data and contents according to the received user input instructions so as to finally display and play various audio and video contents.
In some exemplary embodiments, the CPU processor 254 may comprise a plurality of processors. The plurality of processors may include one main processor and a plurality or one sub-processor. A main processor for performing some initialization operations of the display device 200 in a display device preloading mode and/or an operation of displaying a picture in a normal mode. A plurality of or a sub-processor for performing an operation in a state of standby mode or the like of the display device.
Communication interface 255 may include a first interface through an nth interface. These interfaces may be network interfaces that are connected to external devices via a network.
The controller 250 may control the overall operation of the display apparatus 200. For example: in response to receiving a user input command for selecting a GUI object displayed on the display 275, the controller 250 may perform an operation related to the object selected by the user input command.
Wherein the object may be any one of selectable objects, such as a hyperlink or an icon. The operation related to the selected object, for example, an operation of displaying a link to a hyperlink page, a document, an image, or the like, or an operation of executing a program corresponding to the object. The user input command for selecting the GUI object may be a command input through various input means (e.g., mouse, keyboard, touch pad, etc.) connected to the display apparatus 200 or a voice command corresponding to a voice uttered by the user.
The memory 260 is used to store various types of data, software programs, or applications that drive and control the operation of the display device 200. Memory 260 may include volatile and/or nonvolatile memory. And the term "memory" includes memory 260, RAM251 and ROM252 of controller 250, or a memory card in display device 200.
In some embodiments, the memory 260 is specifically configured to store an operating program that drives the controller 250 in the display device 200; various application programs built in the display device 200 and downloaded from an external device by a user are stored; data for configuring various GUIs provided by the display 275, various objects related to the GUIs, visual effect images of selectors for selecting GUI objects, and the like are stored.
In some embodiments, the memory 260 is specifically configured to store drivers and related data for the modem 210, the communicator 220, the detector 230, the external device interface 240, the video processor 270, the display 275, the audio processor 280, etc., such as external data (e.g., audio-visual data) received from the external device interface or user data (e.g., key information, voice information, touch information, etc.) received from the user interface.
In some embodiments, memory 260 specifically stores software and/or programs for representing an Operating System (OS), which may include, for example: a kernel, middleware, an Application Programming Interface (API), and/or an application program. Illustratively, the kernel may control or manage system resources, as well as other program implemented functions (e.g., middleware, APIs, or application programs); at the same time, the kernel may provide an interface to allow middleware, APIs, or applications to access the controller to implement control or management of system resources.
Fig. 6 is a block diagram of an architectural configuration of an operating system in a memory of the display apparatus 200 of the embodiment of the present application. The operating system architecture is an application layer, a middleware layer and a kernel layer in sequence from top to bottom.
The application layer, the application program built in the system and the non-system application program belong to the application layer. Is responsible for direct interaction with the user. The application layer may include a plurality of applications, such as a setup application, an electronic post application, a media center application, and the like. These applications may be implemented as Web applications that execute based on WebKit engines, and in particular may be developed and executed based on HTML5, cascading Style Sheets (CSS), and JavaScript.
Here, HTML, which is called a hypertext markup language (HyperText Markup Language) in its entirety, is a standard markup language for creating web pages, which are described by markup tags for describing words, graphics, animations, sounds, tables, links, etc., and a browser reads an HTML document, interprets the contents of tags within the document, and displays them in the form of web pages.
CSS, collectively referred to as cascading style sheets (Cascading Style Sheets), is a computer language used to represent the style of HTML files and may be used to define style structures such as fonts, colors, positions, and the like. The CSS style can be directly stored in an HTML webpage or a separate style file, so that the control of the style in the webpage is realized.
JavaScript, a language applied to Web page programming, can be inserted into HTML pages and interpreted by a browser. The interaction logic of the Web application is realized through JavaScript. The JavaScript can be used for realizing communication with the kernel layer by encapsulating the JavaScript extension interface through the browser,
middleware layer, some standardized interfaces may be provided to support the operation of various environments and systems. For example, the middleware layer may be implemented as multimedia and hypermedia information coding expert group (MHEG) of middleware related to data broadcasting, as DLNA middleware of middleware related to communication with an external device, as middleware providing a browser environment in which applications within a display device are running, and the like.
A kernel layer providing core system services such as: file management, memory management, process management, network management, system security authority management and other services. The kernel layer may be implemented as a kernel based on various operating systems, such as a kernel based on the Linux operating system.
The kernel layer also provides communication between system software and hardware at the same time, providing device driver services for various hardware, such as: providing a display driver for a display, providing a camera driver for a camera, providing a key driver for a remote control, providing a WIFI driver for a WIFI module, providing an audio driver for an audio output interface, providing a Power Management (PM) module with a power management driver, and the like.
In fig. 5, a user interface 265 receives various user interactions. Specifically, an input signal for a user is transmitted to the controller 250, or an output signal from the controller 250 is transmitted to the user. Illustratively, the remote control 100A may send input signals such as a power switch signal, a channel selection signal, a volume adjustment signal, etc., input by a user to the user interface 265, and then forwarded by the user interface 265 to the controller 250; alternatively, the remote controller 100A may receive an output signal such as audio, video, or data, which is processed by the controller 250 to be output from the user interface 265, and display the received output signal or output the received output signal in the form of audio or vibration.
In some embodiments, a user may input a user command through a Graphical User Interface (GUI) displayed on the display 275, and the user interface 265 receives the user input command through the GUI. In particular, the user interface 265 may receive user input commands for controlling the position of a selector in a GUI to select different objects or items. Wherein a "user interface" is a media interface for interaction and exchange of information between an application or operating system and a user, which enables conversion between an internal form of information and a user-acceptable form. A commonly used presentation form of the user interface is a graphical user interface (graphic user interface, GUI), which refers to a user interface related to computer operations that is displayed in a graphical manner. It may be an interface element such as an icon, a window, a control, etc. displayed in a display of the electronic device, where the control may include a visual interface element such as an icon, a control, a menu, a tab, a text box, a dialog box, a status bar, a channel bar, a Widget, etc.
Alternatively, the user may enter a user command by entering a particular sound or gesture, and the user interface 265 recognizes the sound or gesture through the sensor to receive the user input command.
The video processor 270 is configured to receive an external video signal, and perform video data processing such as decompression, decoding, scaling, noise reduction, frame rate conversion, resolution conversion, and image composition according to a standard codec protocol of an input signal, so as to obtain a video signal that is directly displayed or played on the display 275.
By way of example, video processor 270 includes a demultiplexing module, a video decoding module, an image compositing module, a frame rate conversion module, a display formatting module, and the like.
Wherein, the demultiplexing module is used for demultiplexing the input audio/video data stream, such as the input MPEG-2 stream (based on the compression standard of the digital storage media moving image and voice), and then the demultiplexing module demultiplexes the input audio/video data stream into video signals, audio signals and the like.
And the video decoding module is used for processing the demultiplexed video signal, including decoding, scaling and the like.
And an image synthesis module, such as an image synthesizer, for performing superposition mixing processing on the graphic generator and the video image after the scaling processing according to the GUI signal input by the user or generated by the graphic generator, so as to generate an image signal for display.
The frame rate conversion module is configured to convert a frame rate of an input video, for example, convert a frame rate of an input 60Hz video into a frame rate of 120Hz or 240Hz, and a common format is implemented in an inserting frame manner.
And a display formatting module for converting the signal output by the frame rate conversion module into a signal conforming to a display format such as a display, for example, format converting the signal output by the frame rate conversion module to output an RGB data signal.
And a display 275 for receiving image signals from the video processor 270 and displaying video content, images and menu manipulation interfaces. The video content may be displayed from the broadcast signal received by the modem 210, or may be displayed from the video content input by the communicator 220 or the external device interface 240. And a display 275 for simultaneously displaying a user manipulation interface UI generated in the display device 200 and used to control the display device 200.
And, the display 275 may include a display assembly for presenting pictures and a drive assembly for driving the display of images. Alternatively, if the display 275 is a projection display, a projection device and a projection screen may be included.
The rotating assembly 276, the controller may issue control signals to cause the rotating assembly 276 to rotate the display 255.
The audio processor 280 is configured to receive an external audio signal, decompress and decode according to a standard codec of an input signal, and perform audio data processing such as noise reduction, digital-to-analog conversion, and amplification, so as to obtain an audio signal that can be played in the speaker 286.
Illustratively, the audio processor 280 may support various audio formats. Such as MPEG-2, MPEG-4, advanced Audio Coding (AAC), high efficiency AAC (HE-AAC), etc.
An audio output interface 285 for receiving the audio signal output from the audio processor 280 under the control of the controller 250, the audio output interface 285 may include a speaker 286, or an external audio output terminal 287, such as a headphone output terminal, for outputting to a generating device of an external device.
In other exemplary embodiments, video processor 270 may include one or more chip components. Audio processor 280 may also include one or more chip components.
And, in other exemplary embodiments, video processor 270 and audio processor 280 may be separate chips or integrated with controller 250 in one or more chips.
The power supply 290 is used for providing power supply support for the display device 200 by power input by an external power supply under the control of the controller 250. The power supply 290 may be a built-in power supply circuit mounted inside the display device 200 or may be a power supply mounted outside the display device 200.
Fig. 7 is a flow chart of a display method according to an embodiment of the present application. Wherein the controller 250 is connected to the display 275, for the actual control process, as shown in fig. 7, the controller 250 may be configured to perform the following program steps:
S101, acquiring an image to be displayed.
The image to be displayed may be a frame of image, for example, the image to be displayed may be a photo to be displayed. The image to be displayed may also be a frame of image in a video. Fig. 8 is a schematic diagram of an exemplary image to be displayed, a corrected first image to be displayed. As shown in fig. 8 (a), the image to be displayed includes noise pixels.
S102, judging whether noise exists in the image to be displayed.
The method for determining whether the image to be displayed has noise will be described in detail in the following embodiments.
If so, it is described that the problem of dark field floating white occurs when the contrast correction is performed on the image to be displayed by the prior art method (for example, the method shown in fig. 2), and step S103 may be performed at this time; if not, it is described that the problem of dark field blushing does not occur when the contrast correction is performed on the image to be displayed by the conventional method (for example, the method shown in fig. 2), and step S106 may be performed.
S103, obtaining first average brightness of the images to be displayed after noise is removed and a first brightness histogram of the images to be displayed after noise is removed.
The first average brightness is obtained, and the first average brightness after noise is removed from the image to be displayed may be obtained by the following formula (2):
Figure BDA0002814004430000141
Wherein Y is avg For a first average brightness, Y i For the brightness of the ith pixel point of the image to be displayed, Y j For the brightness of the j-th noise pixel point of the image to be displayed, P max The number of pixels of the image to be displayed is N, and the number of noise pixels of the image to be displayed is N.
The first luminance histogram is obtained, and in one possible implementation, the first luminance histogram may be obtained based on non-noise pixels in the image to be displayed. That is, the first luminance histogram includes only non-noise pixels and does not include noise pixels.
In another possible implementation, the average value of the brightness of the pixels adjacent to the noise pixel may be used as the brightness of the noise pixel, and then the first brightness histogram may be obtained. That is, the first luminance histogram includes both non-noise pixels and noise pixels after the average processing. For example, the average value of the luminance of eight pixel points adjacent to a noise pixel point may be taken as the luminance of the noise pixel point.
S104, correcting the contrast of the image to be displayed according to the first average brightness and the first brightness histogram to obtain a first image to be displayed.
For example, a contrast control curve of the image to be displayed may be obtained according to the first average brightness and the first brightness histogram; the method shown in fig. 2 in the prior art S703-S705 may be used to obtain the contrast control curve of the image to be displayed, and only the average brightness of the pixels in the image is replaced by the first average brightness, and the brightness histogram of the pixels in the image is replaced by the first brightness histogram.
And correcting the image to be displayed according to the contrast control curve to obtain a first image to be displayed.
Alternatively, in one possible implementation, the brightness of the pixel points in the noise area may be reduced after step S104.
Wherein the noise area is determined according to the noise pixel points, and the following embodiments can be referred to for specific ways.
For example, the brightness of each pixel point in the noise area may be multiplied by a preset noise reduction coefficient, which may be set according to the actual situation. By this alternative method, the brightness of the noise-existing region can be further reduced, and the problem of dark field whitening can be solved.
S105, controlling the display to display based on the first image to be displayed.
Illustratively, the first image to be displayed, as shown in fig. 8 (b), overcomes the problem of dark field blushing compared to the image to be displayed shown in fig. 8 (a).
Optionally, in a possible implementation, the backlight of the display may also be controlled according to the first image to be displayed. Because the first to-be-displayed image is obtained by correcting the contrast of the to-be-displayed image after noise is removed, the brightness of the low-brightness part of the first to-be-displayed image can be reduced to a preset degree by controlling the backlight of the display according to the first to-be-displayed image, so that the accurate backlight control of the first to-be-displayed image is realized, the contrast of the first to-be-displayed image is optimized, and the watching effect of a user is improved.
S106, acquiring a second average brightness of the image to be displayed and a second brightness histogram of the image to be displayed.
The second average luminance may be average luminance of all pixels in the image to be displayed, and the second luminance histogram of the image to be displayed may be luminance histogram of all pixels in the image to be displayed.
And S107, correcting the contrast of the image to be displayed according to the second average brightness and the second brightness histogram to obtain a second image to be displayed.
For example, the second average luminance may be replaced with the average luminance of the pixels in the image shown in fig. 2, the second luminance histogram may be replaced with the luminance histogram of the pixels in the image shown in fig. 2, and then steps S703 to S705 shown in fig. 2 are sequentially performed, so that the second image to be displayed may be obtained.
S108, controlling the display to display based on the second image to be displayed.
According to the display device and the display method, whether the image to be displayed has noise or not is judged, when the image to be displayed has noise, first average brightness of the image to be displayed after noise is removed and a first brightness histogram of the image to be displayed after noise is removed are obtained, then the contrast of the image to be displayed is corrected according to the first average brightness and the first brightness histogram, the first image to be displayed is obtained, and the display is controlled to display based on the first image to be displayed. According to the method and the device for correcting the image contrast, the contrast of the image is corrected according to the first average brightness after noise elimination and the brightness histogram after noise elimination, the influence of noise is eliminated, the problem of dark field whitening is solved, and the watching effect of a user is improved.
The following embodiment will focus on how to determine whether or not noise is present in an image to be displayed.
A first possible implementation:
fig. 9 is a flow chart of another display method provided in the embodiment of the present application, as shown in fig. 9, may include the following steps:
s201, determining noise pixel points of an image to be displayed.
In one possible implementation manner, an average value of brightness corresponding to a first pixel in an image to be displayed and a first absolute value of difference between brightness of the first pixel may be obtained. The average brightness value corresponding to the first pixel point is the average brightness value of the pixel points adjacent to the first pixel point.
When the absolute value of the first difference is larger than the first difference threshold and smaller than the second difference threshold, determining the first pixel point as a noise pixel point of the image to be displayed; the first difference threshold is less than the second difference threshold.
Exemplary, assume pairs ofIn the first pixel point P in the image to be displayed, the average brightness value of 8 adjacent pixels closest to the first pixel point P may be used as the average brightness value Y corresponding to the first pixel point P avg Then average value Y of brightness corresponding to the first pixel point P avg Brightness Y with first pixel point x The absolute value is calculated after the difference is made to obtain a first difference absolute value delta Y, as shown in a formula (3):
ΔY=|Y avg -Y x | (3)
for non-noisy pixels in the image to be processed, the luminance difference between adjacent pixels should be smooth or sharp. When Δy satisfies the following condition (3), it is explained that the luminance of the first pixel point P and the luminance of the adjacent pixel point do not conform to the characteristics of the non-noise pixel point, and at this time, the first pixel point P is determined to be the noise pixel point of the image to be displayed, and the condition (4) is as follows:
Y L <ΔY<Y H (4)
wherein Y is L For a first difference threshold value, Y H For a second difference threshold value, Y L <Y H . First difference threshold Y L And a second difference threshold Y H The size of (2) can be set according to the actual situation.
By this possible implementation, the noise pixel point of the image to be displayed can be accurately determined.
In another possible implementation manner, average brightness of pixel points in a preset area of the image to be displayed may be obtained, and if the absolute value of the difference between the brightness of the second pixel point in the preset area of the image to be displayed and the average brightness of the pixel points in the preset area is greater than a third difference threshold, the second pixel point is determined to be a noise pixel point of the image to be displayed.
The preset area can be set according to actual conditions, the shape of the preset area can be a regular geometric image or an irregular image, and the size of the preset area can be set according to actual conditions.
In this possible implementation manner, since the average brightness in the preset area can represent the average level of the brightness of the non-noise pixel, if the absolute value of the difference between the brightness of the second pixel in the preset area of the image to be displayed and the average brightness of the pixel in the preset area is greater than the third difference threshold, it is indicated that the brightness of the second pixel does not conform to the characteristic of the non-noise pixel, and at this time, it may be determined that the second pixel is the noise pixel of the image to be displayed.
S202, acquiring noise intensity of an image to be displayed according to the noise pixel points.
In one possible implementation, the noise region of the image to be displayed may be obtained from the noise pixel points. If the noise area is one, taking the ratio of the noise pixel points to the non-noise pixel points in the noise area as the noise intensity of the image to be displayed; if the number of the noise areas is at least two, taking the ratio of the noise pixel points to the non-noise pixel points in each noise area as the noise intensity of the noise area; and taking the maximum value of the noise intensities of the at least two noise areas as the noise intensity of the image to be displayed.
Since the brightness of each non-noise pixel point in the image to be displayed is determined by the brightness of the shot object, the brightness of the noise pixel point is random, and when the dark color area in the image to be displayed has noise influence in a low-brightness scene, the brightness of the noise pixel point is obviously different from the average brightness of the dark color area containing the noise pixel point. Therefore, when noise appears in a dark region in the image to be displayed, a corresponding noise region can be fitted according to the distribution of noise pixel points based on the existing method for fitting the graph. For example, assuming that three noise regions are included in the image to be displayed, for each noise region, the ratio of the noise pixel point and the non-noise pixel point in the noise region may be taken as the noise intensity of the noise region, and then the maximum value of the noise intensities of the three noise regions may be taken as the noise pixel point of the image to be displayed.
By this possible implementation, the noise intensity of the image to be displayed can be accurately determined.
In another possible implementation manner, the ratio of the noise pixel point to the non-noise pixel point in the image to be displayed may be used as the noise intensity of the image to be displayed.
S203, judging whether the noise intensity of the image to be displayed is larger than or equal to a noise threshold value.
If so, it is indicated that the noise in the image to be displayed will affect the viewing effect of the user, step S204 may be performed at this time, and if not, it is indicated that the noise in the image to be displayed will not affect the viewing effect of the user, step S205 may be performed at this time.
S204, determining that noise exists in the image to be displayed.
S205, determining that the image to be displayed has no noise.
According to the embodiment of the application, the noise intensity of the image to be displayed is obtained according to the noise pixel points by determining the noise pixel points of the image to be displayed, and then when the noise intensity of the image to be displayed is greater than or equal to the noise threshold value, the noise of the image to be displayed is determined to exist. The method can accurately determine the noise pixel points of the image to be displayed, accurately acquire the noise intensity of the image to be displayed, further accurately judge whether the image to be displayed has noise, further solve the problem of dark field whitening under the condition that the noise exists, and improve the watching effect of a user.
A second possible implementation:
fig. 10 is a flowchart of another display method provided in the embodiment of the present application, as shown in fig. 10, may include the following steps:
s301, determining noise pixel points of an image to be displayed.
The step may refer to step S201, and will not be described herein.
S302, judging whether the first ratio is larger than or equal to a preset ratio threshold.
The first ratio is a ratio of the number of noise pixels of the image to be displayed to the number of pixels of the image to be displayed, and the preset ratio threshold can be determined according to actual conditions.
If so, it is indicated that the noise in the image to be displayed will affect the viewing effect of the user, step S303 may be executed at this time, and if not, it is indicated that the noise in the image to be displayed will not affect the viewing effect of the user, step S304 may be executed at this time.
S303, determining that noise exists in the image to be displayed.
S304, determining that the image to be displayed has no noise.
According to the method, noise pixels of the image to be displayed are determined, and when the ratio of the number of the noise pixels to the number of the pixels of the image to be displayed is greater than or equal to a preset ratio threshold value, the noise of the image to be displayed is determined. The method can accurately determine the noise pixel points of the image to be displayed, further accurately judge whether the image to be displayed has noise, further solve the problem of dark field whitening under the condition that the noise exists, and improve the watching effect of a user.
A third possible implementation:
fig. 11 is a flow chart of another display method provided in the embodiment of the present application, as shown in fig. 11, may include the following steps:
s401, determining noise pixel points of an image to be displayed.
The step may refer to step S201, and will not be described herein.
S402, acquiring the noise intensity of the noise area according to the noise pixel points.
In this step, one noise region may be acquired and at least two noise regions may be acquired for different images to be displayed.
S403, judging whether the noise intensity of at least one noise area is larger than or equal to a noise threshold value.
If so, it is indicated that the noise in the image to be displayed will affect the viewing effect of the user, step S404 may be executed at this time, and if not, it is indicated that the noise in the image to be displayed will not affect the viewing effect of the user, step S405 may be executed at this time.
S404, determining that noise exists in the image to be displayed.
S405, determining that the image to be displayed has no noise.
According to the method, noise of the image to be displayed is determined by determining noise pixel points of the image to be displayed, noise intensity of noise areas is obtained according to the noise pixel points, and when the noise intensity of at least one noise area is greater than or equal to a noise threshold value, the noise of the image to be displayed is determined. The method can accurately determine the noise pixel points of the image to be displayed, further accurately judge whether the image to be displayed has noise, further solve the problem of dark field whitening under the condition that the noise exists, and improve the watching effect of a user.
Based on the display device 200, the present application further provides a display method, which includes:
and acquiring an image to be displayed.
And judging whether noise exists in the image to be displayed.
If yes, acquiring a first average brightness of the image to be displayed after noise is removed and a first brightness histogram of the image to be displayed after noise is removed; and correcting the contrast of the image to be displayed according to the first average brightness and the first brightness histogram to obtain a first image to be displayed.
If not, acquiring a second average brightness of the image to be displayed and a second brightness histogram of the image to be displayed; and correcting the contrast of the image to be displayed according to the second average brightness and the second brightness histogram to obtain a second image to be displayed.
Optionally, in a possible implementation manner, determining whether the image to be displayed has noise may specifically include:
noise pixels of an image to be displayed are determined.
And acquiring the noise intensity of the image to be displayed according to the noise pixel points.
And judging whether the noise intensity of the image to be displayed is larger than or equal to a noise threshold value.
If yes, determining that noise exists in the image to be displayed; if not, determining that the image to be displayed has no noise.
Optionally, in a possible implementation manner, determining whether the image to be displayed has noise may specifically include:
noise pixels of an image to be displayed are determined.
And judging whether the first ratio is larger than or equal to a preset ratio threshold.
If yes, determining that noise exists in the image to be displayed; if not, determining that the image to be displayed has no noise.
Optionally, in a possible implementation manner, determining whether the image to be displayed has noise may specifically include:
noise pixels of an image to be displayed are determined.
And acquiring the noise intensity of the noise area according to the noise pixel points.
And judging whether the noise intensity of at least one noise area is larger than or equal to a noise threshold value.
If yes, determining that noise exists in the image to be displayed; if not, determining that the image to be displayed has no noise.
Optionally, in a possible implementation manner, determining a noise pixel point of an image to be displayed may specifically include:
acquiring a brightness average value corresponding to a first pixel point in an image to be displayed and a first difference absolute value between the brightness of the first pixel point; when the absolute value of the first difference is larger than the first difference threshold and smaller than the second difference threshold, determining the first pixel point as a noise pixel point of the image to be displayed; the average brightness value corresponding to the first pixel point is the average brightness value of the pixel points adjacent to the first pixel point, and the first difference value threshold value is smaller than the second difference value threshold value.
Optionally, in a possible implementation manner, determining a noise pixel point of an image to be displayed may specifically include:
and acquiring the average brightness of the pixel points in the preset area of the image to be displayed, and determining the second pixel point as the noise pixel point of the image to be displayed if the absolute value of the difference between the brightness of the second pixel point in the preset area of the image to be displayed and the average brightness of the pixel points in the preset area is larger than a third difference threshold value.
Optionally, in a possible implementation manner, acquiring the noise intensity of the image to be displayed according to the noise pixel point may specifically include:
acquiring a noise area of an image to be displayed according to the noise pixel points; if the noise area is one, taking the ratio of the noise pixel points to the non-noise pixel points in the noise area as the noise intensity of the image to be displayed; if the number of the noise areas is at least two, taking the ratio of the noise pixel points to the non-noise pixel points in each noise area as the noise intensity of the noise area; and taking the maximum value of the noise intensities of the at least two noise areas as the noise intensity of the image to be displayed.
Optionally, in a possible implementation manner, acquiring the noise intensity of the image to be displayed according to the noise pixel point may specifically include:
And taking the ratio of the noise pixel points to the non-noise pixel points in the image to be displayed as the noise intensity of the image to be displayed.
Optionally, in one possible implementation, after the first image to be displayed is acquired according to the first average luminance and the first luminance histogram, the luminance of the pixel point in the noise area may be reduced.
Alternatively, in one possible implementation manner, the first average brightness after noise is removed from the image to be displayed may be obtained by the following formula (5):
Figure BDA0002814004430000211
wherein Y is avg For a first average brightness, Y i For the brightness of the ith pixel point of the image to be displayed, Y j For the brightness of the j-th noise pixel point of the image to be displayed, P max The number of pixels of the image to be displayed is N, and the number of noise pixels of the image to be displayed is N.
Alternatively, in one possible implementation, the backlight of the display may be controlled according to the first image to be displayed.
Optionally, in a possible implementation manner, a contrast control curve of the image to be displayed may be obtained according to the first average brightness and the first brightness histogram; and correcting the image to be displayed according to the contrast control curve to obtain a first image to be displayed.
The display method provided in the embodiment of the present application is similar to the principle of the display device in executing the display method, and specifically, reference may be made to the foregoing description, which is not repeated.
The display method provided in the embodiment of the present application, the execution body for executing the method may be the controller itself, or may be a chip of the controller.
The foregoing detailed description of the embodiments is merely illustrative of the general principles of the present application and should not be taken in any way as limiting the scope of the invention. Any other embodiments developed in accordance with the present application without inventive effort are within the scope of the present application for those skilled in the art.

Claims (7)

1. A display device, characterized by comprising:
a display;
a controller, coupled to the display, configured to:
acquiring an image to be displayed; if the image to be displayed has noise, acquiring first average brightness of the image to be displayed after noise elimination and a first brightness histogram of the image to be displayed after noise elimination;
correcting the contrast of the image to be displayed according to the first average brightness and the first brightness histogram to obtain a first image to be displayed;
controlling the display to display based on the first image to be displayed;
The controller is specifically configured to:
determining noise pixel points of the image to be displayed;
acquiring the noise intensity of the image to be displayed according to the noise pixel points;
if the noise intensity of the image to be displayed is greater than or equal to a noise threshold value, determining that the image to be displayed has noise;
the controller is specifically configured to:
acquiring a brightness average value corresponding to a first pixel point in an image to be displayed, and obtaining a first difference absolute value between the brightness of the first pixel point, wherein the brightness average value corresponding to the first pixel point is a brightness average value of pixel points adjacent to the first pixel point;
when the first difference absolute value is larger than a first difference threshold value and smaller than a second difference threshold value, determining the first pixel point as a noise pixel point of the image to be displayed; the first difference threshold is less than the second difference threshold;
the controller is specifically configured to:
obtaining the first average brightness of the image to be displayed after noise is removed through the following formula:
Figure FDA0004046261890000011
wherein the Y is avg For the first average brightness, the Y i For the brightness of the ith pixel point of the image to be displayed, the Y j The brightness of the jth noise pixel point of the image to be displayed is the P max And N is the number of noise pixels of the image to be displayed.
2. The display device of claim 1, wherein the controller is further configured to:
if the image to be displayed does not have noise, acquiring second average brightness of the image to be displayed and a second brightness histogram of the image to be displayed;
correcting the contrast of the image to be displayed according to the second average brightness and the second brightness histogram to obtain a second image to be displayed;
and controlling the display to display based on the second image to be displayed.
3. The display device of claim 1, wherein the controller is specifically configured to:
acquiring a noise area of the image to be displayed according to the noise pixel points;
if the noise area is one, taking the ratio of the noise pixel points to the non-noise pixel points in the noise area as the noise intensity of the image to be displayed;
if the number of the noise areas is at least two, taking the ratio of the noise pixel points to the non-noise pixel points in each noise area as the noise intensity of the noise area; and taking the maximum value of the noise intensities of the at least two noise areas as the noise intensity of the image to be displayed.
4. The display device of claim 3, wherein the controller is further configured to:
and after the first image to be displayed is acquired according to the first average brightness and the first brightness histogram, reducing the brightness of the pixel points in the noise area.
5. The display device of any one of claims 1-4, wherein the controller is further configured to:
and controlling the backlight of the display according to the first image to be displayed.
6. The display device of any of claims 1-4, wherein the controller is specifically configured to:
acquiring a contrast control curve of the image to be displayed according to the first average brightness and the first brightness histogram;
and correcting the image to be displayed according to the contrast control curve to obtain the first image to be displayed.
7. A display method, the method comprising:
acquiring an image to be displayed;
if the image to be displayed has noise, acquiring first average brightness of the image to be displayed after noise elimination and a first brightness histogram of the image to be displayed after noise elimination;
correcting the contrast of the image to be displayed according to the first average brightness and the first brightness histogram;
The method further comprises the steps of:
determining noise pixel points of the image to be displayed;
acquiring the noise intensity of the image to be displayed according to the noise pixel points;
if the noise intensity of the image to be displayed is greater than or equal to a noise threshold value, determining that the image to be displayed has noise;
the determining the noise pixel point of the image to be displayed specifically includes:
acquiring a brightness average value corresponding to a first pixel point in an image to be displayed, and obtaining a first difference absolute value between the brightness of the first pixel point, wherein the brightness average value corresponding to the first pixel point is a brightness average value of pixel points adjacent to the first pixel point;
when the first difference absolute value is larger than a first difference threshold value and smaller than a second difference threshold value, determining the first pixel point as a noise pixel point of the image to be displayed; the first difference threshold is less than the second difference threshold;
obtaining the first average brightness of the image to be displayed after noise is removed through the following formula:
Figure FDA0004046261890000031
wherein the Y is avg For the first average brightness, the Y i For the brightness of the ith pixel point of the image to be displayed, the Y j The brightness of the jth noise pixel point of the image to be displayed is the P max And N is the number of noise pixels of the image to be displayed.
CN202011394145.6A 2020-06-04 2020-12-02 Display apparatus and display method Active CN112565915B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010498338X 2020-06-04
CN202010498338 2020-06-04

Publications (2)

Publication Number Publication Date
CN112565915A CN112565915A (en) 2021-03-26
CN112565915B true CN112565915B (en) 2023-05-05

Family

ID=75047461

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011394145.6A Active CN112565915B (en) 2020-06-04 2020-12-02 Display apparatus and display method

Country Status (1)

Country Link
CN (1) CN112565915B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113837990B (en) * 2021-06-11 2022-09-30 荣耀终端有限公司 Noise monitoring method, electronic equipment, chip system and storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008182350A (en) * 2007-01-23 2008-08-07 Fujifilm Corp Imaging apparatus and method
JP2012039440A (en) * 2010-08-09 2012-02-23 Aisin Seiki Co Ltd Dark area noise correction device
CN104123699A (en) * 2013-04-26 2014-10-29 富士通株式会社 Method of reducing image noise and device
CN105227805A (en) * 2015-09-28 2016-01-06 广东欧珀移动通信有限公司 A kind of image processing method and mobile terminal
CN106303156A (en) * 2016-08-29 2017-01-04 厦门美图之家科技有限公司 Method, application and mobile terminal to video denoising
CN108012050A (en) * 2017-12-19 2018-05-08 深圳创维-Rgb电子有限公司 Brightness of image method of adjustment, system and computer-readable recording medium
CN108389552A (en) * 2018-03-19 2018-08-10 京东方科技集团股份有限公司 Backlight illumination processing method and system, back light brightness regulating method, storage medium
CN111031256A (en) * 2019-11-15 2020-04-17 Oppo广东移动通信有限公司 Image processing method, image processing device, storage medium and electronic equipment

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3270609B2 (en) * 1993-01-19 2002-04-02 松下電器産業株式会社 Image display method and apparatus
JPH08317250A (en) * 1995-05-19 1996-11-29 Fujitsu General Ltd Dynamic picture control circuit
KR100213109B1 (en) * 1996-06-20 1999-08-02 윤종용 Circuit for improving picture quality by using noise reduction and histogram equalization and method thereof
JP4277773B2 (en) * 2004-09-21 2009-06-10 株式会社日立製作所 Video display device
JP4746575B2 (en) * 2007-03-16 2011-08-10 株式会社東芝 Image processing apparatus and method
JP2010220030A (en) * 2009-03-18 2010-09-30 Mitsubishi Electric Corp Video correction circuit, and video display device
CN105184747B (en) * 2015-09-09 2019-05-10 天津光电高斯通信工程技术股份有限公司 The method for improving of low-light (level) picture contrast
CN107403422B (en) * 2017-08-04 2020-03-27 上海兆芯集成电路有限公司 Method and system for enhancing image contrast
CN110400260B (en) * 2018-04-25 2022-07-29 展讯通信(上海)有限公司 Image processing method and device
CN109146811A (en) * 2018-08-14 2019-01-04 长沙全度影像科技有限公司 A kind of Adaptive contrast enhancement method of color image
CN110889802B (en) * 2018-09-10 2022-12-06 北京微播视界科技有限公司 Image processing method and device

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008182350A (en) * 2007-01-23 2008-08-07 Fujifilm Corp Imaging apparatus and method
JP2012039440A (en) * 2010-08-09 2012-02-23 Aisin Seiki Co Ltd Dark area noise correction device
CN104123699A (en) * 2013-04-26 2014-10-29 富士通株式会社 Method of reducing image noise and device
CN105227805A (en) * 2015-09-28 2016-01-06 广东欧珀移动通信有限公司 A kind of image processing method and mobile terminal
CN106303156A (en) * 2016-08-29 2017-01-04 厦门美图之家科技有限公司 Method, application and mobile terminal to video denoising
CN108012050A (en) * 2017-12-19 2018-05-08 深圳创维-Rgb电子有限公司 Brightness of image method of adjustment, system and computer-readable recording medium
CN108389552A (en) * 2018-03-19 2018-08-10 京东方科技集团股份有限公司 Backlight illumination processing method and system, back light brightness regulating method, storage medium
CN111031256A (en) * 2019-11-15 2020-04-17 Oppo广东移动通信有限公司 Image processing method, image processing device, storage medium and electronic equipment

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
一种基于加权均值检测的夜景图像增强算法;孙伟;吴翰;李雨;孙威;黄星;;萍乡学院学报(第03期);全文 *
实时星图预处理的并行流水线算法;秦天沐;王伟东;王海涌;刘佳琪;;半导体光电(第02期);全文 *

Also Published As

Publication number Publication date
CN112565915A (en) 2021-03-26

Similar Documents

Publication Publication Date Title
CN113395558B (en) Display equipment and display picture rotation adaptation method
CN111913608B (en) Touch screen rotation control interaction method and display device
CN114827707B (en) Display equipment and startup animation display method
CN112165644B (en) Display device and video playing method in vertical screen state
CN112565839A (en) Display method and display device of screen projection image
CN111866593B (en) Display device and startup interface display method
WO2021212463A1 (en) Display device and screen projection method
CN113556593B (en) Display device and screen projection method
CN111857502B (en) Image display method and display device
WO2021189708A1 (en) Method for turning on screen protection for display device, and display device
CN111866590A (en) Display device
CN114285985B (en) Method for determining preview direction of camera and display device
CN115836528A (en) Display device and screen projection method
CN111064982B (en) Display control method, storage medium and display device
CN112565915B (en) Display apparatus and display method
CN116600157A (en) Display apparatus
CN113542824B (en) Display equipment and display method of application interface
CN114501087B (en) Display equipment
CN113573118B (en) Video picture rotating method and display equipment
CN113556590B (en) Method for detecting effective resolution of screen-projected video stream and display equipment
CN113497958B (en) Display equipment and picture display method
CN113395600B (en) Interface switching method of display equipment and display equipment
CN113015006B (en) Display apparatus and display method
CN113542823B (en) Display equipment and application page display method
CN115396704B (en) Display equipment and power-on signal source display adaptation method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant