CN118277169A - Display method and related device - Google Patents

Display method and related device Download PDF

Info

Publication number
CN118277169A
CN118277169A CN202211730500.1A CN202211730500A CN118277169A CN 118277169 A CN118277169 A CN 118277169A CN 202211730500 A CN202211730500 A CN 202211730500A CN 118277169 A CN118277169 A CN 118277169A
Authority
CN
China
Prior art keywords
image
terminal equipment
smear
brightness
terminal device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211730500.1A
Other languages
Chinese (zh)
Inventor
杨苏
王兴一
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Publication of CN118277169A publication Critical patent/CN118277169A/en
Pending legal-status Critical Current

Links

Abstract

The embodiment of the application provides a display method and a related device, which are applied to the technical field of terminals. The method comprises the following steps: the terminal equipment displays the test object on a display screen; the terminal equipment determines smear compensation parameters based on a first type image and a second type image, wherein the first type image is an image displayed by the terminal equipment before the position of the test object moves, the second type image is an image displayed by the terminal equipment in the position moving process of the test object, and the smear compensation parameters are used for adjusting display parameters of a display screen corresponding to the terminal equipment when the position of the test object moves. The terminal equipment can determine smear compensation parameters according to the brightness of the test objects before and after sliding so as to compensate the display of the display screen. Therefore, interference of human factors is reduced, smear compensation is accurately and objectively carried out, the problem of excessive compensation can be reduced, and the display effect of the display screen is improved. In addition, the terminal equipment can determine the smear compensation parameters without using an additional camera for smear evaluation.

Description

Display method and related device
Technical Field
The application relates to the technical field of terminals, in particular to a display method and a related device.
Background
Currently, a user can review various contents through a display screen of a terminal device. When the content is large, the display screen cannot display the entire content at a time. The terminal device can respond to the sliding operation of the user on the display screen, and control the displayed content to slide with or without hands so as to facilitate the user to browse the related content. During the sliding display process of the display screen with or without hands, a smear phenomenon may occur.
Currently, measuring smear mainly uses simple animation to simulate smear formation, and the severity of smear is judged by comparing with a naked eye.
However, the method cannot accurately and effectively judge the degree of the smear, so that the display screen cannot be comprehensively and pointedly improved aiming at the smear phenomenon, and the user experience is poor.
Disclosure of Invention
The embodiment of the application provides a display method and a related device, which are applied to the technical field of terminals. The terminal equipment can determine smear compensation parameters according to the brightness of the test objects before and after sliding so as to compensate the display of the display screen. Therefore, interference of human factors is reduced, smear compensation is accurately and objectively carried out, the problem of excessive compensation can be reduced, and the display effect of the display screen is improved. In addition, the terminal equipment can determine the smear compensation parameters without using an additional camera for smear evaluation.
In a first aspect, an embodiment of the present application provides a display method. The method comprises the following steps: at a first moment, the terminal equipment displays a test object on a display screen; the terminal equipment determines smear compensation parameters based on a first type image and a second type image, wherein the first type image is an image displayed by the terminal equipment before the position of the test object moves, the second type image is an image displayed by the terminal equipment in the position moving process of the test object, and the smear compensation parameters are used for adjusting display parameters of a display screen corresponding to the terminal equipment when the position of the test object moves.
The display parameters include, but are not limited to, the current of the light emitting element in the display screen, the driving time of the light emitting element, the driving voltage of the light emitting element, and the like.
In this way, the terminal device can determine the smear compensation parameter according to the brightness of the test object before and after sliding so as to compensate the display of the display screen. Therefore, interference of human factors is reduced, smear compensation is accurately and objectively carried out, the problem of excessive compensation can be reduced, and the display effect of the display screen is improved. In addition, the terminal equipment can determine the smear compensation parameters without using an additional camera for smear evaluation.
Optionally, the terminal device displays the test object on a display screen, including: the terminal equipment displays a first interface on a display screen, wherein the first interface comprises a first control; the terminal equipment receives a first operation aiming at a first control; and responding to the first operation, and displaying the test object on the display screen by the terminal equipment.
The first interface may correspond to the interface shown in fig. 6. The first control may correspond to "smear calibration". The first operation may be any operation such as a click operation or a touch operation, or may be a voice operation, and is not limited herein.
Thus, after the user starts the smear calibration function, the terminal device can perform smear calibration.
Optionally, the smear compensation parameter includes a first smear compensation parameter, where the first smear compensation parameter is used to adjust a display parameter of a display screen corresponding to the terminal device when the terminal device moves at the first brightness and the object moves at the first moving speed; the terminal device determines smear compensation parameters based on the first type image and the second type image, and comprises the following steps: the terminal equipment displays a first image, wherein the first image is a first type image displayed by the terminal equipment at a first brightness; the terminal equipment displays a second image, wherein the second image is a second type image displayed when the terminal equipment moves at the first brightness and the first moving speed of the test object; the terminal equipment processes the pixel parameters corresponding to the first image and the pixel parameters corresponding to the second image to obtain a first color coordinate corresponding to the first image and a second color coordinate corresponding to the second image; the terminal equipment determines a first smear compensation parameter based on a first gray scale difference and a corresponding relation, wherein the first gray scale difference is related to a first color coordinate and a second color coordinate, and the corresponding relation is the relation between the gray scale difference and the smear compensation parameter.
The first luminance may be any luminance. The first moving speed may be any speed, and is not limited herein.
In this way, the terminal equipment acquires the smear compensation parameter under a certain brightness and a certain speed, compensates the display under the brightness and the moving speed, reduces the corresponding smear phenomenon, improves the display effect under the brightness and the moving speed, and improves the user experience.
Optionally, the terminal device processes the pixel parameter corresponding to the first image and the pixel parameter corresponding to the second image to obtain a first color coordinate corresponding to the first image and a second color coordinate corresponding to the second image, including: the terminal equipment obtains monochromatic gray scale corresponding to the first image according to the brightness of the pixel corresponding to the first image or the driving voltage applied to the pixel corresponding to the first image; the terminal equipment obtains a first color coordinate according to a single-color gray scale corresponding to the first image in a conversion relation, wherein the conversion relation is a conversion relation between the gray scale and the color coordinate; the terminal equipment obtains monochromatic gray scale corresponding to the second image according to the brightness corresponding to the pixels of the second image or the driving voltage applied to the pixels of the second image by the terminal equipment; and the terminal equipment obtains a second color coordinate according to the single-color gray scale corresponding to the second image in the conversion relation.
In this way, the terminal device can obtain color shift before and after movement based on brightness or driving voltage of the pixel, and further obtain smear compensation parameters. Through the pixel gray scale of the image before and after the test object moves, the electric signal is converted into readable and measurable color coordinates, the accuracy of the smear degree is improved, and further, the smear compensation is more accurate.
Optionally, the pixel parameter corresponding to the first image is a pixel parameter corresponding to a first region of the test object in the first image; the pixel parameters corresponding to the second image are the pixel parameters corresponding to the first region of the test object in the second image.
In this way, the pixel of the selected partial area determines the smear compensation parameter, so that the calculation amount of the terminal equipment can be reduced.
Optionally, the method further comprises: the terminal equipment obtains a bright color difference value based on the first color coordinate and the second color coordinate; the terminal equipment determines a first smear compensation parameter based on the first gray level difference and the corresponding relation, and the method comprises the following steps: and when the obvious color difference value is larger than or equal to a preset threshold value, the terminal equipment determines a first smear compensation parameter based on the first gray difference and the corresponding relation.
Thus, when JNCD values are larger, adjusting smear compensation parameters; when JNCD is small, the smear compensation parameter is not adjusted.
Optionally, the terminal device obtains the difference value of the bright color based on the first color coordinate and the second color coordinate, including: the terminal equipment obtains a first distance based on the first color coordinate and the second color coordinate, wherein the first distance is the distance between the first color coordinate and the second color coordinate; the terminal equipment obtains a distinct color difference value based on the first distance, wherein the distinct color difference value is the ratio of the first distance to the first threshold value.
Optionally, the smear compensation parameter includes a second smear compensation parameter, where the second smear compensation parameter is used to adjust a display parameter of a display screen corresponding to the terminal device when the terminal device moves at a second brightness and at a second moving speed; the terminal device determines smear compensation parameters based on the first type image and the second type image, and comprises the following steps: the terminal equipment displays a third image, wherein the third image is a first type image displayed by the terminal equipment at the second brightness; the terminal equipment displays a fourth image, wherein the fourth image is a second type image displayed when the terminal equipment moves at a second brightness and a second moving speed of the test object; the terminal equipment processes the pixel parameters corresponding to the third image and the pixel parameters corresponding to the fourth image to obtain a third color coordinate corresponding to the third image and a fourth color coordinate corresponding to the fourth image; the terminal equipment determines a second smear compensation parameter based on a second gray level difference and a corresponding relation, wherein the second gray level difference is related to a third color coordinate and a fourth color coordinate, and the corresponding relation is the relation between the gray level difference and the smear compensation parameter.
In this way, the terminal device can also change the brightness and the moving speed, and acquire the smear compensation parameters under other moving speeds of other brightness. And the display effect of the terminal equipment is improved.
Thus, the method further comprises: the terminal equipment acquires the brightness of a display screen; and when the brightness of the display screen is smaller than or equal to the first threshold value, the terminal equipment displays the interface based on the smear compensation parameter.
Thus, when the brightness of the terminal equipment is low, smear compensation is performed, and the display effect under the low brightness is improved. When the brightness is higher, smear compensation is not performed, and the calculated amount of the terminal equipment is reduced.
Optionally, the terminal device obtains brightness of the display screen, including: and the terminal equipment acquires backlight and ambient light parameters of the display screen.
Optionally, the method further comprises: the terminal equipment adjusts smear compensation parameters based on the third type of images and the fourth type of images, and the adjusted smear compensation parameters are used for adjusting display parameters of a display screen corresponding to the object when the object moves after the terminal equipment is at the second moment; the third type of image is an image displayed by the terminal equipment before the position of the test object moves, the fourth type of image is an image displayed by the terminal equipment in the process of the position movement of the test object, and the difference value between the second moment and the first moment is larger than or equal to the preset duration.
In this way, the terminal device can adjust the smear compensation parameter after a period of time. The problems of color cast, smear and the like caused by the reduction of the luminous efficiency of the luminous device in the display screen are reduced, and the display effect is improved.
Optionally, the adjusted smear compensation parameter includes a third smear compensation parameter, where the third smear compensation parameter is used to adjust a display parameter of a display screen corresponding to the terminal device when the terminal device moves at the first brightness and the object moves at the first moving speed; the terminal device adjusts smear compensation parameters based on the third type image and the fourth type image, including: the terminal equipment displays a fifth image, wherein the fifth image is a third type image displayed by the terminal equipment at the first brightness; the terminal equipment displays a sixth image, wherein the sixth image is a fourth image displayed when the terminal equipment moves at the first brightness and the first moving speed of the test object; the terminal equipment processes the pixel parameters corresponding to the fifth image and the pixel parameters corresponding to the sixth image to obtain fifth color coordinates corresponding to the fifth image and sixth color coordinates corresponding to the sixth image; the terminal equipment determines a third smear compensation parameter based on a third gray scale difference and a corresponding relation, wherein the third gray scale difference is related to a fifth color coordinate and a sixth color coordinate.
In a second aspect, an embodiment of the present application provides an electronic device, where the electronic device includes a terminal device, and the terminal device may also be referred to as a terminal (terminal), a User Equipment (UE), a Mobile Station (MS), a Mobile Terminal (MT), and so on. The terminal device may be a mobile phone, a smart television, a wearable device, a tablet (Pad), a computer with wireless transceiving function, a Virtual Reality (VR) terminal device, an augmented reality (augmented reality, AR) terminal device, a wireless terminal in industrial control (industrial control), a wireless terminal in unmanned driving (self-driving), a wireless terminal in teleoperation (remote medical surgery), a wireless terminal in smart grid (SMART GRID), a wireless terminal in transportation security (transportation safety), a wireless terminal in smart city (SMART CITY), a wireless terminal in smart home (smart home), or the like.
The electronic device includes: comprising the following steps: a processor and a memory; the memory stores computer-executable instructions; the processor executes computer-executable instructions stored in the memory to cause the electronic device to perform the method as in the first aspect.
In a third aspect, embodiments of the present application provide a computer-readable storage medium storing a computer program. The computer program, when executed by a processor, implements a method as in the first aspect.
In a fourth aspect, embodiments of the present application provide a computer program product comprising a computer program which, when run, causes a computer to perform the method as in the first aspect.
In a fifth aspect, an embodiment of the application provides a chip comprising a processor for invoking a computer program in memory to perform a method as in the first aspect.
It should be understood that the second to fifth aspects of the present application correspond to the technical solutions of the first aspect of the present application, and the advantages obtained by each aspect and the corresponding possible embodiments are similar, and are not repeated.
Drawings
Fig. 1 is a schematic hardware structure of a terminal device according to an embodiment of the present application;
fig. 2 is a schematic software structure of a terminal device according to an embodiment of the present application;
FIG. 3 is an interface diagram of a smear phenomenon according to an embodiment of the present application;
fig. 4 is a schematic flow chart of a display method according to an embodiment of the present application;
FIG. 5 is a schematic diagram of a process for converting pixels into color coordinates according to an embodiment of the present application;
fig. 6 is an interface schematic diagram of a terminal device according to an embodiment of the present application;
fig. 7 is a schematic flow chart of a display method according to an embodiment of the present application;
fig. 8 is a schematic structural diagram of a display device according to an embodiment of the present application.
Detailed Description
In order to facilitate the clear description of the technical solutions of the embodiments of the present application, the following simply describes some terms and techniques involved in the embodiments of the present application:
1. Smear phenomenon: when a moving object exists in an image displayed on a display screen (screen), the phenomenon that the moving object may generate a smear on the display screen is referred to, so that the viewing experience of a user is reduced.
It can be understood that in the case of low indoor brightness, the brightness of the mobile phone picture automatically dimmes to darkest. If the screen is slid to switch the picture; due to the hysteresis effect of the OLED, the first frame is dragged with lower brightness and accompanied with smear, and meanwhile, PWM dimming configuration of the device can cause smear phenomenon to aggravate the problems of glare and dizziness caused by long-time watching of a screen by eyes of a person, so that user experience is affected
2. Color accuracy (just noticeable color difference, JNCD) refers to the minimum unit in which the human eye can distinguish between color changes. It can be appreciated that the smaller the JNCD value, the better the effect of reducing the true color of the screen.
3. Gray scale: the brightness change between the brightest and darkest is divided into several portions. So as to facilitate the control of the screen brightness corresponding to the signal input. Each digital image is composed of a number of dots, also known as pixels (pixels), which typically each display a number of different colors, consisting of three sub-pixels, red, green, and blue (RGB). The light source behind each sub-pixel may exhibit a different brightness level. While gray scale represents the hierarchical level of different brightness from darkest to brightest. The more such intermediate levels, the finer the picture effect that can be presented.
Taking the 8bit panel as an example, it can represent the 8 th power of 2, which is equal to 256 brightness levels, we call 256 gray levels. Each pixel on the LCD screen is composed of red, green and blue with different brightness levels, and finally different color points are formed. That is, the color change of each point on the screen is actually brought about by the gray-scale change of the three RGB sub-pixels constituting the point.
4. Other terms
In embodiments of the present application, the words "first," "second," and the like are used to distinguish between identical or similar items that have substantially the same function and effect. For example, the first chip and the second chip are merely for distinguishing different chips, and the order of the different chips is not limited. It will be appreciated by those of skill in the art that the words "first," "second," and the like do not limit the amount and order of execution, and that the words "first," "second," and the like do not necessarily differ.
It should be noted that, in the embodiments of the present application, words such as "exemplary" or "such as" are used to mean serving as an example, instance, or illustration. Any embodiment or design described herein as "exemplary" or "for example" should not be construed as preferred or advantageous over other embodiments or designs. Rather, the use of words such as "exemplary" or "such as" is intended to present related concepts in a concrete fashion.
In the embodiments of the present application, "at least one" means one or more, and "a plurality" means two or more. "and/or", describes an association relationship of an association object, and indicates that there may be three relationships, for example, a and/or B, and may indicate: a alone, a and B together, and B alone, wherein a, B may be singular or plural. The character "/" generally indicates that the context-dependent object is an "or" relationship. "at least one of" or the like means any combination of these items, including any combination of single item(s) or plural items(s). For example, at least one (one) of a, b, or c may represent: a, b, c, a-b, a-c, b-c, or a-b-c, wherein a, b, c may be single or plural.
5. Terminal equipment
The terminal device of the embodiment of the application can also be any form of electronic device, for example, the electronic device can include a handheld device with an image processing function, a vehicle-mounted device and the like. For example, some electronic devices are: a mobile phone, a tablet, a palmtop, a notebook, a mobile internet device (mobile INTERNET DEVICE, MID), a wearable device, a Virtual Reality (VR) device, an augmented reality (augmented reality, AR) device, a wireless terminal in industrial control (industrial control), a wireless terminal in unmanned (SELF DRIVING), a wireless terminal in teleoperation (remote medical surgery), a wireless terminal in smart grid (SMART GRID), a wireless terminal in transportation security (transportation safety), a wireless terminal in smart city (SMART CITY), a wireless terminal in smart home (smart home), a cellular phone, a cordless phone, a session initiation protocol (session initiation protocol, SIP) phone, a wireless local loop (wireless local loop, WLL) station, a personal digital assistant (personal DIGITAL ASSISTANT, PDA), a handheld device with wireless communication functionality, a computing device or other processing device connected to a wireless modem, a vehicle-mounted device, a wearable device, a terminal device in a 5G network, or a future evolved land mobile network (public land mobile network), and the like, without limiting the application.
By way of example, and not limitation, in embodiments of the application, the electronic device may also be a wearable device. The wearable device can also be called as a wearable intelligent device, and is a generic name for intelligently designing daily wear by applying wearable technology and developing wearable devices, such as glasses, gloves, watches, clothes, shoes and the like. The wearable device is a portable device that is worn directly on the body or integrated into the clothing or accessories of the user. The wearable device is not only a hardware device, but also can realize a powerful function through software support, data interaction and cloud interaction. The generalized wearable intelligent device includes full functionality, large size, and may not rely on the smart phone to implement complete or partial functionality, such as: smart watches or smart glasses, etc., and focus on only certain types of application functions, and need to be used in combination with other devices, such as smart phones, for example, various smart bracelets, smart jewelry, etc. for physical sign monitoring.
In addition, in the embodiment of the application, the electronic equipment can also be terminal equipment in an internet of things (internet of things, ioT) system, and the IoT is an important component of the development of future information technology, and the main technical characteristics of the IoT are that the article is connected with a network through a communication technology, so that the man-machine interconnection and the intelligent network of the internet of things are realized.
The electronic device in the embodiment of the application may also be referred to as: a terminal device, a User Equipment (UE), a Mobile Station (MS), a Mobile Terminal (MT), an access terminal, a subscriber unit, a subscriber station, a mobile station, a remote terminal, a mobile device, a user terminal, a wireless communication device, a user agent, a user equipment, or the like.
In an embodiment of the present application, the electronic device or each network device includes a hardware layer, an operating system layer running on top of the hardware layer, and an application layer running on top of the operating system layer. The hardware layer includes hardware such as a central processing unit (central processing unit, CPU), a memory management unit (memory management unit, MMU), and a memory (also referred to as a main memory). The operating system may be any one or more computer operating systems that implement business processes through processes (processes), such as a Linux operating system, a Unix operating system, an Android operating system, an iOS operating system, or a windows operating system. The application layer comprises applications such as a browser, an address book, word processing software, instant messaging software and the like.
By way of example, fig. 1 shows a schematic diagram of a terminal device.
The terminal device may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, a display 194, and a subscriber identity module (subscriber identification module, SIM) card interface 195, etc. The sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It will be appreciated that the structure illustrated in the embodiments of the present application does not constitute a specific limitation on the terminal device. In other embodiments of the application, the terminal device may include more or less components than illustrated, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (IMAGE SIGNAL processor, ISP), a controller, a video codec, a digital signal processor (DIGITAL SIGNAL processor, DSP), a baseband processor, and/or a neural-Network Processor (NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include an integrated circuit (inter-INTEGRATED CIRCUIT, I2C) interface, an integrated circuit built-in audio (inter-INTEGRATED CIRCUIT SOUND, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others.
It should be understood that the connection relationship between the modules illustrated in the embodiment of the present application is only illustrative, and does not limit the structure of the terminal device. In other embodiments of the present application, the terminal device may also use different interfacing manners in the foregoing embodiments, or a combination of multiple interfacing manners.
The terminal device implements display functions through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
In the embodiment of the present application, the type of the display screen may be any of the following types: a liquid crystal display, an organic light-emitting Diode (OLED) display, an in-plane switching (IPS) display, a twisted nematic (TWISTED NEMATIC, TN) display, a vertical alignment (VERTICAL ALIGNMENT, VA) display, an electronic paper, a QLED (quantum dot LIGHT EMITTING diodes) display, a micro LED (micro LED) display, or the like, to which the present application is not limited. The light emitting mode of the display screen may be top light, bottom light, or double-sided light.
The terminal device may implement photographing functions through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to realize expansion of the memory capability of the terminal device. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions. For example, files such as music, video, etc. are stored in an external memory card.
The internal memory 121 may be used to store computer-executable program code that includes instructions. The internal memory 121 may include a storage program area and a storage data area. The storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc. The storage data area may store data created during use of the terminal device (such as audio data, phonebook, etc.), etc. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like. The processor 110 performs various functional applications of the terminal device and data processing by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor. For example, the display method of the embodiment of the present application may be performed.
The ambient light sensor 180L is used to sense ambient light level. The terminal device may adaptively adjust the brightness of the display 194 based on the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust white balance when taking a photograph. The ambient light sensor 180L may also cooperate with the proximity light sensor 180G to detect if the terminal device is in a pocket to prevent false touches.
It will be appreciated that the structure illustrated in the embodiments of the present application does not constitute a specific limitation on the terminal device. In other embodiments of the application, the terminal device may include more or less components than illustrated, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The software system of the terminal device can adopt a layered architecture, an event driven architecture, a microkernel architecture, a microservice architecture or a cloud architecture. In the embodiment of the application, an Android system with a layered architecture is taken as an example, and the software structure of terminal equipment is illustrated.
Fig. 2 is a software configuration block diagram of a terminal device according to an embodiment of the present application. The layered architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. In some embodiments, the Android system may be divided into four layers, an application layer (applications), an application framework layer (application framework), an Zhuoyun row (Android runtime) and system library, and a kernel layer (kernel), respectively.
The application layer may include a series of application packages. The application layer runs the application by calling an application program interface (application programming interface, API) provided by the application framework layer. As shown in fig. 2, the application package may include camera, calendar, phone, map, phone, music, settings, mailbox, video, social, etc. applications.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for the application of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 2, the application framework layer may include a window manager, a content provider, a resource manager, a view system, a notification manager, and the like.
The window manager is used for managing window programs. The window manager may obtain the display screen size, determine if there is a status bar, lock the screen, touch the screen, drag the screen, intercept the screen, etc.
The content provider is used to store and retrieve data and make such data accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebooks, etc.
The view system includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, a display interface including a text message notification icon may include a view displaying text and a view displaying a picture.
The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like.
The notification manager allows the application to display notification information in a status bar, can be used to communicate notification type messages, can automatically disappear after a short dwell, and does not require user interaction. Such as notification manager is used to inform that the download is complete, message alerts, etc. The notification manager may also be a notification in the form of a chart or scroll bar text that appears on the system top status bar, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, a text message is prompted in a status bar, a prompt tone is emitted, the terminal equipment vibrates, and an indicator light blinks.
Android runtime include core libraries and virtual machines. Android runtime is responsible for scheduling and management of the android system.
The core library consists of two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application program layer and the application program framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface manager (surface manager), media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., openGL ES), 2D graphics engines (e.g., SGL), etc.
The surface manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio and video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
It should be understood that in some embodiments, layers that implement the same function may be referred to by other names, or layers that implement the functions of multiple layers may be taken as one layer, or layers that implement the functions of multiple layers may be divided into multiple layers. The embodiments of the present application are not limited in this regard.
Currently, a user can review various contents through a display screen of a terminal device. When the content is large, the display screen cannot display the entire content at a time. The terminal device can respond to the sliding operation of the user on the display screen, and control the displayed content to slide with or without hands so as to facilitate the user to browse the related content. During the sliding display of the display screen with or without the hand, a smear phenomenon may occur (as shown in fig. 3). It will be appreciated that the smear phenomenon may be more severe when the terminal device is used in a dark environment of the user.
Currently, smear color cast tests are used to determine the severity of smear by comparing with a visual inspection. The method can intuitively observe the smear phenomenon and the color cast phenomenon, but the picture is always in the degree that the dynamic blurring does not have standard judgment color cast in the dragging process, and the smear degree cannot be accurately and effectively judged. Moreover, due to the fact that the human eyes have different sensitivity to different colors in a dark environment, the original standard cannot cover all color cast problems.
In addition, when a target area of a next frame is brightly rendered in advance at the time of dragging according to the brightness of a pixel to reduce a smear phenomenon, an overcompensation problem may occur, so that power consumption of a terminal device increases.
In view of this, an embodiment of the present application provides a display method, which determines a smear degree according to brightness of a test object before and after sliding, and further determines a corresponding smear compensation parameter to compensate for display on a display screen. Therefore, interference of human factors is reduced, the smear degree is accurately and objectively determined, the accuracy of subsequent smear compensation is further improved, the problem of overcompensation is solved, the display effect of a display screen is improved, and the user experience is improved. In addition, the terminal equipment can evaluate and compensate the smear degree without using an additional camera for smear evaluation.
The display method according to the embodiment of the present application will be described in detail by way of specific examples. The following embodiments may be combined with each other or implemented independently, and the same or similar concepts or processes may not be described in detail in some embodiments.
For convenience of understanding, a display method provided by the embodiment of the present application is described below with reference to fig. 4.
Fig. 4 is a schematic flow chart of a display method according to an embodiment of the present application. As shown in fig. 4, the method includes: taking a first smear compensation parameter at a first brightness and a first moving speed as an example,
S401, the terminal device displays the test object on a display screen.
It will be appreciated that the terminal device displays the first image; the first image is an image displayed before the position of the test object is moved by the terminal device at the first brightness. The terminal device displays a second image, wherein the second image is displayed when the terminal device moves the test object according to the first moving speed under the first brightness.
In the embodiment of the application, the image displayed before the position of the test object moves can be understood as the image displayed when the test object is still.
In some embodiments, the terminal device does not display the first image and the second image simultaneously. Illustratively, a first image of the test object before movement is displayed, and then a second image of the test object as it moves is displayed. In other embodiments, the terminal device displays the first image and the second image simultaneously. The left region of the terminal device displays a first image before movement, and the right region displays a second image of the test object when movement is performed.
In the embodiment of the application, the test object can be any object such as characters, symbols, animals, figures and the like; the test object may be any color of white, pink, black, etc. The test object can be square, round or any other shape; the moving direction of the test object can be any direction of horizontal, vertical, bending and the like. The embodiment of the application does not specifically limit the shape, size, color, moving direction, moving speed and the like of the test object.
It is understood that the color or brightness of the test object is distinguishable from the background.
In some embodiments, the terminal device may automatically control the movement of the test object. For example, the terminal device may control the test object to move according to a preset moving speed and a preset direction. The preset moving speed may be set according to actual situations or experience, and the embodiment of the present application is not limited herein with respect to a specific value of the preset moving speed. The preset direction may be up, down, left, right, etc., which is not particularly limited in the embodiment of the present application.
In other embodiments, the terminal device may control the movement of the test object based on a sliding operation of the user. The moving speed and moving direction of the test object are related to the sliding operation of the user.
S402, the terminal equipment respectively processes the pixel parameters corresponding to the first image and the pixel parameters corresponding to the second image to obtain a first color coordinate and a second color coordinate.
The pixel parameter may be a luminance corresponding to a pixel, a driving voltage applied to the pixel, or the like, and is not limited herein.
The terminal device may process pixel parameters corresponding to a partial region in the image. It can be understood that the terminal device can determine the monochrome gray scale of red, green and blue (R/G/B) according to the brightness corresponding to the pixel or the applied driving voltage, and further calculate the color coordinates based on the conversion relationship between the gray scale and the color coordinates.
The terminal device calculates the corresponding color coordinates based on a certain row or column of pixels of the test object, for example. And taking the pixel gray scale of the pixels in the top row of the test object as the row gray scale, calculating the monochromatic gray scale of R/G/B according to the applied voltage or brightness of the pixels in the row, and calculating the color coordinates of the test object according to the conversion relation between the gray scale and the color coordinates. The embodiment of the application does not specifically limit the area of the test object selected by calculating the color coordinates.
It can be understood that the terminal device may process the pixel parameter corresponding to the first image and the pixel parameter corresponding to the second image at the same time; the pixel parameter corresponding to the first image and the pixel parameter corresponding to the second image may be processed sequentially, which is not limited herein.
For ease of understanding, the process of converting pixel to color coordinates is described below in conjunction with fig. 5.
It will be appreciated that the image displayed by the display screen is composed of pixels, each of which may take on a number of different colours, which are composed of three sub-pixels, red, green and blue (RGB). Illustratively, the pixel corresponding to the test object may be as shown as a in fig. 5.
The light source behind each sub-pixel may exhibit a different brightness level. By way of example, the different brightnesses from darkest to brightest can be divided into 256 brightness levels (shown as b in fig. 5).
The terminal device may obtain the color coordinates (shown as c in fig. 5) corresponding to the test object in the chromaticity space according to the brightness levels of the three sub-pixels of red, green and blue (RGB).
S403, the terminal equipment determines a first smear compensation parameter based on the first gray level difference and a corresponding relation, wherein the corresponding relation is the relation between the smear compensation parameter and the gray level difference.
The gray scale difference refers to the difference between the color coordinates. The first gray scale difference is the difference between the first color coordinate and the second color coordinate.
It will be appreciated that the extent of smear of the terminal device is related to the brightness of the display screen, the moving speed of the object, etc. For example, when the moving speed (sliding speed) of the terminal device is fast and slow, the user cannot see or can see a very slight smear color cast phenomenon; when slid at a certain suitable speed, the user can see a significant color cast problem.
S404, the terminal equipment displays the image based on the first smear compensation parameter.
The first smear compensation parameter adjusts the display parameter of a display screen corresponding to the terminal equipment when the terminal equipment is at the first brightness and the object moves at the first moving speed; and when the terminal equipment is at the first brightness and the object moves at the first moving speed, displaying the image based on the adjusted display parameters.
The object may be any object, may be a list, or may be a picture, a text, or the like displayed in a moving manner, which is not limited herein.
The first smear compensation parameter may be used to perform current adjustment of a light emitting element, driving time adjustment of the light emitting element, driving voltage adjustment, and the like in the display screen.
It will be appreciated that the terminal device may determine smear compensation parameters at a plurality of brightness, a plurality of movement speeds. The specific flow may refer to S401 to S403, which are not described herein.
It can be understood that the terminal device executes the above-mentioned S401-S403 under the condition that the first brightness and the test object are the first moving speed, so as to obtain the smear compensation parameters corresponding to the first brightness and the first moving speed. And the terminal equipment obtains smear compensation parameters corresponding to the first brightness and the second moving speed under the first brightness and the second moving speed. And the terminal equipment executes the steps similar to the steps S401-S403 under the condition that the second brightness and the test object are at the first moving speed, and the smear compensation parameters corresponding to the second brightness and the first moving speed are obtained. And the terminal equipment executes the steps similar to the steps S401-S403 under the condition that the second brightness and the test object are at the second moving speed, and obtains the smear compensation parameters corresponding to the second brightness and the second moving speed.
The terminal device displays a third image, wherein the third image is an image displayed by the terminal device at the second brightness; the terminal equipment displays a fourth image, wherein the fourth image is displayed when the terminal equipment moves at the second brightness and the test object moves at the second moving speed; the terminal equipment processes the pixel parameters corresponding to the third image and the pixel parameters corresponding to the fourth image to obtain a third color coordinate corresponding to the third image and a fourth color coordinate corresponding to the fourth image; the terminal equipment determines a second smear compensation parameter based on a second gray level difference and a corresponding relation, wherein the second gray level difference is related to a third color coordinate and a fourth color coordinate, and the corresponding relation is the relation between the gray level difference and the smear compensation parameter.
Taking the smear compensation parameter under 7 brightness as an example, the 7 brightness may be 2 nit (nit), 4 nit, 10 nit, 20 nit, 50 nit, 90 nit, respectively. The embodiment of the application is not limited to the value of the brightness, the interval between the brightness and the like.
In some embodiments, the terminal device may determine and store N smear compensation parameter values corresponding to the movement speeds at each brightness. N may be 100 or any number, and is not limited herein. In this way, the terminal device can adjust the smear compensation parameter based on different moving speeds, thereby reducing the smear phenomenon.
The N moving speeds may be uniformly divided speed intervals. Illustratively, the speed interval is 0 pixels per millisecond to 100 pixels per millisecond, N is 5, and the N moving speeds are 20 pixels per millisecond, 40 pixels per millisecond, 60 pixels per millisecond, 80 pixels per millisecond, 100 pixels per millisecond, respectively.
The N moving speeds may also be non-uniform division of the speed interval based on terminal device usage. Illustratively, the speed interval is 0 pixels per millisecond to 100 pixels per millisecond, N is 5, and the N moving speeds are 20 pixels per millisecond, 30 pixels per millisecond, 40 pixels per millisecond, 70 pixels per millisecond, 100 pixels per millisecond, respectively. In the embodiment of the application, the value and the number of the moving speeds of the terminal equipment are not particularly limited under the same brightness.
The N moving speeds may also take values of moving speeds (sliding speeds) based on the use situation of the terminal device. For example, the movement speed range frequently used by the user corresponds to more movement speed values.
In some embodiments, the terminal device may perform fitting on the obtained plurality of smear compensation parameters to obtain a smear compensation curve; and determining the smear compensation parameters of the object during display according to the display brightness of the terminal equipment, the moving speed of the object and the smear compensation curve so as to carry out smear compensation.
In some embodiments, the terminal device may respectively correspond to the plurality of brightness intervals and the plurality of speed intervals when the obtained plurality of smear compensation parameters; and the terminal equipment determines smear compensation parameters when the object is displayed according to a brightness interval corresponding to the display brightness of the terminal equipment and a speed interval corresponding to the moving speed of the object so as to compensate.
The specific use modes of the obtained plurality of smear compensation parameters are not limited in the embodiment of the application.
In conclusion, the terminal equipment can accurately evaluate the smear degree, and carry out smear compensation aiming at smear phenomena of different degrees, so that the smear phenomena of the terminal equipment are reduced, and the user experience is improved. The pertinence is strong, and the accuracy is high. In addition, the over-compensation problem can be reduced, and the power consumption of the terminal equipment can be reduced. The electric signal is converted into readable and measurable color coordinates by capturing the gray scale of the regional pixels of the test object, and the evaluation and compensation of the smear degree can be realized without the participation of other equipment.
It is understood that the terminal device may perform the above-described S401-S403 only when the smear calibration is first turned on. After the terminal device starts the smear calibration, S401-S403 may also be executed periodically. The period duration is three months or one month, and the period duration of the smear calibration is not limited in the embodiment of the application.
The subsequent smear calibration process is similar to the process shown in fig. 4, and will not be described again here.
The smear calibration process is as follows: the terminal equipment displays a fifth image, wherein the fifth image is an image displayed by the terminal equipment under the first brightness; the terminal equipment displays a sixth image, wherein the sixth image is displayed when the terminal equipment moves at the first brightness and the first moving speed of the test object; the terminal equipment processes the pixel parameters corresponding to the fifth image and the pixel parameters corresponding to the sixth image to obtain fifth color coordinates corresponding to the fifth image and sixth color coordinates corresponding to the sixth image; the terminal equipment determines a third smear compensation parameter based on a third gray scale difference and a corresponding relation, wherein the third gray scale difference is related to a fifth color coordinate and a sixth color coordinate.
Therefore, the smear compensation parameters can be adjusted according to the use condition of the terminal equipment, the color cast fluctuation caused by the light-emitting efficiency attenuation of the display screen light-emitting material is reduced, and the user experience is improved.
On the basis of the above embodiment, the terminal device also calculates the execution result S405.
And S405, the terminal equipment obtains obvious color difference values (just noticeable color difference, JNCD values) of the image according to the first color coordinates and the second color coordinates.
In the embodiment of the application, when JNCD values are larger than or equal to a preset threshold, the terminal equipment adjusts the smear compensation parameter based on the gray level difference and the corresponding relation. And when JNCD values are smaller than a preset threshold value, the terminal equipment does not adjust the smear compensation parameter.
In this way, the frequency of the terminal device adjusting the smear compensation parameter can be reduced.
In a possible implementation manner, the terminal device may calculate a distance Δuv between the first color coordinate and the second color coordinate, and calculate JNCD according to the distance Δuv between the color coordinates. In some embodiments JNCD meets Δuv/0.004.
For example, as shown by d in fig. 5, two color coordinates correspond to the coordinate point 1 and the coordinate point 2, respectively, and the distance between the coordinate point 1 and the coordinate point 2 may be the distance between the color coordinate 1 and the color coordinate 2 of the line segment 501.
In this way, the amount of computation of the terminal device can be reduced.
In some embodiments, the terminal device further performs S406 before performing S404.
S406, the terminal equipment acquires the brightness of the display screen.
In some embodiments, the terminal device obtains backlight and ambient light parameters of the display screen to determine brightness of the display screen.
When the brightness of the display screen is less than or equal to the first threshold, the terminal device performs S404. And when the brightness of the display screen is larger than the first threshold value, the terminal equipment does not carry out smear compensation.
It will be appreciated that the bias, smear and color bias are related to the brightness of the terminal device. The lower the brightness displayed by the terminal device, the heavier the degree of bias. When the brightness of the display of the terminal equipment is higher, the degree of bias, smear and color cast is lower.
In this way, the terminal equipment performs smear compensation in a scene with low brightness, so that the smear phenomenon of the terminal equipment can be reduced, and the problem of image color cast caused by the luminous efficiency difference of the display device can be reduced. In a scene with high brightness, the terminal equipment does not carry out smear compensation, so that the calculated amount displayed by the terminal equipment can be reduced.
On the basis of the above embodiment, the terminal device also acquires the moving speed of each object in the image before displaying the image. The terminal equipment respectively determines the smear compensation parameters of each object based on the brightness of the display screen and the moving speed of each object, and further displays images based on the smear compensation parameters of each object. It will be appreciated that the terminal device may default to initiate automatic calibration of the smear compensation parameter. Automatic calibration of smear compensation parameters may also be defaulted not to be turned on.
In some embodiments, the start of the smear calibration function is achieved by a setup application.
Fig. 6 is an interface schematic diagram of a terminal device according to an embodiment of the present application. As shown in fig. 6, when the terminal device receives an operation of the user clicking on the setting application 601 at the main interface shown in a in fig. 6, the terminal device enters the setting interface shown in b in fig. 6. The setting interface includes: the settings tab bar 602 and settings, for example, may include one or more of the following: system and updates, notifications, biometrics and passwords, applications, batteries, storage, security, privacy, smear calibration, or other types of settings.
When the user triggers the smear calibration 603 by clicking, touching, or the like in the setting interface shown in b in fig. 6, the terminal device receives an operation of entering the smear calibration page by the user, and enters the smear calibration interface shown in c in fig. 6. The smear calibration interface includes: the smear calibration tab bar and the smear calibration settings, for example, the smear calibration settings may include one or more of the following: the smear calibration 604, advanced settings, or other types of smear calibration settings items are turned on.
When the user triggers the start control 604 by clicking, touching or the like in the smear calibration interface shown in c in fig. 6, the terminal device receives the operation of starting the smear calibration, and enters the interface shown in d in fig. 6. The interface comprises: closing a control, advanced setting, or other type of smear calibration setting item.
In some embodiments, the frequency of the smear calibration, the time of the smear calibration, etc. may be set by "advanced settings".
It can be understood that when the terminal device receives an operation of starting the smear calibration, the smear compensation parameter may be determined through an interface shown by e in fig. 6 and an interface shown by f in fig. 6; after calibration is completed, the interface shown as d in fig. 6 is entered. The interface shown as e in fig. 6 displays a test object in a first position, and the interface shown as f in fig. 6 displays a test object in a second position, the first position and the second position being different. The test object may be a picture, a text, or a combination of a picture and a text, which is not limited herein.
In some embodiments, the terminal device may adjust the brightness of the display screen and the moving speed of the test object when performing the smear calibration, so as to perform the smear calibration on the display with different brightness and different moving speed.
In some embodiments, the terminal device performs the smear calibration every time the smear calibration function is turned on. In some embodiments, the terminal device may further perform smear calibration when receiving an operation for indicating smear calibration. For example, a click operation by the user to turn on the smear calibration 604 is received, or a voice operation of "perform smear calibration" is received.
In other embodiments, the terminal device performs the smear calibration when the smear calibration function is first turned on. The smear calibration is performed again after a while. The period of time may be one month or three months, which is not limited in the embodiment of the present application.
In a possible implementation, the terminal device performs smear calibration when the device is idle. The device may be idle, when no foreground application is running, device dormant, etc. In this way, the disturbance to the user can be reduced.
In other embodiments, the terminal device is provided with an application for smear calibration. Smear calibration is performed by this application.
Fig. 7 is a schematic flow chart of a display method according to an embodiment of the present application. As shown in fig. 7, the method includes:
And S701, grabbing a first type image before the test object moves and a second type image in the moving process of the test object when the brightness of the display screen is smaller than or equal to a first threshold value.
In the embodiment of the application, the terminal equipment acquires the pixel parameters corresponding to the dynamic image in the moving process of the test object.
The first type of image may include the first image in the above-described embodiment; the second type of image may comprise the second image of the above-described embodiments.
S702, acquiring pixel parameters corresponding to the first type of image and pixel parameters corresponding to the second type of image.
S703, fitting color coordinates.
The conversion of the color coordinates may refer to the descriptions related to S401 and S402, and will not be repeated here.
Therefore, the pixel gray value of the dynamic image is converted into a readable and quantized color coordinate value, and the subsequent evaluation of color cast is facilitated.
S704, JNCD values before and after the movement are calculated.
The calculation of JNCD values may refer to the above description of S405, and will not be repeated here.
S706-S709 are performed when JNCD is greater than or equal to the preset threshold. When JNCD is greater than or equal to the preset threshold, the smear compensation parameter is not adjusted, and S709 is executed.
The preset threshold may be referred to the above related description, and will not be described herein.
And S705, triggering a low brightness compensation function, and determining a smear compensation parameter according to the gray level difference.
In the embodiment of the application, when the brightness of the display screen is low, the display is based on the smear compensation parameter.
S706, displaying based on the smear compensation parameter.
S707, effect acceptance.
In the case that the terminal device displays an image based on the smear compensation parameter, S701-S704 are performed to obtain JNCD values after smear compensation. And when the JNCD value after smear compensation is smaller than a preset threshold value, displaying the parameters based on the smear compensation. And when the JNCD value after the smear compensation is larger than or equal to a preset threshold value, continuing to adjust the smear compensation parameter until the JNCD value obtained by the terminal equipment based on the adjusted smear compensation parameter is smaller than the preset threshold value.
S708, periodic monitoring.
It can be understood that after a period of time, the terminal device executes S701-S704 to obtain the JNCD value after smear compensation in the case of displaying the dynamic image based on the smear compensation parameter. And when the JNCD value after smear compensation is smaller than a preset threshold value, displaying the parameters based on the smear compensation. And when the JNCD value after the smear compensation is larger than or equal to a preset threshold value, continuing to adjust the smear compensation parameter until the JNCD value obtained by the terminal equipment based on the adjusted smear compensation parameter is smaller than the preset threshold value.
Therefore, smear compensation parameters can be timely adjusted, color cast fluctuation caused by luminous efficiency attenuation of the display screen luminous material is reduced, and user experience is improved.
It can be understood that the method for converting brightness into color coordinates according to brightness or applied voltage of a pixel provided by the embodiment of the application can also be applied to solving other problems, such as residual image and the like. Technical support can also be provided for other display testing methods.
It should be noted that "at … …" in the embodiment of the present application may be an instant when a certain situation occurs, or may be a period of time after a certain situation occurs, which is not particularly limited in the embodiment of the present application. In addition, the display interface provided by the embodiment of the application is only used as an example, and the display interface can also comprise more or less contents.
The display method according to the embodiment of the present application has been described above, and the device for executing the display method according to the embodiment of the present application is described below. It will be appreciated by those skilled in the art that the methods and apparatus may be combined and referred to, and that the related apparatus provided in the embodiments of the present application may perform the steps of the above-described display method.
As shown in fig. 8, fig. 8 is a schematic structural diagram of a display device according to an embodiment of the present application, where the display device may be a terminal device in an embodiment of the present application, or may be a chip or a chip system in the terminal device.
As shown in fig. 8, the display apparatus 2100 may be used in a communication device, a circuit, a hardware component, or a chip, and includes: a display unit 2101, and a processing unit 2102. Wherein the display unit 2101 is used for supporting the step of displaying performed by the display device 2100; the processing unit 2102 is for supporting the step of the display device 2100 to perform information processing.
In a possible implementation, the display device 2100 may also include a communication unit 2103. Specifically, the communication unit is for supporting the display device 2100 to perform the steps of transmitting data and receiving data. The communication unit 2103 may be an input or output interface, a pin or circuit, or the like.
In a possible embodiment, the display device may further include: a storage unit 2104. The processing unit 2102 and the storage unit 2104 are connected by a line. The memory unit 2104 may include one or more memories, which may be one or more devices, circuits, or means for storing programs or data. The storage unit 2104 may exist independently and be connected to the processing unit 2102 provided in the display device through a communication line. The memory unit 2104 may also be integrated with the processing unit 2102.
The storage unit 2104 may store computer-executed instructions of the method in the terminal apparatus to cause the processing unit 2102 to execute the method in the above-described embodiment. The storage unit 2104 may be a register, a cache, a RAM, or the like, and the storage unit 2104 may be integrated with the processing unit 2102. The storage unit 2104 may be a read-only memory (ROM) or other type of static storage device that may store static information and instructions, and the storage unit 2104 may be independent of the processing unit 2102.
The display method provided by the embodiment of the application can be applied to the electronic equipment with the display function. The electronic device includes a terminal device, and specific device forms and the like of the terminal device may refer to the above related descriptions, which are not repeated herein.
The embodiment of the application provides electronic equipment, which comprises: comprising the following steps: a processor and a memory; the memory stores computer-executable instructions; the processor executes the computer-executable instructions stored in the memory to cause the electronic device to perform the method described above.
The embodiment of the application provides a chip. The chip comprises a processor for invoking a computer program in a memory to perform the technical solutions in the above embodiments. The principle and technical effects of the present application are similar to those of the above-described related embodiments, and will not be described in detail herein.
The embodiment of the application also provides a computer readable storage medium. The computer-readable storage medium stores a computer program. The computer program realizes the above method when being executed by a processor. The methods described in the above embodiments may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. If implemented in software, the functions may be stored on or transmitted over as one or more instructions or code on a computer-readable medium. Computer readable media can include computer storage media and communication media and can include any medium that can transfer a computer program from one place to another. The storage media may be any target media that is accessible by a computer.
In one possible implementation, the computer readable medium may include RAM, ROM, a compact disk-read only memory (CD-ROM) or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium targeted for carrying or storing the desired program code in the form of instructions or data structures and accessible by a computer. Also, any connection is properly termed a computer-readable medium. For example, if the software is transmitted from a website, server, or other remote source using a coaxial cable, fiber optic cable, twisted pair, digital subscriber line (Digital Subscriber Line, DSL), or wireless technologies such as infrared, radio, and microwave, then the coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave are included in the definition of medium. Disk and disc, as used herein, includes optical disc, laser disc, optical disc, digital versatile disc (DIGITAL VERSATILE DISC, DVD), floppy disk and blu-ray disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers. Combinations of the above should also be included within the scope of computer-readable media.
Embodiments of the present application provide a computer program product comprising a computer program which, when executed, causes a computer to perform the above-described method.
Embodiments of the present application are described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processing unit of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processing unit of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It should be noted that, the user information (including but not limited to user equipment information, user personal information, etc.) and the data (including but not limited to data for analysis, stored data, presented data, etc.) related to the present application are information and data authorized by the user or fully authorized by each party, and the collection, use and processing of the related data need to comply with related laws and regulations and standards, and provide corresponding operation entries for the user to select authorization or rejection.
The foregoing detailed description of the invention has been presented for purposes of illustration and description, and it should be understood that the foregoing is by way of illustration and description only, and is not intended to limit the scope of the invention.

Claims (15)

1. A display method, comprising:
at a first moment, the terminal equipment displays a test object on a display screen;
The terminal equipment determines smear compensation parameters based on a first type image and a second type image, wherein the first type image is an image displayed by the terminal equipment before the position of the test object moves, the second type image is an image displayed by the terminal equipment in the position moving process of the test object, and the smear compensation parameters are used for adjusting display parameters of the display screen corresponding to the terminal equipment when the position of the test object moves.
2. The method of claim 1, wherein the terminal device displays the test object on a display screen, comprising:
the terminal equipment displays a first interface on a display screen, wherein the first interface comprises a first control;
The terminal equipment receives a first operation aiming at the first control;
And responding to the first operation, and displaying the test object on the display screen by the terminal equipment.
3. The method according to claim 1 or 2, wherein the smear compensation parameter comprises a first smear compensation parameter, the first smear compensation parameter being used for adjusting a display parameter of the display screen corresponding to the terminal device when the terminal device moves at a first brightness and the object moves at a first moving speed;
The terminal device determines smear compensation parameters based on the first type image and the second type image, and comprises:
the terminal equipment displays a first image, wherein the first image is the first type image displayed by the terminal equipment at a first brightness;
The terminal equipment displays a second image, wherein the second image is the second type image displayed by the terminal equipment when the first brightness and the test object move at the first moving speed;
The terminal equipment processes the pixel parameters corresponding to the first image and the pixel parameters corresponding to the second image to obtain a first color coordinate corresponding to the first image and a second color coordinate corresponding to the second image;
The terminal equipment determines the first smear compensation parameter based on a first gray level difference and a corresponding relation, wherein the first gray level difference is related to the first color coordinate and the second color coordinate, and the corresponding relation is the relation between the gray level difference and the smear compensation parameter.
4. A method according to claim 3, wherein the terminal device processes the pixel parameter corresponding to the first image and the pixel parameter corresponding to the second image to obtain a first color coordinate corresponding to the first image and a second color coordinate corresponding to the second image, and the method comprises:
The terminal equipment obtains monochromatic gray scales corresponding to the first image according to the brightness of the pixels corresponding to the first image or the driving voltage applied to the pixels corresponding to the first image;
The terminal equipment obtains the first color coordinate according to a conversion relation of the monochromatic gray scale corresponding to the first image, wherein the conversion relation is the conversion relation of the gray scale and the color coordinate;
the terminal equipment obtains monochromatic gray scales corresponding to the second image according to the brightness of the pixels corresponding to the second image or the driving voltage applied to the pixels of the second image;
and the terminal equipment obtains the second color coordinates according to the single-color gray scale corresponding to the second image in the conversion relation.
5. The method according to claim 3 or 4, wherein,
The pixel parameters corresponding to the first image are the pixel parameters corresponding to the first region of the test object in the first image; and the pixel parameters corresponding to the second image are the pixel parameters corresponding to the first region of the test object in the second image.
6. The method according to any one of claims 3-5, further comprising:
The terminal equipment obtains a bright color difference value based on the first color coordinate and the second color coordinate;
the terminal device determines the first smear compensation parameter based on a first gray level difference and a corresponding relation, including:
And when the obvious color difference value is larger than or equal to a preset threshold value, the terminal equipment determines the first smear compensation parameter based on the first gray difference and the corresponding relation.
7. The method of claim 6, wherein the terminal device obtains a brightly colored difference value based on the first color coordinate and the second color coordinate, comprising:
The terminal equipment obtains a first distance based on the first color coordinate and the second color coordinate, wherein the first distance is the distance between the first color coordinate and the second color coordinate;
The terminal equipment obtains the obvious color difference value based on the first distance, wherein the obvious color difference value is the ratio of the first distance to the first threshold value.
8. The method according to any one of claims 3-7, wherein the smear compensation parameter comprises a second smear compensation parameter, the second smear compensation parameter being used to adjust a display parameter of the display screen corresponding to the terminal device when the terminal device moves at a second brightness and the object moves at a second moving speed;
The terminal device determines smear compensation parameters based on the first type image and the second type image, and comprises:
the terminal equipment displays a third image, wherein the third image is the first type image displayed by the terminal equipment at the second brightness;
The terminal equipment displays a fourth image, wherein the fourth image is the second type image displayed by the terminal equipment when the second brightness and the test object move at the second moving speed;
The terminal equipment processes the pixel parameters corresponding to the third image and the pixel parameters corresponding to the fourth image to obtain third color coordinates corresponding to the third image and fourth color coordinates corresponding to the fourth image;
the terminal equipment determines a second smear compensation parameter based on a second gray level difference and a corresponding relation, wherein the second gray level difference is related to the third color coordinate and the fourth color coordinate, and the corresponding relation is the relation between the gray level difference and the smear compensation parameter.
9. The method according to any one of claims 1-8, further comprising:
The terminal equipment acquires the brightness of a display screen;
and when the brightness of the display screen is smaller than or equal to a first threshold value, the terminal equipment displays an interface based on the smear compensation parameter.
10. The method of claim 9, wherein the terminal device obtains the brightness of the display screen, comprising:
and the terminal equipment acquires backlight and ambient light parameters of the display screen.
11. The method according to any one of claims 1-10, further comprising:
The terminal equipment adjusts the smear compensation parameter based on a third type image and a fourth type image, wherein the smear compensation parameter after adjustment is used for adjusting the display parameter of the display screen corresponding to the object when moving after the terminal equipment is at the second moment;
The third type of images are images displayed by the terminal equipment before the position of the test object moves, the fourth type of images are images displayed by the terminal equipment in the process of moving the position of the test object, and the difference value between the second moment and the first moment is larger than or equal to a preset duration.
12. The method of claim 11, wherein the adjusted smear compensation parameter comprises a third smear compensation parameter, the third smear compensation parameter being used to adjust a display parameter of the display screen corresponding to the terminal device when the terminal device moves at a first brightness and the object moves at a first moving speed;
the terminal device adjusts the smear compensation parameter based on the third type image and the fourth type image, including:
the terminal equipment displays a fifth image, wherein the fifth image is the third type image displayed by the terminal equipment at the first brightness;
the terminal equipment displays a sixth image, wherein the sixth image is the fourth image displayed by the terminal equipment when the first brightness and the test object move at the first moving speed;
The terminal equipment processes the pixel parameters corresponding to the fifth image and the pixel parameters corresponding to the sixth image to obtain fifth color coordinates corresponding to the fifth image and sixth color coordinates corresponding to the sixth image;
And the terminal equipment determines a third smear compensation parameter based on a third gray scale difference and a corresponding relation, wherein the third gray scale difference is related to the fifth color coordinate and the sixth color coordinate.
13. An electronic device, comprising: a processor and a memory;
the memory stores computer-executable instructions;
The processor executing computer-executable instructions stored in the memory to cause the electronic device to perform the method of any one of claims 1-12.
14. A computer readable storage medium storing a computer program, which when executed by a processor performs the method according to any one of claims 1-12.
15. A computer program product comprising a computer program which, when run, causes a computer to perform the method of any of claims 1-12.
CN202211730500.1A 2022-12-30 Display method and related device Pending CN118277169A (en)

Publications (1)

Publication Number Publication Date
CN118277169A true CN118277169A (en) 2024-07-02

Family

ID=

Similar Documents

Publication Publication Date Title
CN111610847B (en) Page display method and device of third-party application program and electronic equipment
CN113763856B (en) Method and device for determining ambient illumination intensity and storage medium
CN113963659B (en) Display device and adjustment method thereof
EP4024838A1 (en) Display method for side region of screen, and electronic apparatus
CN116010076A (en) Application running method and related equipment
CN116703693A (en) Image rendering method and electronic equipment
CN113157357A (en) Page display method, device, terminal and storage medium
CN110730262A (en) Environment brightness value detection method and device and electronic equipment
CN114138215B (en) Display method and related equipment
CN114285936B (en) Screen brightness adjustment method and device, storage medium and terminal
CN115640083A (en) Screen refreshing method and equipment capable of improving dynamic performance
CN113010066B (en) Display parameter determination method and device
WO2023155606A1 (en) Screen refresh rate switching method, electronic device, and computer-readable storage medium
CN116091329B (en) Image processing method, device, equipment and storage medium
CN118277169A (en) Display method and related device
CN114863432A (en) Terminal device, contrast adjusting method, device and medium
CN117129085B (en) Ambient light detection method, electronic device and readable storage medium
CN116185245B (en) Page display method and electronic equipment
CN116672707B (en) Method and electronic device for generating game prediction frame
CN116204093B (en) Page display method and electronic equipment
CN115933952B (en) Touch sampling rate adjusting method and related device
CN116744106B (en) Control method of camera application and terminal equipment
CN117148959B (en) Frame rate adjusting method for eye movement tracking and related device
US20240203312A1 (en) System and method for modifying display content to obscure screen capture
CN117472479A (en) Display method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication