CN109389547B - Image display method and device - Google Patents

Image display method and device Download PDF

Info

Publication number
CN109389547B
CN109389547B CN201811160065.7A CN201811160065A CN109389547B CN 109389547 B CN109389547 B CN 109389547B CN 201811160065 A CN201811160065 A CN 201811160065A CN 109389547 B CN109389547 B CN 109389547B
Authority
CN
China
Prior art keywords
display
display effect
effect processing
roi
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811160065.7A
Other languages
Chinese (zh)
Other versions
CN109389547A (en
Inventor
纪东磊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Mobile Software Co Ltd
Original Assignee
Beijing Xiaomi Mobile Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaomi Mobile Software Co Ltd filed Critical Beijing Xiaomi Mobile Software Co Ltd
Priority to CN201811160065.7A priority Critical patent/CN109389547B/en
Publication of CN109389547A publication Critical patent/CN109389547A/en
Application granted granted Critical
Publication of CN109389547B publication Critical patent/CN109389547B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/20Processor architectures; Processor configuration, e.g. pipelining
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • G06F3/1431Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display using a single graphics controller

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Graphics (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

The disclosure relates to an image display method and device, wherein the method comprises the following steps: determining at least one region of interest, ROI, in the first display image for display effect processing; performing display effect processing on each ROI to obtain a second display image; and displaying the second display image. Therefore, the display effect processing method and device can achieve the function of performing display effect processing on the ROI only, and can improve the accuracy of image display.

Description

Image display method and device
Technical Field
The present disclosure relates to the field of image processing, and in particular, to an image display method and apparatus.
Background
Along with the wide use of intelligent terminals, the requirements on the display effect of the terminal screen are also higher and higher.
In the prior art, in order to be able to see the content of an image, details of a dark part in the image, and the like in an outdoor high illumination intensity environment, display effect processing can be performed on the whole picture to be displayed.
However, performing display effect processing on the whole picture to be displayed will result in that the area not requiring processing is also subjected to corresponding processing, thereby reducing the accuracy of image display.
Disclosure of Invention
In order to overcome the problems in the related art, the present disclosure provides an image display method and apparatus.
According to a first aspect of embodiments of the present disclosure, there is provided an image display method, the method including:
determining at least one region of interest, ROI, in the first display image for display effect processing;
performing display effect processing on each ROI to obtain a second display image;
and displaying the second display image.
Optionally, the determining at least one ROI for display effect processing in the first display image includes:
upon detecting that an initiator of drawing is ready to display the first display image, determining each of the ROIs included in the first display image; or (b)
Upon detecting that a graphics processor GPU processes the first display image, each of the ROIs included in the first display image is determined.
Optionally, the determining at least one ROI for display effect processing in the first display image further includes:
and determining position information of each ROI in the first display image.
Optionally, the position information of the ROI in the first display image includes coordinates of a first point and coordinates of a second point, wherein the first point is an upper left corner of the ROI, and the second point is a lower right corner of the ROI.
Optionally, the processing the display effect on each ROI includes:
determining a display effect processing mode corresponding to each ROI;
and performing display effect processing on each ROI by using the corresponding display effect processing mode.
Optionally, the display effect processing mode includes a software processing mode and/or a hardware processing mode;
the determining the display effect processing mode corresponding to each ROI comprises the following steps:
determining display effect processing requirements corresponding to all the ROIs;
and determining a display effect processing mode corresponding to each ROI by utilizing the corresponding display effect processing requirement.
Optionally, the display effect processing requirements corresponding to the ROIs are different, and the display effect processing modes corresponding to the different display effect processing requirements are different.
According to a second aspect of embodiments of the present disclosure, there is provided an image display apparatus, the apparatus including:
a determination module configured to determine at least one region of interest, ROI, in the first display image for display effect processing;
the processing module is configured to process the display effect of each ROI to obtain a second display image;
and a display module configured to display the second display image.
Optionally, the determining module includes:
a first determination sub-module configured to determine each of the ROIs included in the first display image upon detecting that an initiator of drawing is ready to display the first display image; or (b)
A second determination sub-module configured to determine each of the ROIs included in the first display image upon detecting that a graphics processor GPU is processing the first display image.
Optionally, the determining module further includes:
a third determination submodule configured to determine positional information of each ROI in the first display image.
Optionally, the position information of the ROI in the first display image includes coordinates of a first point and coordinates of a second point, wherein the first point is an upper left corner of the ROI, and the second point is a lower right corner of the ROI.
Optionally, the processing module includes:
a fourth determining submodule configured to determine a display effect processing mode corresponding to each ROI;
and the processing submodule is configured to perform display effect processing on each ROI by utilizing the corresponding display effect processing mode.
Optionally, the display effect processing mode includes a software processing mode and/or a hardware processing mode; the fourth determination submodule includes:
a fifth determination submodule configured to determine display effect processing requirements corresponding to each ROI;
a sixth determination submodule configured to determine a display effect processing mode corresponding to each ROI by using the corresponding display effect processing requirement.
Optionally, the display effect processing requirements corresponding to the ROIs are different, and the display effect processing modes corresponding to the different display effect processing requirements are different.
According to a third aspect of embodiments of the present disclosure, there is provided an image display apparatus including:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to:
determining at least one region of interest, ROI, in the first display image for display effect processing;
performing display effect processing on each ROI to obtain a second display image;
and displaying the second display image.
The technical scheme provided by the embodiment of the disclosure can comprise the following beneficial effects:
in the method, the user terminal can obtain the second display image by determining at least one ROI (region of interest) for display effect processing in the first display image and performing display effect processing on each ROI, and display the second display image, so that the function of performing display effect processing on the ROI only is realized, and the accuracy of image display is improved.
In the method, the user terminal can also process the display effect of each ROI by determining the display effect processing mode corresponding to each ROI and utilizing the display effect processing mode corresponding to each ROI, so that the flexibility of the display effect processing is enriched, and the reliability of the display effect processing is improved.
In the method, the user terminal can also determine the display effect processing mode corresponding to each ROI by determining the display effect processing requirement corresponding to each ROI and utilizing the display effect processing requirement corresponding to each ROI, thereby meeting the processing requirements of different display effects and improving the accuracy of display effect processing.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the disclosure and together with the description, serve to explain the principles of the disclosure.
FIG. 1 is a flowchart of an image display method according to an exemplary embodiment of the present disclosure;
FIG. 2 is an application scenario diagram of an image display method according to an exemplary embodiment of the present disclosure;
FIG. 3 is a flowchart of another image display method according to an exemplary embodiment of the present disclosure;
FIG. 4 is a flowchart of another image display method according to an exemplary embodiment of the present disclosure;
FIG. 5 is a block diagram of an image display device according to an exemplary embodiment of the present disclosure;
FIG. 6 is a block diagram of another image display device according to an exemplary embodiment of the present disclosure;
FIG. 7 is a block diagram of another image display device according to an exemplary embodiment of the present disclosure;
FIG. 8 is a block diagram of another image display device according to an exemplary embodiment of the present disclosure;
fig. 9 is a block diagram of another image display apparatus according to an exemplary embodiment of the present disclosure;
fig. 10 is a schematic diagram of a structure for an image display device according to an exemplary embodiment of the present disclosure.
Detailed Description
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples are not representative of all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with some aspects of the present disclosure as detailed in the accompanying claims.
The terminology used in the present disclosure is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. As used in this disclosure and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any or all possible combinations of one or more of the associated listed items.
It should be understood that although the terms first, second, third, etc. may be used in this disclosure to describe various information, these information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the present disclosure. The word "if" as used herein may be interpreted as "at … …" or "at … …" or "responsive to a determination", depending on the context.
As shown in fig. 1, fig. 1 is a flowchart of an image display method according to an exemplary embodiment of the present disclosure, and fig. 2 is a scene diagram of an image display method according to an exemplary embodiment; the method can be used in a user terminal (e.g., a smart phone), and as shown in fig. 1, the method can include the following steps:
in step 110, at least one ROI (Region of Interest ) in the first display image for display effect processing is determined.
In the embodiment of the disclosure, the first display image may be a frame image to be displayed.
To meet display requirements in complex scenarios, such as: in outdoor high illumination (lux) environments, where it is necessary to see the image content and dark details in the image, at least one ROI in the first display image for display effect processing may be determined first, and then the display effect processing may be performed on these ROIs.
In one embodiment, when performing step 110, the following two implementations may be included, but are not limited to:
(1-1) upon detecting that an initiator (e.g., APP) of a drawing is ready to display the first display image, determining respective ROIs for display effect processing included in the first display image; or (b)
(1-2) upon detecting that the GPU (Graphics Processing Unit, graphics processor) processes the first display image, determining respective ROIs for display effect processing included in the first display image.
In an embodiment, when performing the step 110, the method may further include:
(2-1) determining positional information of each of the ROIs in the first display image.
In an embodiment, the position information of the ROI in the first display image in (2-1) may include coordinates of a first point and coordinates of a second point, wherein the first point is an upper left corner of the ROI, and the second point is a lower right corner of the ROI.
In step 120, display effect processing is performed on each ROI, and a second display image is obtained.
In the embodiment of the disclosure, different ROIs may use the same or different display effect processing modes to perform display effect processing.
In an embodiment, when executing step 120, a display effect processing manner corresponding to each ROI may be determined first; and then performing display effect processing on each ROI by using the corresponding display effect processing mode (see the embodiment shown in FIG. 3).
In an embodiment, when determining the display effect processing manner corresponding to each ROI, determining the display effect processing requirement corresponding to each ROI; and determining a display effect processing mode corresponding to each ROI by using the corresponding display effect processing requirement (see the embodiment shown in FIG. 4).
In an embodiment, the display effect processing requirements corresponding to the ROIs are different, and the display effect processing modes corresponding to the different display effect processing requirements are different.
In step 130, a second display image is displayed.
In an exemplary scenario, as shown in fig. 2, taking a browsing picture as an example, the resolution of the mobile phone screen is 1920×1080, but the picture is 4:3 in size, so that when displayed, the picture does not cover the whole screen, but is positioned in the middle position information of the mobile phone screen, and is filled up and down. Therefore, when drawing, a buffer for storing a frame of image can be applied first, then the position information of the ROI in the frame of image is calculated, then the ROI content is written into the designated area in the buffer, after the buffer is written, the attribute (meta) information for representing the position information of the ROI is sent to the display control logic module (surfaceflinger), the surfaceflinger determines the ROI according to the meta information, and the ROI is processed to obtain a second display image, and the second display image is displayed.
As shown in fig. 2, the position information of the ROI may include coordinates of an upper left corner of the ROI, which is (0, 55), and coordinates of a lower right corner of the ROI, which is (1080, 1364). Wherein 0 is the horizontal pixel at the upper left corner, 55 is the vertical pixel at the upper left corner, 1080 is the horizontal pixel at the lower right corner, 1364 is the vertical pixel at the lower right corner.
As can be seen from the above embodiments, by determining at least one ROI for display effect processing in the first display image, performing display effect processing on each ROI to obtain a second display image, and displaying the second display image, the function of performing display effect processing on only the ROI is realized, and the accuracy of image display is also improved.
As shown in fig. 3, fig. 3 is a flowchart of another image display method according to an exemplary embodiment of the present disclosure, which may be used in a user terminal and is based on the method shown in fig. 1, and when step 120 is performed, the method may include the steps of:
in step 310, the display effect processing method corresponding to each ROI is determined.
In the embodiment of the disclosure, the display effect processing manners corresponding to different ROIs may be the same or different.
The display effect processing mode may be a software processing mode, a hardware processing mode, or a software processing mode and a hardware processing mode.
Such as: the required processing can be performed in the surfaceflinger through a software algorithm, or the buffer can be sent to a display driver (display driver) and then correspondingly processed by bottom hardware, and then the display is output.
As shown in FIG. 2, there is only one ROI, which occupies only a part of the mobile phone screen, and is black up and down, if the whole picture is processed, it will be determined that the black content is more, and the true image processing will be misaligned when the dynamic range is pulled up, that is, the data of the non-image part is referred to when the true image is processed, thus causing errors. Therefore, only the real image portion data (i.e., ROI) is subjected to the display effect processing in the present disclosure, so that the surrounding black non-image area is not considered, thereby making the display effect processing more accurate. That is, after the ROI is processed with the display effect, the black can be darker and the white can be whiter, so that the display purposes of needing to see the picture content and the details of the dark part in the picture are achieved.
In step 320, the display effect processing is performed on each ROI by using the display effect processing method corresponding to each ROI.
The above embodiment shows that the display effect processing method corresponding to each ROI is determined, and the display effect processing is performed on each ROI by using the display effect processing method corresponding to each ROI, so that the flexibility of the display effect processing is enriched, and the reliability of the display effect processing is improved.
As shown in fig. 4, fig. 4 is a flowchart of another image display method according to an exemplary embodiment of the present disclosure, which may be used in a user terminal, where the display effect processing manner includes a software processing manner and/or a hardware processing manner; and based on the method of fig. 3, when performing step 310, the following steps may be included:
in step 410, display effect processing requirements corresponding to each ROI are determined.
In the embodiment of the disclosure, the display effect processing requirements corresponding to different ROIs may be the same or different.
In an embodiment, the display effect processing requirements corresponding to the ROIs are different, and the display effect processing modes corresponding to the different display effect processing requirements are different. That is, after the display effect processing is performed according to different display effect processing modes, black can be enabled to be darker and white can be enabled to be whiter, and the display purposes of needing to see the picture content, dark part details in the picture and the like are achieved.
In step 420, the display effect processing method corresponding to each ROI is determined by using the display effect processing requirements corresponding to each ROI.
According to the embodiment, the display effect processing requirements corresponding to all the ROIs are determined, and the display effect processing modes corresponding to all the ROIs are determined by utilizing the display effect processing requirements corresponding to all the ROIs, so that different display effect processing requirements are met, and the accuracy of display effect processing is improved.
Corresponding to the foregoing image display method embodiments, the present disclosure also provides embodiments of an image display apparatus.
As shown in fig. 5, fig. 5 is a block diagram of an image display apparatus according to an exemplary embodiment of the present disclosure, which may be applied to a user terminal (e.g., a smart phone) and used to perform the image display method shown in fig. 1, the apparatus may include:
a determining module 51 configured to determine at least one region of interest, ROI, in the first display image for display effect processing;
a processing module 52, configured to perform display effect processing on each ROI to obtain a second display image;
a display module 53 configured to display the second display image.
As can be seen from the above embodiments, by determining at least one ROI for display effect processing in the first display image, performing display effect processing on each ROI to obtain a second display image, and displaying the second display image, the function of performing display effect processing on only the ROI is realized, and the accuracy of image display is also improved.
As shown in fig. 6, fig. 6 is a block diagram of another image display apparatus according to an exemplary embodiment of the present disclosure, which is based on the embodiment shown in fig. 5 described above, the determining module 51 may include:
a first determination sub-module 61 configured to determine each of the ROIs included in the first display image upon detecting that an initiator of drawing is ready to display the first display image; or (b)
A second determination submodule 62 configured to determine each of the ROIs included in the first display image upon detecting that a graphics processor GPU is processing the first display image.
As shown in fig. 7, fig. 7 is a block diagram of another image display apparatus according to an exemplary embodiment of the present disclosure, which may further include, on the basis of the foregoing embodiment shown in fig. 5 or 6:
a third determination submodule 71 configured to determine positional information of each of the ROIs in the first display image.
In an embodiment, the position information of the ROI in the first display image includes coordinates of a first point, which is an upper left corner of the ROI, and coordinates of a second point, which is a lower right corner of the ROI.
As shown in fig. 8, fig. 8 is a block diagram of another image display apparatus according to an exemplary embodiment of the present disclosure, which is based on the embodiment shown in fig. 5 described above, the processing module 52 may include:
a fourth determining submodule 81 configured to determine a display effect processing mode corresponding to each ROI;
and a processing sub-module 82 configured to perform display effect processing on each ROI by using the corresponding display effect processing manner.
The above embodiment shows that the display effect processing method corresponding to each ROI is determined, and the display effect processing is performed on each ROI by using the display effect processing method corresponding to each ROI, so that the flexibility of the display effect processing is enriched, and the reliability of the display effect processing is improved.
As shown in fig. 9, fig. 9 is a block diagram of another image display apparatus according to an exemplary embodiment of the present disclosure, which is based on the foregoing embodiment shown in fig. 8, in which the display effect processing manner includes a software processing manner and/or a hardware processing manner; the fourth determination submodule 81 may include:
a fifth determination submodule 91 configured to determine display effect processing requirements corresponding to the respective ROIs;
a sixth determination submodule 92 is configured to determine a display effect processing mode corresponding to each ROI by using the corresponding display effect processing requirements.
In an embodiment, the display effect processing requirements corresponding to the ROIs are different, and the display effect processing modes corresponding to the different display effect processing requirements are different.
According to the embodiment, the display effect processing requirements corresponding to all the ROIs are determined, and the display effect processing modes corresponding to all the ROIs are determined by utilizing the display effect processing requirements corresponding to all the ROIs, so that different display effect processing requirements are met, and the accuracy of display effect processing is improved.
Corresponding to fig. 5, the present disclosure also provides another image display apparatus, which may be applied to a user terminal, including:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to:
determining at least one region of interest, ROI, in the first display image for display effect processing;
performing display effect processing on each ROI to obtain a second display image;
and displaying the second display image.
The implementation process of the functions and roles of each unit in the above device is specifically shown in the implementation process of the corresponding steps in the above method, and will not be described herein again.
For the device embodiments, reference is made to the description of the method embodiments for the relevant points, since they essentially correspond to the method embodiments. The apparatus embodiments described above are merely illustrative, wherein the elements illustrated as separate elements may or may not be physically separate, and the elements shown as elements may or may not be physical elements, may be located in one place, or may be distributed over a plurality of network elements. Some or all of the modules may be selected according to actual needs to achieve the objectives of the disclosed solution. Those of ordinary skill in the art will understand and implement the present invention without undue burden.
As shown in fig. 10, fig. 10 is a schematic diagram of a structure for an image display device 1000 according to an exemplary embodiment of the present disclosure. For example, apparatus 1000 may be a mobile phone, computer, digital broadcast terminal, messaging device, game console, tablet device, medical device, exercise device, personal digital assistant, or the like having routing functionality.
Referring to fig. 10, the apparatus 1000 may include one or more of the following components: a processing component 1002, a memory 1004, a power component 1006, a multimedia component 1008, an audio component 1010, an input/output (I/O) interface 1012, a sensor component 1014, and a communication component 1016.
The processing component 1002 generally controls overall operation of the apparatus 1000, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing component 1002 can include one or more processors 1020 to execute instructions to perform all or part of the steps of the methods described above. Further, the processing component 1002 can include one or more modules that facilitate interaction between the processing component 1002 and other components. For example, the processing component 1002 can include a multimedia module to facilitate interaction between the multimedia component 1008 and the processing component 1002.
The memory 1004 is configured to store various types of data to support operations at the apparatus 1000. Examples of such data include instructions for any application or method operating on the device 1000, contact data, phonebook data, messages, pictures, videos, and the like. The memory 1004 may be implemented by any type or combination of volatile or nonvolatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disk.
The power supply component 1006 provides power to the various components of the device 1000. The power components 1006 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for the device 1000.
The multimedia component 1008 includes a screen between the device 1000 and the user that provides an output interface. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive input signals from a user. The touch panel includes one or more touch sensors to sense touches, swipes, and gestures on the touch panel. The touch sensor may sense not only the boundary of a touch or slide action, but also the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia assembly 1008 includes a front-facing camera and/or a rear-facing camera. The front camera and/or the rear camera may receive external multimedia data when the apparatus 1000 is in an operation mode, such as a photographing mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have focal length and optical zoom capabilities.
The audio component 1010 is configured to output and/or input audio signals. For example, the audio component 1010 includes a Microphone (MIC) configured to receive external audio signals when the device 1000 is in an operational mode, such as a call mode, a recording mode, and a speech recognition mode. The received audio signals may be further stored in memory 1004 or transmitted via communication component 1016. In some embodiments, the audio component 1010 further comprises a speaker for outputting audio signals.
The I/O interface 1012 provides an interface between the processing assembly 1002 and peripheral interface modules, which may be a keyboard, click wheel, buttons, and the like. These buttons may include, but are not limited to: homepage button, volume button, start button, and lock button.
The sensor assembly 1014 includes one or more sensors for providing status assessment of various aspects of the device 1000. For example, the sensor assembly 1014 may detect an on/off state of the device 1000, a relative positioning of the components, such as a display and keypad of the device 1000, the sensor assembly 1014 may also detect a change in positional information of the device 1000 or a component of the device 1000, the presence or absence of user contact with the device 1000, an orientation or acceleration/deceleration of the device 1000, and a change in temperature of the device 1000. The sensor assembly 1014 may include a proximity sensor configured to detect the presence of nearby objects in the absence of any physical contact. The sensor assembly 1014 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 1014 can also include an acceleration sensor, a gyroscopic sensor, a magnetic sensor, a pressure sensor, a microwave sensor, or a temperature sensor.
The communication component 1016 is configured to facilitate communication between the apparatus 1000 and other devices, either wired or wireless. The device 1000 may access a wireless network based on a communication standard, such as WiFi,2G or 3G, or a combination thereof. In one exemplary embodiment, the communication component 1016 receives broadcast signals or broadcast-related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component 1016 further includes a Near Field Communication (NFC) module to facilitate short range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, ultra Wideband (UWB) technology, bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the apparatus 1000 may be implemented by one or more Application Specific Integrated Circuits (ASICs), digital Signal Processors (DSPs), digital Signal Processing Devices (DSPDs), programmable Logic Devices (PLDs), field Programmable Gate Arrays (FPGAs), controllers, microcontrollers, microprocessors, or other electronic elements for executing the methods described above.
In an exemplary embodiment, a non-transitory computer readable storage medium is also provided, such as memory 1004, including instructions executable by processor 1020 of apparatus 1000 to perform the above-described method. For example, the non-transitory computer readable storage medium may be ROM, random Access Memory (RAM), CD-ROM, magnetic tape, floppy disk, optical data storage device, etc.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This disclosure is intended to cover any adaptations, uses, or adaptations of the disclosure following the general principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It is to be understood that the present disclosure is not limited to the precise arrangements and instrumentalities shown in the drawings, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (11)

1. An image display method, the method comprising:
determining at least two regions of interest (ROIs) for display effect processing in a first display image;
performing display effect processing on each ROI to obtain a second display image;
displaying the second display image;
the processing of the display effect on each ROI comprises the following steps:
determining display effect processing requirements corresponding to all the ROIs;
determining a display effect processing mode corresponding to each ROI by utilizing the corresponding display effect processing requirement;
performing display effect processing on each ROI by using the corresponding display effect processing mode; the display effect processing requirements corresponding to the ROIs are different, and the display effect processing modes corresponding to the different display effect processing requirements are different.
2. The method of claim 1, wherein the determining at least two ROIs for display effect processing in the first display image comprises:
upon detecting that an initiator of drawing is ready to display the first display image, determining each of the ROIs included in the first display image; or (b)
Upon detecting that a graphics processor GPU processes the first display image, each of the ROIs included in the first display image is determined.
3. The method according to claim 1 or 2, wherein said determining at least two ROIs for display effect processing in the first display image further comprises:
and determining position information of each ROI in the first display image.
4. A method according to claim 3, wherein the position information of the ROI in the first display image comprises coordinates of a first point which is an upper left corner of the ROI and coordinates of a second point which is a lower right corner of the ROI.
5. The method of claim 1, wherein the display effect processing means comprises software processing means and/or hardware processing means.
6. An image display device, the device comprising:
a determination module configured to determine at least two regions of interest, ROIs, for display effect processing in a first display image;
the processing module is configured to process the display effect of each ROI to obtain a second display image;
a display module configured to display the second display image;
the processing module comprises:
a fifth determination submodule configured to determine display effect processing requirements corresponding to each ROI;
a sixth determining submodule configured to determine a display effect processing mode corresponding to each ROI by using the corresponding display effect processing requirement;
the processing submodule is configured to process the display effect of each ROI by utilizing the corresponding display effect processing mode; the display effect processing requirements corresponding to the ROIs are different, and the display effect processing modes corresponding to the different display effect processing requirements are different.
7. The apparatus of claim 6, wherein the means for determining comprises:
a first determination sub-module configured to determine each of the ROIs included in the first display image upon detecting that an initiator of drawing is ready to display the first display image; or (b)
A second determination sub-module configured to determine each of the ROIs included in the first display image upon detecting that a graphics processor GPU is processing the first display image.
8. The apparatus of claim 6 or 7, wherein the determining module further comprises:
a third determination submodule configured to determine positional information of each ROI in the first display image.
9. The apparatus of claim 8, wherein the location information of the ROI in the first display image comprises coordinates of a first point and coordinates of a second point, the first point being an upper left corner of the ROI and the second point being a lower right corner of the ROI.
10. The apparatus of claim 6, wherein the display effect processing means comprises software processing means and/or hardware processing means.
11. An image display device, the device comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to:
determining at least two regions of interest (ROIs) for display effect processing in a first display image;
performing display effect processing on each ROI to obtain a second display image;
displaying the second display image;
the processing of the display effect on each ROI comprises the following steps:
determining display effect processing requirements corresponding to all the ROIs;
determining a display effect processing mode corresponding to each ROI by utilizing the corresponding display effect processing requirement;
performing display effect processing on each ROI by using the corresponding display effect processing mode; the display effect processing requirements corresponding to the ROIs are different, and the display effect processing modes corresponding to the different display effect processing requirements are different.
CN201811160065.7A 2018-09-30 2018-09-30 Image display method and device Active CN109389547B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811160065.7A CN109389547B (en) 2018-09-30 2018-09-30 Image display method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811160065.7A CN109389547B (en) 2018-09-30 2018-09-30 Image display method and device

Publications (2)

Publication Number Publication Date
CN109389547A CN109389547A (en) 2019-02-26
CN109389547B true CN109389547B (en) 2023-05-09

Family

ID=65419254

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811160065.7A Active CN109389547B (en) 2018-09-30 2018-09-30 Image display method and device

Country Status (1)

Country Link
CN (1) CN109389547B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115408071A (en) * 2021-05-26 2022-11-29 华为技术有限公司 Dynamic effect calculation method and device

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016044983A1 (en) * 2014-09-22 2016-03-31 华为技术有限公司 Image processing method and apparatus and electronic device
CN107122189A (en) * 2017-04-27 2017-09-01 北京小米移动软件有限公司 Method for displaying image and device

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150193658A1 (en) * 2014-01-09 2015-07-09 Quentin Simon Charles Miller Enhanced Photo And Video Taking Using Gaze Tracking
CN107111749A (en) * 2014-12-22 2017-08-29 诺瓦赛特有限公司 System and method for improved display
CN107896303A (en) * 2017-10-23 2018-04-10 努比亚技术有限公司 A kind of image-pickup method, system and equipment and computer-readable recording medium

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016044983A1 (en) * 2014-09-22 2016-03-31 华为技术有限公司 Image processing method and apparatus and electronic device
CN107122189A (en) * 2017-04-27 2017-09-01 北京小米移动软件有限公司 Method for displaying image and device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
陈俊彦 ; 张红梅 ; 李鹏飞 ; 何燕 ; .基于多目视觉技术的多点触摸屏系统的实现方法.计算机测量与控制.2015,(01),全文. *

Also Published As

Publication number Publication date
CN109389547A (en) 2019-02-26

Similar Documents

Publication Publication Date Title
US9674395B2 (en) Methods and apparatuses for generating photograph
US10026381B2 (en) Method and device for adjusting and displaying image
EP3182716A1 (en) Method and device for video display
CN104918107B (en) The identification processing method and device of video file
CN104238911B (en) Load icon display method and device
EP3147802B1 (en) Method and apparatus for processing information
CN104238890B (en) Character displaying method and device
CN111078170B (en) Display control method, display control device, and computer-readable storage medium
EP3905660A1 (en) Method and device for shooting image, and storage medium
CN105678296B (en) Method and device for determining character inclination angle
CN107219989B (en) Icon processing method and device and terminal
CN109408022A (en) Display methods, device, terminal and storage medium
CN112331158B (en) Terminal display adjusting method, device, equipment and storage medium
CN108550127A (en) image processing method, device, terminal and storage medium
CN109389547B (en) Image display method and device
CN112565625A (en) Video processing method, apparatus and medium
CN112445348A (en) Expression processing method, device and medium
CN106375744B (en) Information projecting method and device
CN111246012B (en) Application interface display method and device and storage medium
CN111724398A (en) Image display method and device
CN111538447A (en) Information display method, device, equipment and storage medium
CN110876015B (en) Method and device for determining image resolution, electronic equipment and storage medium
CN110876013B (en) Method and device for determining image resolution, electronic equipment and storage medium
CN110955328B (en) Control method and device of electronic equipment and storage medium
CN112506393B (en) Icon display method and device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant