CN113079334A - Computer system and image compensation method thereof - Google Patents

Computer system and image compensation method thereof Download PDF

Info

Publication number
CN113079334A
CN113079334A CN202010009947.4A CN202010009947A CN113079334A CN 113079334 A CN113079334 A CN 113079334A CN 202010009947 A CN202010009947 A CN 202010009947A CN 113079334 A CN113079334 A CN 113079334A
Authority
CN
China
Prior art keywords
image
resolution
level
resolution information
adjustment parameter
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010009947.4A
Other languages
Chinese (zh)
Inventor
林子杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Acer Inc
Original Assignee
Acer Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Acer Inc filed Critical Acer Inc
Priority to CN202010009947.4A priority Critical patent/CN113079334A/en
Publication of CN113079334A publication Critical patent/CN113079334A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/765Interface circuits between an apparatus for recording and another apparatus
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/91Television signal processing therefor

Abstract

The invention provides a computer system and an image compensation method thereof. In the method, a difference between the first resolution information and the second resolution information is determined, the first image is analyzed to generate an image adjustment parameter in response to the difference between the first resolution information and the second resolution information, and the first image is adjusted according to the image adjustment parameter and the second resolution information to form a second image. The first resolution information is related to a first image captured by the image capturing device, and the second resolution information is related to a second image presented by the display. The image adjustment parameter is related to enhancing the image definition, and the difference is that the resolution corresponding to the first resolution information is lower than the resolution corresponding to the second resolution information. The second image is used for being presented by the display. Therefore, the definition of the displayed image can be improved.

Description

Computer system and image compensation method thereof
Technical Field
The present invention relates to image processing technologies, and more particularly, to a computer system and an image compensation method thereof.
Background
Devices such as cell phones, laptops, desktops, Video-Through-Head-Mounted displays (VST-HMDs), etc. typically carry (e.g., external or internal) both a camera and a Display. In applications of these devices, images can be captured by the camera and displayed on the display quickly (or in real time). Taking a video transmission type head-mounted display (VST-HMD) as an example, the VST-HMD mainly includes a panel display and an optical lens to form an image display system, and can shoot an external environment through a camera on the device, and then display captured images, so that a user can watch the external world through the camera. Such a head-mounted display can be widely applied to technologies such as Virtual Reality (VR), Mixed Reality (MR), Augmented Reality (AR), Extended Reality (XR), and the like.
However, the aforementioned devices with cameras and displays may encounter a situation where the resolution of the source image (captured by the camera) is lower than that of the display during the application process. Under such a condition, the image displayed by the display may have the defects of image blur, image blocking effect or flickering of the dynamic image, thereby affecting the user experience.
Disclosure of Invention
In view of this, embodiments of the present invention provide a computer system and an image compensation method thereof, which improve the resolution and definition of an image captured by a camera, and make the image displayed on a display clearer.
The image compensation method of the embodiment of the invention includes, but is not limited to, the following steps: determining a difference between the first resolution information and the second resolution information; analyzing the first image to generate an image adjustment parameter in response to a difference between the first resolution information and the second resolution information; and adjusting the first image according to the image adjustment parameter and the second resolution information to form a second image. The first resolution information is related to a first image captured by the image capturing device, and the second resolution information is related to a second image presented by the display. The image adjustment parameter is related to enhancing the image definition, and the difference is that the resolution corresponding to the first resolution information is lower than the resolution corresponding to the second resolution information. The second image is used for being presented by the display.
The computer system of the embodiment of the invention includes, but is not limited to, an image capturing device, a display and a processor. The image capturing device is used for capturing a first image. The display is used for presenting a second image. The processor is coupled with the image capturing device and the display. The processor is configured to perform the following steps: determining a difference between the first resolution information and the second resolution information; analyzing the first image to generate an image adjustment parameter in response to a difference between the first resolution information and the second resolution information; and adjusting the first image according to the image adjustment parameter and the second resolution information to form a second image. The first resolution information is associated with the first image, and the second resolution information is associated with the second image. The image adjustment parameter is related to enhancing the image definition, and the difference is that the resolution corresponding to the first resolution information is lower than the resolution corresponding to the second resolution information. The processor displays the second image on the display.
Based on the above, in the computer system and the image compensation method thereof according to the embodiment of the invention, if the corresponding resolution of the image obtained by the image capturing device is lower than that of the display, the image is analyzed to increase the resolution of the image and enhance the definition. Therefore, more efficient resource allocation can be used, and the image displayed by the display is clearer.
In order to make the aforementioned and other features and advantages of the invention more comprehensible, embodiments accompanied with figures are described in detail below.
Drawings
FIG. 1 is a block diagram of components of a computer system according to an embodiment of the invention.
FIG. 2 is a flowchart illustrating an image compensation method according to an embodiment of the invention.
Wherein:
100: a computer system;
110: an image capturing device;
120: a display;
130: a processor;
s201 to S250.
Detailed Description
FIG. 1 is a block diagram of a computer system 100 according to an embodiment of the invention. Referring to FIG. 1, a computer system 100 includes, but is not limited to, an image capturing device 110, a display 120, and a processor 130. The computer system 100 may be a smartphone, tablet, pen-notebook, desktop, head-mounted display, or the like.
The image capturing device 110 may be a camera, a video camera or other devices with image capturing function. In one embodiment, the image capturing device 110 is used for capturing images.
The display 120 is coupled to the processor 130, and the display 120 may be a Liquid Crystal Display (LCD), a Light Emitting Diode (LED) display, an Organic Light Emitting Diode (OLED) display, or other type of display. In one embodiment, the display 120 is used for displaying images.
The Processor 130 is coupled to the image capturing device 110 and the display 120, respectively, and the Processor 130 may be a Central Processing Unit (CPU), or other programmable general purpose or special purpose Microprocessor (Microprocessor), Digital Signal Processor (DSP), programmable controller, Application-Specific Integrated Circuit (ASIC), or other similar devices or combinations thereof. In one embodiment, the processor 130 is configured to obtain an image from the image capturing device 110 and further process the image.
It is noted that, in some embodiments, the processor 130 may be embedded in the image capturing device 110 or the display 120. That is, the processor 130 is a processor of the image capturing device 110 or the display 120.
To facilitate understanding of the operation flow of the embodiment of the present invention, the following describes the flow of image processing in the embodiment of the present invention in detail with reference to various embodiments. The method of the present invention will be described with reference to the components and modules of the computer system 100. The various processes of the method may be adapted according to the implementation, and are not limited thereto.
FIG. 2 is a flowchart illustrating an image compensation method according to an embodiment of the invention. Referring to fig. 2, the processor 130 determines a difference between the first analysis information and the second analysis information (step S210). Specifically, the processor 130 inputs a first image captured by the image capturing device 110 and obtains the first image or first resolution information of the image capturing device 110. That is, the first resolution information is related to the first image captured by the image capturing device 110. The first resolution information may be the number of pixels of the first picture that are high and/or wide (i.e., the height and/or width of the resolution). For example, the processor 130 obtains metadata (metadata) of the first image, and the metadata records the picture resolution. Alternatively, the processor 130 obtains the device identification information of the image capturing device 110 from the image capturing device 110 or the memory to obtain the resolution supported by the image capturing device 110. On the other hand, the processor 130 obtains second resolution information of the display 120. For example, the processor 130 may obtain the second resolution information from the device identification information of the display 120 or the current setting of the display resolution by the operating system. The second resolution information may be a number of high and/or wide pixels of the display presenting the second imagery. That is, the second resolution information is related to the second image displayed on the display 120.
Then, the processor 130 may compare the resolution of the first image with the resolution of the display 120, and/or compare the resolution of the first image with the resolution of the display 120. In one embodiment, the processor 130 may determine the ratio of the two. For example, the resolution Ratio value RatioRWComprises the following steps:
Figure BDA0002356770230000041
wherein Resolution _ WidthDisplayIs the Resolution _ Width of the display 120SourceIs the width of the resolution of the first image. While the high (Height) Ratio of resolutionRHComprises the following steps:
Figure BDA0002356770230000042
wherein Resolution _ HeightDisplayIs the high Resolution of the display 120, Resolution _ HeightSourceThe resolution of the first image is high. In another embodiment, the processor 130 may directly compare the size of the high and/or wide of the resolution.
Then, in response to the difference between the first resolution information and the second resolution information, the processor 130 analyzes the first image to generate an image adjustment parameter (step S230). The difference of the embodiment of the invention is that the resolution corresponding to the first resolution information is lower than the resolution corresponding to the second resolution information. It is noted that the lower resolution images may appear on the display 120 as blurs, blocking artifacts, or image flicker. Therefore, if the two pieces of information are different as a result of the determination in step S210, the first image needs to be further subjected to image sharpness compensation.
In one embodiment, the processor 130 may compare the ratio of equation (1) and/or equation (2) with the corresponding threshold value. If the ratio is greater than the corresponding threshold, the processor 130 determines that there is a difference between the two pieces of information; otherwise, the processor 130 determines that there is no difference or very little difference between the two pieces of information.
For example, assume that the aspect ratio of the first image is not changed. If the width of the first image resolution is larger than the height, the processor 130 sets the Ratio of the width RatioRWAnd if the first threshold value is larger than the first threshold value, the image definition compensation is carried out. Conversely, if the first image resolution is higher than the first image resolution is wider, the processor 130 sets the Height RatioRHAnd if the second threshold value is larger than the first threshold value, image definition compensation is carried out. It should be noted that the values of the first and second thresholds may vary depending on the actual situation. For example, the first threshold and/or the second threshold may be set to 1 (i.e., when the resolution (height or width) of the display 120 is greater than the resolution (height or width) of the first image (the image captured by the image capturing device 110), the processor 130 performs the image resolution compensation). For another example, the first threshold and/or the second threshold may be set to 1.5.
In another embodiment, the processor 130 may directly compare the height and/or width of the resolution of the two pieces of information. If the first resolution information is lower than the resolution corresponding to the second resolution information, or the difference between the two resolutions is higher than a specific value (e.g., 100, 200, or 300 pixels), the processor 130 may determine that there is a difference (i.e., image sharpness compensation is required).
If the image sharpness compensation is desired, the processor 130 further determines image adjustment parameters for enhancing the image sharpness. The processor 130 may analyze the first image to obtain a corresponding image adjustment parameter. The image adjustment parameter may be associated with contrast enhancement, sharpness enhancement, or other image processing that improves image sharpness.
In one embodiment, the processor 130 may determine the corresponding image adjustment parameter according to a comparison result between the first resolution information and the second resolution information. Specifically, the processor 130 may determine the magnitude relationship between the display resolution of the display 120 and the resolution of the first image according to the ratio calculated by equation (1) or equation (2). If the ratio is greater than 1, it indicates that the display resolution of the display 120 is greater than the resolution of the first image. If the ratio is larger, the phenomena such as blurring and blocking artifacts may be more serious, and the processor 130 needs to perform a corresponding trend of image sharpness adjustment on the first image (i.e., the higher the ratio, the higher the intensity of the image adjustment parameter). If the ratio is less than 1, it indicates that the display resolution of the display 120 is less than the resolution of the first image, i.e. the "display image resolution of the display 120 is decreased" is less likely to occur, and the processor 150 may disable (disable) the adjustment of the first image.
In another embodiment, the processor 130 determines the texture level of the first image. Specifically, the processor 130 may count the corresponding edge intensity of each pixel in the first image, and obtain the texture level of the first image according to the statistical result. For example, the Texture Level Texture _ Level of the first image can be obtained by equation (3):
Figure BDA0002356770230000061
EdgelLV is the edge intensity level (0 ~ 255 for example), PixelCountEdgeLVIs the number of edge intensity levels of the pixel equal to EdgeLV. That is, the texture level is the arithmetic average of the edge intensity levels of all pixels in the first image.
It should be noted that, in other embodiments, the texture level may also be a median, a mode or other representative value of the edge intensity levels of all the pixels in the first image. In addition, in some embodiments, the processor 130 may perform edge enhancement on the first image before counting the edge intensity.
After determining the texture level, the processor 130 may determine the corresponding image adjustment parameter according to the level of the texture level. If the texture level of the first image is higher (possibly compared to a certain threshold), it indicates that the first image itself may have more or more complex textures, and the effect of subsequent adjustment (e.g., image sharpness enhancement) may be higher, so that the processor 130 has to perform a relatively higher intensity image adjustment on the first image. That is, the higher the texture level, the higher the intensity of the image adjustment parameter. On the other hand, a low texture level (possibly compared to a certain threshold) of the first image indicates that the first image may be a smoother frame and the effect of subsequent adjustment may be low, so that the first image does not need to perform excessive image adjustment or even no adjustment. That is, the lower the texture level, the lower the intensity of the image adjustment parameter. In addition, for texture levels between the highest and highest texture levels, the processor 150 may determine corresponding image adjustment parameters according to the aforementioned trend.
In another embodiment, the processor 130 may determine the dynamic level of the first image. Specifically, the processor 130 may count a motion vector (motion vector) corresponding to at least one block (composed of one or more pixels) in the first image, and obtain a motion level of the first image according to the statistical result. For example, the processor 130 may determine an average value of motion vector sizes of the blocks in the first image as a motion value or a motion level (hereinafter, referred to as motion level) of the whole first image. For another example, the motion level may be a median, mode or other representative value of the motion vector size of the blocks in the first image.
After determining the dynamic level, the processor 130 may determine the corresponding image adjustment parameter according to the level of the dynamic level. If the dynamic grade is lower, the first image is closer to the static picture; conversely, a higher motion level indicates that the first image is closer to the motion picture. Generally, the higher the scene motion level of the first image, the higher the intensity of the adjustment. That is, the higher the motion level, the higher the intensity of the image adjustment parameter. On the other hand, the lower the scene dynamic level of the first video is, the less or even no adjustment is required to be performed. That is, the lower the dynamic level, the lower the intensity of the image adjustment parameter.
In another embodiment, the processor 130 determines a texture level and/or a dynamic level of the first image, and gives a difference between the first resolution information and the second resolution information and a corresponding weight value of the texture level and/or the dynamic level. Specifically, the image adjustment parameters corresponding to the different resolution ratios, the different texture levels, and the different dynamic levels may be given by experiments in advance (e.g., forming a comparison table or an equation), or may be generated by using training methods such as machine learning, which is not limited herein.
Further, these levels or differences may also use a way of weight assignment. In one embodiment, the weight value is determined according to the following mechanism: if the difference of the resolution, the texture level or the dynamic level is higher, the corresponding weight value is higher; the lower the difference in resolution, texture level or dynamic level, the lower the corresponding weight value. Then, the processor 130 may determine the intensity (e.g., the sharpness intensity, or the contrast intensity) of the corresponding image adjustment parameter according to the difference and the corresponding weight value of the texture level and/or the dynamic level (e.g., performing weight distribution or weight calculation).
For example, assuming that the first threshold (or the second threshold) for the comparison resolution is set to 1, the processor 130 previously divides the values of the ratios possibly derived from equation (1) or equation (2) into fifty groups (e.g., a first resolution group to a fifty resolution group, the first resolution group being the group with the lowest value of the ratios, the fifty resolution group being the group with the highest value of the ratios, and so on). The level values that may be derived by equation (3) are divided into thirty groups (e.g., the first texture group through the thirtieth texture group, the first texture group being the lowest value group, the thirtieth texture group being the highest value group, and so on). In addition, the values that may be derived from the dynamic levels are also divided into twenty groups (e.g., first to twentieth dynamic groups, the first dynamic group being the lowest value group, the twentieth dynamic group being the highest value group, and so on). In addition, assume that the intensity of the image adjustment parameter for sharpness adjustment has one hundred levels (e.g., first level to one hundred levels).
The processor 130 then determines the intensity of the final image adjustment parameter by weight assignment. For example, resolution groups are fifty percent, texture groups are thirty percent, and dynamic groups are twenty percent. The processor 130 assigns the results of the three reference terms (e.g., resolution difference, texture level, and motion level) according to a ratio (e.g., fifty, thirty, and twenty percent). That is, the three reference items differ in their degree of priority. The three reference items are all positively correlated in value with the intensity of the image adjustment parameter. Therefore, when the weights are assigned, the processor 130 may assign the highest value of the group corresponding values to the maximum value of the occupied weights (for example, the texture item is assigned to the thirty percent ratio, and the thirtieth texture group corresponds to the thirty percent weight), assign the lowest value of the group corresponding values to the minimum value of the occupied weights (for example, the weight value corresponding to the first texture level is one percent), and finally add the weight values corresponding to the three reference items (i.e., the sum of the assigned weight values) to serve as the strength of the image adjustment parameter.
For example, if the results obtained according to the current first image are the first resolution group, the first texture group and the first dynamic group, the intensity of the corresponding image adjustment parameter is the third level (three percent plus three percent). For another example, if the results obtained from the current first image are the fiftieth resolution group, the thirtieth texture group, and the twentieth dynamic group, the intensity of the corresponding image adjustment parameter is at the hundredth level (a solution of fifty percent plus thirty percent plus twenty percent is one hundred percent).
It should be noted that the foregoing weight ratio and the number of groups of each reference item are only used as examples and may vary according to actual situations, and the embodiments of the present invention are not limited thereto.
Next, the processor 130 adjusts the first image according to the image adjustment parameter and the second resolution information obtained in step S230 to form a second image (step S250). Specifically, the second image is an image for presentation by the display 120. The processor 150 may increase the image resolution of the first image to a resolution corresponding to the second resolution information. For example, the first resolution information is 480P (720 × 480) and the second resolution information is 1080P (1920 × 1080) (i.e., the display resolution of the display 120), the processor 130 may increase the 480P first image to 1080P resolution. It should be noted that, in other embodiments, the processor 130 may also adjust the image resolution of the first image to a resolution between the resolutions corresponding to the first resolution information and the second resolution information. Advantageously, the first resolution information is 480P and the second resolution information is 1080P, the processor 130 may increase the 480P first image to a resolution of 720P (1280 × 720).
Next, the processor 130 performs image processing related to sharpness enhancement, such as sharpness enhancement, contrast enhancement, etc., on the first image with increased resolution according to the image adjustment parameters obtained in step S230 to form a second image. That is, the second image is an image of the first image with increased resolution and enhanced sharpness. The processor 130 may then display the second image on the display 120.
It should be noted that the time interval between the generation of the first image and the display of the second image may be less than a specific time threshold (e.g., 1 second, 500 milliseconds, or 100 milliseconds, etc.), so that the display 120 can display the image captured by the image capturing device 110 (e.g., Live View) in real time. In addition, if there is no difference or a small difference in the determination result of step S210, the processor 130 may directly use the first image as the second image to be displayed by the display 120. That is, the processor 130 displays the first image on the display 120.
In summary, in the computer system and the image compensation method thereof according to the embodiments of the invention, for the case that the resolution of the first image captured by the image capturing device is smaller than the resolution of the display, the difference between the first image and the display resolution, the texture level and the dynamic level of the first image are analyzed, and the image adjustment parameters corresponding to the reference items are obtained accordingly. Then, the resolution of the first image can be increased, and corresponding image definition enhancement adjustment can be performed according to the image adjustment parameters. Therefore, the display image of the display can be clearer by utilizing more efficient resource allocation.
Although the present invention has been described with reference to the above embodiments, it should be understood that various changes and modifications can be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (10)

1. An image compensation method includes:
determining a difference between first resolution information and second resolution information, wherein the first resolution information is related to a first image captured by an image capturing device, and the second resolution information is related to a second image presented by a display;
analyzing the first image to generate an image adjustment parameter in response to the difference between the first resolution information and the second resolution information, wherein the image adjustment parameter is related to enhancing image definition, and the difference is that the resolution corresponding to the first resolution information is lower than the resolution corresponding to the second resolution information; and
and adjusting the first image according to the image adjustment parameter and the second resolution information to form a second image, wherein the second image is used for being displayed by the display.
2. The image compensation method of claim 1, wherein the first resolution information comprises a height or width of a resolution of the first image, the second resolution information comprises a height or width of a resolution of the display, and the step of determining the difference between the first resolution information and the second resolution information comprises:
comparing a height of the resolution of the first image with a height of the resolution of the display, or comparing a width of the resolution of the first image with a width of the resolution of the display; the step of analyzing the first image to generate the image adjustment parameter includes:
determining the corresponding image adjustment parameter according to the comparison result.
3. The image compensation method of claim 1, wherein the step of analyzing the first image to generate the image adjustment parameter comprises:
judging the texture grade of the first image, wherein the corresponding edge intensity of each pixel in the first image is counted, and the texture grade of the first image is obtained according to the counting result; and
determining the corresponding image adjustment parameter according to the texture level, wherein
If the texture level is higher, the intensity of the image adjustment parameter is higher, and
the lower the texture level, the lower the intensity of the image adjustment parameter.
4. The image compensation method of claim 1, wherein the step of analyzing the first image to generate the image adjustment parameter comprises:
judging the dynamic grade of the first image, wherein the corresponding dynamic vector of at least one block in the first image is counted, and the dynamic grade of the first image is obtained according to the counting result; and
determining the corresponding image adjustment parameter according to the dynamic grade, wherein
If the dynamic grade is higher, the intensity of the image adjusting parameter is higher, and
the lower the dynamic level, the lower the intensity of the image adjustment parameter.
5. The image compensation method of claim 1, wherein the step of analyzing the first image to generate the image adjustment parameter comprises:
determining at least one of a texture level and a motion level of the first image, wherein the texture level is associated with an edge strength and the motion level is associated with a motion vector;
giving the difference between the first resolution information and the second resolution information, and a corresponding weight value to the at least one of the texture level and the dynamic level, wherein
If the difference, the texture level or the dynamic level is higher, the corresponding weight value is higher, and
if the difference, the texture level or the dynamic level is lower, the corresponding weight value is lower; and
determining the corresponding image adjustment parameter according to the difference and the corresponding weight value of the at least one of the texture level and the dynamic level.
6. A computer system, comprising:
an image capturing device for capturing a first image;
a display for displaying a second image; and
a processor coupled to the image capture device and the display and configured to:
determining a difference between first resolution information and second resolution information, wherein the first resolution information is associated with the first image and the second resolution information is associated with the second image;
analyzing the first image to generate an image adjustment parameter in response to the difference between the first resolution information and the second resolution information, wherein the image adjustment parameter is related to enhancing image definition, and the difference is that the resolution corresponding to the first resolution information is lower than the resolution corresponding to the second resolution information; and
and adjusting the first image according to the image adjustment parameter and the second resolution information to form the second image, wherein the second image is displayed on the display.
7. The computer system of claim 6, wherein the first resolution information includes a height or width of resolution of the first image, the second resolution information includes a height or width of resolution of the display, and the processor is further configured to:
comparing a height of the resolution of the first image with a height of the resolution of the display, or comparing a width of the resolution of the first image with a width of the resolution of the display; and
determining the corresponding image adjustment parameter according to the comparison result.
8. The computer system of claim 6, wherein the processor is further configured to:
judging the texture grade of the first image, wherein the corresponding edge intensity of each pixel in the first image is counted, and the texture grade of the first image is obtained according to the counting result; and
determining the corresponding image adjustment parameter according to the texture level, wherein
If the texture level is higher, the intensity of the image adjustment parameter is higher, and
the lower the texture level, the lower the intensity of the image adjustment parameter.
9. The computer system of claim 6, wherein the processor is further configured to:
judging the dynamic grade of the first image, wherein the corresponding dynamic vector of at least one block in the first image is counted, and the dynamic grade of the first image is obtained according to the counting result; and
determining the corresponding image adjustment parameter according to the dynamic grade, wherein
If the dynamic grade is higher, the intensity of the image adjusting parameter is higher, and
the lower the dynamic level, the lower the intensity of the image adjustment parameter.
10. The computer system of claim 6, wherein the processor is further configured to:
determining at least one of a texture level and a motion level of the first image, wherein the texture level is associated with an edge strength and the motion level is associated with a motion vector;
giving the difference between the first resolution information and the second resolution information, and a corresponding weight value to the at least one of the texture level and the dynamic level, wherein
If the difference, the texture level or the dynamic level is higher, the corresponding weight value is higher, and
if the difference, the texture level or the dynamic level is lower, the corresponding weight value is lower; and
determining the corresponding image adjustment parameter according to the difference and the corresponding weight value of the at least one of the texture level and the dynamic level.
CN202010009947.4A 2020-01-06 2020-01-06 Computer system and image compensation method thereof Pending CN113079334A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010009947.4A CN113079334A (en) 2020-01-06 2020-01-06 Computer system and image compensation method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010009947.4A CN113079334A (en) 2020-01-06 2020-01-06 Computer system and image compensation method thereof

Publications (1)

Publication Number Publication Date
CN113079334A true CN113079334A (en) 2021-07-06

Family

ID=76609036

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010009947.4A Pending CN113079334A (en) 2020-01-06 2020-01-06 Computer system and image compensation method thereof

Country Status (1)

Country Link
CN (1) CN113079334A (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW200507619A (en) * 2003-08-14 2005-02-16 Inventec Multimedia & Telecom Image capture system
US20090213234A1 (en) * 2008-02-18 2009-08-27 National Taiwan University Method of full frame video stabilization
CN105631926A (en) * 2014-11-20 2016-06-01 三星电子株式会社 Image processing apparatus and method
CN105704398A (en) * 2016-03-11 2016-06-22 咸阳师范学院 Video processing method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW200507619A (en) * 2003-08-14 2005-02-16 Inventec Multimedia & Telecom Image capture system
US20090213234A1 (en) * 2008-02-18 2009-08-27 National Taiwan University Method of full frame video stabilization
CN105631926A (en) * 2014-11-20 2016-06-01 三星电子株式会社 Image processing apparatus and method
CN105704398A (en) * 2016-03-11 2016-06-22 咸阳师范学院 Video processing method

Similar Documents

Publication Publication Date Title
WO2021259122A1 (en) Backlight adjustment method and backlight adjustment device for display device, and display device
WO2021179851A1 (en) Image processing method and device, and terminal and storage medium
JP6615917B2 (en) Real-time video enhancement method, terminal, and non-transitory computer-readable storage medium
CN108702514B (en) High dynamic range image processing method and device
CN108962185B (en) Method for reducing display image brightness, device thereof and display device
US20140139561A1 (en) Display Processing Method Display Processing Device and Display
CN108009997B (en) Method and device for adjusting image contrast
CN104658487B (en) Adjust method, device and the mobile terminal of brightness of image
US10013747B2 (en) Image processing method, image processing apparatus and display apparatus
TWI567707B (en) Image adjusting method and related display
JP2015039085A (en) Image processor and image processing method
US20110142342A1 (en) Image processing apparatus, image processing method and program
US20080107335A1 (en) Methods for processing image signals and related apparatus
JP2017156935A (en) Image quality evaluation device, image quality evaluation method and program
CN111105764A (en) Display driving method and system for relieving display ghost
CN101399909A (en) Adaptive image regulating method and image processing device using the method
US9208749B2 (en) Electronic device and method for enhancing readability of an image thereof
CN112700456A (en) Image area contrast optimization method, device, equipment and storage medium
US20150029204A1 (en) Dynamic localized contrast enhancement method for image and computer readable medium of the same
CN109064441B (en) Mura detection method based on independent component adaptive selection
CN113079334A (en) Computer system and image compensation method thereof
TWI740326B (en) Computer system and image compensation method thereof
CN113990263B (en) Backlight processing method and device for display screen, storage medium and electronic equipment
CN112992052B (en) Power consumption control method of display panel and display panel
CN114266803A (en) Image processing method, image processing device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20210706