CN115240613A - Method for boundary line between display device and camera device under fuzzy screen - Google Patents

Method for boundary line between display device and camera device under fuzzy screen Download PDF

Info

Publication number
CN115240613A
CN115240613A CN202110441299.4A CN202110441299A CN115240613A CN 115240613 A CN115240613 A CN 115240613A CN 202110441299 A CN202110441299 A CN 202110441299A CN 115240613 A CN115240613 A CN 115240613A
Authority
CN
China
Prior art keywords
pixel
sensor
display
pixels
blurred
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110441299.4A
Other languages
Chinese (zh)
Inventor
庄启峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Himax Technologies Ltd
Original Assignee
Himax Technologies Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Himax Technologies Ltd filed Critical Himax Technologies Ltd
Priority to CN202110441299.4A priority Critical patent/CN115240613A/en
Publication of CN115240613A publication Critical patent/CN115240613A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/10Intensity circuits
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/06Adjustment of display parameters
    • G09G2320/0626Adjustment of display parameters for control of overall brightness

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Control Of Indicators Other Than Cathode Ray Tubes (AREA)

Abstract

A display device, comprising: a group of sensor pixels having a maximum sensor pixel luminance value SV, a group of display pixels having a display pixel luminance value DV, and a group of blurred pixels having a blurred pixel luminance value BV. The minimum distance between the display pixel group and the sensor pixel group is 1, the minimum distance between the blurring pixel group and the sensor pixel group is Z, the minimum distance between the blurring pixel group and the display pixel group is (1-Z), and BV = (1-Z) × SV + Z × DV is satisfied.

Description

Method for boundary line in display device and fuzzy under-screen camera device
[ technical field ] A
The present invention generally relates to a display device and a method of blurring a boundary line in a Camera Under Display (CUD) device. In particular, the present invention is directed to a method for blurring a boundary line between a sensor region and a display region in a display device in a manner of adjusting a gamma level of a pixel unit for use in an off-screen camera device.
[ background ] A method for producing a semiconductor device
For better display quality, the camera of the mobile phone may advantageously be designed to be hidden under the display panel. However, some light incident to the camera may be blocked by the pixels, and the transparent material of the Organic Light Emitting Diode (OLED) is still too expensive to be applied to general products.
To solve such a problem, an incomplete sub-pixel layout may be proposed to hide the camera under the panel. Furthermore, in the panel, certain pixels in the area of the under-screen camera may be cut out to facilitate proper functioning of the camera.
As shown in fig. 1, such an incomplete sub-pixel layout may cause a problem (contour) of the outline on the boundary line 11 of the under-screen camera area 10 and the normal area 20. Fig. 1 illustrates that a clear visual difference, i.e. the visual presence of the borderline 11, is perceived between the under-screen camera area 10 and the normal area 20, so that the visual presentation quality of the panel 1 is compromised in the presence of the under-screen camera area 10.
[ summary of the invention ]
In view of the above, a first aspect of the present invention provides a novel display device, which can improve the visual presentation quality of an off-screen camera device in the presence of an off-screen camera area, for example, to minimize the visual presence of the off-screen camera area. Under the condition that the under-screen camera area is visually imperceptible, the boundary line can be blurred to hide the under-screen camera area. A second aspect of the present invention proposes a novel method to blur the boundary lines in the under-screen camera device to improve the visual presentation quality of the under-screen camera device in the presence of the under-screen camera area.
The present invention, in a first aspect, proposes a novel display device. The display device comprises a display panel, an image sensor, a sensor pixel group, a display pixel group and a blurring pixel group. The display panel includes a display area, a blur area surrounded by the display area, and a sensor area surrounded by the blur area. The image sensor is disposed in the sensor area. The sensor pixel group is disposed in the sensor region, adjacent to the blurring region, and has a maximum sensor pixel luminance value SV. The display pixel groups are disposed in the display area adjacent to the blurring area and have display pixel luminance values DV. The group of blurred pixels is arranged in the blurring region, between the group of sensor pixels and the group of display pixels, and has a blurred pixel luminance value BV. The minimum distance between the display pixel group and the sensor pixel group is 1, the minimum distance between the blurring pixel group and the sensor pixel group is Z, and the minimum distance between the blurring pixel group and the display pixel group is (1-Z), so that BV = (1-Z) × SV + Z × DV.
In one embodiment of the invention, the sensor pixel group comprises n active sensor pixels and m inactive sensor pixels.
In another embodiment of the present invention, the display pixel group includes n + m effective display pixels and no ineffective display pixels.
In another embodiment of the present invention, each active display pixel has a display pixel luminance value DV such that DV = [ n/(n + m) ] × 100.
In another embodiment of the invention, the obscured area is in the form of a hollow circle and comprises an inner concentric circle and an outer concentric circle. The inner concentric circle has a center of the inner concentric circle.
In another embodiment of the present invention, a straight line passes through the center of the inner concentric circle, the set of sensor pixels, the set of blurring pixels, and the set of display pixels.
In another embodiment of the present invention, the display device further comprises at least one pinhole disposed in the sensor area.
In another embodiment of the present invention, at least one pinhole represents m invalid sensor pixels.
In another embodiment of the invention the blurred pixel brightness values BV represent the gamma levels of the blurred pixels in the group of blurred pixels.
The present invention, in a second aspect, presents a novel method of blurring the boundary lines in an underscreen camera device. First, an off-screen camera device is provided. The under-screen camera device includes a display area, a obscured area surrounded by the display area, and a sensor area surrounded by the obscured area. The sensor pixel group is disposed in the sensor region adjacent to the blur region and has a maximum sensor pixel brightness value SV. The sensor pixel group includes n active sensor pixels and m inactive sensor pixels. The display pixel group is disposed in the display area, adjacent to the blurring area, and has a display pixel luminance value DV. The display pixel group includes n + m effective display pixels and no ineffective display pixels. The blurred pixel groups are arranged in a blur area located between the sensor pixel group and the display pixel group, and have a blurred pixel luminance value BV. Next, the display pixel luminance value DV is determined according to DV = [ n/(n + m) ] × 100. Then, after the display pixel luminance value DV is determined, the blurred pixel luminance value BV is determined from BV = (1-Z) × SV + Z × DV. Wherein the minimum distance between the display pixel group and the sensor pixel group is 1, the minimum distance between the blurring pixel group and the sensor pixel group is Z, and the minimum distance between the blurring pixel group and the display pixel group is (1-Z).
In one embodiment of the invention, the obscured area is in the form of a hollow circle and comprises an inner concentric circle and an outer concentric circle. The inner concentric circle has a center of the inner concentric circle.
In another embodiment of the invention the decision of the dimmed pixel luminance value BV is to blur the boundary lines of the inner concentric circles.
In another embodiment of the present invention, a straight line passes through the center of the inner concentric circle, the set of sensor pixels, the set of blurring pixels, and the set of display pixels.
In another embodiment of the invention the blurred pixel brightness value BV represents a gamma level of a blurred pixel in the group of blurred pixels.
In another embodiment of the invention, the sensor pixel group comprises a first sensor pixel having a maximum sensor pixel luminance value SV and a second sensor pixel having a minimum sensor pixel luminance value 0.
In another embodiment of the invention the group of blurred pixels comprises first blurred pixels having a first blurred pixel luminance value BV1 and second blurred pixels having a second blurred pixel luminance value BV2.
In another embodiment of the invention the first blurred pixel luminance values BV1 are different from the second blurred pixel luminance values BV2.
In another embodiment of the present invention, the first and second blurred pixel brightness values BV1 and BV2 respectively represent gamma levels.
In another embodiment of the invention, the first sensor pixel corresponds to the first blurred pixel and the second sensor pixel corresponds to the second blurred pixel.
In another embodiment of the present invention, the total brightness of the display pixel group is equal to the total brightness of the blurred pixel group.
In another embodiment of the invention, the total brightness of the sensor pixel group is equal to the total brightness of the blurred pixel group.
[ description of the drawings ]
Fig. 1 illustrates an example of a contour problem occurring on a boundary line of an off-screen camera area and a normal area to indicate that a strong visual difference may be exhibited between the off-screen camera area and the normal area, thereby compromising the visual display quality of a panel in the presence of the off-screen camera area.
Fig. 2 is an example of a flow chart of a method of obscuring boundary lines in an underscreen camera device of the present invention.
Fig. 3 illustrates a top view of an underscreen camera device in accordance with an example of the invention.
Fig. 4 illustrates a close-up view of the underscreen camera device of fig. 3 along a line in accordance with the present invention.
Fig. 5 illustrates the calculation result according to the first example of the present invention.
Fig. 6 illustrates a calculation result according to a second example of the present invention.
Fig. 7 illustrates a calculation result according to a third example of the present invention.
Fig. 8 illustrates an image corresponding to the image of fig. 1 after undergoing the operation of the present invention, with an example of mitigating contour problems occurring on a boundary line between a sensor region (an off-screen camera region) and a display region (a general region), thereby improving the visual presentation quality of the display panel in the presence of the off-screen camera region.
[ notation ] to show
1: panel board
10: under-screen camera area
11: boundary line
20: normal area
100: under screen camera device/display device
101: display panel
102: straight line
103: image forming method
110: sensor area
111: center of circle
112: image sensor
113: sensor pixel group/cell unit
114: sensor sub-pixel
115: sensor sub-pixel
116: sensor sub-pixel
117: sensor sub-pixel
120: obscured areas
120H: hollow round
121: inner concentric circle
122: external concentric circle
123: blurred pixel groups/cell units
124: blurred sub-pixels
125: blurred sub-pixels
126: blurred sub-pixels
127: blurred sub-pixels
130: display area
133: display pixel group/cell unit
134: display sub-pixel
135: display sub-pixel
136: display sub-pixel
137: display sub-pixel
101: step (ii) of
201: step (ii) of
301: step (ii) of
401: step (ii) of
501: step (ii) of
601: step (ii) of
BV: luminance value of blurred pixel
DV: display pixel luminance value
SV: sensor pixel luminance values
Z: weight value
[ detailed description ] A
In order to improve display quality, the present invention provides an adjustment method to blur the boundary line in the under-screen camera device in the presence of the under-screen camera area, for example to weaken or further eliminate the visual presence of the undesirable under-screen camera area. Fig. 2 is an illustration of a flow chart of a method of obscuring boundary lines in an underscreen camera device of the present invention. Fig. 3 to 4 illustrate an example of an operation procedure for blurring a boundary line in an off-screen camera device according to the present invention.
Referring to fig. 2, step 101 is performed first. Step 101 is a step representing inputting a plurality of original pixel units. The original pixel elements may be pixels (pixels) or sub-pixels (sub-pixels) of the display device. The display device may correspond to the off-screen camera device 100 shown in fig. 3. For example, there are a plurality of pixel units in the under-screen camera device 100. One pixel unit may be one pixel or one sub-pixel having a predetermined color and a predetermined brightness. One pixel may include a plurality of sub-pixels, and each sub-pixel may emit light of a certain color, for example, red (referred to as R), green (referred to as G), blue (referred to as G), or another suitable color, but the present invention is not limited thereto. In other words, step 101 may also be referred to as inputting R/G/B information.
Please refer to fig. 3. Fig. 3 illustrates a top view of an underscreen camera device in accordance with an example of the invention. The off-screen camera device 100 is provided first. The off-screen camera device 100 may include a display panel 101 for displaying a picture or image, such as the image 103 shown in fig. 1. The off-screen camera device 100 may further include other suitable elements such as an input unit (not shown), an output unit (not shown), or a control unit (not shown), but the present invention is not limited thereto. The display panel 101 may include a plurality of functional areas, such as a sensor area 110, a blurring area 120, and a display area 130, but the present invention is not limited thereto. In some embodiments of the present invention, sensor region 110 may be surrounded by obscured region 120, and obscured region 120 may be surrounded by display region 130. In some embodiments of the present invention, the obscured area 120 may be in the form of a hollow circle 120H. The hollow circles 120H may include an inner concentric circle 121 and an outer concentric circle 122. Inner concentric circle 121 may have a center 111 of inner concentric circle 121 such that center 111 may also be the center of outer concentric circle 122.
The display panel 101 may include an image sensor 112 disposed in the sensor region 110. The image sensor 112 may be disposed at least partially within the sensor region 110 or may be disposed entirely within the sensor region 110. The image sensor 112 may be used as a camera in the off-screen camera device 100.
As previously mentioned, there are multiple pixels or sub-pixels in the display device 100. Different pixels or different sub-pixels in different areas of the display device 100 may form different sets (sets) of pixels. In some embodiments of the present invention, at least one sensor pixel group 113 may be disposed in the sensor region 110 adjacent to the location of the obscured region 120. In other words, the sensor pixel group 113 may be disposed at the inner concentric circle 121, i.e., the boundary line of the sensor region 110 and the blurring region 120. The sensor pixel group 113 may include one or more sensor pixel units. Fig. 4 illustrates that the sensor pixel group 113 may include a plurality of sensor sub-pixels, but the present invention is not limited thereto.
In some embodiments of the present invention, at least one blurred pixel group 123 may be disposed in the blurred region 120. In other words, the blurred pixel group 123 may be disposed between the inner concentric circle 121 and the outer concentric circle 122, that is, may be disposed between the sensor area 110 and the display area 130. The group of blurred pixels 123 may comprise one or more blurred pixel units. Fig. 4 illustrates that the blurred pixel group 123 may include a plurality of sensor sub-pixels, but the present invention is not limited thereto.
In some embodiments of the invention, at least one display pixel group 133 may be disposed in display area 130 adjacent to the location of obscured area 120. For example, the display pixel group 133 may be disposed at the outer concentric circle 122, i.e., the boundary line between the display region 130 and the blurring region 120. Display pixel group 133 may include one or more display pixel cells. Fig. 4 illustrates that the display pixel group 133 may include a plurality of display sub-pixels, but the present invention is not limited thereto.
In some embodiments of the present invention, there may also be a straight line 102 passing through the set of sensor pixels 113, the set of blurring pixels 123 and the set of display pixels 133. In some embodiments of the present invention, the straight line 102 may further pass through the center 111 of the inner concentric circle 121, the sensor pixel group 113, the blurring pixel group 123, and the display pixel group 133, such that the blurring pixel group 123 may be disposed between the sensor pixel group 113 and the display pixel group 133. The group of blurred pixels 123 may comprise one or more blurred pixel elements. In some embodiments of the present invention, fig. 4 illustrates that the minimum distance between display pixel group 133 and sensor pixel group 113 is 1, the minimum distance between blurred pixel group 123 and sensor pixel group 113 is Z, and the minimum distance between blurred pixel group 123 and display pixel group 133 is (1-Z).
Fig. 4 illustrates a close-up view of the underscreen camera device 100 of fig. 3 along line 102 in accordance with the present invention. The sensor pixel group 113 may include one or more sensor pixel cells to form a unit cell (unit cell). In one embodiment of the present invention, the active sensor pixels (active sensor pixels) and the inactive sensor pixels (inactive sensor pixels) in the sensor area 110 may collectively form a pattern or may be regularly arranged. A unit including an effective sensor pixel and an ineffective sensor pixel to represent a minimum repeating unit (minimum repeating unit) of a pattern or arrangement is represented by one of the sensor pixel groups 113. For example, the sensor pixel group 113 may include n valid sensor pixels and m invalid sensor pixels, where n is an integer not less than 1 and m is an integer not less than 1. In another embodiment of the present invention, fig. 4 illustrates that the sensor pixel group 113 may include four sensor sub-pixels, i.e., n + m =4, but the present invention is not limited thereto. For example, sensor pixel group 113 may include sensor subpixel 114, sensor subpixel 115, sensor subpixel 116, and sensor subpixel 117, regardless of the color of the four sensor subpixels.
Similarly, the blurred pixel group 123 may include one or more blurred pixel units to form one unit of unit. The number of blurred pixel cells in one unit is the same as the number of sensor pixel cells in one unit. For example, fig. 4 illustrates that the group of blurred pixels 123 may include four blurred subpixels to correspond to the four sensor subpixels in the group of sensor pixels 113, regardless of the colors of the four blurred subpixels. For example, the blurred pixel group 123 may include a blurring sub-pixel 124, a blurring sub-pixel 125, a blurring sub-pixel 126, and a blurring sub-pixel 127, but the present invention is not limited thereto.
Similarly, the display pixel group 133 may include one or more display pixel units to form one unit. The number of the blurring pixel units in one unit is the same as the number of the display pixel units in one unit. For example, fig. 4 illustrates that display pixel group 133 may include four display subpixels to correspond with the four sensor subpixels in sensor pixel group 113, regardless of the colors of the four display subpixels. For example, one display pixel group 133 may include a display sub-pixel 134, a display sub-pixel 135, a display sub-pixel 136, and a display sub-pixel 137, but the present invention is not limited thereto.
As described above, the image sensor 112 is disposed in the sensor region 110, and a pixel unit occupied by a portion of the image sensor 112 may be a pinhole (indicated by a dot) to allow incident light to reach the image sensor 112 to form a portion of an image. Due to the presence of one or more pinholes (i.e., one or more dots represented by the off-screen camera area 10 in fig. 1), one or more sensor pixel cells in the sensor area 110 may become an inactive sensor pixel cell because these inactive sensor pixel cells corresponding to the dots are no longer capable of emitting light. Fig. 1 illustrates that the areas that correspond to the sensor areas 110, which are visually point-like in form, may correspond to pinholes in invalid sensor pixel cells.
Thus, the sensor area 110 including the image sensor 112 may visually represent some valid sensor pixels and some invalid sensor pixels. An active sensor pixel may be represented as a sensor pixel cell capable of emitting light of any suitable color. An invalid sensor pixel may be represented as a sensor pixel cell that cannot emit light at all. For example, an invalid sensor pixel may be represented as a pixel element in sensor region 110 represented by image sensor 112 that is occupied by a pinhole and that renders the sensor pixel element functionally invalid. The set of inactive sensor pixels in sensor area 110 may adversely alter the predetermined visual presentation of a given pattern, as shown in fig. 1.
Because an inactive sensor pixel has no brightness (no illumination available), the overall brightness of the sensor pixel group 113 (i.e., one unit cell) may be inevitably reduced based on the presence of an inactive sensor pixel in the sensor pixel group 113. And the more invalid sensor pixels there are, the less the overall brightness of the sensor pixel group 113 will be.
To balance the reduction in illumination due to the presence of inactive sensor pixels, all active sensor pixels in the sensor pixel group 113 may have a maximum sensor pixel brightness value SV, while all inactive sensor pixels in the sensor pixel group 113 may have a minimum sensor pixel brightness value of 0. The maximum pixel luminance value may represent a maximum gray scale (gray) 255 equal to 100% of the intensity. The minimum pixel luminance value may refer to the minimum gray level 0 equal to 0% intensity. The gray scale or intensity concept is well known in the art and therefore will not be described in detail. In other words, the luminance value of a sensor pixel cell may be SV or 0, so SV may be equal to the luminance value 100.
Thus, the present invention provides the following procedures to mitigate or further eliminate the adverse visual interference of invalid sensor pixels in the sensor region 110. First, as shown in fig. 2, step 201 is performed. The display pixel luminance value DV is determined. The display pixel luminance value DV is preferably related to the sensor pixel luminance value SV.
Not every sensor pixel cell in sensor region 110 is an active sensor pixel due to the presence of one or more pinholes. In contrast, the display pixel group 133 is provided in the display region 130 that is a general display region, and therefore each display pixel unit in the display region 130 is an effective display pixel capable of emitting light of any appropriate color in the absence of an ineffective display pixel. In some embodiments of the present invention, if sensor pixel group 113 includes n active sensor pixels and m inactive sensor pixels, then display pixel group 133 may similarly and accordingly include n + m active display pixels and no inactive display pixels.
To exhibit visually uniform brightness, the visual brightness of the display pixel set 133 is preferably close to or equal to the visual brightness of the sensor pixel set 113, and thus the display pixel brightness value DV can be determined. For example, the display pixel luminance value DV may be reduced proportionally to the sensor pixel luminance value SV. The sensor pixel brightness value SV may then represent the result of the response to the under-screen camera.
If the sensor pixel group 113 includes n valid sensor pixels and m invalid sensor pixels, the display pixel luminance value DV of the display pixel unit may be determined according to the principle of DV = [ n/(n + m) ] × (maximum pixel luminance value SV). For example, if SV is set to 100, DV = [ n/(n + m) ] × 100. The display pixel luminance value DV satisfying the foregoing relationship can ensure visual luminance uniformity in the display region 130 with respect to the sensor region 110. After the foregoing steps, it is possible to enable, for example, the luminance of the display pixel cells in the display area 130 to be uniformized and normalized (normalized), so they can share the same display pixel luminance value DV while having uniform luminance in the display area 130. Thus, the luminance value DV of the display pixel may represent a normal result or a normalized result.
Next, as shown in fig. 2, after step 201 is executed, step 301 is executed. After the display pixel luminance value DV of the display pixel unit is determined, the blurred pixel luminance value BV of the blurred pixel unit is determined. The luminance value BV of the blurred pixel may be determined in such a way that the blurred pixel luminance value BV is luminance-aligned from the maximum sensor pixel luminance value SV to the display pixel luminance value DV. This is an adjustment operation for brightness alignment of the blur area 120 with respect to the sensor area 110 and with respect to the display area 130.
Differently, to balance the brightness difference of the pixel cells between the sensor region 110 and the display region 130, the illumination, e.g. the brightness of each blurred pixel cell in the blurring region 120, may have a particular blurred pixel brightness value BV depending on the display pixel brightness value DV and the sensor pixel brightness value SV. For example, the blurred pixel luminance value BV may be no greater than the maximum sensor pixel luminance value SV and no less than the corresponding display pixel luminance value DV to form a smooth luminance gradient from SV to DV. Thus, the luminance gradient of the pixel cell may be formed along the straight line 102 such that the poor visual appearance of a given image caused by invalid sensor pixels in the sensor region 110 may gradually weaken and gradually converge to the display region 130 via the obscured region 120. In other words, the luminance gradient of the pixel cell may be used to blur the area of the under-screen camera, such as the distinct boundary line 11 shown in fig. 1, so that the original boundary line may become less visually distinct, or further substantially invisible.
The blurring pixel group 123 forming a unit is disposed in the blurring region 120 and is located between the sensor pixel group 113 and the display pixel group 133, for example, just between the sensor pixel group 113 and the display pixel group 133. Each pixel cell in the group of blurred pixels 123 has an independent blurred pixel luminance value BV. The respective individual blurred pixel luminance values BV are calculated to be aligned with SV and DV. The respective blurred pixel luminance values BV are determined so that the boundary lines of the inner concentric circles 121 become blurred. One unit cell may include n + m sub-pixels regardless of the colors of the sub-pixels, and the n + m sub-pixels may be arranged to form a pattern or in order, for example, by sites (locas) thereof.
For example, one unit cell in the sensor pixel group 113 may include n effective sensor pixels and m ineffective sensor pixels, where n is an integer not less than 1 and m is an integer not less than 1, so one unit cell may accordingly include n + m blurred pixels in one blurred pixel group 123, or may accordingly include n + m display pixels in the display area 130. A certain blurred pixel in one unit may exclusively correspond to a specific sensor pixel and to a specific display pixel. For example, the blurred pixel of the first site in one unit cell may correspond to the sensor pixel of the first site in another unit cell and to the display pixel of the first site in another unit cell along the straight line 102. While the blurred pixel of the second location in one unit cell may correspond along line 102 to the sensor pixel of the second location in another unit cell and to the display pixel of the second location in another unit cell and so on.
Then, as shown in fig. 2, step 401 is performed after step 301 is performed. Step 401 may be an adjustment calculation, such as performing one or more weighting calculations. Such weighting calculations may involve one or more weights. The one or more weight values may relate to a dimension measurement such as distance, such that the brightness of the sub-pixels in the obscured region 120 may be adjusted according to the distance to the reference point, although the invention is not limited thereto.
For example, a particular blurred pixel brightness value BV may be determined based on the weight Z, based on the corresponding sensor pixel brightness value SV, and based on the corresponding display pixel brightness value DV, to satisfy a linear weighting calculation: BV = (1-Z) × SV + Z × DV, but the present invention is not limited thereto. The blurred pixel luminance value BV may represent a gamma level of the blurred pixels in the group 123 of blurred pixels, but the present invention is not limited thereto. Along the straight line 102, the minimum distance between one display pixel group 133 and one sensor pixel group 113 is 1, the minimum distance between one blurring pixel group 123 and one sensor pixel group 113 is Z and is used as a weight in this example, and the minimum distance between one blurring pixel group 123 and one display pixel group 133 is (1-Z), but the present invention is not limited thereto. When implementing the method of the present invention, the actual minimum distance between the display pixel set 133 and the sensor pixel set 113 is obtained according to the implementation and can be determined by one of ordinary skill in the art as long as the minimum distance is sufficient for implementing the present invention. In other words, the invention introduces weights associated with the off-screen camera to determine the blurred pixel luminance values BV, but the invention is not limited thereto.
As shown in FIG. 4, some exemplary calculations for determining the different blurred pixel luminance values BV are proposed in accordance with some embodiments of the present invention. Although the following calculation example relates to the embodiment of n + m =4, the present invention is not limited to the embodiment of n + m = 4. Any embodiment involving n + m.gtoreq.2 is considered to fall within the scope and optimized concept of the present invention.
First example
Fig. 4 illustrates a sensor pixel group 113 including a first sensor subpixel 114, a second sensor subpixel 115, a third sensor subpixel 116, and a fourth sensor subpixel 117 to form one unit 113 regardless of colors of the four sensor subpixels. The four sensor sub-pixels are arranged in a unit cell with respect to their respective specific sites. A site is a site in a given unit of cells where a particular sensor sub-pixel is arranged along a given line, or a site of interest in performing a determination of a pixel intensity value, such as DV or BV. For example, sensor pixel group 113, which includes first sensor subpixel 114, second sensor subpixel 115, third sensor subpixel 116, and fourth sensor subpixel 117, may include four corresponding sites, such as a site of first sensor subpixel 114, a site of second sensor subpixel 115, a site of third sensor subpixel 116, and a site of fourth sensor subpixel 117, but the present invention is not limited thereto. In the first example, first sensor subpixel 114, second sensor subpixel 115, and third sensor subpixel 116 are active sensor pixels, and fourth sensor subpixel 117 is an inactive sensor pixel, so n =3 and m =1.
First, according to the aforementioned principle, the first sensor subpixel 114, the second sensor subpixel 115 and the third sensor subpixel 116 each have a maximum sensor pixel luminance value SV (luminance 100) and the fourth sensor subpixel 117 has a minimum sensor pixel luminance value 0. The foregoing initial conditions can be simplified to be represented as [ sub-pixels (corresponding luminance values) ]:
[114(100),115(100),116(100),117(0)]
next, fig. 4 illustrates a display pixel group 133 including a first display sub-pixel 134, a second display sub-pixel 135, a third display sub-pixel 136, and a fourth display sub-pixel 137 to form one unit 133 regardless of colors of the four display sub-pixels. The four display sub-pixels are arranged in a unit cell with respect to their specific sites, respectively. And determining the brightness value DV of the display pixels of the four display sub-pixels according to DV = [ n/(n + m) ]. Times.100.
DV134= [ 3/(3 + 1) ] × 100=75 because the first display sub-pixel 134 corresponds in area to the first sensor sub-pixel 114;
DV135= [ 3/(3 + 1) ] × 100=75 because second display subpixel 135 corresponds in area to second sensor subpixel 115;
DV136= [ 3/(3 + 1) ] × 100=75 because the third display sub-pixel 136 corresponds to the third sensor sub-pixel 116 on area;
DV137= [ 3/(3 + 1) ] × 100=75 because the fourth display sub-pixel 137 corresponds in area to the fourth sensor sub-pixel 117.
The above results can be simplified as:
[134(75),135(75),136(75),137(75)]
third, fig. 4 illustrates a blurred pixel group 123 including a first blurred subpixel 124, a second blurred subpixel 125, a third blurred subpixel 126, and a fourth blurred subpixel 127 to form one unit 123 regardless of colors of the four blurred subpixels. The four display sub-pixels are arranged in a unit cell with respect to their specific sites, respectively. Assuming Z =0.25, this means that the group of blurred pixels 123 is relatively close to the group of sensor pixels 113 to determine the blurred pixel brightness value BV1 or BV2 for each blurred sub-pixel and satisfy the relationship: BV = (1-Z) × SV + Z × DV. The first and second blurred pixel brightness values BV1 and BV2 may respectively represent a type of gamma level.
BV124= (1-0.25) × 100+0.25 × 75=93.75 because the first blurred subpixel 124 corresponds in area to the first sensor subpixel 114 (SV = 100);
BV125= (1-0.25) × 100+0.25 × 75=93.75 because second blurred subpixel 125 corresponds in area to second sensor subpixel 115 (SV = 100);
BV126= (1-0.25) × 100+0.25 × 75=93.75 because the third blurring subpixel 126 corresponds in area to the third sensor subpixel 116 (SV = 100);
BV127= (1-0.25) × 0+0.25 × 75=18.75 because the fourth blurring subpixel 127 corresponds in area to the fourth sensor subpixel 117 (SV = 0).
The above adjustment results can be simplified as follows:
[124(93.75),125(93.75),126(93.75),127(18.75)]
the calculation results are shown in fig. 5.
Note that because the total luminance in each unit cell 113/123/133 is the same (300), uniform visual luminance quality is produced across different pixel groups.
Second example
Fig. 4 illustrates a sensor pixel group 113 including a first sensor subpixel 114, a second sensor subpixel 115, a third sensor subpixel 116, and a fourth sensor subpixel 117 to form one unit 113 regardless of colors of the four sensor subpixels. The four sensor sub-pixels are arranged in a unit cell with respect to their specific sites, respectively. While in the second example, first sensor subpixel 114 and fourth sensor subpixel 117 are active sensor pixels, while second sensor subpixel 115 and third sensor subpixel 116 are inactive sensor pixels, so n =2 while m =2.
First, according to the foregoing principle, the first sensor subpixel 114 and the fourth sensor subpixel 117 each have a maximum sensor pixel luminance value SV (luminance 100), while the second sensor subpixel 115 and the third sensor subpixel 116 each have a minimum sensor pixel luminance value 0. The aforementioned initial conditions can be simplified to [ sub-pixels (corresponding luminance values) ]:
[114(100),115(0),116(0),117(100)]
next, fig. 4 illustrates a display pixel group 133 including a first display sub-pixel 134, a second display sub-pixel 135, a third display sub-pixel 136, and a fourth display sub-pixel 137 to form one unit 133 regardless of colors of the four display sub-pixels. The four display sub-pixels are arranged in a unit cell with respect to their specific sites, respectively. And then, the display pixel brightness value DV of the four display sub-pixels is determined according to the principle that DV = [ n/(n + m) ]. Times.100.
DV134= [ 2/(2 + 2) ] × 100=50 because first display subpixel 134 corresponds in area to first sensor subpixel 114;
DV135= [ 2/(2 + 2) ] × 100=50 because second display subpixel 135 corresponds in area to second sensor subpixel 115;
DV136= [ 2/(2 + 2) ] × 100=50 because the third display sub-pixel 136 corresponds to the third sensor sub-pixel 116 on area;
DV137= [ 2/(2 + 2) ] × 100=50 because the fourth display sub-pixel 137 corresponds in area to the fourth sensor sub-pixel 117.
The above results can be simplified as:
[134(50),135(50),136(50),137(50)]
third, fig. 4 illustrates a blurred pixel group 123 including a first blurred sub-pixel 124, a second blurred sub-pixel 125, a third blurred sub-pixel 126, and a fourth blurred sub-pixel 127 to form one unit 123 regardless of the colors of the four blurred sub-pixels. The four blurred sub-pixels are arranged in a unit of a unit with respect to their specific positions, respectively. Let Z =0.50, this means that the blurred pixel group 123 is located at a position equidistant between the sensor pixel group 113 and the display pixel group 133. And determining the brightness value BV of the blurred pixel of each blurred sub-pixel according to BV = (1-Z). Times.SV + Z × DV.
BV124= (1-0.50) × 100+0.50 × 50=75 because the first blurred subpixel 124 corresponds to the first sensor subpixel 114 (SV = 100) in area;
BV125= (1-0.50) × 0+0.50 × 50=25 because second blurred subpixel 125 corresponds in area to second sensor subpixel 115 (SV = 0);
BV126= (1-0.50) × 0+0.50 × 50=25 because the third blurring subpixel 126 corresponds in area to the third sensor subpixel 116 (SV = 0);
BV127= (1-0.50) × 100+0.50 × 50=75 because the fourth blurring subpixel 127 corresponds to the fourth sensor subpixel 117 (SV = 100) in area.
The above adjustment result can be simplified as follows:
[124(75),125(25),126(25),127(75)]
the calculation results are shown in fig. 6.
Note that because the total luminance in each unit cell 113/123/133 is the same (200), uniform visual luminance quality is produced in different pixel groups.
Third example
Fig. 4 illustrates a sensor pixel group 113 including a first sensor subpixel 114, a second sensor subpixel 115, a third sensor subpixel 116, and a fourth sensor subpixel 117 to form one unit cell 113 regardless of colors of the four sensor subpixels. The four sensor sub-pixels are arranged in a unit cell with respect to their specific sites, respectively. Whereas in the third example, first sensor subpixel 114, second sensor subpixel 115, and third sensor subpixel 116 are inactive sensor pixels, and fourth sensor subpixel 117 is an active sensor pixel, thus n =1 and m =3.
First, according to the foregoing principle, the first sensor subpixel 114, the second sensor subpixel 115, and the third sensor subpixel 116 each have a minimum sensor pixel luminance value of 0, and the fourth sensor subpixel 117 has a maximum sensor pixel luminance value SV (luminance of 100). The aforementioned initial conditions can be simplified to [ sub-pixels (corresponding luminance values) ]:
[114(0),115(0),116(0),117(100)]
next, fig. 4 illustrates a display pixel group 133 including a first display sub-pixel 134, a second display sub-pixel 135, a third display sub-pixel 136, and a fourth display sub-pixel 137 to form one unit 133 regardless of the colors of the four display sub-pixels. The four display sub-pixels are arranged in a unit cell with respect to their specific sites, respectively. Then, the display pixel brightness value DV of the four display sub-pixels is determined according to DV = [ 1/(1 + 3) ] × 100.
DV134= [ 1/(3 + 1) ] × 100=25 because first display subpixel 134 corresponds in area to first sensor subpixel 114;
DV135= [ 1/(3 + 1) ] × 100=25 because second display subpixel 135 corresponds in area to second sensor subpixel 115;
DV136= [ 1/(3 + 1) ] × 100=25 because the third display sub-pixel 136 corresponds to the third sensor sub-pixel 116 on area;
DV137= [ 1/(3 + 1) ] × 100=25 because the fourth display sub-pixel 137 corresponds in area to the fourth sensor sub-pixel 117.
The above results can be simplified as:
[134(25),135(25),136(25),137(25)]
third, fig. 4 illustrates a blurred pixel group 123 including a first blurred subpixel 124, a second blurred subpixel 125, a third blurred subpixel 126, and a fourth blurred subpixel 127 to form one unit 123 regardless of colors of the four blurred subpixels. The four blurring sub-pixels are arranged in a unit of a unit with respect to their specific sites, respectively. Assuming Z =0.75, this means that the blurred pixel group 123 is located closer to the display pixel group 133, and the blurred pixel brightness value BV of each blurred sub-pixel is determined according to the principle BV = (1-Z) × SV + Z × DV.
BV124= (1-0.75) × 0+0.75 × 25=18.75 because the first blurred subpixel 124 corresponds to the first sensor subpixel 114 (SV = 0) in area;
BV125= (1-0.75) × 0+0.75 × 25=18.75 because second blurred subpixel 125 corresponds in area to second sensor subpixel 115 (SV = 0);
BV126= (1-0.75) × 0+0.75 × 25=18.75 because the third blurring sub-pixel 126 corresponds in area to the third sensor sub-pixel 116 (SV = 0);
BV127= (1-0.75) × 100+0.75 × 25=43.75 because the fourth blurring subpixel 127 corresponds in area to sensor subpixel 117 (SV = 100).
The above adjustment result can be simplified as follows:
[124(18.75),125(18.75),126(18.75),127(43.75)]
the calculation results are shown in fig. 7.
Note that because the total luminance in each unit cell 113/123/133 is the same (100), uniform visual luminance quality is produced in different pixel groups.
Next, as shown in fig. 2, step 501 is performed after step 401 is performed. For example, the brightness of each pixel unit and each of the blurred regions 120 is determined, and the result of weighting each pixel unit in the blurred region 120 is obtained. The newly weighted results of the pixel elements in the obscured region 120 may represent a gradient that is used to weaken the contour to obscure the distinct boundary lines of the under-screen camera region as shown in fig. 1. For example, in a first example, the brightness of some illustrated pixel cells in the blurred region 120 is decided to obtain weighted results [124 (93.75), 125 (93.75), 126 (93.75), 127 (18.75) ] of the pixel cells in the blurred region 120. In a second example, the brightness of some of the illustrated pixel cells in the obscured region 120 is decided to obtain a weighted result [124 (75), 125 (25), 126 (25), 127 (75) ] of the pixel cells in the obscured region 120. In a third example, however, the brightness of some of the illustrated pixel cells in the obscured region 120 is determined to obtain a weighted result [124 (18.75), 125 (18.75), 126 (18.75), 127 (43.75) ] of the pixel cells in the obscured region 120.
The weighting results of the pixel cell group are a set of weighting results corresponding to each pixel cell in the blur area 120. For example, the set in the first instance is [124 (93.75), 125 (93.75), 126 (93.75), 127 (18.75) ], the set in the second instance is [124 (75), 125 (25), 126 (25), 127 (75) ], and the set in the third instance is [124 (18.75), 125 (18.75), 126 (18.75), 127 (43.75) ]. Each pixel element in the blurred region 120 having the weighting result in luminance becomes an adjusted pixel element corresponding to the associated pixel element in a different region. For example, in the first example [114 (100) -124 (93.75) -134 (75) ] together with the first sensor sub-pixel (sub-screen camera) 114 and with the normal first display sub-pixel 134 forms the adjusted first blurring sub-pixel 124. In a second example, [115 (0) -125 (25) -135 (50) ] together with the first sensor sub-pixel (sub-screen camera) 115 and with the normal first display sub-pixel 135 form the adjusted first blurred sub-pixel 125. In a third example, [117 (100) -127 (43.75) -137 (25) ] forms, together with the first sensor sub-pixel (sub-screen camera) 117 and with the normal first display sub-pixel 137, an adjusted first blurred sub-pixel 127. The result of improving the luminance gradient is shown in the first example, the second example, or the third example, respectively.
After the foregoing steps, as shown in fig. 2, step 601 is performed to output a plurality of adjusted pixel units. Each sub-pixel (referred to as R/G/B information) with the respective RGB obtained after the foregoing steps on the display panel 101 is output. The information of R/G/B may include different information types in terms of brightness, but is independent of its respective color information. For example, the R/G/B information in sensor area 110 may relate to the results of the original under-screen camera, such as maximum pixel brightness value SV or minimum pixel brightness value 0. The R/G/B information in the display area 130 may relate to a general result, such as an average display pixel luminance value DV obtained from DV = [ n/(n + m) ] × (maximum pixel luminance value SV). Whereas the R/G/B information in the blurred region 120 may relate to a set of multiple gradient results, e.g. a set of blurred pixel luminance values BV resulting from BV = (1-Z) × SV + zxdv. Z is a variable corresponding to the position of a given blurred pixel cell.
In particular, the blurred pixel luminance value BV is a kind of weighting result according to Z, and each weighting result corresponding to each pixel unit in the blurred region 120 is collected, and therefore, all weighted results become a set of luminance information on all pixel units in the blurred region 120 for outputting a plurality of adjusted pixel units for display purposes.
After the foregoing adjustment method, a novel display device 100 is provided to optimize the visual presentation quality of the off-screen camera device in the presence of the off-screen camera area, for example, to minimize the visual presence of the off-screen camera area. Fig. 3 illustrates a top view of an underscreen camera device in accordance with an example of the invention. Fig. 4 illustrates a partial enlarged view of the under-screen camera device according to fig. 3 along a line. The display device 100 of the present invention includes a display panel 101, an image sensor 112, a sensor pixel group 113, a display pixel group 123, and a blurring pixel group 133.
The display device 100 may be a device in which a camera is disposed below a display panel. The off-screen camera device may include a display panel 101 for displaying pictures or images (e.g., image 103). The off-screen camera device may further include other suitable elements such as an input unit (not shown), an output unit (not shown), or a control unit (not shown), but the present invention is not limited thereto.
The display panel 101 may include a plurality of functional areas such as a sensor area 110, a blurring area 120, and a display area 130, but the present invention is not limited thereto. In some embodiments of the present invention, sensor region 110 may be surrounded by obscured region 120, and obscured region 120 may be surrounded by display region 130. In some embodiments of the invention, the obscured region 120 may be in the form of a hollow circle 120H. The hollow circles 120H may include an inner concentric circle 121 and an outer concentric circle 122. The inner concentric circle 121 may have a center 111 of the inner concentric circle 121, and thus the center 111 may also be the center of the outer concentric circle 122.
The display panel 101 may include an image sensor 112 disposed in the sensor region 110, and one pixel unit occupied by a portion of the image sensor 112 may be a pinhole (dot) to allow incident light to pass through to the image sensor 112 to form an image. The image sensor 112 may be disposed at least partially within the sensor region 110, or entirely within the sensor region 110. The image sensor 112 may form a plurality of dots and may be used as a camera in the sensor area 110 of the off-screen camera device 100. Due to the presence of the one or more pinholes, one or more sensor pixel cells in sensor area 110 may become one or more inactive sensor pixel cells, e.g., one or more pinholes, because the inactive sensor pixel cells are not capable of emitting light. Fig. 1 illustrates a pinhole corresponding to a sensor area 110, represented in the form of a visually dotted area, corresponding to an invalid sensor pixel cell.
The display device 100 has a plurality of pixels or sub-pixels. Different pixels or different sub-pixels in different regions of the display device 100 form different pixel groups. In some embodiments of the invention, the sensor pixel group 113 may be disposed in the sensor region 110 and adjacent to the obscured region 120. In other words, the sensor pixel group 113 may be disposed at the inner concentric circle 121, i.e., adjacent to the boundary line between the sensor region 110 and the blur region 120. The sensor pixel group 113 may include one or more sensor pixel units.
Thus, the sensor area 110 including the image sensor 112 may visually exhibit some valid sensor pixels and some invalid sensor pixels. An active sensor pixel may refer to a sensor pixel cell capable of emitting light of any suitable color. An inactive sensor pixel may refer to a sensor pixel cell that does not emit light at all. For example, an invalid sensor pixel may refer to a pixel cell occupied by a pinhole, and the pinhole, represented by image sensor 112 in sensor region 110, may render the original function of the sensor pixel cell invalid. The collection of inactive sensor pixels in sensor region 110 may adversely alter the visually predetermined appearance of a given image, as shown in fig. 1.
One sensor pixel group 113 may include one or more sensor pixel cells to form one unit cell. In one embodiment of the present invention, the active sensor pixels and the inactive sensor pixels in one sensor region 110 may be collectively patterned or may be regularly arranged. One unit including an effective sensor pixel and an ineffective sensor pixel to represent a minimum repeating unit on a pattern or arrangement is represented by one sensor pixel group 113. For example, the sensor pixel group 113 may include n effective sensor pixels and m ineffective sensor pixels, where n is an integer not less than 1 and m is an integer not less than 1. In another embodiment of the present invention, fig. 4 illustrates that the sensor pixel group 113 may include four sensor sub-pixels, so n + m =4, but the present invention is not limited thereto. For example, sensor pixel group 113 may include sensor subpixel 114, sensor subpixel 115, sensor subpixel 116, and sensor subpixel 117 regardless of the colors of the four sensor subpixels.
The active sensor pixels in the sensor pixel group 113 may have a maximum sensor pixel brightness value SV. The inactive sensor pixels in sensor pixel group 113 may have a minimum sensor pixel luminance value of 0. The maximum pixel brightness value may represent a maximum gray level 255 equal to 100% intensity. The minimum pixel luminance value may refer to the minimum gray level 0 equal to 0% intensity. The concept of gray scale or intensity is well known in the art and therefore the details thereof will not be described in detail. In other words, the luminance value of a sensor pixel cell may be SV or 0, so SV may be equal to the luminance value of 100.
In some embodiments of the present invention, display pixel group 133 may be disposed in display region 130 and adjacent to obscured region 120. In other words, the display pixel groups 133 may be disposed at the outer concentric circles 122, i.e., adjacent to the boundary line between the display region 130 and the blurring region 120. Display pixel group 133 may include one or more display pixel cells. Fig. 3 illustrates that each display pixel group 133 may include a plurality of display sub-pixels, but the present invention is not limited thereto.
Similarly, one display pixel group 133 may include one or more display pixel units to form one unit. The number of display pixel cells in one unit cell is the same as the number of sensor pixel group cells in another unit cell. For example, fig. 4 illustrates that display pixel group 133 may include four display sub-pixels to correspond to the four sensor sub-pixels in sensor pixel group 113, regardless of the color of the sensor sub-pixels. For example, display pixel group 133 may include display subpixel 134, display subpixel 135, display subpixel 136, and display subpixel 137, but the invention is not limited thereto.
If the sensor pixel group 113 includes n valid sensor pixels and m invalid sensor pixels, the display pixel luminance value DV of the display pixel unit can be determined to satisfy the following relationship: DV = [ n/(n + m) ] × (maximum pixel brightness value SV). The display pixel luminance value DV satisfying the foregoing relationship in the display region 130 can ensure visual luminance uniformity with respect to the sensor region 110. Thus, the luminance value DV of the display pixel may represent a normal result or a normalized result.
In some embodiments of the present invention, the blurred pixel group 123 may be disposed in the blurred region 120. In other words, the blurred pixel group 123 may be disposed between the inner concentric circle 121 and the outer concentric circle 122, i.e., between the sensor area 110 and the display area 130. One group 123 of blurred pixels may comprise one or more blurred pixel units. Fig. 3 illustrates that each of the blurred pixel groups 123 may include a plurality of blurred sub-pixels, but the present invention is not limited thereto.
Similarly, one blurred pixel group 123 may include one or more blurred pixel units to form one unit of unit. The number of blurred pixel cells in one unit is the same as the number of sensor pixel cells in another unit. For example, fig. 4 illustrates that the group of blurred pixels 123 may include four blurred subpixels to correspond to the four sensor subpixels in the group of sensor pixels 113, but independent of the color of the sensor subpixels. For example, one blurring pixel group 123 may include a blurring sub-pixel 124, a blurring sub-pixel 125, a blurring sub-pixel 126, and a blurring sub-pixel 127, but the present invention is not limited thereto.
In some embodiments of the present invention, there may be a straight line 102 passing through the set of sensor pixels 113, the set of blurring pixels 123, and the set of display pixels 133. In some embodiments of the present invention, line 102 may further pass through center 111 of inner concentric circle 121, sensor pixel group 113, blur pixel group 123, and display pixel group 133. Fig. 3 illustrates that along the straight line 102, the minimum distance between one pixel group 133 and one sensor pixel group 113 is 1, the minimum distance between one blurred pixel group 123 and one sensor pixel group 113 is Z and is used as a weight, and the minimum distance between one blurred pixel group 123 and one display pixel group 133 is (1-Z), but the present invention is not limited thereto. When implementing the method of the present invention, the actual minimum distance between the display pixel set 133 and the sensor pixel set 113 is obtained according to the implementation and can be determined by one of ordinary skill in the art as long as the minimum distance is sufficient for implementing the present invention.
Depending on the display pixel luminance value DV and the sensor pixel luminance value SV, the illumination, for example, with respect to the luminance of each blurred pixel cell in the blurred region 120 may have a particular blurred pixel luminance value BV. The blurred pixel luminance value BV may represent the gamma level of the blurred pixels in the group of blurred pixels 123, but the present invention is not limited thereto. For example, the blurred pixel luminance value BV is not greater than the maximum sensor pixel luminance value SV nor less than the corresponding display pixel luminance value DV to satisfy a linear weighting relationship according to Z, SV, and DV: BV = (1-Z) × SV + Z × DV, but the present invention is not limited thereto.
Thus, a luminance gradient in pixel units may be formed along the straight line 102 from the sensor pixel group 113 to the display pixel group 133, so that the poor visual appearance of a given image due to the invalid display pixels in the sensor region 110 may gradually be reduced via the blurred region 120 and converge to the display region 130. In other words, the pixel cell brightness gradient may be used to make the distinct boundary line 11 of the under-screen camera area 10 as shown in fig. 1 appear blurred, so that the original boundary line 11 may become less visually distinct, or more substantially invisible.
One blurring pixel group 123 forming one unit is disposed in the blurring region 120 and is located between the sensor pixel group 113 and the display pixel group 133. Each pixel cell in one group of blurred pixels 123 has an individual blurred pixel intensity value BV. Each blurred pixel intensity value BV is calculated to align with a corresponding sensor pixel intensity value SV. One unit cell may include n + m sub-pixels regardless of the color of the sub-pixels, and the n + m sub-pixels may be arranged to form a pattern or be orderly arranged according to, for example, sites.
For example, one unit cell in the sensor pixel group 113 may include n effective sensor pixels and m ineffective sensor pixels, where n is an integer not less than 1 and m is an integer not less than 1, and thus one unit cell may also correspondingly include n + m blurred pixels located in the blurred pixel group 123, and may correspondingly include n + m display pixels in the display area 130. Some of the blurred pixels in one unit may exclusively correspond to a specific sensor pixel and a specific display pixel. For example, a blurred pixel of a first location in one unit cell may correspond along the straight line 102 to a sensor pixel of a first location in another unit cell and to a display pixel of a first location in another unit cell, and a blurred pixel of a second location in one unit cell may correspond along the straight line 102 to a sensor pixel of a second location in another unit cell and to a display pixel of a second location in another unit cell, and so on.
After step 601, an adjusted image 103 may then be obtained that improves, or further optimizes, the visual presentation quality. FIG. 8 illustrates an example of an image 103 with updated R/G/B information with reduced contouring problems after going through the method of the present invention in accordance with FIG. 1. As shown in fig. 8, the image 103 showing better visual presentation quality can minimize the visual effect of black dots compared to fig. 1. For example, in the presence of an off-screen camera area, after the method of the present invention is performed on the display panel 101, the contouring problem of the image 103 in fig. 8 occurring on the boundary line (substantially invisible) between the sensor area (substantially invisible) such as the off-screen camera area and the display area 130 such as the general area can be alleviated.
The above description is only a preferred embodiment of the present invention, and all the equivalent changes and modifications made by the claims of the present invention should be covered by the scope of the present invention.

Claims (20)

1. A display device, comprising:
a display panel including a display area, a blurring area surrounded by the display area, and a sensor area surrounded by the blurring area;
an image sensor disposed in the sensor region;
a sensor pixel group disposed in the sensor region adjacent to the blur region and having a maximum sensor pixel brightness value SV;
a display pixel group disposed in the display region adjacent to the blurring region and having a display pixel luminance value DV; and
a group of blurred pixels arranged in the blurring region, located between the group of sensor pixels and the group of display pixels, and having blurred pixel luminance values BV;
wherein a minimum distance between the display pixel group and the sensor pixel group is 1, a minimum distance between the blurring pixel group and the sensor pixel group is Z, and a minimum distance between the blurring pixel group and the display pixel group is (1-Z), such that BV = (1-Z) × SV + Z × DV.
2. The display device of claim 1, wherein the sensor pixel group comprises n active sensor pixels and m inactive sensor pixels, and the display pixel group comprises n + m active display pixels and no inactive display pixels.
3. A display device according to claim 2, wherein each said active display pixel has said display pixel luminance value DV such that DV = [ n/(n + m) ] × (maximum sensor pixel luminance value SV).
4. A display device according to claim 1, wherein the obscuring region is in the form of a hollow circle and comprises an inner concentric circle and an outer concentric circle, and the inner concentric circle has a centre of the inner concentric circle.
5. The display device of claim 4, wherein a straight line passes through the inner concentric circles, the set of sensor pixels, the set of obscured pixels, and the center of the set of display pixels.
6. The display device according to claim 2, further comprising:
at least one pinhole disposed in the sensor region.
7. The display device of claim 6, wherein the at least one pinhole represents the m invalid sensor pixels.
8. A display device according to claim 1, wherein the blurred pixel brightness value BV represents a gamma level of a blurred pixel of the group of blurred pixels.
9. A method of obscuring boundary lines in an under-screen camera device, comprising:
provided is an under-screen camera device including:
a display area, a obscured area surrounded by the display area, and a sensor area surrounded by the obscured area;
a sensor pixel group disposed in the sensor region adjacent to the blurring region and having a maximum sensor pixel luminance value SV, wherein the sensor pixel group comprises n active sensor pixels and m inactive sensor pixels;
a display pixel group disposed in the display region adjacent to the blurring region and having a display pixel luminance value DV, wherein the display pixel group includes n + m effective display pixels and no ineffective display pixels; and
a group of blurred pixels arranged in the blurring region, located between the group of sensor pixels and the group of display pixels, and having blurred pixel luminance values BV;
deciding the display pixel luminance value DV according to DV = [ n/(n + m) ] × (the maximum pixel luminance value SV); and
after the determination of the display pixel luminance value DV, the blurred pixel luminance value BV is determined according to BV = (1-Z) × SV + Z × DV, where the minimum distance between the display pixel group and the sensor pixel group is 1, the minimum distance between the blurred pixel group and the sensor pixel group is Z, and the minimum distance between the blurred pixel group and the display pixel group is (1-Z).
10. The method of obscuring a boundary line in an underscreen camera device of claim 9, wherein the obscured region is in the form of a hollow circle and comprises an inner concentric circle and an outer concentric circle, and the inner concentric circle has a center of the inner concentric circle.
11. The method of blurring the boundary line in an under-screen camera device according to claim 10, wherein determining the blurred pixel luminance value BV is to blur the boundary line of the inner concentric circles.
12. The method of obscuring boundary lines in an under-screen camera device of claim 10, wherein a straight line passes through the center of the inner concentric circles, the set of sensor pixels, the set of blur pixels, and the set of display pixels.
13. The method of obscuring boundary lines in an under-screen camera device according to claim 9, wherein the blurred pixel brightness value BV represents a gamma level of a blurred pixel in the group of blurred pixels.
14. The method of blurring boundary lines in an under-screen camera device according to claim 9, wherein the group of sensor pixels comprises a first sensor pixel having the maximum sensor pixel brightness value SV and a second sensor pixel having a minimum sensor pixel brightness value 0.
15. The method of blurring a boundary line in an under-screen camera device according to claim 14, wherein the group of blurred pixels comprises a first blurred pixel having a first blurred pixel luminance value BV1 and a second blurred pixel having a second blurred pixel luminance value BV2.
16. The method of blurring the boundary line in an under-screen camera device according to claim 15, wherein the first blurred pixel luminance value BV1 is different from the second blurred pixel luminance value BV2.
17. The method of blurring the boundary line in an under-screen camera device according to claim 16, wherein the first and second blurred pixel brightness values BV1 and BV2 represent gamma levels, respectively.
18. The method of obscuring boundary lines in an under-screen camera device of claim 15, wherein the first sensor pixel corresponds to the first obscured pixel and the second sensor pixel corresponds to the second obscured pixel.
19. The method of obscuring boundary lines in an under-screen camera device of claim 9, wherein a total brightness of the display pixel set is equal to a total brightness of the obscured pixel set.
20. The method of blurring the boundary line in an under-screen camera device of claim 9, wherein a total brightness of the group of sensor pixels is equal to a total brightness of the group of blurred pixels.
CN202110441299.4A 2021-04-23 2021-04-23 Method for boundary line between display device and camera device under fuzzy screen Pending CN115240613A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110441299.4A CN115240613A (en) 2021-04-23 2021-04-23 Method for boundary line between display device and camera device under fuzzy screen

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110441299.4A CN115240613A (en) 2021-04-23 2021-04-23 Method for boundary line between display device and camera device under fuzzy screen

Publications (1)

Publication Number Publication Date
CN115240613A true CN115240613A (en) 2022-10-25

Family

ID=83666807

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110441299.4A Pending CN115240613A (en) 2021-04-23 2021-04-23 Method for boundary line between display device and camera device under fuzzy screen

Country Status (1)

Country Link
CN (1) CN115240613A (en)

Similar Documents

Publication Publication Date Title
US7965305B2 (en) Color display system with improved apparent resolution
JP5302961B2 (en) Control device for liquid crystal display device, liquid crystal display device, control method for liquid crystal display device, program, and recording medium therefor
CN109192174B (en) Driving method and driving device of display panel and display device
JP4806102B2 (en) Control device for liquid crystal display device, liquid crystal display device, control method for liquid crystal display device, program, and recording medium
KR100818988B1 (en) Method and apparatus for processing image signal
US9524664B2 (en) Display device, display panel driver and drive method of display panel
CN103325351B (en) Image processing apparatus and image processing method
CN105070267B (en) Display system with peep-proof function and display method thereof
KR100772906B1 (en) Method and apparatus for displaying image signal
CN110140163B (en) Display panel and control method and control device thereof
JP2006285238A (en) Display method for use in display device and display device
CN109686323B (en) Display device, driving method thereof and electronic equipment
US11322103B2 (en) Driving method and driving device of display panel and display device
CN109256077B (en) Control method and device of display panel and readable storage medium
CN106560880A (en) Display Device And Image Rendering Method Thereof
US20190156772A1 (en) Display device and method for controlling display device
CN110490838B (en) Method and device for processing boundaries of areas with different resolutions of display panel
JP2003248476A (en) Character display device and character display method, control program for controlling the character display method, and recording medium with the control program recorded thereon
CN111650995A (en) Image display method and device, mobile terminal and storage medium
US7528814B2 (en) Method and device providing enhanced characters
CN112365841A (en) Display substrate, high-precision metal mask plate, display device and display driving method
CN109817158B (en) Display panel driving method and device and display device
CN115240613A (en) Method for boundary line between display device and camera device under fuzzy screen
CN115588409A (en) Sub-pixel rendering method and device and display equipment
TWI756109B (en) Display device and method to blur borderline in cud device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination