CN114279332B - Method and device for positioning sensor under display screen and electronic equipment - Google Patents

Method and device for positioning sensor under display screen and electronic equipment Download PDF

Info

Publication number
CN114279332B
CN114279332B CN202111620884.7A CN202111620884A CN114279332B CN 114279332 B CN114279332 B CN 114279332B CN 202111620884 A CN202111620884 A CN 202111620884A CN 114279332 B CN114279332 B CN 114279332B
Authority
CN
China
Prior art keywords
pixels
sensor
sensing data
assembly tolerance
type
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111620884.7A
Other languages
Chinese (zh)
Other versions
CN114279332A (en
Inventor
钟松锦
杜灿鸿
付恩琪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Goodix Technology Co Ltd
Original Assignee
Shenzhen Goodix Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Goodix Technology Co Ltd filed Critical Shenzhen Goodix Technology Co Ltd
Priority to CN202111620884.7A priority Critical patent/CN114279332B/en
Publication of CN114279332A publication Critical patent/CN114279332A/en
Application granted granted Critical
Publication of CN114279332B publication Critical patent/CN114279332B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Position Input By Displaying (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The embodiment of the application provides a method and a device for positioning a sensor under a display screen and electronic equipment, which can effectively improve the performance of the sensor, thereby improving the performance of the electronic equipment. The method comprises the following steps: acquiring a corresponding relation between a first value and a position of a sensor, wherein the first value is a value obtained based on first sensing data and second sensing data of the sensor, the first sensing data is data obtained after the sensor receives a first light signal sent by a first type of pixel in an assembly tolerance area, the second sensing data is data obtained after the sensor receives a second light signal sent by a second type of pixel in the assembly tolerance area, the assembly tolerance area is an area in a display screen arranged above the sensor, the assembly tolerance area corresponds to the assembly tolerance of the sensor, and the colors of the first type of pixel and the second type of pixel are different; and determining the position of the target sensor according to the first sensing data and the second sensing data of the target sensor and the corresponding relation.

Description

Method and device for positioning sensor under display screen and electronic equipment
Technical Field
The embodiment of the application relates to the technical field of positioning, in particular to a method, a device and electronic equipment for positioning a sensor under a display screen.
Background
With the development of the electronic device industry, in order to provide better user experience, many electronic devices are provided with ambient light sensors, so as to realize some intelligent functions through the ambient light sensors, such as automatically adjusting the brightness of a display screen, compensating the color of the display screen, correcting the photographing color, and the like.
At present, the performance requirements of users on electronic devices are higher and higher, so how to improve the performance of ambient light sensors, thereby improving the performance of electronic devices, is a problem to be solved urgently.
Disclosure of Invention
The embodiment of the application provides a method and a device for positioning a sensor under a display screen and electronic equipment, which can effectively improve the performance of the sensor, thereby improving the performance of the electronic equipment.
In a first aspect, a method for positioning an under-display sensor is provided, the method comprising: acquiring a corresponding relation between a first value and a position of a sensor, wherein the first value is a value obtained based on first sensing data and second sensing data of the sensor, the first sensing data is data obtained after the sensor receives a first light signal sent by a first type of pixel in an assembly tolerance area, the second sensing data is data obtained after the sensor receives a second light signal sent by a second type of pixel in the assembly tolerance area, the assembly tolerance area is an area arranged in a display screen above the sensor, the assembly tolerance area corresponds to the assembly tolerance of the sensor, and the colors of the first type of pixel and the second type of pixel are different; and determining the position of the target sensor according to the first sensing data and the second sensing data of the target sensor and the corresponding relation.
According to the embodiment of the application, the position of the target sensor is determined by using the sensing data corresponding to the two types of pixels with different colors, and the position accuracy of the target sensor determined on the basis of the sensing data corresponding to the pixels with different colors is higher because the sensing data obtained by the target sensor on the basis of the received light signals sent by the pixels with different colors are different. In addition, in general, parameters of the sensor are determined based on the position of the sensor, and the performance of the target sensor is improved due to the improvement of the position accuracy of the target sensor, so that the performance of the electronic device including the target sensor is effectively improved.
Further, the embodiment of the application also determines the position of the target sensor according to the corresponding relation between the sensing data and the position of the sensor. Under normal conditions, the corresponding relation is obtained based on a plurality of sample data, so that the obtained corresponding relation is accurate, and the position accuracy of the target sensor is further improved.
In one possible implementation, the first value is a ratio between the first sensed data and the second sensed data; the determining the position of the target sensor according to the first sensing data and the second sensing data of the target sensor and the corresponding relation comprises the following steps: and determining the position of the target sensor according to the ratio between the first sensing data and the second sensing data of the target sensor and the corresponding relation.
According to the technical scheme, the position of the target sensor is determined according to the ratio of the first sensing data to the second sensing data, the implementation is simple, and the operation efficiency can be effectively improved.
In one possible implementation, the correspondence relationship includes a correspondence relationship between the first value and a relative position, the relative position being a position of the sensor relative to a reference position; the determining the position of the target sensor according to the first sensing data and the second sensing data of the target sensor and the corresponding relation comprises the following steps: determining a relative position between the position of the target sensor and the reference position according to the first sensing data and the second sensing data of the target sensor and the corresponding relation; the position of the target sensor is determined from the relative position between the position of the target sensor and the reference position, and from the reference position.
In one possible implementation manner, the acquiring the correspondence between the first value and the position of the sensor includes: determining the reference position; sequentially illuminating a plurality of groups of pixels in the assembly tolerance area based on the reference position, wherein each group of pixels comprises the first type of pixels and the second type of pixels, and the positions between every two groups of pixels in the plurality of groups of pixels are different; when a group of pixels of the first type and the second type are lightened, acquiring a group of first sensing data and second sensing data; determining the first value according to the first sensing data and the second sensing data; and determining the corresponding relation between the first value and the relative position according to the first value and the relative position of the sensor corresponding to the first value relative to the reference position.
In one possible implementation manner, the sequentially illuminating the first type of pixels and the second type of pixels at different positions within the assembly tolerance region based on the reference position includes: the plurality of sets of pixels are illuminated row by row, starting from the first row of pixels within the assembly tolerance region, based on the reference position.
In one possible implementation, the determining the reference position includes: acquiring the assembly tolerance area; sequentially lighting pixels at different positions in the assembly tolerance area; acquiring a plurality of third sensing data of the sensor when the pixels at different positions are sequentially lightened; and determining the reference position according to the third sensing data.
According to the technical scheme, the pixels at different positions in the assembly tolerance area are sequentially lightened, and the sensing data of the sensor corresponding to the pixels at different positions are acquired. Since the position of the sensor is within the region corresponding to the assembly tolerance region and the sensor's sensing data is closely related to the position of the pixel, for example, under the same conditions, the closer the pixel is to the test sensor, the larger the sensing data. Therefore, the accuracy of the reference position determined based on the sensor data corresponding to the pixels at the different positions is high.
Further, the accuracy of the corresponding relation determined based on the reference position is higher due to the fact that the accuracy of the reference position is higher, and therefore the accuracy of the target sensor determined according to the corresponding relation is effectively improved.
In one possible implementation, the sequentially illuminating pixels at different positions within the assembly tolerance region includes: the pixels at the different positions are sequentially lighted up by using white bright stripes.
According to the technical scheme, compared with other colors, the white brightness is higher. Thus, the pixels are illuminated with white bright stripes, so that the greater the intensity of the light signal emitted by the pixels received by the sensor, the better the position of the sensor can be determined.
In a possible implementation manner, the determining the reference position according to the plurality of third sensing data includes: and determining the reference position according to the maximum value in the third sensing data.
According to the technical scheme, as the pixel is closer to the sensor, the sensing data of the sensor is larger. Therefore, the maximum value in the third sensing data indicates that the pixel within its corresponding assembly tolerance region is closest to the sensor, thereby further improving the accuracy of the determined sensor position.
In one possible implementation, the sequentially illuminating pixels at different positions within the assembly tolerance region includes: starting from the first row of pixels within the assembly tolerance region, the pixels within the assembly tolerance region are lit up row by row.
In one possible implementation, each lit pixel includes a plurality of rows of pixels.
According to the technical scheme, the plurality of rows of pixels are lightened each time, so that the time consumption for determining the position of the sensor can be greatly reduced, and the processing speed is improved. Further, the plurality of rows of pixels simultaneously emit light signals, so that the intensity of the light signals received by the sensor is obviously increased, the obtained sensing data is larger, and the accuracy of the position of the sensor is further improved.
In one possible implementation, the total width of the rows of pixels is equal to the width of the sensor.
According to the technical scheme, the total width of the plurality of rows of pixels is equal to the width of the sensor, so that the problems that sensing data are unchanged and errors are large due to different numbers of rows of pixels which are lightened at different times under the condition that the number of rows of the pixels which are lightened at the same time is too large or too small can be avoided, and the accuracy of the position of the sensor is further improved.
In one possible implementation, the number of times the pixels at the different positions are lit is at least three, and the plurality of third sensing data includes at least three data.
In one possible implementation, the number of rows of pixels is less than or equal to a second value, which is a difference between the number of rows of pixels occupied by the assembly tolerance region and 2.
In one possible implementation, the number of rows of pixels is greater than or equal to a third value that is a quotient of the number of rows of pixels occupied by the assembly tolerance region and the number of times the pixels of the different locations are illuminated.
In one possible implementation, a first distance is different from a second distance, the first distance being a distance between the first type of pixel and the sensor, the second distance being a distance between the second type of pixel and the sensor.
In the above technical solution, since the sensing data of the sensor is closely related to the position of the pixel, for example, under the same condition, the closer the pixel is to the sensor, the larger the sensing data is. Therefore, the position of the target sensor is determined based on the sensor data corresponding to the pixels at different positions within the assembly tolerance region, and the position accuracy of the target sensor can be effectively improved.
In one possible implementation, the spacing between the first type of pixels and the second type of pixels is the largest among the spacings between the two types of pixels within the assembly tolerance region of the display screen.
According to the technical scheme, the farther the sensor is from the pixel, the farther the optical path of the optical signal sent by the pixel reaches the sensor, the smaller the sensing data of the sensor under the same condition. Therefore, the two types of pixels with the largest distance are set as the first type of pixels and the second type of pixels, so that the larger the optical path difference of the optical signals sent by the two types of pixels received by the target sensor is, the more obvious the difference between the first sensing data and the second sensing data generated by the target sensor is, and the more accurate the position of the obtained target sensor is.
In one possible implementation, the correspondence is preconfigured on the target sensor.
According to the technical scheme, the corresponding relation is preconfigured on the target sensor, so that when the position of the target sensor is determined according to the corresponding relation, the corresponding relation can be directly obtained from the target sensor, and the operation complexity is reduced.
In one possible implementation, the method further includes: the first type pixels and the second type pixels in a target assembly tolerance area in a target display screen arranged above the target sensor are lightened once; and when the first type pixels and the second type pixels in the target assembly tolerance area are lightened, acquiring first sensing data and second sensing data of the target sensor.
According to the technical scheme, the position of the target sensor can be determined by only lighting the first type pixels and the second type pixels once, so that the operation is simple, and the time consumption for determining the target sensor is greatly reduced.
In one possible implementation, the first type of pixels and the second type of pixels within the target assembly tolerance region are illuminated for a period of 50ms.
According to the technical scheme, the time length for lighting the first type pixels and the second type pixels is set to be 50ms, so that the accuracy of the determined position of the target sensor and the time spent can be balanced.
In one possible implementation, the number of rows of pixels occupied by the assembly tolerance region is between 10 rows of pixels and 50 rows of pixels.
In one possible implementation, the method further includes: the assembly tolerance zone is covered by a light blocking object.
According to the technical scheme, the light blocking object covers the assembly tolerance area, so that the target sensor can be blocked from receiving other light signals except the light signals sent by the lighted pixels, and the position accuracy of the determined target sensor is further improved.
In one possible implementation, the target sensor is an ambient light sensor for detecting the intensity of an ambient light signal.
In a second aspect, there is provided an apparatus for positioning an under-screen sensor, comprising: the processing unit is used for acquiring a corresponding relation between a first value and a position of a sensor, wherein the first value is a value obtained based on first sensing data and second sensing data of the sensor, the first sensing data is data obtained after the sensor receives a first light signal sent by a first type of pixel in an assembly tolerance area, the second sensing data is data obtained after the sensor receives a second light signal sent by a second type of pixel in the assembly tolerance area, the assembly tolerance area is an area arranged in a display screen above the sensor, the assembly tolerance area corresponds to the assembly tolerance of the sensor, and the colors of the first type of pixel and the second type of pixel are different; the processing unit is further used for determining the position of the target sensor according to the first sensing data and the second sensing data of the target sensor and the corresponding relation.
In a third aspect, an apparatus for positioning an under-display sensor is provided, including: a memory for storing a program; a processor for executing the program stored in the memory, and when the program stored in the memory is executed, the processor is configured to perform the method in the first aspect or the implementation manner thereof.
In a fourth aspect, there is provided an electronic device comprising: a display screen; the sensor is arranged below the display screen; and apparatus in the second aspect or various implementations thereof, or apparatus in the third method.
Drawings
Fig. 1 is a schematic view of an assembly position of an under-screen sensor.
FIG. 2 is a schematic flow chart of a method of sensor positioning under a display screen in accordance with an embodiment of the application.
FIG. 3 is a schematic diagram of the location and assembly tolerance regions of a sensor according to an embodiment of the present application.
FIG. 4 is another schematic view of the location and assembly tolerance regions of a sensor according to an embodiment of the present application.
FIG. 5 is a schematic flow chart of determining a reference position according to an embodiment of the present application.
FIG. 6 is a schematic diagram of the positions of rows of pixels and sensors illuminated at a time in an assembly tolerance zone of one embodiment of the present application.
FIG. 7 is a schematic diagram of the positions of rows of pixels and sensors illuminated at a time in an assembly tolerance zone of another embodiment of the present application.
Fig. 8 is a schematic diagram of the positions of rows of pixels and sensors illuminated at a time in an assembly tolerance zone of yet another embodiment of the present application.
Fig. 9 is a schematic diagram of the positions of rows of pixels and sensors illuminated at a time in the assembly tolerance zone of yet another embodiment of the present application.
Fig. 10 is a schematic diagram of the correspondence of the embodiment of the present application.
FIG. 11 is a schematic illustration of an embodiment of the present application covering an assembly tolerance zone with a light blocking object.
FIG. 12 is a schematic block diagram of an apparatus for sensor positioning under a display screen in accordance with an embodiment of the application.
Fig. 13 is a schematic block diagram of an electronic device of an embodiment of the application.
Detailed Description
The technical scheme of the application will be described below with reference to the accompanying drawings.
With the development of the electronic device industry, ambient light sensors have become standard sensors for electronic devices. The ambient light sensor can be used for detecting the intensity of an ambient light signal so that the brightness of a display screen of the electronic device is automatically adjusted along with the change of the ambient light. For example, in the case where the intensity of the ambient light signal is strong, the electronic device may dim the brightness of the display screen to reduce the power consumption of the electronic device. The ambient light sensor can be used for detecting the intensity of the ambient light signal, detecting the color temperature of the ambient light signal, and compensating and correcting the color of the display screen, so that more excellent display effect is provided. In addition, the color temperature of the ambient light signal can also be used for correcting the photographing color so that the color of the photograph taken by the electronic device is closer to the actual color.
With the development and popularity of full-face screens, there is an increasing demand by users for ambient light sensors mounted under the display screen. Typically, when an engineer assembles an ambient light sensor disposed under a display screen, a preassembled position of the ambient light sensor is known in advance, for example, as shown in fig. 1, the preassembled position of the ambient light sensor is a. However, due to assembly tolerances, an engineer may assemble the ambient light sensor in the B position.
After the ambient light sensor is assembled in position B, the engineer will still consider the ambient light sensor assembled in position a, thereby configuring a series of parameters for the ambient light sensor based on position a, such that the ambient light sensor detects the intensity, color temperature, etc. of the ambient light signal based on the configured parameters.
The actual mounting position of the ambient light sensor is B, but the parameters are determined based on the pre-mounting position a, which can seriously affect the performance of the ambient light sensor. For example, if the parameters of the ambient light sensor are determined based on the actual assembly position, the ambient light sensor may detect the ambient light signal when the pixels of the display screen are in the dormant state, so that the influence of the brightness of the display screen on the detection accuracy can be avoided. If the parameters of the ambient light sensor are not determined based on the actual assembly position, it may happen that the ambient light sensor detects the ambient light signal while the pixels of the display screen are in the lit state, and thus the brightness of the display screen may affect the accuracy of the detection.
For example, the ambient light sensor detects the ambient light signal when the pixels of the display screen are in the on state, so that the ambient light sensor may receive the light signal emitted downward from the display screen, i.e. light leakage, in addition to the ambient light signal. Typically, the intensity of the light leakage is typically 0lux (lux) -10lux with the brightness and display content of the display screen. The intensity of the ambient light signal to be detected is generally below 10lux, and the transmittance of the display screen is generally 1% -8%, if considered according to 3%, the intensity of the ambient light signal with the intensity of 10lux is only 0.3lux after passing through the display screen. The ambient light signal with the intensity of 0.3lux is too small relative to the light leakage of the display screen with the intensity of 0lux-10lux, which can cause the ambient light signal detected by the ambient light sensor and inaccuracy thereof, thereby affecting the performance of the electronic device.
In view of this, the embodiment of the application provides a method for positioning a sensor under a display screen, which can effectively determine the position of the sensor in electronic equipment, thereby improving the performance of the sensor.
The following describes in detail the method for positioning the sensor under the display screen according to the embodiment of the present application with reference to fig. 2 to 11.
FIG. 2 is a schematic flow chart of a method 200 of sensor positioning under a display screen in accordance with an embodiment of the application. Alternatively, the method 200 may be performed by a processor in an electronic device. The processor may be a central processing unit (Central Processing Unit, CPU), but may also be other general purpose processors, digital Signal Processors (DSPs), application Specific Integrated Circuits (ASICs), off-the-shelf programmable gate arrays (FPGAs) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
As shown in fig. 2, method 200 includes at least some of the following.
210: a correspondence between the first value and the position of the sensor is obtained. The first value is a value obtained based on first sensing data and second sensing data of the sensor, the first sensing data is data obtained after the sensor receives first light signals sent by first type pixels in an assembly tolerance area, the second sensing data is data obtained after the sensor receives second light signals sent by second type pixels in the assembly tolerance area, the assembly tolerance area is an area in a display screen arranged above the sensor, the assembly tolerance area corresponds to the assembly tolerance of the sensor, and colors of the first type pixels and the second type pixels are different.
220: and determining the position of the target sensor according to the ratio between the first sensing data and the second sensing data of the target sensor and the corresponding relation.
According to the embodiment of the application, the position of the target sensor is determined by using the sensing data corresponding to the two types of pixels with different colors, and the position accuracy of the target sensor determined on the basis of the sensing data corresponding to the pixels with different colors is higher because the sensing data obtained by the target sensor on the basis of the received light signals sent by the pixels with different colors are different. In addition, in general, parameters of the sensor are determined based on the position of the sensor, and the performance of the target sensor is improved due to the improvement of the position accuracy of the target sensor, so that the performance of the electronic device including the target sensor is effectively improved.
Further, the embodiment of the application also determines the position of the target sensor according to the corresponding relation between the sensing data and the position of the sensor. Under normal conditions, the corresponding relation is obtained based on a plurality of sample data, so that the obtained corresponding relation is accurate, and the position accuracy of the target sensor is further improved.
Alternatively, the sensor may be an ambient light sensor, a fingerprint sensor (e.g., an optical fingerprint sensor), or the like. As previously described, an ambient light sensor may be used to detect the intensity, color temperature, etc. of ambient light. The optical fingerprint sensor may be adapted to receive an optical signal carrying a user's fingerprint and to convert the received optical signal into a corresponding electrical signal, i.e. a fingerprint identification signal. Based on the fingerprint identification signal, fingerprint image data can be obtained, and fingerprint matching verification is further carried out, so that an optical fingerprint identification function is realized in the electronic equipment.
The sensor is arranged below the display screen. The display screen may be a display screen having a self-luminous display unit, such as an Organic Light-Emitting Diode (OLED) display screen, a Micro-Light-Emitting Diode (Micro-LED) display screen, or a Mini-LED display screen.
It should be noted that, in the embodiments of the present application, the "sensor" refers to a target sensor or other sensors, such as a test sensor for obtaining a correspondence relationship. For example, the foregoing "the sensor may be disposed below the display screen", may indicate that the target sensor is disposed below the display screen, and may also indicate that the test sensor is disposed below the display screen.
Assembly tolerances can occur when assembling the sensor. Due to the assembly tolerance, the sensor can be assembled at any position in the gray area shown in fig. 3, and the area on the display screen corresponding to the gray area is the assembly tolerance area. Wherein the customer can inform the assembly tolerance zone in advance.
Alternatively, the number of rows of pixels of the display screen occupied by the assembly tolerance region may be between 10 and-50 rows. Illustratively, the number of rows of pixels occupied by the assembly tolerance region shown in FIG. 3 is 30 rows.
Optionally, the assembly tolerance region corresponds to an assembly tolerance. Assuming that the number of rows of pixels occupied by the assembly tolerance region is 30 rows, then the assembly tolerance is 30 rows of pixels.
It should be understood that the color of the first type pixel and the second type pixel in the embodiment of the present application is not specifically limited. For example, as shown in fig. 4, the first type of pixels may be blue pixels and the second type of pixels may be red pixels.
It should also be appreciated that the pixels in the assembly tolerance region may include only the first type of pixels and the second type of pixels, or the assembly tolerance region may include other pixels in addition to the first type of pixels and the second type of pixels. Referring again to fig. 4, the assembly tolerance area includes green pixels in addition to red and blue pixels. And, the red pixel, the blue pixel, and the green pixel respectively include 10 rows of pixels.
When the assembly tolerance area comprises other pixels except the first type of pixels and the second type of pixels, the farther the optical path of the optical signal emitted by the pixels reaches the sensor is considered to be, the smaller the sensing data of the sensor is under the same condition; conversely, the closer the sensor is to the pixel, the closer the optical path that the optical signal emitted by the pixel passes to reach the sensor, and under the same conditions, the larger the sensor data of the sensor. The more obvious the difference between the first sensing data and the second sensing data is, the more accurate the position of the obtained target sensor is.
Thus, the pitch between the first type of pixels and the second type of pixels is largest among the pitches between the different two types of pixels within the assembly tolerance region. For example, as shown in fig. 4, the pitch between the red pixel and the green pixel, the pitch between the green pixel and the blue pixel, and the pitch between the red pixel and the blue pixel are the largest, and thus, the embodiments of the present application set the red pixel and the blue pixel as the first type pixel and the second type pixel.
It should be understood that the spacing between two types of pixels in embodiments of the present application can be understood as: the spacing between the centers of each of the two types of pixels.
Alternatively, in an embodiment of the present application, the first distance and the second distance are different. The first distance is the distance between the first type of pixels and the sensor, and the second type of distance is the distance between the second type of pixels and the sensor.
According to the technical scheme, as the sensing data of the sensor is closely related to the position of the pixel, for example, under the same condition, the sensing data is larger as the pixel is closer to the sensor. Therefore, the position of the target sensor is determined based on the sensor data corresponding to the pixels at different positions within the assembly tolerance region, and the position accuracy of the target sensor can be effectively improved.
Optionally, in the embodiment of the present application, the light signal sent by the pixel received by the sensor may be light leakage of the display screen, or may also be that an object, such as a finger of a user, is placed above the display screen, and the display screen sends the light signal upwards, and the light signal is reflected on the surface of the object to form reflected light, so that the light signal is received by the sensor. After the sensor receives the optical signals sent by the pixels of the display screen, the sensor can convert the optical signals into electric signals, and the electric signals are the sensing data.
Alternatively, the first value may be a ratio between the first sensed data and the second sensed data. At this time, step 220 may specifically be: and determining the position of the target sensor according to the ratio between the first sensing data and the second sensing data of the target sensor and the corresponding relation. Alternatively, the first value may be a difference between the first sensed data and the second sensed data. At this time, step 220 may specifically be: and determining the position of the target sensor according to the corresponding relation.
In the following, the first value is taken as the ratio between the first sensing data and the second sensing data as an example. The position of the target sensor is determined according to the ratio of the first sensing data to the second sensing data, so that the method is simple to realize and can effectively improve the operation efficiency.
In one embodiment, the correspondence between the first value and the position of the sensor may be preconfigured on the target sensor. For example, the correspondence may be obtained during a testing phase of the electronic device. And pre-configuring the corresponding relation obtained in the testing stage on the target sensor in the mass production stage of the electronic equipment. In this way, when determining the position of the target sensor, the position of the target sensor can be determined directly according to the correspondence pre-configured on the target sensor, so that the time required for determining the position of the target sensor can be reduced.
In the embodiment, the corresponding relation is preconfigured on the target sensor, so that when the position of the target sensor is determined according to the corresponding relation, the corresponding relation can be directly obtained from the target sensor, and the operation complexity is reduced.
In another embodiment, the correspondence between the first value and the location of the sensor may be stored at the cloud. For example, after the corresponding relationship is obtained in the test stage, the corresponding relationship is stored in the cloud. Thus, when the position of the target sensor is determined, the corresponding relation can be acquired from the cloud end, and the position of the target sensor is determined according to the corresponding relation.
Hereinafter, for convenience of description, the sensor of the test stage will be referred to as a test sensor, and similarly, the display screen and the assembly tolerance area will be referred to as a test display screen and a test assembly tolerance area.
Alternatively, the correspondence may be a correspondence between the first value and an absolute position of the test sensor. Based on this, after the first sensing data and the second sensing data of the target sensor are obtained, if the first value of the target sensor is x and the position of the sensor corresponding to x is M in the correspondence relationship, the position of the target sensor can be determined to be M.
Alternatively, the correspondence relationship may be a correspondence relationship between the first value and a relative position, which is a position of the test sensor with respect to the reference position. Based on this, after the first sensor data and the second sensor data of the target sensor are obtained, if the first value of the target sensor is x, the relative position corresponding to x is M, and the reference position is d in the correspondence relation, the position of the target sensor can be determined to be (m+d).
As an example, the reference position may be a pre-assembled position of the test sensor. As previously described, the engineer knows in advance the preassembled position of the test sensor.
However, as also previously described, when the engineer assembles the test sensor, the pre-assembled position of the test sensor is a, as shown in fig. 1. However, due to assembly tolerances, an engineer may assemble the test sensor in position B. The actual assembly position, however, is still the a position, which the engineer considers as the reference position. Since the reference position itself has an error, the correspondence relationship determined based on the reference position also has an error, and thus, the determined position of the target sensor includes the position error of the test sensor, and thus, there is a problem that the position error of the target sensor is also large and the accuracy is low.
In view of the above, as another example, the reference position may be an actual assembly position of the test sensor. According to the technical scheme, the actual assembly position of the test sensor is used as the reference position, so that the problem of position error of the test sensor is solved to a great extent, and the position accuracy of the determined target sensor is improved.
The implementation of determining the actual assembly position of the test sensor is described in detail below.
As shown in fig. 5, determining the actual assembly position of the test sensor according to the embodiment of the present application may specifically include the following steps:
510: and acquiring a test assembly tolerance area in the test display screen.
520: the pixels at different locations within the test assembly tolerance region are sequentially illuminated.
530: and when the pixels at different positions are sequentially lightened, acquiring a plurality of third sensing data of the test sensor, wherein the plurality of third sensing data are obtained after the test sensor sequentially receives the optical signals sent by the pixels at different positions.
540: and determining the actual assembly position of the test sensor according to the third sensing data.
According to the technical scheme, in the test assembly tolerance area in the test display screen, the pixels at different positions in the area are sequentially lightened, and the sensing data of the test sensor corresponding to the pixels at different positions are acquired. Since the position of the test sensor is within the region corresponding to the test assembly tolerance region and the sensing data of the test sensor is closely related to the position of the pixel, for example, under the same condition, the closer the pixel is to the test sensor, the larger the sensing data is. Therefore, the accuracy of the test sensor position determined based on the sensor data corresponding to the pixels at the different positions is high.
Alternatively, the test assembly tolerance zone may be pre-stored with the test sensor so that the test assembly tolerance zone can be obtained.
Alternatively, step 520 may specifically be: and sequentially lighting pixels at different positions in the test assembly tolerance area by using a single-color bright strip. Illustratively, the monochrome bright strip may be a white bright strip. White is higher in brightness relative to other colors. Thus, the pixels are illuminated with white bright stripes, so that the greater the intensity of the light signal from the pixels received by the test sensor, the better the determination of the location of the test sensor.
Of course, the single-color bright stripes can also be bright stripes of other colors, such as red bright stripes and the like.
Alternatively, the pixels within the test assembly tolerance region may be monochrome pixels. For example, a white pixel, a red pixel, or the like.
Since the closer the pixel is to the sensor, the larger the sensed data. Thus, in one possible embodiment, the position of the test sensor may be determined from the maximum value of the plurality of third sensor data. For example, a pixel position corresponding to the maximum value of the third sensing data may be used as the position of the test sensor.
According to the technical scheme, the maximum value in the sensing data indicates that the pixel in the corresponding test assembly tolerance area is nearest to the test sensor, so that the accuracy of the determined position of the test sensor can be further improved.
Or after the plurality of third sensing data of the test sensor are acquired, fitting the plurality of third sensing data, so that the position of the test sensor is determined according to the maximum value after fitting.
Alternatively, the duration of each time a pixel at a different position is lit is not particularly limited in the embodiment of the present application. It should be appreciated that pixels at different locations may include pixels at different row locations. Illustratively, the duration may be greater than or equal to 50ms, such as 50ms, 200ms, etc.
Optionally, step 520 may specifically include: pixels within the test assembly tolerance region are illuminated row by row. For example, pixels within the test assembly tolerance region may be illuminated row by row starting from an upper limit value of the assembly tolerance to a lower limit value of the assembly tolerance. I.e. pixels within the test assembly tolerance area can be illuminated row by row starting with the first row of pixels within the test assembly tolerance area.
As one example, pixels within a test assembly tolerance region may be illuminated one row at a time.
As another example, multiple rows of pixels within the test assembly tolerance region may be illuminated at a time. For example, 11 rows of pixels may be illuminated at a time. According to the technical scheme, the plurality of rows of pixels are lightened each time, so that the time consumption for determining the position of the test sensor can be greatly reduced, and the processing speed is improved. Further, the plurality of rows of pixels simultaneously emit light signals, so that the intensity of the light signals received by the test sensor is obviously increased, the obtained sensing data is larger, and the accuracy of the position of the test sensor is further improved.
Fig. 6-9 show schematic diagrams of rows of pixels and test sensor positions illuminated at a time in a test assembly tolerance zone. Wherein, test sensor sets up in test display screen below. If the number of rows of pixels that are lit at a time (hereinafter, referred to as the target number of rows) is small, as shown in fig. 6 and 7, when the plurality of rows of pixels that are lit at different times are all covered by the test sensor, the third sensor data corresponding to the pixels that are lit at different times are substantially the same, and thus, there may be a case that there are a plurality of determined positions of the test sensor, which results in a large error in the determined positions of the test sensor. Where error= (width of test sensor/pitch between two rows of pixels) -target row number.
If the number of the target lines is large, as shown in fig. 8 and 9, when the plurality of rows of pixels that are lit up at different times cover the test sensor, the third sensing data corresponding to the pixels that are lit up at different times are substantially the same, and thus, there may be a case that there are a plurality of determined sensor positions, which results in a large error of the determined test sensor positions. Where error = target number of rows- (width of test sensor/spacing between two rows of pixels).
Therefore, in order to improve the accuracy of the determined position of the test sensor, in the embodiment of the present application, the width of the test sensor/the distance between the two rows of pixels=the target row number, that is, the width of the test sensor=the width of the rows of pixels that are lit at a time may be made. For example, the width of the test sensor is 700um, the interval between two rows of pixels is 63um, and the target row number=700 um/63 um=11.1, and thus the target row number is 11.
In one implementation, when determining the position of the test sensor based on the maximum value of the plurality of third sensor data, typically three values may result in a parabola, resulting in a maximum value therebetween. Therefore, in order to obtain the maximum value in the third sensing data, the number of times the pixels at different positions are lit up is at least three, i.e., the plurality of third sensing data includes at least three data.
At this time, the target line number may be less than or equal to a second value, where the second value is a difference between the line number of the pixels occupied by the assembly tolerance region and 2.
Further, the target number of rows may be greater than or equal to a third value, where the third value is a quotient of the number of pixel rows occupied by the assembly tolerance region and the number of pixels at different positions within the lighting test assembly tolerance region, i.e., the third value = the number of pixel rows occupied by the assembly tolerance region/the number of pixels at different positions within the lighting test assembly tolerance region.
It should be understood that if the ratio of the width of the test sensor to the spacing between two rows of pixels in the test display screen (referred to as the first ratio) is between the second value and the third value, embodiments of the present application may determine the target row number as the first ratio. If the first ratio is not between the second value and the third value, the embodiment of the application can determine the value closest to the first ratio in the values between the second value and the third value as the target line number.
It should also be appreciated that in the embodiments of the present application, "first," "second," and "third" are merely for distinguishing between different objects, and do not limit the scope of the embodiments of the present application.
After determining the reference position, a plurality of sets of pixels within the test assembly tolerance region may be sequentially illuminated based on the reference position, each set of pixels in the plurality of sets of pixels including a first type of pixel and a second type of pixel, the positions between each two sets of pixels in the plurality of sets of pixels being different. In other words, the positions of each of the plurality of sets of pixels are different.
As one example, groups of pixels may be illuminated row by row starting with the first row of pixels within the test assembly tolerance region.
When a plurality of groups of pixels are lighted row by row from the first row of pixels in the test tolerance area, taking the blue pixels as the first type of pixels and the red pixels as the second type of pixels in fig. 4 as examples, as the groups of pixels are lighted row by row, the lighted red pixels are closer to the test sensor, the lighted blue pixels are farther from the test sensor, the second sensing data obtained by the test sensor based on the received light signals sent by the red pixels are larger, the first sensing data obtained based on the received light signals sent by the blue pixels is smaller, and the ratio between the first sensing data and the second sensing data is smaller, as shown by the ordinate in fig. 10.
Further, each time a group of pixels is lit, the ratio between the first sensing data and the second sensing data corresponding to the group of pixels and the relative position with respect to the reference position at that time may be stored, so that the correspondence may be obtained. As shown in fig. 10.
After the corresponding relation is determined, in a mass production stage, when the position of the target sensor is determined, a first type of pixels and a second type of pixels in the target assembly tolerance area can be lightened, the target sensor receives a first optical signal sent by the first type of pixels and obtains first sensing data based on the first optical signal, and receives a second optical signal sent by the second type of pixels and obtains second sensing data based on the second optical signal. Then, the position of the target sensor is determined according to the ratio of the first sensing data to the second sensing data and the corresponding relation, such as the corresponding relation in fig. 10.
Alternatively, the duration of lighting the first type of pixels and the second type of pixels within the target assembly tolerance region may be greater than or equal to 50ms. For example 50ms.
In general, under the same conditions, the longer the duration of the lighting pixel, the more the amount of the light signal emitted by the lighting pixel that is received by the target sensor, the more accurate the position of the target sensor that is determined based on the light signal. However, the longer the duration of the pixel being illuminated, the less efficient the determination of the target sensor location. Assuming that the duration of one pixel is 200ms and a total of 8 times is required, the total time taken to determine the target sensor position is 200ms×8=16s, which is very inefficient. Therefore, considering the accuracy and efficiency of determining the position of the target sensor in combination, the embodiment of the present application can determine the duration of once-on pixels to be 50ms. In this way, the accuracy and efficiency of determining the position of the target sensor can be balanced, i.e. the position of the target sensor can be accurately determined in a short time.
Alternatively, the first type of pixels and the second type of pixels within the target assembly tolerance region may be lit only once.
According to the technical scheme, the position of the target sensor can be determined by only lighting the first type pixels and the second type pixels once, so that the operation is simple, and the time consumption for determining the target sensor is greatly reduced. For example, if the duration of lighting the first type pixel and the second type pixel is 50ms, the time required to determine the position of the target sensor is only 50ms.
In consideration of the fact that the position of the target sensor may be affected by the ambient light signal or other light signals in the process of determining the position of the target sensor, for example, the target sensor may receive not only the light signals emitted by the first type of pixels and the second type of pixels which are turned on, but also the ambient light signals, so that the sensing data of the target sensor is seriously affected.
Thus, the method 200 may further comprise: the assembly tolerance area is covered by a light blocking object.
As shown in fig. 11, the light blocking object may be placed on a transparent protective cover plate of an electronic device. The transparent protective cover plate may be, for example, a glass cover plate or a sapphire cover plate, which is located above the display screen 120 and covers the front surface of the electronic device.
Alternatively, the light blocking object may be a black test head or a black glue layer, etc.
In this way, the sensor can be blocked from receiving other optical signals than those emitted by the lit pixel, so that the accuracy of determining the position of the sensor can be greatly improved.
To verify the method 200, the embodiment of the present application performs 20 trials based on the method 200, each time the first type of pixels and the second type of pixels of the target sensor are lit only once, and table 1 shows the test results. The first type of pixels are blue pixels, and the second type of pixels are red pixels.
TABLE 1
As can be seen from table 1, the relative position between the target sensor and the reference position is 3, i.e., the relative position is 3 rows of pixels. If the reference position is d, the position of the target sensor is (d+3).
Further, the results of the 20 tests were the same, and the relative positions between the target sensor and the reference position were 3. Therefore, it can be appreciated that the positioning error of the embodiment of the present application may be 0.
In the embodiment of the present application, the sequence number of each process does not mean the sequence of execution sequence, and the execution sequence of each process should be determined by its function and internal logic, and should not constitute any limitation on the implementation process of the embodiment of the present application.
On the premise of no conflict, the embodiments and/or technical features in the embodiments described in the present application can be combined with each other arbitrarily, and the combined technical solutions should also fall into the protection scope of the present application.
The method for positioning the sensor under the display screen in the embodiment of the application is described in detail, and the device for positioning the sensor under the display screen in the embodiment of the application is described below. It should be understood that the device for positioning an under-screen sensor in the embodiment of the present application may perform the method for positioning an under-screen sensor in the embodiment of the present application, and have a function of performing the corresponding method.
Fig. 12 shows a schematic block diagram of an apparatus 1200 for sensor positioning under a display screen according to an embodiment of the application. As shown in fig. 12, the device 1200 for positioning an under-screen sensor may include:
the processing unit 1210 is configured to obtain a correspondence between a first value and a position of a sensor, where the first value is a value obtained based on first sensing data and second sensing data of the sensor, the first sensing data is data obtained after the sensor receives a first light signal sent by a first type of pixel in an assembly tolerance area, the second sensing data is data obtained after the sensor receives a second light signal sent by a second type of pixel in the assembly tolerance area, the assembly tolerance area is an area in a display screen disposed above the sensor, the assembly tolerance area corresponds to an assembly tolerance of the sensor, and colors of the first type of pixel and the second type of pixel are different.
The processing unit 1210 is further configured to determine a position of the target sensor according to the first sensing data and the second sensing data of the target sensor and according to the correspondence.
Optionally, in an embodiment of the present application, the color of the first type of pixels and the color of the second type of pixels are different.
Optionally, in an embodiment of the present application, the first value is a ratio between the first sensing data and the second sensing data; the processing unit 1210 is specifically configured to: and determining the position of the target sensor according to the ratio between the first sensing data and the second sensing data of the target sensor and the corresponding relation.
Optionally, in an embodiment of the present application, the correspondence relationship includes a correspondence relationship between the first value and a relative position, where the relative position is a position of the sensor relative to a reference position; the processing unit 1210 is specifically configured to: determining a relative position between the position of the target sensor and the reference position according to the first sensing data and the second sensing data of the target sensor and the corresponding relation; the position of the target sensor is determined from the relative position between the position of the target sensor and the reference position, and from the reference position.
Optionally, in an embodiment of the present application, the processing unit 1210 is specifically configured to: determining the reference position; the apparatus 1200 further comprises: a lighting unit, configured to sequentially light a plurality of groups of pixels in the assembly tolerance area based on the reference position, where each group of pixels includes the first type of pixels and the second type of pixels, and positions between every two groups of pixels in the plurality of groups of pixels are different; the processing unit 1210 is specifically configured to: acquiring a group of first sensing data and second sensing data when the lighting unit lights a group of first type pixels and second type pixels each time; determining the first value according to the first sensing data and the second sensing data; and determining the corresponding relation between the first value and the relative position according to the first value and the relative position of the sensor corresponding to the first value relative to the reference position.
Optionally, in an embodiment of the present application, the lighting unit is specifically configured to: the plurality of sets of pixels are illuminated row by row, starting with a first row of pixels within the assembly tolerance region, based on the reference position.
Optionally, in an embodiment of the present application, the processing unit 1210 is specifically configured to: acquiring the assembly tolerance area; the lighting unit is also used for sequentially lighting pixels at different positions in the assembly tolerance area; the processing unit 1210 is specifically configured to: acquiring a plurality of third sensing data of the sensor when the lighting unit sequentially lights the pixels at the different positions; and determining the reference position according to the third sensing data.
Optionally, in an embodiment of the present application, the lighting unit is specifically configured to: the pixels at the different positions are sequentially lighted up by using white bright stripes.
Optionally, in an embodiment of the present application, the processing unit 1210 is specifically configured to: and determining the reference position according to the maximum value in the third sensing data.
Optionally, in an embodiment of the present application, the lighting unit is specifically configured to: starting from the first row of pixels within the assembly tolerance region, the pixels within the assembly tolerance region are lit up row by row.
Alternatively, in an embodiment of the present application, the pixels each lit by the lighting unit include a plurality of rows of pixels.
Optionally, in an embodiment of the present application, a total width of the plurality of rows of pixels is equal to a width of the sensor.
Optionally, in an embodiment of the present application, the number of times the lighting unit lights the pixels at the different positions is at least three, and the plurality of third sensing data includes at least three data.
Optionally, in an embodiment of the present application, the number of rows of pixels is less than or equal to a second value, where the second value is a difference between the number of rows of pixels occupied by the assembly tolerance region and 2.
Optionally, in an embodiment of the present application, the number of rows of pixels is greater than or equal to a third value, where the third value is a quotient of the number of rows of pixels occupied by the assembly tolerance region and the number of times of lighting the pixels at the different positions.
Optionally, in an embodiment of the present application, a first distance and a second distance are different, where the first distance is a distance between the first type of pixel and the sensor, and the second distance is a distance between the second type of pixel and the sensor.
Optionally, in an embodiment of the present application, within the assembly tolerance area of the display screen, a pitch between the first type of pixels and the second type of pixels is the largest among pitches between the two types of pixels.
Optionally, in an embodiment of the present application, the correspondence is preconfigured on the target sensor.
Optionally, in an embodiment of the present application, the method further includes: the lighting unit is used for lighting first-class pixels and second-class pixels in a target assembly tolerance area in a target display screen arranged above the target sensor once; the processing unit 1210 is further configured to: and when the lighting unit lights the first type pixels and the second type pixels in the target assembly tolerance area, acquiring first sensing data and second sensing data of the target sensor.
Optionally, in an embodiment of the present application, a duration of lighting the first type pixels and the second type pixels in the target assembly tolerance area by the lighting unit is 50ms.
Optionally, in an embodiment of the present application, the number of rows of pixels occupied by the assembly tolerance region is between 10 rows of pixels and 50 rows of pixels.
Optionally, in an embodiment of the present application, the processing unit 1210 is further configured to: the assembly tolerance zone is covered by a light blocking object.
Optionally, in an embodiment of the present application, the target sensor is an ambient light sensor, and the ambient light sensor is used to detect an intensity of an ambient light signal.
It should be appreciated that the device 1200 for positioning an on-screen sensor may implement the corresponding operations in the method 200, and will not be described herein for brevity.
The embodiment of the application also provides a device for positioning the sensor under the display screen. The apparatus may include a memory and a processor.
Wherein the memory is coupled to the processor; a memory for storing a program; and the processor is used for calling the program stored in the memory so that the device can execute the method for positioning the sensor under the display screen, which is provided by any embodiment.
The embodiment of the application also provides an electronic device, as shown in fig. 13, the electronic device 1300 may include a display 1301, a sensor 1302, and an apparatus 1303 for positioning a sensor under the display.
Wherein the sensor 1302 is arranged below the display 1301, the means 1303 for positioning the sensor under the display may be the means for positioning the sensor under the display in the previous embodiment, and can be used to perform the content in the method embodiment shown in fig. 2.
Alternatively, the display 1301 may be a non-folding display or a foldable display, i.e., a flexible display.
By way of example, and not limitation, the electronic device 1300 in the embodiments of the present application may be a portable or mobile computing device such as a terminal device, a mobile phone, a tablet computer, a notebook computer, a desktop computer, a game device, an in-vehicle electronic device, or a wearable smart device, and other electronic devices such as an electronic database, an automobile, and a bank automated teller machine (Automated Teller Machine, ATM). The wearable intelligent device comprises full functions, large size and complete or partial functions which can be realized independent of the intelligent mobile phone, for example: smart watches or smart glasses, etc., and are only focused on certain application functions, and need to be used in combination with other devices, such as smart phones, as well as devices for monitoring physical signs, such as smart bracelets, smart jewelry, etc.
It should be noted that, on the premise of no conflict, the embodiments and/or technical features in the embodiments described in the present application may be combined with each other arbitrarily, and the technical solutions obtained after combination should also fall into the protection scope of the present application.
It is to be understood that the terminology used in the embodiments of the application and in the appended claims is for the purpose of describing particular embodiments only, and is not intended to be limiting of the embodiments of the application. For example, as used in the embodiments of the application and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
Those of ordinary skill in the art will appreciate that the elements of the examples described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both, and that the elements and steps of the examples have been described above generally in terms of functionality for clarity of understanding of interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the several embodiments provided in the present application, it should be understood that the disclosed system and apparatus may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of the units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. In addition, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices, or elements, or may be an electrical, mechanical, or other form of connection.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the embodiment of the present application.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application is essentially or a part contributing to the prior art, or all or part of the technical solution may be embodied in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
While the application has been described with reference to certain preferred embodiments, it will be understood by those skilled in the art that various changes and substitutions of equivalents may be made and equivalents will be apparent to those skilled in the art without departing from the scope of the application. Therefore, the protection scope of the application is subject to the protection scope of the claims.

Claims (46)

1. A method of positioning an under-screen sensor, the method comprising:
acquiring a corresponding relation between a first value and a position of a sensor, wherein the first value is a value obtained based on first sensing data and second sensing data of the sensor, the first sensing data is data obtained after the sensor receives a first light signal sent by a first type of pixel in an assembly tolerance area, the second sensing data is data obtained after the sensor receives a second light signal sent by a second type of pixel in the assembly tolerance area, the assembly tolerance area is an area arranged in a display screen above the sensor, the assembly tolerance area corresponds to the assembly tolerance of the sensor, and the colors of the first type of pixel and the second type of pixel are different;
and determining the position of the target sensor according to the first sensing data and the second sensing data of the target sensor and the corresponding relation.
2. The method of claim 1, wherein the first value is a ratio between the first sensed data and the second sensed data;
the determining the position of the target sensor according to the first sensing data and the second sensing data of the target sensor and the corresponding relation comprises the following steps:
And determining the position of the target sensor according to the ratio between the first sensing data and the second sensing data of the target sensor and the corresponding relation.
3. The method according to claim 1 or 2, wherein the correspondence relation includes a correspondence relation between the first value and a relative position, the relative position being a position of the sensor with respect to a reference position;
the determining the position of the target sensor according to the first sensing data and the second sensing data of the target sensor and the corresponding relation comprises the following steps:
determining a relative position between the position of the target sensor and the reference position according to the first sensing data and the second sensing data of the target sensor and the corresponding relation;
the position of the target sensor is determined from the relative position between the position of the target sensor and the reference position, and from the reference position.
4. A method according to claim 3, wherein said obtaining a correspondence between the first value and the position of the sensor comprises:
determining the reference position;
Sequentially illuminating a plurality of groups of pixels in the assembly tolerance area based on the reference position, wherein each group of pixels comprises the first type of pixels and the second type of pixels, and the positions between every two groups of pixels in the plurality of groups of pixels are different;
acquiring a group of the first sensing data and the second sensing data when each group of the first type pixels and the second type pixels are lightened;
determining the first value according to the first sensing data and the second sensing data;
and determining the corresponding relation between the first value and the relative position according to the first value and the relative position of the sensor corresponding to the first value relative to the reference position.
5. The method of claim 4, wherein sequentially illuminating the plurality of groups of pixels within the assembly tolerance zone based on the reference position comprises:
the plurality of sets of pixels are illuminated row by row, starting from the first row of pixels within the assembly tolerance region, based on the reference position.
6. The method of claim 4, wherein the determining the reference position comprises:
acquiring the assembly tolerance area;
Sequentially lighting pixels at different positions in the assembly tolerance area;
acquiring a plurality of third sensing data of the sensor when the pixels at different positions are sequentially lightened;
and determining the reference position according to the third sensing data.
7. The method of claim 6, wherein sequentially illuminating pixels at different locations within the assembly tolerance region comprises:
the pixels at the different positions are sequentially lighted up by using white bright stripes.
8. The method of claim 6, wherein said determining said reference position from said plurality of third sensed data comprises:
and determining the reference position according to the maximum value in the third sensing data.
9. The method of claim 6, wherein sequentially illuminating pixels at different locations within the assembly tolerance region comprises:
starting from the first row of pixels within the assembly tolerance region, the pixels within the assembly tolerance region are illuminated row by row.
10. The method of claim 9, wherein each lit pixel comprises a plurality of rows of pixels.
11. The method of claim 10, wherein the total width of the rows of pixels is equal to the width of the sensor.
12. The method of claim 10, wherein the number of times the pixels at the different locations are illuminated is at least three, and the plurality of third sensed data includes at least three data.
13. The method of claim 12, wherein the number of rows of pixels is less than or equal to a second value that is the difference between the number of rows of pixels occupied by the assembly tolerance region and 2.
14. The method of claim 12, wherein the number of rows of pixels is greater than or equal to a third value that is a quotient of the number of rows of pixels occupied by the assembly tolerance region and the number of times pixels at the different locations are illuminated.
15. A method according to claim 1 or 2, characterized in that a first distance, which is the distance between the first type of pixel and the sensor, and a second distance, which is the distance between the second type of pixel and the sensor, are different.
16. A method according to claim 1 or 2, wherein the spacing between the first type of pixels and the second type of pixels is greatest among the spacing between the two types of pixels within the assembly tolerance region of the display screen.
17. The method according to claim 1 or 2, wherein the correspondence is pre-configured on the target sensor.
18. The method according to claim 1 or 2, characterized in that the method further comprises:
the first type pixels and the second type pixels in a target assembly tolerance area in a target display screen arranged above the target sensor are lightened once;
and when the first type pixels and the second type pixels in the target assembly tolerance area are lightened, acquiring first sensing data and second sensing data of the target sensor.
19. The method of claim 18, wherein the first type of pixels and the second type of pixels within the target assembly tolerance region are illuminated for a period of 50ms.
20. A method according to claim 1 or 2, wherein the number of rows of pixels occupied by the assembly tolerance zone is between 10 rows of pixels and 50 rows of pixels.
21. The method according to claim 1 or 2, characterized in that the method further comprises:
the assembly tolerance zone is covered by a light blocking object.
22. The method according to claim 1 or 2, wherein the target sensor is an ambient light sensor for detecting the intensity of an ambient light signal.
23. An apparatus for positioning an under-screen sensor, comprising:
the processing unit is used for acquiring a corresponding relation between a first value and a position of a sensor, wherein the first value is a value obtained based on first sensing data and second sensing data of the sensor, the first sensing data is data obtained after the sensor receives a first light signal sent by a first type of pixel in an assembly tolerance area, the second sensing data is data obtained after the sensor receives a second light signal sent by a second type of pixel in the assembly tolerance area, the assembly tolerance area is an area arranged in a display screen above the sensor, the assembly tolerance area corresponds to the assembly tolerance of the sensor, and the colors of the first type of pixel and the second type of pixel are different;
the processing unit is further used for determining the position of the target sensor according to the first sensing data and the second sensing data of the target sensor and the corresponding relation.
24. The apparatus of claim 23, wherein the first value is a ratio between the first sensed data and the second sensed data;
The processing unit is specifically configured to:
and determining the position of the target sensor according to the ratio between the first sensing data and the second sensing data of the target sensor and the corresponding relation.
25. The apparatus according to claim 23 or 24, wherein the correspondence relation includes a correspondence relation between the first value and a relative position, the relative position being a position of the sensor with respect to a reference position;
the processing unit is specifically configured to:
determining a relative position between the position of the target sensor and the reference position according to the first sensing data and the second sensing data of the target sensor and the corresponding relation;
the position of the target sensor is determined from the relative position between the position of the target sensor and the reference position, and from the reference position.
26. The apparatus according to claim 25, wherein the processing unit is specifically configured to:
determining the reference position;
the apparatus further comprises:
a lighting unit, configured to sequentially light a plurality of groups of pixels in the assembly tolerance area based on the reference position, where each group of pixels includes the first type of pixels and the second type of pixels, and positions between every two groups of pixels in the plurality of groups of pixels are different;
The processing unit is specifically configured to:
acquiring a group of first sensing data and second sensing data when the lighting unit lights a group of first type pixels and second type pixels each time;
determining the first value according to the first sensing data and the second sensing data;
and determining the corresponding relation between the first value and the relative position according to the first value and the relative position of the sensor corresponding to the first value relative to the reference position.
27. The apparatus of claim 26, wherein the lighting unit is specifically configured to:
the plurality of sets of pixels are illuminated row by row, starting with a first row of pixels within the assembly tolerance region, based on the reference position.
28. The apparatus according to claim 26, wherein the processing unit is specifically configured to:
acquiring the assembly tolerance area;
the lighting unit is also used for sequentially lighting pixels at different positions in the assembly tolerance area;
the processing unit is specifically configured to:
acquiring a plurality of third sensing data of the sensor when the lighting unit sequentially lights the pixels at the different positions;
And determining the reference position according to the third sensing data.
29. The apparatus of claim 28, wherein the lighting unit is specifically configured to:
the pixels at the different positions are sequentially lighted up by using white bright stripes.
30. The apparatus according to claim 28, wherein the processing unit is specifically configured to:
and determining the reference position according to the maximum value in the third sensing data.
31. The apparatus of claim 28, wherein the lighting unit is specifically configured to:
starting from the first row of pixels within the assembly tolerance region, the pixels within the assembly tolerance region are lit up row by row.
32. The apparatus of claim 31, wherein the pixels that are illuminated by the illumination unit each time comprise a plurality of rows of pixels.
33. The apparatus of claim 32, wherein a total width of the plurality of rows of pixels is equal to a width of the sensor.
34. The apparatus of claim 32, wherein the number of times the lighting unit lights up the pixels of the different positions is at least three, and the plurality of third sensing data includes at least three data.
35. The apparatus of claim 34, wherein the number of rows of pixels is less than or equal to a second value that is a difference between the number of rows of pixels occupied by the assembly tolerance region and 2.
36. The apparatus of claim 34, wherein the number of rows of pixels is greater than or equal to a third value that is a quotient of the number of rows of pixels occupied by the assembly tolerance region and the number of times pixels at the different locations are illuminated.
37. The apparatus of claim 23 or 24, wherein a first distance and a second distance are different, the first distance being a distance between the first type of pixel and the sensor and the second distance being a distance between the second type of pixel and the sensor.
38. The apparatus of claim 23 or 24, wherein the spacing between the first type of pixels and the second type of pixels is greatest among the spacing between two types of pixels within the assembly tolerance region of the display screen.
39. The apparatus of claim 23 or 24, wherein the correspondence is pre-configured on the target sensor.
40. The apparatus according to claim 23 or 24, further comprising:
the lighting unit is used for lighting first-class pixels and second-class pixels in a target assembly tolerance area in a target display screen arranged above the target sensor once;
the processing unit is further configured to:
and when the lighting unit lights the first type pixels and the second type pixels in the target assembly tolerance area, acquiring first sensing data and second sensing data of the target sensor.
41. The apparatus of claim 40, wherein the lighting unit lights the first type of pixels and the second type of pixels within the target assembly tolerance region for a duration of 50ms.
42. The apparatus of claim 23 or 24, wherein the assembly tolerance region occupies between 10 rows of pixels and 50 rows of pixels.
43. The apparatus of claim 23 or 24, wherein the processing unit is further configured to:
the assembly tolerance zone is covered by a light blocking object.
44. The apparatus of claim 23 or 24, wherein the target sensor is an ambient light sensor for detecting the intensity of an ambient light signal.
45. An apparatus for positioning an under-screen sensor, comprising:
a memory for storing a program;
a processor for executing the program stored in the memory, which processor is adapted to perform the method of on-screen sensor positioning according to any one of claims 1 to 22 when the program stored in the memory is executed.
46. An electronic device, comprising:
a display screen;
the sensor is arranged below the display screen;
and means for positioning an under-display sensor as claimed in any one of claims 23 to 45.
CN202111620884.7A 2021-12-27 2021-12-27 Method and device for positioning sensor under display screen and electronic equipment Active CN114279332B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111620884.7A CN114279332B (en) 2021-12-27 2021-12-27 Method and device for positioning sensor under display screen and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111620884.7A CN114279332B (en) 2021-12-27 2021-12-27 Method and device for positioning sensor under display screen and electronic equipment

Publications (2)

Publication Number Publication Date
CN114279332A CN114279332A (en) 2022-04-05
CN114279332B true CN114279332B (en) 2023-11-03

Family

ID=80876773

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111620884.7A Active CN114279332B (en) 2021-12-27 2021-12-27 Method and device for positioning sensor under display screen and electronic equipment

Country Status (1)

Country Link
CN (1) CN114279332B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108509849A (en) * 2017-02-24 2018-09-07 三星电子株式会社 Utilize the electronic equipment and method of the pixel identification sensor position of display
CN108716950A (en) * 2018-05-16 2018-10-30 北京小米移动软件有限公司 Environmental light brightness acquisition methods and device
CN111261074A (en) * 2018-11-30 2020-06-09 上海耕岩智能科技有限公司 Method for detecting whether screen uniformly emits light or not, storage medium and electronic equipment
CN112070018A (en) * 2018-12-14 2020-12-11 深圳市汇顶科技股份有限公司 Fingerprint identification device and electronic equipment
CN112599089A (en) * 2020-12-09 2021-04-02 深圳市汇顶科技股份有限公司 Display screen light leakage value acquisition method, electronic device and storage medium
CN113654459A (en) * 2021-07-22 2021-11-16 北京小米移动软件有限公司 Method and device for determining position of photosensitive sensor under screen and storage medium
CN113763856A (en) * 2021-07-19 2021-12-07 荣耀终端有限公司 Method and device for determining ambient illumination intensity and storage medium

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108509849A (en) * 2017-02-24 2018-09-07 三星电子株式会社 Utilize the electronic equipment and method of the pixel identification sensor position of display
CN108716950A (en) * 2018-05-16 2018-10-30 北京小米移动软件有限公司 Environmental light brightness acquisition methods and device
EP3570267A1 (en) * 2018-05-16 2019-11-20 Beijing Xiaomi Mobile Software Co., Ltd. Method and apparatus for acquiring ambient light brightness
CN111261074A (en) * 2018-11-30 2020-06-09 上海耕岩智能科技有限公司 Method for detecting whether screen uniformly emits light or not, storage medium and electronic equipment
CN112070018A (en) * 2018-12-14 2020-12-11 深圳市汇顶科技股份有限公司 Fingerprint identification device and electronic equipment
CN112599089A (en) * 2020-12-09 2021-04-02 深圳市汇顶科技股份有限公司 Display screen light leakage value acquisition method, electronic device and storage medium
CN113763856A (en) * 2021-07-19 2021-12-07 荣耀终端有限公司 Method and device for determining ambient illumination intensity and storage medium
CN113654459A (en) * 2021-07-22 2021-11-16 北京小米移动软件有限公司 Method and device for determining position of photosensitive sensor under screen and storage medium

Also Published As

Publication number Publication date
CN114279332A (en) 2022-04-05

Similar Documents

Publication Publication Date Title
CN109155071B (en) Color detection method and terminal
US9582118B2 (en) Optical touch system and object detection method therefor
US9582083B2 (en) Directional light sensors
CN109359459B (en) Fingerprint identification method, fingerprint identification device and display module
US11354530B2 (en) Electronic apparatus and operating method thereof
EP2147345A1 (en) Method and system for optimizing an image for improved analysis of material and illumination image features
EP3775810B1 (en) Calibrated brightness estimation using ambient color sensors
US11835515B2 (en) Method for evaluating suitability of lighting conditions for detecting an analyte in a sample using a camera of a mobile device
WO2020159049A1 (en) Electronic device with illuminance sensor mounted at back side of display and method for measuring illuminance using illuminance sensor
CN107945770A (en) Ambient light intensity detection method, device, storage medium and electronic equipment
US9838612B2 (en) Inspecting device and method for inspecting inspection target
CN114281629B (en) Method and device for estimating screen light leakage amount and electronic equipment
CN114279332B (en) Method and device for positioning sensor under display screen and electronic equipment
CN112668425A (en) Fingerprint detection method, fingerprint sensor and electronic equipment
US9696470B2 (en) Sensing images and light sources via visible light filters
CN112185300A (en) Display screen correction method and device, storage medium and processor
CN114281290B (en) Method and device for positioning sensor under display screen and electronic equipment
CN111947775B (en) Device detection method and device, storage medium and electronic device
CN216621463U (en) Data acquisition device for LED luminous color test
CN206961140U (en) Has the display of biological characteristic identification function
CN112241697B (en) Corner color determination method and device, terminal device and readable storage medium
CN113963658B (en) Brightness compensation method, brightness data determination method, device and chip
CN102132239B (en) Interactive display screen
CN116324347A (en) Ambient radiation sensing
CN215642738U (en) Fingerprint recognition device and terminal under screen

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant