CN114279332A - Method and device for positioning sensor under display screen and electronic equipment - Google Patents

Method and device for positioning sensor under display screen and electronic equipment Download PDF

Info

Publication number
CN114279332A
CN114279332A CN202111620884.7A CN202111620884A CN114279332A CN 114279332 A CN114279332 A CN 114279332A CN 202111620884 A CN202111620884 A CN 202111620884A CN 114279332 A CN114279332 A CN 114279332A
Authority
CN
China
Prior art keywords
pixels
sensor
sensing data
type
assembly tolerance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111620884.7A
Other languages
Chinese (zh)
Other versions
CN114279332B (en
Inventor
钟松锦
杜灿鸿
付恩琪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Goodix Technology Co Ltd
Original Assignee
Shenzhen Goodix Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Goodix Technology Co Ltd filed Critical Shenzhen Goodix Technology Co Ltd
Priority to CN202111620884.7A priority Critical patent/CN114279332B/en
Publication of CN114279332A publication Critical patent/CN114279332A/en
Application granted granted Critical
Publication of CN114279332B publication Critical patent/CN114279332B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Position Input By Displaying (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The embodiment of the application provides a method and a device for positioning a sensor under a display screen and electronic equipment, which can effectively improve the performance of the sensor, thereby improving the performance of the electronic equipment. The method comprises the following steps: acquiring a corresponding relation between a first value and a position of a sensor, wherein the first value is a value obtained based on first sensing data and second sensing data of the sensor, the first sensing data is data obtained after the sensor receives a first optical signal sent by a first type of pixel in an assembly tolerance region, the second sensing data is data obtained after the sensor receives a second optical signal sent by a second type of pixel in the assembly tolerance region, the assembly tolerance region is a region in a display screen arranged above the sensor, the assembly tolerance region corresponds to the assembly tolerance of the sensor, and the color of the first type of pixel is different from the color of the second type of pixel; and determining the position of the target sensor according to the first sensing data and the second sensing data of the target sensor and the corresponding relation.

Description

Method and device for positioning sensor under display screen and electronic equipment
Technical Field
The embodiment of the application relates to the technical field of positioning, and in particular relates to a method and a device for positioning a sensor under a display screen and an electronic device.
Background
With the development of the electronic equipment industry, in order to provide better user experience, an ambient light sensor is arranged in a plurality of electronic equipment, so that some intelligent functions such as automatically adjusting the brightness of a display screen, compensating the color of the display screen, correcting the photographing color and the like can be realized through the ambient light sensor.
At present, users have higher and higher requirements for the performance of electronic devices, and therefore, how to improve the performance of the ambient light sensor, thereby improving the performance of the electronic devices, is a problem to be solved urgently.
Disclosure of Invention
The embodiment of the application provides a method and a device for positioning a sensor under a display screen and electronic equipment, which can effectively improve the performance of the sensor, thereby improving the performance of the electronic equipment.
In a first aspect, a method for positioning a sensor under a display screen is provided, the method comprising: acquiring a corresponding relation between a first value and a position of a sensor, wherein the first value is a value obtained based on first sensing data and second sensing data of the sensor, the first sensing data is data obtained after the sensor receives a first optical signal sent by a first type of pixel in an assembly tolerance region, the second sensing data is data obtained after the sensor receives a second optical signal sent by a second type of pixel in the assembly tolerance region, the assembly tolerance region is a region in a display screen arranged above the sensor, the assembly tolerance region corresponds to the assembly tolerance of the sensor, and the color of the first type of pixel is different from the color of the second type of pixel; and determining the position of the target sensor according to the first sensing data and the second sensing data of the target sensor and the corresponding relation.
According to the embodiment of the application, the position of the target sensor is determined by using the sensing data corresponding to the two types of pixels with different colors, and the position accuracy of the target sensor determined based on the sensing data corresponding to the pixels with different colors is higher because the sensing data obtained by the target sensor based on the received optical signals sent by the pixels with different colors is different. In addition, in general, parameters of the sensor are determined based on the position of the sensor, and the position accuracy of the target sensor is improved, so that the performance of the target sensor is improved, and the performance of the electronic device including the target sensor is effectively improved.
Further, the position of the target sensor is determined according to the corresponding relation between the sensing data and the position of the sensor. In general, the corresponding relationship is obtained based on many sample data, so that the obtained corresponding relationship is relatively accurate, and the position accuracy of the target sensor is further improved.
In one possible implementation, the first value is a ratio between the first sensing data and the second sensing data; determining the position of the target sensor according to the first sensing data and the second sensing data of the target sensor and according to the corresponding relationship, includes: and determining the position of the target sensor according to the ratio of the first sensing data and the second sensing data of the target sensor and the corresponding relation.
According to the technical scheme, the position of the target sensor is determined according to the ratio of the first sensing data to the second sensing data, the implementation is simple, and the operation efficiency can be effectively improved.
In one possible implementation, the correspondence includes a correspondence between the first value and a relative position, where the relative position is a position of the sensor relative to a reference position; determining the position of the target sensor according to the first sensing data and the second sensing data of the target sensor and according to the corresponding relationship, includes: determining a relative position between the position of the target sensor and the reference position according to the first sensing data and the second sensing data of the target sensor and according to the corresponding relation; determining the position of the target sensor based on the relative position between the position of the target sensor and the reference position and based on the reference position.
In a possible implementation manner, the obtaining a correspondence between the first value and the position of the sensor includes: determining the reference position; sequentially lightening a plurality of groups of pixels in the assembly tolerance region based on the reference position, wherein each group of pixels in the plurality of groups of pixels comprises the first type of pixels and the second type of pixels, and the positions of each two groups of pixels in the plurality of groups of pixels are different; acquiring a set of the first sensing data and the second sensing data when acquiring a set each time a set of the first type pixels and the second type pixels are lit; determining the first value according to the first sensing data and the second sensing data; and determining the corresponding relation between the first value and the relative position according to the first value and the relative position of the sensor corresponding to the first value relative to the reference position.
In one possible implementation, the sequentially lighting the first type of pixels and the second type of pixels at different positions within the assembly tolerance region based on the reference position includes: the plurality of groups of pixels are lit up line by line starting from the first line of pixels within the fitting tolerance region described above based on the reference position.
In one possible implementation, the determining the reference position includes: acquiring the assembly tolerance area; sequentially lightening pixels at different positions in the assembly tolerance area; acquiring a plurality of third sensing data of the sensor when the pixels at different positions are sequentially lighted; and determining the reference position according to the plurality of third sensing data.
According to the technical scheme, the pixels at different positions in the assembly tolerance area are sequentially lightened in the assembly tolerance area, and the sensing data of the sensors corresponding to the pixels at different positions are acquired. Since the position of the sensor is within the region corresponding to the assembly tolerance region and the sensing data of the sensor is closely related to the position of the pixel, for example, the closer the pixel is to the test sensor, the larger the sensing data under the same conditions. Therefore, the accuracy of the reference position determined based on the sensing data corresponding to the pixels at different positions is high.
Furthermore, the accuracy of the reference position is high, so that the accuracy of the corresponding relation determined based on the reference position is also high, and the accuracy of the target sensor determined according to the corresponding relation is effectively improved.
In one possible implementation, the sequentially lighting pixels at different positions within the assembly tolerance region includes: and using the white bright bars to sequentially light the pixels at the different positions.
According to the technical scheme, the brightness of white is higher compared with other colors. Therefore, the white bright strip is used for lighting the pixel, so that the intensity of the light signal emitted by the pixel received by the sensor is higher, and the position of the sensor is favorably determined.
In one possible implementation manner, the determining the reference position according to the plurality of third sensing data includes: and determining the reference position according to the maximum value in the plurality of third sensing data.
According to the technical scheme, the closer the pixel is to the sensor, the larger the sensing data of the sensor is. Thus, a maximum in the third sensed data indicates that the pixel within its corresponding assembly tolerance region is closest to the sensor, thereby further improving the accuracy of the determined sensor location.
In one possible implementation, the sequentially lighting pixels at different positions within the assembly tolerance region includes: starting from the first row of pixels in the fitting tolerance region, the pixels in the fitting tolerance region are lit up row by row.
In one possible implementation, the pixels lit at a time comprise a plurality of rows of pixels.
According to the technical scheme, a plurality of rows of pixels are lightened each time, so that the time consumption for determining the position of the sensor can be greatly reduced, and the processing speed is improved. Furthermore, the pixels in multiple rows emit light signals simultaneously, so that the intensity of the light signals received by the sensor is obviously increased, the obtained sensing data is larger, and the accuracy of the position of the sensor is further improved.
In one possible implementation, the total width of the rows of pixels is equal to the width of the sensor.
According to the technical scheme, the total width of the multiple rows of pixels is set to be equal to the width of the sensor, so that the problems that the number of rows of pixels lighted at different times is different, the sensing data is unchanged and the error is larger under the condition that the number of rows of pixels lighted at the same time is too much or too little can be avoided, and the accuracy of the position of the sensor is further improved.
In a possible implementation manner, the number of times the pixel at the different position is lighted is at least three, and the plurality of third sensing data includes at least three data.
In one possible implementation, the number of rows of pixels is less than or equal to a second value, which is a difference between the number of rows of pixels occupied by the assembly tolerance region and 2.
In one possible implementation, the number of rows of pixels is greater than or equal to a third value, the third value being a quotient of the number of rows of pixels occupied by the assembly tolerance region and the number of times the pixels at the different locations are lit.
In a possible implementation manner, a first distance and a second distance are different, the first distance is a distance between the first type of pixel and the sensor, and the second distance is a distance between the second type of pixel and the sensor.
According to the technical scheme, the sensing data of the sensor is closely related to the position of the pixel, for example, under the same condition, the closer the pixel is to the sensor, the larger the sensing data is. Therefore, the position of the target sensor is determined based on the sensing data corresponding to the pixels at different positions within the assembly tolerance region, and the position accuracy of the target sensor can be effectively improved.
In one possible implementation, in the assembly tolerance region of the display screen, a pitch between the first type of pixels and the second type of pixels is largest among pitches between the two types of pixels.
According to the technical scheme, the farther the sensor is away from the pixel, the farther the optical path of the optical signal emitted by the pixel reaching the sensor is, and the smaller the sensing data of the sensor under the same condition. Therefore, the two types of pixels with the largest distance are set as the first type of pixels and the second type of pixels, so that the difference between the optical paths of the optical signals received by the target sensor and sent by the two types of pixels is larger, the difference between the first sensing data and the second sensing data generated by the target sensor is more obvious, and the position of the obtained target sensor is more accurate.
In one possible implementation, the correspondence is preconfigured on the target sensor.
According to the technical scheme, the corresponding relation is pre-configured on the target sensor, so that when the position of the target sensor is determined according to the corresponding relation, the corresponding relation can be directly obtained from the target sensor, and the operation complexity is reduced.
In one possible implementation, the method further includes: lighting a first type of pixels and a second type of pixels in a target assembly tolerance area in a target display screen arranged above the target sensor for one time; and acquiring first sensing data and second sensing data of the target sensor when the first type of pixels and the second type of pixels in the target assembly tolerance area are lightened.
According to the technical scheme, the position of the target sensor can be determined only by lighting the first type of pixels and the second type of pixels once, so that the operation is simple, and the time consumption for determining the target sensor is greatly reduced.
In one possible implementation, the time period for lighting the first type of pixels and the second type of pixels within the target assembly tolerance region is 50 ms.
According to the technical scheme, the time length for lighting the first type of pixels and the second type of pixels is set to be 50ms, so that the accuracy rate of the determined position of the target sensor and the time spent can be balanced.
In one possible implementation, the assembly tolerance region occupies a number of rows of pixels between 10 rows of pixels and 50 rows of pixels.
In one possible implementation, the method further includes: the assembly tolerance zone is covered by a light blocking object.
According to the technical scheme, the assembly tolerance area is covered by the light blocking object, so that the target sensor can be prevented from receiving other light signals except the light signal sent by the lighted pixel, and the position accuracy of the determined target sensor is further improved.
In one possible implementation, the target sensor is an ambient light sensor for detecting the intensity of an ambient light signal.
In a second aspect, there is provided an apparatus for positioning a sensor under a display screen, comprising: the processing unit is used for acquiring a corresponding relation between a first value and a position of a sensor, wherein the first value is a value obtained based on first sensing data and second sensing data of the sensor, the first sensing data is data obtained after the sensor receives a first optical signal sent by a first type of pixel in an assembly tolerance region, the second sensing data is data obtained after the sensor receives a second optical signal sent by a second type of pixel in the assembly tolerance region, the assembly tolerance region is a region in a display screen arranged above the sensor, the assembly tolerance region corresponds to the assembly tolerance of the sensor, and the color of the first type of pixel is different from the color of the second type of pixel; the processing unit is further configured to determine a position of the target sensor according to the first sensing data and the second sensing data of the target sensor and according to the corresponding relationship.
In a third aspect, an apparatus for positioning a sensor under a display screen is provided, comprising: a memory for storing a program; a processor configured to execute the program stored in the memory, and when the program stored in the memory is executed, the processor is configured to perform the method of the first aspect or each implementation manner thereof.
In a fourth aspect, an electronic device is provided, comprising: a display screen; the sensor is arranged below the display screen; and the apparatus of the second aspect or each implementation manner thereof, or the apparatus of the third method.
Drawings
Fig. 1 is a schematic diagram of the mounting position of an under-screen sensor.
FIG. 2 is a schematic flow chart diagram of a method of positioning an off-screen sensor according to an embodiment of the present application.
FIG. 3 is a schematic diagram of the location and assembly tolerance regions of a sensor according to an embodiment of the present application.
FIG. 4 is another schematic illustration of the location and assembly tolerance region of a sensor of an embodiment of the present application.
Fig. 5 is a schematic flow chart of determining the reference position according to an embodiment of the present application.
FIG. 6 is a schematic diagram of the position of multiple rows of pixels and sensors lit at a time in the assembly tolerance region of one embodiment of the present application.
FIG. 7 is a schematic diagram of the position of multiple rows of pixels and sensors lit at a time in an assembly tolerance region according to another embodiment of the present application.
FIG. 8 is a schematic diagram of the position of multiple rows of pixels and sensors lit at a time in an assembly tolerance region of yet another embodiment of the present application.
FIG. 9 is a schematic diagram of the position of multiple rows of pixels and sensors per illumination in an assembly tolerance region of yet another embodiment of the present application.
Fig. 10 is a schematic diagram of a correspondence relationship of an embodiment of the present application.
FIG. 11 is a schematic diagram of covering an assembly tolerance area with a light blocking object according to an embodiment of the present application.
FIG. 12 is a schematic block diagram of an apparatus for under-display sensor positioning in an embodiment of the present application.
Fig. 13 is a schematic block diagram of an electronic device of an embodiment of the present application.
Detailed Description
The technical solution in the present application will be described below with reference to the accompanying drawings.
With the development of the electronic device industry, the ambient light sensor has become a standard sensor of the electronic device. The ambient light sensor can be used for detecting the intensity of the ambient light signal, so that the brightness of the display screen of the electronic equipment is automatically adjusted along with the change of the ambient light. For example, in the case where the intensity of the ambient light signal is strong, the electronic device may dim the brightness of the display screen to reduce the power consumption of the electronic device. The ambient light sensor can be used for detecting the intensity of the ambient light signal and the color temperature of the ambient light signal, and the color temperature of the ambient light signal can be used for compensating and correcting the color of the display screen, so that a more excellent display effect is provided. In addition, the color temperature of the ambient light signal can be used for correcting the photographing color so that the color of the photo shot by the electronic equipment is closer to the actual color.
With the development and popularization of the full-face screen, the demand of users for the ambient light sensor assembled under the display screen is increasingly strong. Typically, when an engineer assembles the ambient light sensor disposed under the display screen, the pre-assembly position of the ambient light sensor is known in advance, for example, as shown in fig. 1, the pre-assembly position of the ambient light sensor is a. However, due to assembly tolerances, an engineer may assemble the ambient light sensor in the B position.
After the ambient light sensor is mounted at the position B, the engineer may still consider that the ambient light sensor is mounted at the position a, and configure a series of parameters for the ambient light sensor based on the position a, so that the ambient light sensor detects the intensity, color temperature, and the like of the ambient light signal based on the configured parameters.
The actual installation position of the ambient light sensor is B, but the parameters are determined on the basis of the preassembly position a, which seriously affects the performance of the ambient light sensor. For example, if the parameters of the ambient light sensor are determined based on the actual assembly position, the ambient light sensor may detect an ambient light signal when the pixels of the display screen are in a sleep state, so that the influence of the brightness of the display screen on the detection accuracy can be avoided. If the parameters of the ambient light sensor are not determined based on the actual assembly position, it may happen that the ambient light sensor detects the ambient light signal while the pixels of the display screen are in the lit state, so that the brightness of the display screen may affect the accuracy of the detection.
For example, the ambient light sensor detects an ambient light signal when the pixels of the display screen are in a lighting state, so that the ambient light sensor may receive a light signal emitted downward from the display screen, that is, leak light, in addition to the ambient light signal. In general, the intensity of the light leakage is generally 0lux to 10lux as the brightness and display content of the display screen vary. The minimum intensity of the ambient light signal to be detected is generally below 10lux, the transmittance of the display screen is generally 1% -8%, and if the transmittance is considered as 3%, the intensity of the ambient light signal with the intensity of 10lux is only 0.3lux left after passing through the display screen. The ambient light signal with 0.3lux intensity is too small compared to the display screen with 0lux-10lux intensity, which may cause the ambient light signal detected by the ambient light sensor to be inaccurate, thereby affecting the performance of the electronic device.
In view of this, the embodiments of the present application provide a method for positioning a sensor under a display screen, which can effectively determine a position of the sensor in an electronic device, thereby improving performance of the sensor.
The method for positioning the sensor under the display screen according to the embodiment of the present application is described in detail below with reference to fig. 2 to 11.
FIG. 2 is a schematic flow chart diagram of a method 200 for positioning an off-screen sensor according to an embodiment of the present application. Alternatively, the method 200 may be performed by a processor in an electronic device. The processor may be a Central Processing Unit (CPU), but may also be other general purpose processors, Digital Signal Processors (DSP), Application Specific Integrated Circuits (ASIC), an off-the-shelf programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
As shown in fig. 2, the method 200 includes at least some of the following.
210: a correspondence between the first value and the position of the sensor is obtained. The first value is a value obtained based on first sensing data and second sensing data of the sensor, the first sensing data is data obtained after the sensor receives a first optical signal sent by a first type of pixel in the assembly tolerance region, the second sensing data is data obtained after the sensor receives a second optical signal sent by a second type of pixel in the assembly tolerance region, the assembly tolerance region is a region in a display screen arranged above the sensor, the assembly tolerance region corresponds to the assembly tolerance of the sensor, and the first type of pixel and the second type of pixel are different in color.
220: and determining the position of the target sensor according to the ratio between the first sensing data and the second sensing data of the target sensor and the corresponding relation.
According to the embodiment of the application, the position of the target sensor is determined by using the sensing data corresponding to the two types of pixels with different colors, and the position accuracy of the target sensor determined based on the sensing data corresponding to the pixels with different colors is higher because the sensing data obtained by the target sensor based on the received optical signals sent by the pixels with different colors is different. In addition, in general, parameters of the sensor are determined based on the position of the sensor, and the position accuracy of the target sensor is improved, so that the performance of the target sensor is improved, and the performance of the electronic device including the target sensor is effectively improved.
Further, the position of the target sensor is determined according to the corresponding relation between the sensing data and the position of the sensor. In general, the corresponding relationship is obtained based on many sample data, so that the obtained corresponding relationship is relatively accurate, and the position accuracy of the target sensor is further improved.
Alternatively, the sensor may be an ambient light sensor, a fingerprint sensor (e.g., an optical fingerprint sensor), or the like. As previously mentioned, the ambient light sensor may be used to detect the intensity, color temperature, etc. of the ambient light. The optical fingerprint sensor may be configured to receive an optical signal carrying a fingerprint of a user, and convert the received optical signal into a corresponding electrical signal, i.e., a fingerprint identification signal. Fingerprint image data can be obtained based on the fingerprint identification signal, and fingerprint matching verification is further carried out, so that the optical fingerprint identification function is realized in the electronic equipment.
The sensor is arranged below the display screen. The display screen may be a display screen having a self-luminous display unit, such as an Organic Light-Emitting Diode (OLED) display screen, a Micro-LED (Micro-LED) display screen, or a Mini-LED (Mini-LED) display screen.
It should be noted that, the term "sensor" in the embodiments of the present application is used generically, and may refer to a target sensor, and may also refer to other sensors, such as a test sensor for obtaining a corresponding relationship, as mentioned later. For example, the foregoing "sensor may be disposed below the display screen", which may mean that the target sensor is disposed below the display screen, and may also mean that the following test sensor is disposed below the display screen.
When the sensor is assembled, assembly tolerances occur. Due to the existence of assembly tolerance, the sensor can be assembled at any position in the gray area shown in fig. 3, and the area on the display screen corresponding to the gray area is the assembly tolerance area. Wherein the customer can inform the assembly tolerance area in advance.
Alternatively, the number of rows of pixels of the display screen occupied by the assembly tolerance area may be between 10 and-50 rows. Illustratively, the number of rows of pixels occupied by the assembly tolerance area shown in fig. 3 is 30 rows.
Optionally, the assembly tolerance region corresponds to an assembly tolerance. Assuming that the number of rows of pixels occupied by the assembly tolerance region is 30 rows, the assembly tolerance is 30 rows of pixels.
It should be understood that the color of the first type of pixel and the second type of pixel is not particularly limited in the embodiments of the present application. For example, as shown in fig. 4, the first type of pixels may be blue pixels and the second type of pixels may be red pixels.
It should also be understood that the pixels in the assembly tolerance region may include only the first type of pixels and the second type of pixels, or the assembly tolerance region may include other pixels in addition to the first type of pixels and the second type of pixels. Referring again to fig. 4, the assembly tolerance region includes a green pixel in addition to a red pixel and a blue pixel. The red pixels, the blue pixels, and the green pixels include 10 rows of pixels, respectively.
When the assembly tolerance region includes other pixels except the first type of pixels and the second type of pixels, considering that the farther the sensor is away from the pixels, the farther the optical path of the optical signal emitted by the pixels reaches the sensor, the smaller the sensing data of the sensor under the same condition; conversely, the closer the sensor is to the pixel, the closer the optical path taken by the pixel to reach the sensor, and the greater the sensor data under the same conditions. The more obvious the difference between the first sensing data and the second sensing data, the more accurate the position of the target sensor can be obtained.
Therefore, in the assembly tolerance region, the pitch between the first type pixels and the second type pixels is the largest among the pitches between the two different types of pixels. For example, as shown in fig. 4, the pitch between the red pixel and the blue pixel is the largest among the pitch between the red pixel and the green pixel, the pitch between the green pixel and the blue pixel, and the pitch between the red pixel and the blue pixel, and thus, the embodiments of the present application set the red pixel and the blue pixel as the first type pixel and the second type pixel.
It should be understood that the pitch between two types of pixels in the embodiments of the present application can be understood as follows: the distance between the respective centers of the two types of pixels.
Optionally, in an embodiment of the present application, the first distance and the second distance are different. The first distance is the distance between the first type of pixels and the sensor, and the second distance is the distance between the second type of pixels and the sensor.
According to the technical scheme, the sensing data of the sensor is closely related to the position of the pixel, for example, under the same condition, the closer the pixel is to the sensor, the larger the sensing data is. Therefore, the position of the target sensor is determined based on the sensing data corresponding to the pixels at different positions within the assembly tolerance region, and the position accuracy of the target sensor can be effectively improved.
Optionally, in this embodiment of the application, the optical signal received by the sensor and sent by the pixel may be light leakage of the display screen, or may also be an object placed above the display screen, such as a finger of a user, the display screen sends out the optical signal upwards, and the optical signal is reflected on the surface of the object to form reflected light, so as to be received by the sensor. After the sensor receives the optical signal sent by the pixel of the display screen, the sensor can convert the optical signal into an electrical signal, and the electrical signal is sensing data.
Alternatively, the first value may be a ratio between the first sensed data and the second sensed data. In this case, step 220 may specifically be: and determining the position of the target sensor according to the ratio between the first sensing data and the second sensing data of the target sensor and the corresponding relation. Alternatively, the first value may be a difference between the first sensed data and the second sensed data. In this case, step 220 may specifically be: and determining the position of the target sensor according to the difference between the first sensing data and the second sensing data of the target sensor and the corresponding relation.
In the following, the first value is taken as a ratio between the first sensing data and the second sensing data as an example to describe the scheme of the embodiment of the present application. The position of the target sensor is determined according to the ratio of the first sensing data to the second sensing data, the method is simple to implement, and the operation efficiency can be effectively improved.
In one embodiment, the correspondence between the first value and the position of the sensor may be preconfigured on the target sensor. For example, the correspondence may be obtained during a testing phase of the electronic device. And then, in the stage of mass production of the electronic equipment, pre-configuring the corresponding relation obtained in the test stage on the target sensor. In this way, when the position of the target sensor is determined, the position of the target sensor can be directly determined according to the correspondence relationship pre-configured on the target sensor, so that the time consumption for determining the position of the target sensor can be reduced.
The embodiment pre-configures the corresponding relation on the target sensor, so that the corresponding relation can be directly obtained from the target sensor when the position of the target sensor is determined according to the corresponding relation, thereby reducing the operation complexity.
In another embodiment, the correspondence between the first value and the location of the sensor may be stored in a cloud. For example, after the corresponding relationship is obtained in the testing stage, the corresponding relationship is stored in the cloud. Therefore, when the position of the target sensor is determined, the corresponding relation can be acquired from the cloud, and the position of the target sensor is determined according to the corresponding relation.
Hereinafter, for convenience of description, the sensor at the test stage is referred to as a test sensor, and similarly, the display screen and the assembly tolerance region are referred to as a test display screen and a test assembly tolerance region.
Alternatively, the correspondence may be a correspondence between the first value and an absolute position of the test sensor. Based on this, after the first sensing data and the second sensing data of the target sensor are obtained, if the first value of the target sensor is obtained from the first sensing data and the second sensing data as x and the position of the sensor corresponding to x in the correspondence relationship is M, the position of the target sensor can be determined as M.
Alternatively, the correspondence may be a correspondence between the first value and a relative position, which is a position of the test sensor with respect to the reference position. Based on this, after the first sensing data and the second sensing data of the target sensor are obtained, if the first value of the target sensor is x, and the relative position corresponding to x is M and the reference position is d in the correspondence relationship, from the first sensing data and the second sensing data, the position of the target sensor can be determined to be (M + d).
As one example, the reference position may be a pre-assembly position of the test sensor. As previously described, the engineer knows the pre-assembly position of the test sensor in advance.
However, as also previously described, when the engineer assembles the test sensor, the pre-assembly position of the test sensor is a, as shown in fig. 1. However, due to assembly tolerances, the engineer may assemble the test sensor in the B position. The actual assembly position considered by the engineer is however still the a position, i.e. the a position is taken as the reference position. Because the reference position has an error, the corresponding relation determined based on the reference position also has an error, so that the determined position of the target sensor comprises the position error of the test sensor, and further, the problems that the position error of the target sensor is larger and the accuracy is lower are caused.
In view of the above, as another example, the reference position may be an actual mounting position of the test sensor. According to the technical scheme, the actual assembly position of the test sensor is used as the reference position, so that the problem of position error of the test sensor is solved to a great extent, and the position accuracy of the determined target sensor is improved.
The implementation of determining the actual mounting location of the test sensor is described in detail below.
As shown in fig. 5, the determining the actual mounting position of the test sensor according to the embodiment of the present application may specifically include the following steps:
510: a test assembly tolerance region in a test display screen is obtained.
520: and sequentially lighting pixels at different positions in the test assembly tolerance region.
530: when the pixels at different positions are sequentially lightened, a plurality of third sensing data of the test sensor are obtained, wherein the third sensing data are obtained after the test sensor sequentially receives optical signals sent by the pixels at different positions.
540: and determining the actual assembly position of the test sensor according to the plurality of third sensing data.
According to the technical scheme, in the test assembly tolerance area in the test display screen, the pixels at different positions in the area are sequentially lightened, and the sensing data of the test sensor corresponding to the pixels at different positions are acquired. Since the position of the test sensor is within the region corresponding to the test assembly tolerance region and the sensing data of the test sensor is closely related to the position of the pixel, for example, under the same conditions, the closer the pixel is to the test sensor, the larger the sensing data. Therefore, the accuracy of the test sensor position determined based on the sensing data corresponding to the pixels at different positions is high.
Alternatively, the test assembly tolerance zone may be pre-existing test sensors so that the test assembly tolerance zone can be accessed.
Optionally, step 520 may specifically be: and sequentially lightening pixels at different positions in the test assembly tolerance area by using the single-color bright strips. Illustratively, the monochrome bright stripes may be white bright stripes. Since the brightness of white is high relative to other colors. Therefore, the pixels are lighted by the white bright bars, so that the intensity of the light signals sent by the pixels and received by the test sensor is higher, and the position of the test sensor is favorably determined.
Of course, the single-color bright strip can also be a bright strip of other colors, such as a red bright strip.
Alternatively, the pixels within the test rig tolerance region may be monochrome pixels. For example, a white pixel or a red pixel, etc.
The closer the pixel is to the sensor, the larger the sensed data. Thus, in one possible embodiment, the location of the test sensor may be determined from a maximum of the plurality of third sensed data. For example, the pixel position corresponding to the maximum value of the third sensing data may be used as the position of the test sensor.
According to the technical scheme, the maximum value in the sensing data indicates that the pixel in the corresponding test assembly tolerance region is closest to the test sensor, so that the accuracy of the determined position of the test sensor can be further improved.
Or after the plurality of third sensing data of the test sensor are acquired, fitting may be performed on the plurality of third sensing data, so that the position of the test sensor is determined according to the fitted maximum value.
Optionally, in the embodiment of the present application, a time duration for lighting the pixels at different positions each time is not specifically limited. It should be understood that the pixels in different positions may include pixels in different row positions. Illustratively, the duration may be greater than or equal to 50ms, such as 50ms, 200ms, and so on.
Optionally, step 520 may specifically include: pixels within the test assembly tolerance region are illuminated line by line. For example, pixels in the test assembly tolerance region may be illuminated line by line starting from the upper limit value of the assembly tolerance up to the lower limit value of the assembly tolerance. That is, the pixels in the test set tolerance region may be illuminated line by line, starting with the first line of pixels in the test set tolerance region.
As an example, one row of pixels within the test rig tolerance region may be lit at a time.
As another example, multiple rows of pixels within the test rig tolerance region may be lit at a time. For example, 11 rows of pixels may be lit at a time. According to the technical scheme, a plurality of rows of pixels are lightened each time, so that the time consumption for determining the position of the test sensor can be greatly reduced, and the processing speed is improved. Furthermore, the multiple rows of pixels simultaneously emit optical signals, so that the intensity of the optical signals received by the test sensor is obviously increased, the obtained sensing data is larger, and the accuracy of the position of the test sensor is further improved.
6-9 show schematic diagrams of rows of pixels lit at a time and test sensor locations in a test assembly tolerance region. Wherein, the test sensor sets up in test display screen below. If the number of pixel rows lit at each time (hereinafter referred to as a target number of rows for convenience of description) is small, as shown in fig. 6 and 7, when all the pixels lit at different times are covered by the test sensor, the third sensing data corresponding to the pixels lit at different times are substantially the same, and in this case, there may be a plurality of determined test sensor positions, and further, an error of the determined test sensor position may be large. Where error is (width of test sensor/pitch between two rows of pixels) -target number of rows.
If the number of target lines is large, as shown in fig. 8 and 9, when the multiple lines of pixels lit at different times all cover the test sensor, the third sensing data corresponding to the pixels lit at different times are substantially the same, and in this case, there may be a plurality of determined sensor positions, which may further result in a large error in the determined test sensor position. Where the error is the target number of rows- (width of test sensor/pitch between two rows of pixels).
Therefore, in order to improve the accuracy of determining the position of the test sensor, in the embodiment of the present application, the width of the test sensor/the pitch between two rows of pixels may be made equal to the target number of rows, that is, the width of the test sensor is made equal to the width of multiple rows of pixels lit at a time. For example, the width of the test sensor is 700um, the pitch between two rows of pixels is 63um, and the target number of rows is 700um/63um 11.1, so the target number of rows is 11.
In one implementation, when the position of the test sensor is determined according to the maximum value of the plurality of third sensing data, the three values may generally obtain a parabola, thereby obtaining the maximum value therebetween. Therefore, in order to obtain the maximum value in the third sensed data, the number of times the pixel at the different position is lighted is at least three times, i.e., the plurality of third sensed data includes at least three data.
At this time, the target line number may be less than or equal to a second value, where the second value is a difference between the line number of pixels occupied by the assembly tolerance region and 2.
Further, the target number of lines may be greater than or equal to a third value, where the third value is a quotient of the number of lines of pixels occupied by the assembly tolerance region and the number of times pixels at different positions within the test assembly tolerance region are lit, i.e., the third value is the number of lines of pixels occupied by the assembly tolerance region/the number of times pixels at different positions within the test assembly tolerance region are lit.
It should be understood that if the ratio of the width of the test sensor to the pitch between two rows of pixels in the test display screen (referred to as a first ratio) is between a second value and a third value, embodiments of the present application may determine the target number of rows as the first ratio. If the first ratio is not between the second value and the third value, the embodiment of the present application may determine, as the target row number, a value closest to the first ratio among values between the second value and the third value.
It should also be understood that "first", "second", and "third" in the embodiments of the present application are merely for distinguishing different objects, and do not limit the scope of the embodiments of the present application.
After determining the reference position, a plurality of groups of pixels within the test assembly tolerance region may be sequentially illuminated based on the reference position, each group of pixels of the plurality of groups of pixels including a first type of pixels and a second type of pixels, the position between each two groups of pixels of the plurality of groups of pixels being different. In other words, each of the sets of pixels is located at a different position.
As an example, groups of pixels may be lit row by row, starting with the first row of pixels in the test rig tolerance region.
When a plurality of groups of pixels are lit row by row from a first row of pixels in a test tolerance region, taking a blue pixel as a first type of pixel and a red pixel as a second type of pixel in fig. 4 as an example, as the plurality of groups of pixels are lit row by row, the lit red pixel is closer to the test sensor, and the lit blue pixel is farther from the test sensor, then second sensing data obtained by the test sensor based on the received optical signal emitted by the red pixel is larger, and first sensing data obtained based on the received optical signal emitted by the blue pixel is smaller, then a ratio between the first sensing data and the second sensing data is smaller, as shown by a vertical coordinate in fig. 10.
Further, each group of pixels is lighted up, the ratio between the first sensing data and the second sensing data corresponding to the group of pixels and the relative position with respect to the reference position at that time can be stored, so that the correspondence relationship can be obtained. Such as the correspondence shown in fig. 10.
After the corresponding relation is determined, in a mass production stage, when the position of the target sensor is determined, the first type of pixels and the second type of pixels in the target assembly tolerance area can be lightened, the target sensor receives a first optical signal sent by the first type of pixels and obtains first sensing data based on the first optical signal, and receives a second optical signal sent by the second type of pixels and obtains second sensing data based on the second optical signal. Then, the position of the target sensor is determined according to the ratio of the first sensing data and the second sensing data and the corresponding relationship, such as the corresponding relationship in fig. 10.
Alternatively, the time period for lighting the first type of pixels and the second type of pixels within the target assembly tolerance region may be greater than or equal to 50 ms. For example 50 ms.
In general, under the same conditions, the longer the period of time that a pixel is lit, the greater the amount of light signal emitted by the lit pixel that is received by the target sensor, and the more accurate the position of the target sensor determined based on that light signal. However, the longer the period of time the pixel is lit, the less efficient it is to determine the target sensor position. Assuming that the pixel is lit once for 200ms, which is needed to be lit 8 times in total, the total time taken to determine the target sensor position is 200ms 8 to 16s, which is very inefficient. Therefore, the embodiment of the present application can determine the time period for lighting a pixel once to be 50ms, taking the accuracy and efficiency of determining the position of the target sensor into consideration. In this way, the accuracy and efficiency of determining the position of the target sensor can be balanced, that is, the position of the target sensor can be accurately determined in a short time.
Alternatively, the first type of pixels and the second type of pixels within the target assembly tolerance region may be lit only once.
According to the technical scheme, the position of the target sensor can be determined by only lighting the first type of pixels and the second type of pixels once, so that the operation is simple, and the time consumption for determining the target sensor is greatly reduced. For example, if the first type of pixels and the second type of pixels are lit for 50ms, the time required to determine the position of the target sensor is only 50 ms.
Considering that the target sensor may be affected by the ambient light signal or other light signals in the process of determining the position of the target sensor, for example, the target sensor may receive not only the light signals emitted by the first type pixels and the second type pixels that are lit, but also the ambient light signal, thereby seriously affecting the sensing data of the target sensor.
Thus, the method 200 may further comprise: the assembly tolerance area is covered by a light blocking object.
As shown in fig. 11, a light blocking object may be placed on a transparent protective cover of an electronic device. The transparent protective cover may be, for example, a glass cover or a sapphire cover, which is located above the display screen 120 and covers the front surface of the electronic device.
Alternatively, the light blocking object may be a black test head or a black glue layer, etc.
In this way, the sensor can be blocked from receiving other optical signals than the optical signal emitted by the lit pixel, thereby enabling the accuracy of the determined position of the sensor to be greatly improved.
In order to verify the method 200, the embodiment of the present application performs 20 tests based on the method 200, and the first type pixels and the second type pixels of the target sensor are lit only once each time, and table 1 shows the test results. The first type of pixels are blue pixels, and the second type of pixels are red pixels.
TABLE 1
Figure BDA0003437456300000161
Figure BDA0003437456300000171
As can be seen from table 1, the relative position between the target sensor and the reference position is 3, i.e., the relative position is 3 rows of pixels. If the reference position is d, the position of the target sensor is (d + 3).
Further, the results of the 20 trials were the same, and the relative position between the target sensor and the reference position was 3. Therefore, it can be known that the positioning error of the embodiment of the present application may be 0.
In the embodiment of the present application, the sequence numbers of the above-mentioned processes do not mean the execution sequence, and the execution sequence of each process should be determined by the function and the inherent logic of the process, and should not constitute any limitation on the implementation process of the embodiment of the present application.
Moreover, in the present application, the technical features of the embodiments and/or the technical features of the embodiments may be arbitrarily combined with each other, and the technical solutions obtained after the combination also fall within the protection scope of the present application.
The method for positioning the sensor under the display screen according to the embodiment of the present application is described above in detail, and the apparatus for positioning the sensor under the display screen according to the embodiment of the present application will be described below. It should be understood that the device for positioning the sensor under the display screen in the embodiment of the present application may perform the method for positioning the sensor under the display screen in the embodiment of the present application, and has a function of performing the corresponding method.
FIG. 12 shows a schematic block diagram of an apparatus 1200 for under-display sensor positioning according to an embodiment of the present application. As shown in fig. 12, the apparatus 1200 for positioning an off-screen sensor may include:
the processing unit 1210 is configured to obtain a corresponding relationship between a first value and a position of a sensor, where the first value is a value obtained based on first sensing data and second sensing data of the sensor, the first sensing data is data obtained after the sensor receives a first optical signal sent by a first type of pixel in an assembly tolerance region, the second sensing data is data obtained after the sensor receives a second optical signal sent by a second type of pixel in the assembly tolerance region, the assembly tolerance region is a region in a display screen disposed above the sensor, the assembly tolerance region corresponds to an assembly tolerance of the sensor, and a color of the first type of pixel is different from a color of the second type of pixel.
The processing unit 1210 is further configured to determine a position of the target sensor according to the first sensing data and the second sensing data of the target sensor and according to the corresponding relationship.
Optionally, in this embodiment of the present application, the color of the first type of pixel is different from the color of the second type of pixel.
Optionally, in an embodiment of the present application, the first value is a ratio between the first sensing data and the second sensing data; the processing unit 1210 is specifically configured to: and determining the position of the target sensor according to the ratio of the first sensing data and the second sensing data of the target sensor and the corresponding relation.
Optionally, in this embodiment of the present application, the correspondence includes a correspondence between the first value and a relative position, where the relative position is a position of the sensor relative to a reference position; the processing unit 1210 is specifically configured to: determining a relative position between the position of the target sensor and the reference position according to the first sensing data and the second sensing data of the target sensor and according to the corresponding relation; determining the position of the target sensor based on the relative position between the position of the target sensor and the reference position and based on the reference position.
Optionally, in this embodiment of the present application, the processing unit 1210 is specifically configured to: determining the reference position; the apparatus 1200 further comprises: a lighting unit, configured to sequentially light, based on the reference position, a plurality of groups of pixels in the assembly tolerance region, where each group of pixels in the plurality of groups of pixels includes the first type of pixels and the second type of pixels, and positions of each two groups of pixels in the plurality of groups of pixels are different; the processing unit 1210 is specifically configured to: acquiring a set of the first sensing data and the second sensing data when the lighting unit lights each set of the first type pixels and the second type pixels; determining the first value according to the first sensing data and the second sensing data; and determining the corresponding relation between the first value and the relative position according to the first value and the relative position of the sensor corresponding to the first value relative to the reference position.
Optionally, in this embodiment of the application, the lighting unit is specifically configured to: illuminating the plurality of groups of pixels line by line starting with a first line of pixels within the fitting tolerance region based on the reference position.
Optionally, in this embodiment of the present application, the processing unit 1210 is specifically configured to: acquiring the assembly tolerance area; the lighting unit is further used for sequentially lighting pixels at different positions in the assembly tolerance area; the processing unit 1210 is specifically configured to: acquiring a plurality of third sensing data of the sensor when the lighting unit sequentially lights the pixels at the different positions; and determining the reference position according to the plurality of third sensing data.
Optionally, in this embodiment of the application, the lighting unit is specifically configured to: and using the white bright bars to sequentially light the pixels at the different positions.
Optionally, in this embodiment of the present application, the processing unit 1210 is specifically configured to: and determining the reference position according to the maximum value in the plurality of third sensing data.
Optionally, in this embodiment of the application, the lighting unit is specifically configured to: starting from the first row of pixels in the fitting tolerance region, the pixels in the fitting tolerance region are lit up row by row.
Optionally, in this embodiment of the application, the pixels lit by the lighting unit at a time include a plurality of rows of pixels.
Optionally, in an embodiment of the present application, a total width of the plurality of rows of pixels is equal to a width of the sensor.
Optionally, in this embodiment of the application, the number of times that the lighting unit lights the pixels at the different positions is at least three, and the plurality of third sensing data includes at least three data.
Optionally, in this embodiment of the application, the number of rows of the plurality of rows of pixels is less than or equal to a second value, where the second value is a difference between the number of rows of pixels occupied by the assembly tolerance area and 2.
Optionally, in this embodiment of the application, the number of rows of the plurality of rows of pixels is greater than or equal to a third value, where the third value is a quotient of the number of rows of the pixels occupied by the assembly tolerance area and the number of times of lighting the pixels at the different positions.
Optionally, in this embodiment of the present application, a first distance and a second distance are different, where the first distance is a distance between the first type of pixel and the sensor, and the second distance is a distance between the second type of pixel and the sensor.
Optionally, in this embodiment of the present application, in the assembly tolerance region of the display screen, a pitch between the first type of pixels and the second type of pixels is the largest among pitches between the two types of pixels.
Optionally, in this embodiment of the present application, the correspondence relationship is preconfigured on the target sensor.
Optionally, in an embodiment of the present application, the method further includes: the lighting unit is used for lighting the first type of pixels and the second type of pixels in a target assembly tolerance region in a target display screen arranged above the target sensor at one time; the processing unit 1210 is further configured to: acquiring first sensing data and second sensing data of the target sensor when the lighting unit lights the first type of pixels and the second type of pixels in the target assembly tolerance region.
Optionally, in an embodiment of the present application, a time period for the lighting unit to light the first type of pixels and the second type of pixels in the target assembly tolerance region is 50 ms.
Optionally, in this embodiment of the present application, the number of rows of pixels occupied by the assembly tolerance area is between 10 rows of pixels and 50 rows of pixels.
Optionally, in this embodiment of the present application, the processing unit 1210 is further configured to: the assembly tolerance zone is covered by a light blocking object.
Optionally, in this embodiment of the present application, the target sensor is an ambient light sensor, and the ambient light sensor is configured to detect an intensity of an ambient light signal.
It should be understood that the apparatus 1200 for positioning an off-screen sensor can implement the corresponding operations in the method 200, and therefore, for brevity, will not be described again.
The embodiment of the application also provides a device for positioning the sensor under the display screen. The apparatus may include a memory and a processor.
Wherein the memory is coupled to the processor; a memory for storing a program; and the processor is used for calling the program stored in the memory so that the device can execute the method for positioning the sensor under the display screen, which is provided by any embodiment.
An electronic device is further provided in the embodiments of the present application, as shown in fig. 13, the electronic device 1300 may include a display screen 1301, a sensor 1302, and a device 1303 for positioning a sensor under the display screen.
Wherein the sensor 1302 is disposed below the display screen 1301, and the device 1303 for positioning the sensor under the display screen may be the device for positioning the sensor under the display screen in the foregoing embodiment, and can be used to execute the contents in the method embodiment shown in fig. 2.
Optionally, the display 1301 may be an unfolded display or a foldable display, that is, a flexible display.
By way of example and not limitation, the electronic device 1300 in the embodiment of the present application may be a portable or mobile computing device such as a terminal device, a mobile phone, a tablet computer, a notebook computer, a desktop computer, a game device, an in-vehicle electronic device, or a wearable smart device, and other electronic devices such as an electronic database, an automobile, and an Automated Teller Machine (ATM). This wearable smart machine includes that the function is complete, the size is big, can not rely on the smart mobile phone to realize complete or partial function, for example: smart watches or smart glasses and the like, and only focus on a certain type of application function, and need to be used in cooperation with other devices such as smart phones, such as various smart bracelets for physical sign monitoring, smart jewelry and other devices.
It should be noted that, without conflict, the embodiments and/or technical features in the embodiments described in the present application may be arbitrarily combined with each other, and the technical solutions obtained after the combination also fall within the protection scope of the present application.
It is to be understood that the terminology used in the embodiments of the present application and the appended claims is for the purpose of describing particular embodiments only and is not intended to be limiting of the embodiments of the present application. For example, as used in the examples of this application and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
Those of ordinary skill in the art will appreciate that the elements of the examples described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both, and that the components and steps of the examples have been described above generally in terms of their functionality in order to clearly illustrate the interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the several embodiments provided in the present application, it should be understood that the disclosed system and apparatus may be implemented in other manners. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may also be an electric, mechanical or other form of connection.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiments of the present application.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be substantially or partially contributed by the prior art, or all or part of the technical solution may be embodied in a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
While the invention has been described with reference to specific embodiments, the scope of the invention is not limited thereto, and those skilled in the art can easily conceive various equivalent modifications or substitutions within the technical scope of the invention. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (46)

1. A method of positioning an off-screen sensor, the method comprising:
acquiring a corresponding relation between a first value and a position of a sensor, wherein the first value is a value obtained based on first sensing data and second sensing data of the sensor, the first sensing data is data obtained after the sensor receives a first optical signal sent by a first type of pixel in an assembly tolerance region, the second sensing data is data obtained after the sensor receives a second optical signal sent by a second type of pixel in the assembly tolerance region, the assembly tolerance region is a region in a display screen arranged above the sensor, the assembly tolerance region corresponds to the assembly tolerance of the sensor, and the color of the first type of pixel is different from the color of the second type of pixel;
and determining the position of the target sensor according to the first sensing data and the second sensing data of the target sensor and the corresponding relation.
2. The method of claim 1, wherein the first value is a ratio between the first sensory data and the second sensory data;
determining the position of the target sensor according to the first sensing data and the second sensing data of the target sensor and according to the corresponding relationship, includes:
and determining the position of the target sensor according to the ratio of the first sensing data and the second sensing data of the target sensor and the corresponding relation.
3. The method according to claim 1 or 2, wherein the correspondence includes a correspondence between the first value and a relative position, the relative position being a position of the sensor relative to a reference position;
determining the position of the target sensor according to the first sensing data and the second sensing data of the target sensor and according to the corresponding relationship, includes:
determining a relative position between the position of the target sensor and the reference position according to the first sensing data and the second sensing data of the target sensor and according to the corresponding relation;
determining the position of the target sensor based on the relative position between the position of the target sensor and the reference position and based on the reference position.
4. The method of claim 3, wherein obtaining a correspondence between the first value and the position of the sensor comprises:
determining the reference position;
sequentially lightening a plurality of groups of pixels in the assembly tolerance region based on the reference position, wherein each group of pixels in the plurality of groups of pixels comprises the first type of pixels and the second type of pixels, and the positions of each two groups of pixels in the plurality of groups of pixels are different;
acquiring a set of the first sensing data and the second sensing data when each set of the first type pixels and the second type pixels is lighted;
determining the first value according to the first sensing data and the second sensing data;
and determining the corresponding relation between the first value and the relative position according to the first value and the relative position of the sensor corresponding to the first value relative to the reference position.
5. The method of claim 4, wherein sequentially illuminating groups of pixels within the fitting tolerance region based on the reference position comprises:
the plurality of groups of pixels are lit up line by line starting from the first line of pixels within the fitting tolerance region described above based on the reference position.
6. The method of claim 4, wherein said determining the reference position comprises:
acquiring the assembly tolerance area;
sequentially lightening pixels at different positions in the assembly tolerance area;
acquiring a plurality of third sensing data of the sensor when the pixels at different positions are sequentially lighted;
and determining the reference position according to the plurality of third sensing data.
7. The method of claim 6, wherein the sequentially illuminating pixels at different locations within the assembly tolerance region comprises:
and using the white bright bars to sequentially light the pixels at the different positions.
8. The method of claim 6, wherein said determining the reference position from the plurality of third sensing data comprises:
and determining the reference position according to the maximum value in the plurality of third sensing data.
9. The method of claim 6, wherein the sequentially illuminating pixels at different locations within the assembly tolerance region comprises:
illuminating pixels within the fitting tolerance region row by row starting with a first row of pixels within the fitting tolerance region.
10. The method of claim 9, wherein the pixels lit at a time comprise a plurality of rows of pixels.
11. The method of claim 10, wherein a total width of the plurality of rows of pixels is equal to a width of the sensor.
12. The method of claim 10, wherein the number of times the pixel at the different position is lighted is at least three, and the plurality of third sensing data includes at least three data.
13. The method of claim 12, wherein the number of rows of pixels is less than or equal to a second value that is a difference between the number of rows of pixels occupied by the assembly tolerance region and 2.
14. The method of claim 12, wherein the number of rows of pixels is greater than or equal to a third value, the third value being a quotient of the number of rows of pixels occupied by the assembly tolerance region and the number of times the differently positioned pixels are illuminated.
15. A method according to claim 1 or 2, characterized in that a first distance, which is the distance between the pixels of the first type and the sensor, and a second distance, which is the distance between the pixels of the second type and the sensor, are different.
16. The method according to claim 1 or 2, wherein the pitch between the first type of pixels and the second type of pixels is the largest among the pitches between the two types of pixels in the assembly tolerance region of the display screen.
17. The method according to claim 1 or 2, characterized in that the correspondence is preconfigured on the target sensor.
18. The method according to claim 1 or 2, characterized in that the method further comprises:
lighting a first type of pixels and a second type of pixels in a target assembly tolerance area in a target display screen arranged above the target sensor for one time;
and acquiring first sensing data and second sensing data of the target sensor when the first type of pixels and the second type of pixels in the target assembly tolerance area are lightened.
19. The method of claim 18, wherein the first type of pixel and the second type of pixel within the target assembly tolerance region are illuminated for a period of 50 ms.
20. A method according to claim 1 or 2, wherein the assembly tolerance region occupies a number of rows of pixels between 10 and 50 rows of pixels.
21. The method according to claim 1 or 2, characterized in that the method further comprises:
the assembly tolerance zone is covered by a light blocking object.
22. The method of claim 1 or 2, wherein the target sensor is an ambient light sensor for detecting the intensity of an ambient light signal.
23. An apparatus for positioning a sensor under a display screen, comprising:
the processing unit is used for acquiring a corresponding relation between a first value and a position of a sensor, wherein the first value is a value obtained based on first sensing data and second sensing data of the sensor, the first sensing data is data obtained after the sensor receives a first optical signal sent by a first type of pixel in an assembly tolerance region, the second sensing data is data obtained after the sensor receives a second optical signal sent by a second type of pixel in the assembly tolerance region, the assembly tolerance region is a region in a display screen arranged above the sensor, the assembly tolerance region corresponds to the assembly tolerance of the sensor, and the color of the first type of pixel is different from the color of the second type of pixel;
the processing unit is further configured to determine a position of the target sensor according to the first sensing data and the second sensing data of the target sensor and according to the corresponding relationship.
24. The apparatus of claim 23, wherein the first value is a ratio between the first sensory data and the second sensory data;
the processing unit is specifically configured to:
and determining the position of the target sensor according to the ratio of the first sensing data and the second sensing data of the target sensor and the corresponding relation.
25. The apparatus of claim 23 or 24, wherein the correspondence comprises a correspondence between the first value and a relative position, the relative position being a position of the sensor relative to a reference position;
the processing unit is specifically configured to:
determining a relative position between the position of the target sensor and the reference position according to the first sensing data and the second sensing data of the target sensor and according to the corresponding relation;
determining the position of the target sensor based on the relative position between the position of the target sensor and the reference position and based on the reference position.
26. The apparatus according to claim 25, wherein the processing unit is specifically configured to:
determining the reference position;
the device further comprises:
a lighting unit, configured to sequentially light, based on the reference position, a plurality of groups of pixels in the assembly tolerance region, where each group of pixels in the plurality of groups of pixels includes the first type of pixels and the second type of pixels, and positions of each two groups of pixels in the plurality of groups of pixels are different;
the processing unit is specifically configured to:
acquiring a set of the first sensing data and the second sensing data when the lighting unit lights each set of the first type pixels and the second type pixels;
determining the first value according to the first sensing data and the second sensing data;
and determining the corresponding relation between the first value and the relative position according to the first value and the relative position of the sensor corresponding to the first value relative to the reference position.
27. The apparatus according to claim 26, wherein the lighting unit is specifically configured to:
illuminating the plurality of groups of pixels line by line starting with a first line of pixels within the fitting tolerance region based on the reference position.
28. The apparatus according to claim 26, wherein the processing unit is specifically configured to:
acquiring the assembly tolerance area;
the lighting unit is further used for sequentially lighting pixels at different positions in the assembly tolerance area;
the processing unit is specifically configured to:
acquiring a plurality of third sensing data of the sensor when the lighting unit sequentially lights the pixels at the different positions;
and determining the reference position according to the plurality of third sensing data.
29. The apparatus according to claim 28, wherein the lighting unit is specifically configured to:
and using the white bright bars to sequentially light the pixels at the different positions.
30. The apparatus according to claim 28, wherein the processing unit is specifically configured to:
and determining the reference position according to the maximum value in the plurality of third sensing data.
31. The apparatus according to claim 28, wherein the lighting unit is specifically configured to:
starting from the first row of pixels in the fitting tolerance region, the pixels in the fitting tolerance region are lit up row by row.
32. The apparatus of claim 31, wherein the pixels illuminated at a time by the illumination unit comprise a plurality of rows of pixels.
33. The apparatus of claim 32, wherein a total width of the plurality of rows of pixels is equal to a width of the sensor.
34. The apparatus according to claim 32, wherein the number of times the lighting unit lights the pixel at the different position is at least three, and the plurality of third sensing data includes at least three data.
35. The apparatus of claim 34, wherein the number of rows of pixels is less than or equal to a second value that is a difference between the number of rows of pixels occupied by the assembly tolerance region and 2.
36. The apparatus of claim 34, wherein the number of rows of pixels is greater than or equal to a third value, the third value being a quotient of the number of rows of pixels occupied by the assembly tolerance region and the number of times the differently positioned pixels are illuminated.
37. The apparatus of claim 23 or 24, wherein a first distance and a second distance are different, the first distance being a distance between the first type of pixel and the sensor, and the second distance being a distance between the second type of pixel and the sensor.
38. The apparatus of claim 23 or 24, wherein the distance between the first type of pixels and the second type of pixels is the largest among the distances between the two types of pixels in the assembly tolerance region of the display screen.
39. The apparatus of claim 23 or 24, wherein the correspondence is preconfigured on the target sensor.
40. The apparatus of claim 23 or 24, further comprising:
the lighting unit is used for lighting the first type of pixels and the second type of pixels in a target assembly tolerance region in a target display screen arranged above the target sensor at one time;
the processing unit is further to:
acquiring first sensing data and second sensing data of the target sensor when the lighting unit lights the first type of pixels and the second type of pixels in the target assembly tolerance region.
41. The apparatus of claim 40, wherein the lighting unit lights the first type of pixels and the second type of pixels within the target assembly tolerance region for a period of 50 ms.
42. An apparatus as claimed in claim 23 or 24, wherein the assembly tolerance region occupies a number of rows of pixels between 10 and 50 rows of pixels.
43. The apparatus according to claim 23 or 24, wherein the processing unit is further configured to:
the assembly tolerance zone is covered by a light blocking object.
44. The apparatus of claim 23 or 24, wherein the target sensor is an ambient light sensor for detecting the intensity of an ambient light signal.
45. An apparatus for positioning a sensor under a display screen, comprising:
a memory for storing a program;
a processor for executing the memory-stored program, the processor for performing the method of off-screen sensor positioning according to any of claims 1-22 when the memory-stored program is executed.
46. An electronic device, comprising:
a display screen;
the sensor is arranged below the display screen;
and means for under-display sensor positioning as claimed in any one of claims 23 to 45.
CN202111620884.7A 2021-12-27 2021-12-27 Method and device for positioning sensor under display screen and electronic equipment Active CN114279332B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111620884.7A CN114279332B (en) 2021-12-27 2021-12-27 Method and device for positioning sensor under display screen and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111620884.7A CN114279332B (en) 2021-12-27 2021-12-27 Method and device for positioning sensor under display screen and electronic equipment

Publications (2)

Publication Number Publication Date
CN114279332A true CN114279332A (en) 2022-04-05
CN114279332B CN114279332B (en) 2023-11-03

Family

ID=80876773

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111620884.7A Active CN114279332B (en) 2021-12-27 2021-12-27 Method and device for positioning sensor under display screen and electronic equipment

Country Status (1)

Country Link
CN (1) CN114279332B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108509849A (en) * 2017-02-24 2018-09-07 三星电子株式会社 Utilize the electronic equipment and method of the pixel identification sensor position of display
CN108716950A (en) * 2018-05-16 2018-10-30 北京小米移动软件有限公司 Environmental light brightness acquisition methods and device
CN111261074A (en) * 2018-11-30 2020-06-09 上海耕岩智能科技有限公司 Method for detecting whether screen uniformly emits light or not, storage medium and electronic equipment
CN112070018A (en) * 2018-12-14 2020-12-11 深圳市汇顶科技股份有限公司 Fingerprint identification device and electronic equipment
CN112599089A (en) * 2020-12-09 2021-04-02 深圳市汇顶科技股份有限公司 Display screen light leakage value acquisition method, electronic device and storage medium
CN113654459A (en) * 2021-07-22 2021-11-16 北京小米移动软件有限公司 Method and device for determining position of photosensitive sensor under screen and storage medium
CN113763856A (en) * 2021-07-19 2021-12-07 荣耀终端有限公司 Method and device for determining ambient illumination intensity and storage medium

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108509849A (en) * 2017-02-24 2018-09-07 三星电子株式会社 Utilize the electronic equipment and method of the pixel identification sensor position of display
CN108716950A (en) * 2018-05-16 2018-10-30 北京小米移动软件有限公司 Environmental light brightness acquisition methods and device
EP3570267A1 (en) * 2018-05-16 2019-11-20 Beijing Xiaomi Mobile Software Co., Ltd. Method and apparatus for acquiring ambient light brightness
CN111261074A (en) * 2018-11-30 2020-06-09 上海耕岩智能科技有限公司 Method for detecting whether screen uniformly emits light or not, storage medium and electronic equipment
CN112070018A (en) * 2018-12-14 2020-12-11 深圳市汇顶科技股份有限公司 Fingerprint identification device and electronic equipment
CN112599089A (en) * 2020-12-09 2021-04-02 深圳市汇顶科技股份有限公司 Display screen light leakage value acquisition method, electronic device and storage medium
CN113763856A (en) * 2021-07-19 2021-12-07 荣耀终端有限公司 Method and device for determining ambient illumination intensity and storage medium
CN113654459A (en) * 2021-07-22 2021-11-16 北京小米移动软件有限公司 Method and device for determining position of photosensitive sensor under screen and storage medium

Also Published As

Publication number Publication date
CN114279332B (en) 2023-11-03

Similar Documents

Publication Publication Date Title
CN107945769B (en) Ambient light intensity detection method and device, storage medium and electronic equipment
US9582118B2 (en) Optical touch system and object detection method therefor
CN107941330B (en) Ambient light intensity detection method and device, storage medium and electronic equipment
CN109359459B (en) Fingerprint identification method, fingerprint identification device and display module
US11410412B2 (en) Anti-counterfeiting method and system for under-screen fingerprint identification
CN104749184A (en) Automatic optical detection method and automatic optical detection system
EP2147345A1 (en) Method and system for optimizing an image for improved analysis of material and illumination image features
CN107945770A (en) Ambient light intensity detection method, device, storage medium and electronic equipment
CN101477785A (en) Image display system and incident light distribution sensing method, and display control method
WO2020159049A1 (en) Electronic device with illuminance sensor mounted at back side of display and method for measuring illuminance using illuminance sensor
CN107682685A (en) White balancing treatment method and device, electronic installation and computer-readable recording medium
CN110139088A (en) Color temperature compensating method, electronic equipment and computer readable storage medium
CN108881875A (en) Image white balancing treatment method, device, storage medium and terminal
CN114281629B (en) Method and device for estimating screen light leakage amount and electronic equipment
CN106226033B (en) Detect the method and device of transparent substrates transmitance
CN114279332B (en) Method and device for positioning sensor under display screen and electronic equipment
CN114281290B (en) Method and device for positioning sensor under display screen and electronic equipment
CN216621463U (en) Data acquisition device for LED luminous color test
CN112241697B (en) Corner color determination method and device, terminal device and readable storage medium
CN206961140U (en) Has the display of biological characteristic identification function
CN110807817A (en) Machine vision method for target color recognition adapting to illumination change
CN113012607A (en) Display module detection method, device and system
CN215642738U (en) Fingerprint recognition device and terminal under screen
US20240134587A1 (en) Smart visual display
CN116166160A (en) Display unit positioning method and device of display screen, control equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant