CN113776787A - Screen uniformity testing method and system of virtual reality equipment and related device - Google Patents
Screen uniformity testing method and system of virtual reality equipment and related device Download PDFInfo
- Publication number
- CN113776787A CN113776787A CN202111136320.6A CN202111136320A CN113776787A CN 113776787 A CN113776787 A CN 113776787A CN 202111136320 A CN202111136320 A CN 202111136320A CN 113776787 A CN113776787 A CN 113776787A
- Authority
- CN
- China
- Prior art keywords
- value
- screen
- uniformity
- virtual reality
- calibration
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000012360 testing method Methods 0.000 title claims abstract description 182
- 238000000034 method Methods 0.000 claims abstract description 36
- 238000005259 measurement Methods 0.000 claims abstract description 22
- 230000014509 gene expression Effects 0.000 claims description 11
- 230000003321 amplification Effects 0.000 claims description 9
- 238000004590 computer program Methods 0.000 claims description 9
- 238000003199 nucleic acid amplification method Methods 0.000 claims description 9
- 238000004364 calculation method Methods 0.000 claims description 6
- 230000006870 function Effects 0.000 claims description 6
- 230000008569 process Effects 0.000 abstract description 10
- 238000009825 accumulation Methods 0.000 abstract description 4
- 238000005375 photometry Methods 0.000 abstract description 3
- 230000009286 beneficial effect Effects 0.000 abstract description 2
- 238000010586 diagram Methods 0.000 description 4
- 230000009471 action Effects 0.000 description 3
- 238000003384 imaging method Methods 0.000 description 3
- 238000013507 mapping Methods 0.000 description 3
- 238000012795 verification Methods 0.000 description 3
- 238000012937 correction Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000013459 approach Methods 0.000 description 1
- 238000004422 calculation algorithm Methods 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000007654 immersion Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000000750 progressive effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01M—TESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
- G01M11/00—Testing of optical apparatus; Testing structures by optical methods not otherwise provided for
- G01M11/02—Testing optical properties
- G01M11/0242—Testing optical properties by measuring geometrical properties or aberrations
- G01M11/0257—Testing optical properties by measuring geometrical properties or aberrations by analyzing the image formed by the object to be tested
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01M—TESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
- G01M11/00—Testing of optical apparatus; Testing structures by optical methods not otherwise provided for
- G01M11/02—Testing optical properties
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30244—Camera pose
Abstract
The application provides a screen uniformity testing method of virtual reality equipment, which comprises the following steps: acquiring a screen picture of the virtual reality equipment; determining the gray average value of each test point in the screen picture; substituting the average gray value into the calibration relation to obtain the photometric value of the test area corresponding to the test point; and determining the uniformity of the virtual reality equipment according to the luminosity value. According to the method, through the calibration relation of the gray value and the photometric value determined in advance, when the uniformity degree is tested, the screen picture of the virtual reality device is determined firstly, the gray average value of each target test point in the screen picture is calculated, and the calibration relation is directly substituted to calculate and obtain the corresponding photometric value, so that photometric measurement is not required to be performed by using photometer equipment, the error accumulation in the photometer measurement process is avoided, the test time is saved, the test efficiency is effectively improved, and the test precision is guaranteed. The application also provides a screen uniformity testing system of the virtual reality equipment, a computer readable storage medium and electronic equipment, and has the beneficial effects.
Description
Technical Field
The present disclosure relates to the field of product testing, and in particular, to a method, a system and a related device for testing screen uniformity of a virtual reality device.
Background
At present, Virtual Reality (VR) technique is used extensively, and the VR imaging principle is enlarged image, video on the inside screen of VR to several meters outside through imaging system, lets the user have the sense of immersion of being personally on the scene, and this degree of consistency that requires the inside screen of VR satisfies the requirement, and if the degree of consistency difference of screen center and four angles of screen and periphery is very big, then whole screen test effect will be very poor, seriously influences user's use experience. Therefore, how to accurately and effectively detect the uniformity of the VR screen is very important.
The conventional testing method is to use a photometer to test a plurality of points on left and right screens, and the photometer test needs to measure the photometric value within a circle with a specific distance as a radius. Testing all the test points on the left and right screens requires the photometer to move to each point to test. The more test points, the greater the error accumulation of each movement of the photometer, and the lower the accuracy; every time a test point is added, two points are added on the left screen and the right screen, and the test efficiency of the test equipment is greatly reduced.
Disclosure of Invention
The application aims to provide a screen uniformity testing method of virtual reality equipment, a screen uniformity testing system, a computer readable storage medium and electronic equipment, which can improve the screen uniformity testing efficiency.
In order to solve the technical problem, the application provides a method for testing the uniformity of virtual reality equipment, which has the following specific technical scheme:
acquiring a screen picture of the virtual reality equipment;
determining the gray level average value of each test point in the screen picture;
substituting the gray average value into the calibration relation to obtain a photometric value of a test area corresponding to the test point;
and determining the uniformity of the virtual reality equipment according to the luminosity value.
Optionally, before substituting the gray-scale average value into the calibration relationship, the method further includes:
measuring the measurement brightness value of the target test area by using a photometer when different gray values are measured;
shooting screen pictures corresponding to different gray values by using a calibration camera, and calculating an average gray value corresponding to the target test area;
and taking the mathematical relationship between the average gray value of the target test area under different gray values and the measured brightness value of the photometer as the calibration relationship.
Optionally, calculating the average gray value corresponding to the target test area includes:
taking the test point as the center of a circle and a circular area with the preset length as the radius as the target test area; the test points comprise screen center points and screen edge points;
determining the amplification scale factor of the calibration camera according to the screen distance from the calibration camera to the virtual reality equipment;
and determining the target test area contained in the screen, amplifying the target test area by the amplification scale factor, and calculating the average gray value in the target test area.
Optionally, taking the mathematical relationship between the average gray value of the target test area under different gray values and the measured brightness value of the photometer as the calibration relationship includes:
and determining a function expression of the average gray value and the measured brightness value by taking the average gray value as an abscissa and the measured brightness value as an ordinate, and taking the function expression as the calibration relation.
Optionally, after obtaining the calibration relationship of the test point, the method further includes:
determining an estimated brightness value of a second screen according to the calibration relation, comparing the estimated brightness value with an actual brightness measurement value of the second screen by a photometer, and judging whether the estimated brightness value and the actual brightness measurement value are within a preset error range;
if so, determining that the accuracy of the calibration relation meets a preset standard;
if not, the calibration relation is verified until the precision of the calibration relation meets the preset standard.
Optionally, if the estimated brightness value and the actual brightness measurement value exceed the preset error range, the method further includes:
and changing a data relation model of the data relation, and correcting the calibration relation.
Optionally, the determining the uniformity of the virtual reality device according to the luminosity value includes:
substituting the luminosity value into a uniformity formula to determine the uniformity of the virtual reality equipment; wherein the photometric values include a maximum photometric value and a minimum photometric value;
the Uniformity formula is (Max-Min)/Average (%), Uniformity is Uniformity, Max is the maximum luminance value, and Min is the minimum luminance value.
The application also provides a screen uniformity testing system of virtual reality equipment, includes:
the picture acquisition module is used for acquiring a screen picture of the virtual reality equipment;
the gray value determining module is used for determining the average gray value of each test point in the screen picture;
the photometric value calculation module is used for substituting the gray average value into the calibration relation to obtain the photometric value of the test area corresponding to the test point;
and the uniformity calculation module is used for determining the uniformity of the virtual reality equipment according to the luminosity value.
The present application also provides a computer-readable storage medium having stored thereon a computer program which, when being executed by a processor, carries out the steps of the method as set forth above.
The present application further provides an electronic device, comprising a memory and a processor, wherein the memory stores a computer program, and the processor implements the steps of the method described above when calling the computer program in the memory.
The application provides a method for testing the uniformity of virtual reality equipment, which comprises the following steps: acquiring a screen picture of the virtual reality equipment; determining the gray level average value of each test point in the screen picture; substituting the gray average value into the calibration relation to obtain a photometric value of a test area corresponding to the test point; and determining the uniformity of the virtual reality equipment according to the luminosity value.
According to the method, through the calibration relation of the gray value and the photometric value determined in advance, when the uniformity test is executed, the screen picture of the virtual reality device is determined firstly, the gray average value of each target test point in the screen picture is calculated, and the calibration relation is directly substituted to calculate and obtain the corresponding photometric value, so that photometric measurement is not required to be executed by using photometer equipment, the error accumulation in the photometer measurement process is avoided, the test time is saved, the test efficiency is effectively improved, and the test precision is ensured.
The application also provides a screen uniformity testing system of the virtual reality device, a computer readable storage medium and an electronic device, which have the beneficial effects and are not repeated herein.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, it is obvious that the drawings in the following description are only embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the provided drawings without creative efforts.
Fig. 1 is a flowchart of a method for testing screen uniformity of a virtual reality device according to an embodiment of the present disclosure;
fig. 2 is a flow chart of calibration relationship determination provided in the embodiment of the present application;
fig. 3 is a schematic diagram of a test point position provided in the embodiment of the present application;
fig. 4 is a schematic structural diagram of a screen uniformity testing system of a virtual reality device according to an embodiment of the present disclosure.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Referring to fig. 1, fig. 1 is a flowchart illustrating a method for testing screen uniformity of a virtual reality device according to an embodiment of the present disclosure, the method including:
s101: acquiring a screen picture of the virtual reality equipment;
the step is to obtain a screen picture of the virtual reality device, where the screen picture refers to a picture of the virtual reality device after an image of the virtual reality device is amplified by an imaging system, and may be an image on an LCD (Liquid Crystal Display) screen or an OLED (Organic Light-Emitting Diode) screen, for example. It should be noted that in general, a virtual reality device includes two display screens, and each display screen needs to be separately tested for uniformity.
S102: determining the gray level average value of each test point in the screen picture;
this step aims to determine the average value of the gray levels of the test points in the screen picture. The test points refer to points for performing a screen uniformity test, and the number and the positions of the test points are not particularly limited in this embodiment. It will be readily appreciated that to ensure that the uniformity test is accurate, it is preferable to perform the test at the center and edges of the screen, for example, if the screen is a rectangular or nearly rectangular pattern, nine points such as a center point, four vertices and four edges may be selected as test points, and more or fewer test points, for example, five or seventeen, may be selected to perform the uniformity test. Of course, when more test points are selected, the uniformity test result is more accurate.
Since the gray value is usually the gray value of the area, when determining the gray average value of the test point, the gray value of the circular area with the preset length as the radius can be used as the gray average value of the test point by taking each test point as the center of a circle. In the specific gray test, the gray test can be directly determined by using a corresponding gray test algorithm, which is not specifically limited herein.
S103: substituting the gray average value into a calibration relation to obtain a photometric value of a test area corresponding to the test point;
after the gray level average value of each test point is obtained, the gray level average value can be substituted into the calibration relation, so that the corresponding luminosity value of the test point at the gray level average value is obtained.
It should be noted that the embodiment only requires that the calibration relation is determined before the step is executed, in other words, the determination of the calibration relation and the execution sequence of the steps are independent of each other. The determination of the calibration relationship can usually be performed beforehand.
The calibration relation comprises a conversion relation between the gray value and the photometric value, so that the gray average value can be directly converted into the corresponding photometric value after the gray average value of the test point is obtained in the previous step, and actual measurement is not required by a photometer.
S104: and determining the uniformity of the virtual reality equipment according to the luminosity value.
After the luminosity values of the test points are determined, the corresponding uniformity can be directly determined according to the luminosity values. The uniformity determination is not limited, and the luminance value can be substituted into the uniformity formula to determine the uniformity of the virtual reality device. The luminance value comprises a maximum luminance value and a minimum luminance value, the Uniformity formula is (Max-Min)/Average (%), the Uniformity formula is Uniformity, the Max is the maximum luminance value, and the Min is the minimum luminance value.
According to the embodiment of the application, through the calibration relation of the gray value and the photometric value determined in advance, when the uniformity test is executed, the screen picture of the virtual reality device is determined firstly, the gray average value of each target test point in the screen picture is calculated, and the calibration relation is directly substituted to calculate and obtain the corresponding photometric value, so that photometric measurement is not required to be executed by using photometer equipment, error accumulation in the photometer measurement process is avoided, the test time is saved, the test efficiency is effectively improved, and the test precision is ensured.
On the basis of the foregoing embodiment, referring to fig. 2, fig. 2 is a flow chart for determining a calibration relationship provided in the embodiment of the present application, and how to determine the calibration relationship is described below:
s201: measuring the measurement brightness value of the target test area by using a photometer when different gray values are measured;
in this step, graphic cards with different gray values can be configured first, and it should be noted that the gray values and the luminance values are different, and the luminance value refers to the brightness degree of the picture, and the unit is candela per square meter (cd/m2) or nits. Gray scale refers to the logarithmic relationship between white and black divided into several levels, typically ranging from 0 to 255, with white being 255 and black being 0. When the cards with different gray values are configured, the gray values between the cards can be the average difference, for example, the step can be 10, and then the cards with different gray values of 0, 10, 20 … … 240, 250, etc. can be used.
Then, the brightness values of the test points at different gray values are further measured, and the gray values refer to the gray values of the corresponding graphic cards. When the brightness value is measured, the corresponding measurement can be carried out by using a photometer. It should be noted that the position of the test point may be the same as or different from that of the test point in the previous embodiment, and is not particularly limited.
S202: shooting screen pictures corresponding to different gray values by using a calibration camera, and calculating an average gray value corresponding to the target test area;
and then, shooting corresponding screen pictures under different gray values by using a calibration camera, thereby determining the average gray value corresponding to the test point. The test point in this step should be the same as the test point selected in step S201, and it should be noted that the target test area adopted in this step should be the same as the test area corresponding to the test point when the gray level average value corresponding to the test point is calculated in the previous embodiment. In other words, in determining the calibration relationship, the selected target test area should be the same as the test area of the test point in the actual test process.
Preferably, one way of performing this step is as follows:
s2021: taking the test point as the center of a circle and taking the preset length as the radius area as a target test area;
s2022: determining an amplification scale factor of the calibration camera according to the screen distance from the calibration camera to the virtual reality equipment;
s2023: and determining a target test area contained in the screen, amplifying the target test area by using an amplification scale factor, and calculating an average gray value in the target test area.
The specific value of the preset radius is not limited, and may be, for example, 4mm, 5mm, or 6mm, and may be set by those skilled in the art according to the actual screen size to be tested. In order to ensure the accuracy of the calibration relationship, the selected test points include the screen center point and the screen edge points, although the number of the screen edge points is not limited. The magnification scale factor is used for magnifying the image on the screen, and is convenient for calculating the average gray value in the target test area. And taking the obtained average gray value as the gray value corresponding to the test point.
S203: and taking the mathematical relationship between the average gray value of the target test area under different gray values and the measured brightness value of the photometer as the calibration relationship.
After the average gray value corresponding to each test point is determined, the calibration relation between the average gray value and the measured brightness value is determined according to the average gray value obtained by actual measurement and the measured brightness value. In particular, a corresponding mathematical approach may be used for fitting to determine a mathematical relationship between the average gray value and the measured brightness value. For example, a functional expression of the average gradation value and the measured luminance value may be determined with the average gradation value as an abscissa and the measured luminance value as an ordinate, and the functional expression may be taken as the calibration relationship. And if the average gray value and the measured brightness value are in a linear relation, the corresponding calibration relation is a linear expression.
In addition, in order to ensure the accuracy of the calibration relationship, after the calibration relationship is obtained, verification and correction of the calibration relationship may be performed, which specifically includes the following steps:
s301: determining the estimated brightness value of the second screen according to the calibration relation, comparing the estimated brightness value with the actual brightness measured value of the second screen by the photometer, and judging whether the estimated brightness value and the actual brightness measured value are within a preset error range; if yes, entering S302; if not, entering S303;
s302: determining that the accuracy of the calibration relation meets a preset standard;
s303: and checking the calibration relation until the accuracy of the calibration relation meets a preset standard.
The second screen in this embodiment may be any screen that can be used for testing, and is used to evaluate the accuracy of the calibration relationship, that is, determine the average gray level of the test points on the second screen, and calculate the estimated brightness value through the calibration relationship, and then measure the actual brightness of the test points by using the photometer to obtain the actual brightness test value, if the difference between the two is within the preset error range, the accuracy of the calibration relationship is considered to meet the preset standard, otherwise, the calibration relationship may be verified until the accuracy of the calibration relationship meets the preset standard. The calibration relationship is not specifically limited, and it can be detected whether the gray average value of each test point is calculated incorrectly, whether the corresponding brightness value is measured accurately, whether the mathematical relationship between the average gray value and the measured brightness value is fitted accurately, and the like. If the estimated brightness value and the actual brightness measured value exceed the preset error range, the data relation model of the data relation can be changed, and the calibration relation is corrected.
The following description will be given of a calibration process and a method for testing uniformity of virtual reality devices, which are provided by the present application, by taking a specific application of the present application as an example:
referring to fig. 3, fig. 3 is a schematic diagram of test point positions provided by the embodiment of the present application, and fig. 3 includes 18 test points, where P5 and P14 are center points, and the rest are screen edge points.
The whole calibration process is divided into a left screen and a right screen for calibration respectively, firstly, a photometer is used for measuring the brightness values of 18 test points under different brightness, then a camera is used for shooting pictures of the left screen and the right screen under different brightness, then, a program is used for calculating the gray level average value in 18 circular areas (the same as the test area and the test area used by a photometer for measurement) with the radius of 5mm by taking each test point as the center of a circle, then, the gray level average value of the camera under different gray levels is taken as the horizontal coordinate of each test point, the measured value of the photometer under different gray levels is taken as the vertical coordinate of each test point to calculate the mapping relation of 18 points, the brightness values of other screens obtained by the mapping relation of right calibration are calculated and compared with the actually measured value of the photometer, and whether the calibration relation is accurate or not is verified. The deviation between the photometric value obtained from the calibration relation and the photometric value measured by the photometer is within a preset error range, which proves that the accuracy of the calibration relation meets the preset standard.
The specific steps are as follows, taking the calibration and measurement process of one point as an example:
firstly, making brightness graphic cards under different gray values, determining the relative positions of a calibration camera and a photometer, and ensuring that the test areas of the calibration camera and the photometer are the same.
Secondly, measuring the photometric values of 18 points under different graphic cards by using a photometer;
thirdly, shooting left and right screen pictures of different picture cards by a camera;
fourthly, finding out an area of 18 points corresponding to the photometer on a camera shooting image, wherein the photometer uses a circle with a radius of 5mm on a screen as a test point, determines an amplification scale factor of a calibration camera and the screen according to the position and the distance between the calibration camera and a product, finds out a corresponding target test area according to a central point, and calculates an average gray value in the target test area as a test value of the test point;
taking the point P1 as an example, the average gray-scale value at the point P1 corresponding to different brightness values can be calculated, similar to the data in table 1:
TABLE 1 average Gray-value List derived from P1 points corresponding to different luminance charts
According to the photometric values obtained by the photometer with the point P1 corresponding to different chart brightnesses, as shown in Table 2, the mapping relationship between the gray-scale value and the photometric value under different brightnesses is calculated:
TABLE 2P1 Point Table for measuring brightness value measured by photometer under different brightness
Taking the above data as an example, taking the average gray value as the abscissa and the measured brightness value as the ordinate, determining the mathematical expressions of the average gray value and the measured brightness value
It can be obtained that the mathematical expression corresponding to the calibration relationship is: and f (x) 0.3741x-0.1989, where f (x) is the luminance value and x represents the gray scale value.
After the calibration relation is obtained, whether the method is applicable to other screens can be checked by utilizing the calibration relation. If the verification result meets the requirement, the calibration relation can be written into a test program, when the subsequent Uniformity test is carried out, the Average value of the gray scale of each test point is obtained by shooting pictures of left and right screens of the product, the corresponding luminosity value can be obtained, the maximum luminosity value and the minimum luminosity value are obtained by comparison, and the Uniformity formula, such as Uniformity (Max-Min)/Average (%), is introduced, so that the respective Uniformity of the left and right screens can be obtained.
The screen uniformity testing system of the virtual reality device provided in the embodiments of the present application is introduced below, and the screen uniformity testing system described below and the screen uniformity testing method of the virtual reality device described above may be referred to correspondingly.
Referring to fig. 4, fig. 4 is a schematic structural diagram of a screen uniformity testing system of a virtual reality device according to an embodiment of the present disclosure, and the present disclosure further provides a screen uniformity testing system of a virtual reality device, including:
the picture acquisition module is used for acquiring a screen picture of the virtual reality equipment;
the gray value determining module is used for determining the average gray value of each test point in the screen picture;
the photometric value calculation module is used for substituting the gray average value into the calibration relation to obtain the photometric value of the test area corresponding to the test point;
and the uniformity calculation module is used for determining the uniformity of the virtual reality equipment according to the luminosity value.
Based on the above embodiment, as a preferred embodiment, the method further includes:
the calibration relation determining module is used for measuring the measurement brightness value of the target test area when the photometer is used for measuring different gray values; shooting screen pictures corresponding to different gray values by using a calibration camera, and calculating an average gray value corresponding to the target test area; and taking the mathematical relationship between the average gray value of the target test area under different gray values and the measured brightness value of the photometer as the calibration relationship.
Based on the foregoing embodiment, as a preferred embodiment, the calibration relation determining module includes:
the gray value measuring unit is used for taking the test point as the circle center and taking a circular area with the preset length as the radius as the target test area; the test points comprise screen center points and screen edge points; determining the amplification scale factor of the calibration camera according to the screen distance from the calibration camera to the virtual reality equipment; and determining the target test area contained in the screen, amplifying the target test area by the amplification scale factor, and calculating the average gray value in the target test area.
Based on the foregoing embodiment, as a preferred embodiment, the calibration relation determining module includes:
and the calibration relation determining unit is used for determining a function expression of the average gray value and the measurement brightness value by taking the average gray value as an abscissa and the measurement brightness value as an ordinate, and taking the function expression as the calibration relation.
Based on the above embodiment, as a preferred embodiment, the method further includes:
the calibration and verification module is used for determining an estimated brightness value of a second screen according to the calibration relation, comparing the estimated brightness value with an actual brightness measurement value of the second screen by a photometer, and judging whether the estimated brightness value and the actual brightness measurement value are within a preset error range; if so, determining that the accuracy of the calibration relation meets a preset standard; if not, the calibration relation is verified until the precision of the calibration relation meets the preset standard.
Based on the above embodiment, as a preferred embodiment, the method further includes:
and the calibration correction module is used for changing a data relation model of the data relation and correcting the calibration relation when the calibration check module judges that the estimated brightness value and the actual brightness measured value exceed the preset error range.
Based on the above embodiment, as a preferred embodiment, the evenness degree calculating module is a module for substituting the luminosity value into an evenness degree formula to determine the evenness degree of the virtual reality device; wherein the photometric values include a maximum photometric value and a minimum photometric value;
the Uniformity formula is (Max-Min)/Average (%), Uniformity is Uniformity, Max is the maximum luminance value, and Min is the minimum luminance value.
The present application also provides a computer readable storage medium having stored thereon a computer program which, when executed, may implement the steps provided by the above-described embodiments. The storage medium may include: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The application further provides an electronic device, which may include a memory and a processor, where the memory stores a computer program, and the processor may implement the steps provided by the foregoing embodiments when calling the computer program in the memory. Of course, the electronic device may also include various network interfaces, power supplies, and the like.
The embodiments are described in a progressive manner in the specification, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other. For the system provided by the embodiment, the description is relatively simple because the system corresponds to the method provided by the embodiment, and the relevant points can be referred to the method part for description.
The principles and embodiments of the present application are explained herein using specific examples, which are provided only to help understand the method and the core idea of the present application. It should be noted that, for those skilled in the art, it is possible to make several improvements and modifications to the present application without departing from the principle of the present application, and such improvements and modifications also fall within the scope of the claims of the present application.
It is further noted that, in the present specification, relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
Claims (10)
1. A screen uniformity testing method of virtual reality equipment is characterized by comprising the following steps:
acquiring a screen picture of the virtual reality equipment;
determining the gray level average value of each test point in the screen picture;
substituting the gray average value into a calibration relation to obtain a photometric value of a test area corresponding to the test point;
and determining the uniformity of the virtual reality equipment according to the luminosity value.
2. The screen uniformity testing method of claim 1, wherein before substituting the gray average value into the calibration relationship, further comprising:
measuring the measurement brightness value of the target test area by using a photometer when different gray values are measured;
shooting screen pictures corresponding to different gray values by using a calibration camera, and calculating an average gray value corresponding to the target test area;
and taking the mathematical relationship between the average gray value of the target test area under different gray values and the measured brightness value of the photometer as the calibration relationship.
3. The screen uniformity testing method of claim 2, wherein calculating the average gray value corresponding to the target test area comprises:
taking the test point as the center of a circle and a circular area with the preset length as the radius as the target test area; the test points comprise screen center points and screen edge points;
determining the amplification scale factor of the calibration camera according to the screen distance from the calibration camera to the virtual reality equipment;
and determining the target test area contained in the screen, amplifying the target test area by the amplification scale factor, and calculating the average gray value in the target test area.
4. The screen uniformity testing method of claim 3, wherein taking the mathematical relationship of the average gray value of the target test area at different gray values and the measured brightness value of the photometer as the calibration relationship comprises:
and determining a function expression of the average gray value and the measured brightness value by taking the average gray value as an abscissa and the measured brightness value as an ordinate, and taking the function expression as the calibration relation.
5. The screen uniformity testing method of any of claims 1-4, further comprising, after obtaining the calibration relationship of the test points:
determining an estimated brightness value of a second screen according to the calibration relation, comparing the estimated brightness value with an actual brightness measurement value of the second screen by a photometer, and judging whether the estimated brightness value and the actual brightness measurement value are within a preset error range;
if so, determining that the accuracy of the calibration relation meets a preset standard;
if not, the calibration relation is verified until the precision of the calibration relation meets the preset standard.
6. The method of claim 4, wherein if the estimated brightness value and the actual brightness measurement value exceed the predetermined error range, further comprising:
and changing a data relation model of the data relation, and correcting the calibration relation.
7. The screen uniformity testing method of claim 1, wherein said determining the uniformity of the virtual reality device from the luminosity values comprises:
substituting the luminosity value into a uniformity formula to determine the uniformity of the virtual reality equipment; wherein the photometric values include a maximum photometric value and a minimum photometric value;
the Uniformity formula is (Max-Min)/Average (%), Uniformity is Uniformity, Max is the maximum luminance value, and Min is the minimum luminance value.
8. A screen uniformity testing system of a virtual reality device, comprising:
the picture acquisition module is used for acquiring a screen picture of the virtual reality equipment;
the gray value determining module is used for determining the average gray value of each test point in the screen picture;
the photometric value calculation module is used for substituting the gray average value into the calibration relation to obtain the photometric value of the test area corresponding to the test point;
and the uniformity calculation module is used for determining the uniformity of the virtual reality equipment according to the luminosity value.
9. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the screen evenness testing method of a virtual reality device according to any one of claims 1 to 7.
10. An electronic device, comprising a memory having a computer program stored therein and a processor, wherein the processor when calling the computer program in the memory implements the steps of the screen uniformity testing method of the virtual reality device according to any one of claims 1 to 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111136320.6A CN113776787A (en) | 2021-09-27 | 2021-09-27 | Screen uniformity testing method and system of virtual reality equipment and related device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111136320.6A CN113776787A (en) | 2021-09-27 | 2021-09-27 | Screen uniformity testing method and system of virtual reality equipment and related device |
Publications (1)
Publication Number | Publication Date |
---|---|
CN113776787A true CN113776787A (en) | 2021-12-10 |
Family
ID=78853783
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111136320.6A Pending CN113776787A (en) | 2021-09-27 | 2021-09-27 | Screen uniformity testing method and system of virtual reality equipment and related device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113776787A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115824593A (en) * | 2022-12-27 | 2023-03-21 | 北京灵犀微光科技有限公司 | Testing device and method for augmented reality glasses |
CN116577074A (en) * | 2023-07-06 | 2023-08-11 | 武汉精立电子技术有限公司 | Method and device for measuring brightness uniformity of display panel |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104658461A (en) * | 2014-01-29 | 2015-05-27 | 广西科技大学 | Method for testing light emission uniformity of display |
CN106596068A (en) * | 2016-12-20 | 2017-04-26 | 广州视源电子科技股份有限公司 | Display brightness uniformity detection method and display brightness uniformity detection device |
CN106713903A (en) * | 2016-12-08 | 2017-05-24 | 广州视源电子科技股份有限公司 | Screen brightness uniformity detection method and system |
CN107221306A (en) * | 2017-06-29 | 2017-09-29 | 上海顺久电子科技有限公司 | Method, device and the display device of brightness of image in correction splicing device screen |
CN107547881A (en) * | 2016-06-24 | 2018-01-05 | 上海顺久电子科技有限公司 | A kind of auto-correction method of projection imaging, device and laser television |
US20200193898A1 (en) * | 2018-07-30 | 2020-06-18 | Wuhan China Star Optoelectronics Semiconductor Display Technology Co., Ltd. | Mura compensation method and device for curved screen |
CN111429537A (en) * | 2020-03-19 | 2020-07-17 | 中国电影科学技术研究所 | Optical detection method, device and equipment for movie screen and intelligent network sensor |
CN111586263A (en) * | 2020-03-27 | 2020-08-25 | 广东技术师范大学 | Imaging quality detection method for automobile HUD virtual image |
-
2021
- 2021-09-27 CN CN202111136320.6A patent/CN113776787A/en active Pending
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104658461A (en) * | 2014-01-29 | 2015-05-27 | 广西科技大学 | Method for testing light emission uniformity of display |
CN107547881A (en) * | 2016-06-24 | 2018-01-05 | 上海顺久电子科技有限公司 | A kind of auto-correction method of projection imaging, device and laser television |
CN106713903A (en) * | 2016-12-08 | 2017-05-24 | 广州视源电子科技股份有限公司 | Screen brightness uniformity detection method and system |
CN106596068A (en) * | 2016-12-20 | 2017-04-26 | 广州视源电子科技股份有限公司 | Display brightness uniformity detection method and display brightness uniformity detection device |
CN107221306A (en) * | 2017-06-29 | 2017-09-29 | 上海顺久电子科技有限公司 | Method, device and the display device of brightness of image in correction splicing device screen |
US20200193898A1 (en) * | 2018-07-30 | 2020-06-18 | Wuhan China Star Optoelectronics Semiconductor Display Technology Co., Ltd. | Mura compensation method and device for curved screen |
CN111429537A (en) * | 2020-03-19 | 2020-07-17 | 中国电影科学技术研究所 | Optical detection method, device and equipment for movie screen and intelligent network sensor |
CN111586263A (en) * | 2020-03-27 | 2020-08-25 | 广东技术师范大学 | Imaging quality detection method for automobile HUD virtual image |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115824593A (en) * | 2022-12-27 | 2023-03-21 | 北京灵犀微光科技有限公司 | Testing device and method for augmented reality glasses |
CN115824593B (en) * | 2022-12-27 | 2023-12-29 | 北京灵犀微光科技有限公司 | Device and method for testing augmented reality glasses |
CN116577074A (en) * | 2023-07-06 | 2023-08-11 | 武汉精立电子技术有限公司 | Method and device for measuring brightness uniformity of display panel |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN113776787A (en) | Screen uniformity testing method and system of virtual reality equipment and related device | |
CN112665727B (en) | Infrared thermal imaging temperature measurement method | |
CN108489423B (en) | Method and system for measuring horizontal inclination angle of product surface | |
CN109425420B (en) | Weighing method and storage medium thereof | |
CN112599089A (en) | Display screen light leakage value acquisition method, electronic device and storage medium | |
CN107525652B (en) | Lens distortion testing method, device and system | |
CN108286946B (en) | Method and system for sensor position calibration and data splicing | |
CN108717847A (en) | The method of DICOM calibrations, medical display device and computer storage media | |
JP2015158626A (en) | Calibration device, calibration method and program | |
US20120092362A1 (en) | System and method for detecting light intensity in an electronic device | |
CN115760653B (en) | Image correction method, device, equipment and readable storage medium | |
CN113065538A (en) | Pressure sensor detection method, device and equipment based on image recognition | |
CN112365833B (en) | Method for detecting and correcting brightness adjustment value of light sensation under screen | |
CN113558536B (en) | Intelligent calibration method, device and system of intelligent sweeping robot | |
CN115325939A (en) | Measuring method and device based on distance sensor and camera and electronic equipment | |
CN109632087A (en) | Field calibration method and imaging brightness meter caliberating device suitable for imaging brightness meter | |
CN111276092B (en) | Optimization parameter determination method and device | |
CN116608816B (en) | Calibration method and device for calibrating device of small-angle measuring instrument | |
CN114038372A (en) | Gamma adjustment method, related device and storage medium | |
CN116164818A (en) | Determination method, device, equipment and storage medium for measuring uncertainty | |
CN114092765A (en) | Wood quality detection method and device, electronic equipment and storage medium | |
CN111028287B (en) | Method and device for determining a transformation matrix of radar coordinates and camera coordinates | |
JP5634473B2 (en) | Panel evaluation system and panel evaluation method | |
CN112165616A (en) | Camera module testing method and device, electronic equipment and storage medium | |
CN112201192B (en) | Method, device, equipment and medium for determining gamma value of display |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |